It’s impossible to leave no trace. When you leave a room, on the microscopic or forensic level, you always leave some piece of you behind (e.g. hair or clothing fibers). Furthermore, whenever the tip of one of your fingers brushes a surface or makes contact with something in nearly any way, you imprint a mark that can be traced back only to you.
Fingerprints, while different in any single instance, follow a general pattern of loops, arches, and whorls. Today, these patterns have found reliable usage in identification, specifically in criminal records. However, while this is a resourceful application of these ever-present fingertip etchings in society, one has to wonder: why do we have fingerprints at all?
Why We Have Fingerprints
The likely reason for the existence of fingerprints dates back to human ancestors. These grooves, by limiting the surface area of the skin that makes contact, help to improve the friction rate between our fingers and the object we’re holding, making it easier to grip. This makes sense for early humans, who depended on the ability to carry objects over distances and grasp rocks and tools. A more reliable grip pairs well with opposable thumbs.
An increased friction rate between fingertips and a gripped surface would benefit human ancestors as well. These hominins, which lived primarily in trees, would use these unique grooves to help them grasp branches. This idea is supported by the evidence of fingerprints in some of Homo sapiens closest living relatives—chimps and gorillas. In fact, it is likely that fingerprints are an attribute evolutionarily acquired by animals with an arboreal lifestyle. The greatest piece of evidence in support of this theory is that koalas—marsupials that are by no means closely related to primates—have similar fingerprints to humans. Koala bears would have independently developed fingerprints to help them climb and hang off the sides of trees.
Please note that another popular theory for the existence of fingerprints postulates that the grooves improve our sense of touch. This, of course, would be useful to humans while using their hands for a variety of purposes, as well as arboreal animals for gripping branches. In fact, grooved fingertips have been shown to produce greater vibrations when sliding against a slightly rough surface.
Why All Fingerprints Are Different
This all explains why fingerprints exist, but it doesn’t answer why there is such great variation in fingerprints. The assortment among these digital valleys is so diverse that it is statistically unlikely that any two people throughout the course of human history have had an identical fingerprint. In fact, each person’s fingerprints vary by digit. The reasoning for this lies almost entirely in the randomness and chaos of the universe.
Fingerprints are fully formed before birth. The middle skin layer, the basal layer, which is scrunched between the inner layer—the dermis—and the outer layer—the epidermis—grows faster than the other two. Around the tenth week of pregnancy, this starts to exert a strain against the dermis and epidermis, and this pressure ultimately drives the epidermis to fold into the dermis, resulting in the nuanced ridge patterns known as fingerprints.
This pattern is set by the time a fetus reaches 17 weeks, and it basically remains unchanged throughout a person’s lifetime. Certain repeated activities, however, might wear down fingerprints over time. Another factor that can alter fingerprints is entirely intentional, and it’s the act of purposefully burning them off with acid or fire. Something that may come to mind to many is the scene on Men in Black when Will Smith’s character has his fingerprints burned off by pressing them against a silver orb to remove records of his former identity. This type of sacrifice, however, might not actually be permanent, as, due to the imprinting of fingerprints in deeper skin layers, they will eventually grow back.
As with many developments that take place in the womb, fingerprint pattern formation is greatly genetic. However, since fingerprint patterns vary among family members— even identical twins have different fingerprints—environmental factors are also at play. Numerous environmental factors influence the formation of fingerprints. The general chaos of the womb means that blood pressure, oxygen levels in the blood, the nutrition of the mother, hormone levels, the exact position of the fetus in the womb at a particular time, and the exact composition and density of the amniotic fluid that makes contact with the fetus all help to craft the etches that a person will have in his or her fingertips. Even the length of the umbilical cord plays a role.
Ultimately, while each is different, fingerprint ridge patterns are divided into three basic classes—arches, loops, and whorls, as well as a “composite” class that features a mixture of patterns regularly identified by the primary three. Interestingly, these classes generally show ethnic variation, with people of African ancestry tending to have arches, those of European background often having loops, and those of Asian descent generally having whorls.
How Fingerprints Are Used for Identification
It is often stated that no two fingerprints can be the same, but there is really no way of proving this claim. Regardless, a redundant fingerprint would be remarkably unlikely. This information has surely been in society’s consciousness for some time, as fingerprints have been used as a means of identification as far back as ancient Babylon, when clay tablets used for business transactions required fingerprints.
Throughout the Nineteenth Century, these unique grooves began to be utilized to keep records of criminals, and Francis Galton, a British anthropologist, clearly identified the uniqueness of fingerprints and established many of their unique characteristics. At the turn of the century, Sir Edward Henry, an Inspector General of Police in Bengal, India, developed the first system of classifying fingerprints by revising Galton’s discoveries. Quickly adopted as the official system in England, the Henry Classification System quickly spread worldwide.
The Henry Classification System features four types of pattern groupings: arches (with subgroups plain arch and tented arch), loops (with subgroups radial loop and ulnar loop), whorls (with subgroups plain whorl, central pocket, double loop, and accidental whorl), and composites (with subgroups central pocket loop, twinned loop, lateral pockets loop, and accidental loop). This system assigned a value to each finger in accordance with the pattern observed.
Within a decade of the Henry System’s establishment, fingerprints became accepted by U.S. courts as a reliable means of identification. In 1924, the ID Division of the FBI was formed, and the federal organization became responsible for managing the national fingerprint collection. Please note that the FBI used a system that varied slightly from the Henry System in how fingerprints were valued.
In the 80s, automated fingerprint identification processes were adopted. In the late 90s, the Integrated Automated Fingerprint Identification System (IAFIS) was established as the national, computerized system for storing, comparing, and exchanging fingerprint data. IAFIS, as well as similar systems, allows for comparisons of fingerprints in a timely and accurate manner.
This system finds immense usage with criminal identification, and it allows for cross-referencing of fingerprints found at the scene of a crime. However, the usage of fingerprinting goes beyond forensics and police organizations. For example, in accordance with the American National Standard ANSI/ASB 007, medical examiners can conduct searches of automated fingerprint identification system databases against postmortem impressions.
Today, fingerprinting is just one of an assortment of biometric technologies, including gait analysis and iris and facial recognition. For this reason it is also useful for authenticating individuals in a variety of industries, such as financial services.