Individual Biometrics -

FINGERPRINT 

 

Basics:

 Finger printing takes an image (either using ink or a digital scan) of a person's fingertips and records its characteristics. Whorls, arches, and loops are recorded along with the patterns of ridges, furrows, and minutiae. This information may then be processed or stored as an image or as an encoded computer algorithm to be compared with other fingerprint records.

 

In the digital domain, the software maps the minutiae points in relative placement on the finger and then searches for similar minutiae information in the database. Often an algorithm is used to encode the information into a character string that can be searched for in the database, improving search time. In most cases no image of the fingerprint is actually created, only a set of data that can be used for comparison. This method was meant to alleviate the public's fear of their fingerprint being recorded or stolen, but most people still do not understand or believe the actual method used.  

  

  

 

How it Works:

 The user presses his finger gently against a small reader surface (optical or silicon) usually of about 2 inch square size. This reader is attached to a computer and takes the information from the scan and sends it to the database. There it is compared to the information within. The user is usually required to leave his finger on the reader for less than 5 seconds during which time the identification or verification takes place. 

 

To prevent fake fingers from being used, many systems also measure blood flow, or check for correctly arrayed ridges at the edges of the fingers.  

  

  

History:

 Finger printing was first used in a fashion in 14th century China as a method for parents to distinguish their children from those of others. Young children would have their feet and palms stamped in ink and then onto paper to creaet a record individual to the child.

 

The English began using fingerprints in July of 1858, when Sir William Herschel, Chief Magistrate of the Hooghly district in Jungipoor, India, reached his limit of frustration with the dishonesty of the local natives. On a whim, and with no thought toward personal identification, Herschel had Rajyadhar Konai, a local businessman, impress his hand print on the back of a contract.

 

In the later half of the 19th century, Richard Edward Henry of Scotland yard developed a method of categorizing and identifying marks in fingerprints. This method, an advanced version of one first brought forward by Francis Galton in 1892, was used on an experimental basis in the late 1890s and soon proved extremely reliable. After the failure of bertillonage in 1903, finger printing became the method of choice for police around the world. Interestingly, another classification system was created almost concurrently in 1891 by Juan Vucetich which is still use in most spanish countries. International organizations such as InterPol now use both methods.

 

Still the biometric of choice for most law agencies, the fingerprint is undergoing a major change for the first time in decades as scanners are beginning to rival ink prints in quality and affordability, Highly effective and relatively simple, the use of fingerprints as a viable biometric seems to be here for the long run.  

  

  

Use:

 In use in criminal investigations for over 100 years, and used as far back as the 14th century for identification purposes, fingerprint usage continues to expand every day. Fingerprint scanning secure entry devices for building door locks and computer network access are becoming more common. Recently a small number of banks have begun using fingerprint readers for authorization at ATMs and grocery stores are experimenting with a fingerprint scan checkout that automatically recognizes and bills a registered user's credit card or debit account. The potential uses for this biometric appear to be limited only by the willingness of people to use it.  

  

  

Evaluation:

 The crossover accuracy of digital finger printing has been measured to be 1:500 for a single finger. Use of multiple fingers increases the accuracy exponentially. Because of the large amount of data that can be drawn from the fingerprints, as systems become more accurate and powerful, this accuracy should increase even more. Given the amount of information contained in a fingerprint, it is highly unlikely (estimated at 1 in 64 billion) that any two fingerprints would be identical and therefore impossible to tell apart.

 

Another advantage to using fingerprint technology is the fairly small storage space required for the biometric template, reducing the size of the database memory required. Also, it is one of the most developed biometrics, with more history, research, and design than any other form. The traditional use of fingerprints on criminals has given it a public stigma that is slowly being overcome, but often overshadows its usefulness.

 

For those concerned with how easy it is to fool a fingerprint reader, companies have been quickly progressing in creating "human-sensing" devices that can differentiate between living human fingers and even some of the best replicas. And since the information in the database is encoded with a mathematical algorithm, recreation of a ringerprint is extremely difficult on even a limited scale with most modern systems.