WO2001024700A1 - Spoof detection for biometric sensing systems - Google Patents

Spoof detection for biometric sensing systems Download PDF

Info

Publication number
WO2001024700A1
WO2001024700A1 PCT/US2000/027782 US0027782W WO0124700A1 WO 2001024700 A1 WO2001024700 A1 WO 2001024700A1 US 0027782 W US0027782 W US 0027782W WO 0124700 A1 WO0124700 A1 WO 0124700A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
fingeφrint
captured
images
spoof
Prior art date
Application number
PCT/US2000/027782
Other languages
French (fr)
Inventor
Lawrence O'gorman
Stephanie Schuckers
Reza Derakhshani
Lawrence Hornak
Xiongwu Xia
Michael D'amour
Original Assignee
Veridicom, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Veridicom, Inc. filed Critical Veridicom, Inc.
Priority to AU13305/01A priority Critical patent/AU1330501A/en
Publication of WO2001024700A1 publication Critical patent/WO2001024700A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • A61B5/1172Identification of persons based on the shapes or appearances of their bodies or parts thereof using fingerprinting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1382Detecting the live character of the finger, i.e. distinguishing from a fake or cadaver finger
    • G06V40/1388Detecting the live character of the finger, i.e. distinguishing from a fake or cadaver finger using image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/30Individual registration on entry or exit not involving the use of a pass
    • G07C9/32Individual registration on entry or exit not involving the use of a pass in combination with an identity check
    • G07C9/37Individual registration on entry or exit not involving the use of a pass in combination with an identity check using biometric data, e.g. fingerprints, iris scans or voice recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7253Details of waveform analysis characterised by using transforms
    • A61B5/7257Details of waveform analysis characterised by using transforms using Fourier transforms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device

Abstract

A biometric sensing system and techniques are disclosed for detecting spoofs of a living finger. In accordance with an embodiment of the invention, unique biological and physical characteristics of a finger are captured by a fingerprint sensor over a sequence of images and interpreted from the captured images. The characteristics are extracted from an electrical representation of the finger in what, in the past, was considered 'noise' in the electrical representation. According to one embodiment, the system comprises an image capture device configured to sample an applied object and create an electrical representation of the applied object and a spoof detection module configured to analyze the electrical representation of the applied object for relative intensity, density, geometric, or temporal anomalies indicative of a non-living applied object. Methods are disclosed for the same end, the methods including: average intensity, pixel density, rate of warming, ridge uniformity, ridge signal strength, water droplet differential, fingerprint vitality, and inverted spoof techniques.

Claims

Process this Application in accordance with the PATENT COOPERATION TREATY
TITLE OF THE INVENTION Spoof Detection for Biometric Sensing Systems
APPLICANT Veridicom, Inc. 2040 Martin Ave. Santa Clara, CA 95050
U.S.A.
CROSS-REFERENCE TO RELATED APPLICATIONS This application is related to U.S. Provisional Application Serial No. 60/158,423, filed October 7, 1999, entitled, "Method and Apparatus for Determining a Living Fingerprint on a Fingerprint Sensor," which is incorporated herein by reference in its entirety, and to which priority is claimed.
BACKGROUND 1. Field of the Invention.
The invention relates to biometric sensors, and more particularly to fingerprint sensors and techniques for distinguishing between living and non-living fingers placed on a sensor. 2. Background Information.
Fingerprint identification and authentication systems rely on a user's unique fingerprint to identify whether the user is authorized to access the system. A significant challenge for fingerprint authentication systems is to prevent unauthorized access gained through the use of a spoof, i.e. a fingerprint from a non-living finger. A spoof can include an artificially produced fingerprint (e.g., a molded plastic or rubber 3-dimensional impression of a true fingerprint), a fingerprint from a finger of a dead person, or a fingerprint from a finger severed off of a live person. Each of these spoofs circumvents the fingerprint authentication process, as the spoof can be used separate from its legitimate owner.
One solution to this problem is to require the use of an accompanying password or personal identification number (PIN). All bank cards and most other tokens require both the
1 card and a PIN. This combination is a good defense against the spoof attack. Multi-factor authentication is required in many applications not only as a defense against spoofing, but also as an increased level of security.
However, multi-factor authentication has drawbacks. For example, the accounting or card issuing organization faces significant administrative cost in handling the secret codes and the card holders have to memorize the secret codes. In addition to the cost of managing the secret codes, the card issuing organization faces significant cost in handling the cards such as issuing new cards or dealing with lost cards and the card holders have to carry the cards which may be inconvenient to the users in certain situations. Therefore, it is advantageous to capitalize on the inherent benefit of identification through the user's fingerprint, which does not require the user to remember any passwords or to retain any tokens. Accordingly, there is a need to be able to distinguish between living fingers and spoofs, so as to ensure that only authorized persons will be able to access the protected system. For example, one fairly simple way of illegally accessing an optical fingerprint system is with a photographic copy of a true fingerprint. Older optical fingerprint systems are unable to discriminate between a real fingerprint and a photocopy. To overcome this problem, some optical systems now use frustrated total internal refraction (FTIR), while others employ ultrasound technology. FTIR systems are designed to reflect light from surfaces of the skin directly in contact with the scanner (e.g. fingerprint ridges) and not to reflect light from those surfaces out of contact with the scanner (e.g. fingerprint valleys). Ultrasonic systems such as ones described in U.S. Patent No. 5,587,533 entitled "Surface Feature of Mapping Using High Resolution C-Scan Ultrasonography" of Schneider et al. issued December 24, 1996, direct ultrasonic waves to the object placed on the scanner and monitor the waves that are returned. Both types of optical systems can discriminate between a flat photocopy of a fingerprint and a true fingerprint having 3-dimensional depth. One shortcoming, however, with both of these optical systems is that they cannot distinguish between a living finger and a 3-dimensional finger molded from plastic or rubber, nor a dead from a living finger. Other approaches are used for non-optical fingerprint authentication systems. For example, some fingerprint capture devices measure the subject's skin resistance and temperature. These measurements can then be compared to normal values for a living finger. If the measurements deviate from normal values, then the fingerprint is identified as a spoof.
2 The skin resistance and temperature safeguards, however, can easily be circumvented by warming up an artificial or dead finger or by designing an artificial finger with a resistance similar to a true finger.
Still other fingerprint authentication systems utilize other medical measurements to determine whether the fingerprint presented is from a living finger or a spoof. One such system is described in U.S. Patent No. 5,719,950, entitled "Biometric, Personal Authentication System" of Osten et al., issued February 17, 1998, which incorporates biological measurements, such as electrocardiograph signals and blood pressure, in conjunction with a fingerprint scan to determine whether the fingerprint presented originates from a living finger. Disadvantages of these systems include their complexity, large size and high cost. In addition, systems which probe medical signals of a person are highly intrusive. There is a need, therefore, for an improved method for distinguishing a living finger from a spoof. In particular, the method should accurately discern between the two without being overly intrusive. The method also should not significantly increase the size or cost of the system.
SUMMARY OF THE INVENTION A method and apparatus for employing spoof detection for a biometric sensing device is provided. The invention employs a number of computer implemented techniques designed to distinguish between a living and a non-living biometric, particularly a fingerprint. Anomalies of an electrical representation of a fingerprint, captured from the biometric sensing device, are analyzed using any number of steps designed to detect certain characteristics that are difficult to spoof and largely unique to a living fmger.
According to one embodiment, a solid state fingerprint sensor is employed. One such solid state sensor is a capacitive sensor that captures fingerprints by electrical means. Depth is determined by electric field strength, which is inversely proportional to distance. Because capacitive fingerprint sensors require a three-dimensional object, they do not accept a photographic copy of a true fingerprint. In addition, in order for these sensors to capture a fingerprint, the finger or object presented to the sensor must have electrostatic characteristics similar to the skin on a living fmger. This will eliminate fingerprints from plastic or rubber molded fingers, as these materials are non-conductive.
The method further relies on additional characteristics of living fingers that cannot be easily replicated by an artificial, dead or severed fmger. For example, living skin has the
3 functions of excretion of substances through sweat glands and absorption of lipid-soluble substances. Sweat glands include pores which are small openings of the sweat ducts in the skin surface. Pore patterns are unique and they do not disappear, move, or spontaneously change over time. The sweat glands also produce unique characteristics of a living finger. For example, a living fmger being a heat source, which warms up the sensor after it is placed on a fingerprint sensor; a living fmger perspires; and the skin of a living finger is hydrophilic. Accordingly, the invention analyzes the electrical representation of the fingerprint for characteristics such as intensity or density, as well as, in some embodiments, spatial, geometric, and/or temporal anomalies largely inconsistent with a living fmger. The above characteristics may be observed over a series of captured images. With an increase in temperature, a living fmger perspires. Therefore, areas surrounding pores in the fmger produce stronger signals. Because the skin of a living finger is hydrophilic, it absorbs liquid. Thus, even with perspiration, a living fmger placed on the sensor is capable of generating an image that exhibits good differentiation between the ridges and valleys of the fmger.
According to one embodiment, the invention is a biometric sensing system including an image capture device configured to sample an applied object and create an electrical representation of the applied object and a spoof detection module configured to analyze the electrical representation of the applied object for relative intensity, density, geometric, or temporal anomalies indicative of a non-living applied object. Various methods for the spoof detection module are described below.
In one embodiment, the method includes capturing a sequence of images of the applied object from the sensor and calculating an average intensity for each image. In one embodiment, the average intensity is calculated based on the entire image. In another embodiment, the average intensity is calculated based on a portion of the image. In still another embodiment, the average intensity is ON-pixel normalized, meaning that only pixels having a signal exceeding a predetermined threshold value that indicates a fingerprint ridge are used for this calculation. The method further includes successively comparing the average intensities of the sequence of images to determine whether they vary beyond a predetermined threshold amount. If the averages vary beyond the threshold amount, then the system accepts the image as coming from a live person. If the averages do not vary beyond the threshold amount, then the system acquires an additional image of the applied object and calculates the average intensity for that image.
4 The average intensity of the additional image is compared with the prior averages for the sequence of images to determine whether the averages vary beyond the threshold amount. If the averages still do not vary beyond the threshold amount, then the system repeats the steps of acquiring another image, calculating the average intensity for that image, and comparing that average with the previous ones, until the maximum number of images have been captured. At that point, the system rejects the applied object as being a spoof, rather than a living fmger.
In another embodiment, the density of the ON-pixels is calculated over a sequence of images instead of average intensity. In one embodiment, maximum and minimum pixel values for an image (either the entire image or a selected portion of an image) are determined. A mid-value is then determined. A pixel with a value greater than the mid- value is considered to be an ON-pixel. The number of ON-pixels over the number of total pixels (in either the entire image or a selected portion of an image) is the density of an image. The density of each image in the sequence is successively compared with the density of the previous image. An increase in density over the sequence of images exceeding a predetermined threshold amount indicates that the object is from a live person. Otherwise, the object is a spoof.
In still another embodiment, the average intensity is calculated separately for the middle fingerprint portion and the outer fingerprint portion for each image in the sequence of images. Each average intensity of the middle fingerprint portion is successively compared with the average intensity of the middle fingerprint portion of a previously captured image. Likewise, each average intensity of the outer fingerprint portion is successively compared with the average intensity of the outer fingerprint portion of a previously captured image. If the average intensity for the outer fingerprint portion increases faster than the average intensity for the middle fingerprint portion, then the fingerprint is determined to be coming from a spoof. Calculating and comparing the rate of increase for the middle fingerprint portion and the outer fingerprint portion protects the system from falsely accepting an artificially warmed spoof. This is because the middle fingerprint portion warms up faster than the outer fingerprint portion for a living fmger while the outer fingerprint portion warms up faster than the middle fingerprint portion for an artificially warmed spoof.
In an alternative embodiment, the density of the middle fingerprint portion and the density for the outer fingerprint portion are calculated and successively compared. If the density for the middle fingerprint portion increases at a rate that is faster than the outer fingerprint portion, then the image is coming from a living fmger.
In one embodiment, the rate of warming for the sequence of captured images is calculated and compared with the rate of warming of the fmger during enrollment or during a previous successful verification. The rate of warming, in one embodiment, is calculated using average intensity. In an alternative embodiment, the rate of warming is calculated using density. Comparing the rate of warming for an applied object and a previously determined rate of warming prevents false acceptance of an artificially warmed spoof because the rate of warming of a living fmger differs from the rate of warming of a spoof. In one embodiment, the changes in intensity along the ridges are measured over a sequence of images. If the intensity along ridges increases in a spatially non-uniform way, then the image is accepted as coming from a live person because a spatially non-uniform increase in intensity along the ridges indicates pores emanating sweat which is an indicator of a living finger. In an alternative embodiment, the nonuniformly increasing intensity is measured along the contours of ridges to obtain a linear sequence of intensity values. The linear sequence depends on the intensity along the ridge. For example, the typical ridge comprises pixels having intensities that increase as they get closer to a pore and decrease as they get further from the pore. Thus, the signal of the contour is typically sinusoidal and has a particular frequency. Since a spoof has no pores, the signal of the contour does not have a frequency related to pore-to-pore frequency. In other words, the signal from a spoof is not sinusoidal. In one embodiment, the intensity periodicity and nonuniformity of the pore-to- pore frequency content in a specific window are measured by Fourier transform.
In one embodiment, the signals for the ridges of the sequence of images are compared. If there are no changes in the signals along a ridge, then the system determines whether the signals are at a maximum. In addition to no signal change along a ridge, a maximum signal strength can indicate that a living fmger that is wetted, a living fmger that is saturated due to perspiration, or a spoof saturated with moisture. In one circumstance, the system accepts the image as coming from a living fmger. In another embodiment, the system notifies the user to wipe excess moisture off the applied object or the sensor and to try again. In one embodiment, water droplets are located and their sizes measured for each image in the sequence of images. In one embodiment, a water droplet image is a group of pixels having a width exceeding the width of a ridge (note that the ridge is narrow and long
6 while the water droplet is fat and short). The sizes of the water droplets are successively compared. The same or decreasing sizes can indicate that the object is a wetted spoof, or the object is a living finger with over-accumulation of surface moisture. The system rejects the fingerprint. Again, the system can instruct the user to wipe off excessive moisture from the applied object or sensor and try again.
In one embodiment, matching minutiae are further compared with respect to their minutia type, that is, endpoint or bifurcation. The ratio of mismatches of minutia type to the overall minutia matches is compared against a threshold amount. If the ratio exceeds the threshold value, then the sensed fingerprint is rejected as a spoof. In still another embodiment, the same finger is to be disposed over on the sensor multiple times — e.g. twice. A sequence of images are captured each time. The sequences of images are then compared. For a true fmger, the first image in the second sequence exhibits characteristics closer to the last image in the first sequence while for a spoof, the first image in the second sequence exhibits characteristics closer to the first image in the first sequence. In another embodiment, two-dimensional captured images are mapped into one- dimensional signals. Static (e.g. from one fingerprint image) and dynamic (e.g. temporal change of perspiration pattern of the skin) measures are then extracted and used for classification. The static and dynamic measures can include those detectable within the same signal derived from one fingerprint image and those observed in temporal transition from one signal to the next such as temporal changes over multiple image signals, respectively. The static measure can be used to detect variations in gray level due to darkening around the pores and the dynamic measure can be used to detect the changes caused by perspiration over time. These measurements are quantified to produce a sweating pattern.
In one embodiment, the calculated parameters, for example, the energy inside normal pore frequency window, total swing ratio, minimum/maximum growth ratio, last-first fingerprint signal difference mean and percentage change of standard deviations, are fed into a neural network. According to one instance, if the output of the neural network is in a predetermined range, then the applied object passes as coming from a living finger. In another instance, the predetermined range is positive. If the output of the neural network is in a second predetermined range, then it is determined whether a predetermined number of trials have been met. In one embodiment, the second predetermined range is negative. If the number of trials is less than a predetermined number of maximum trials, then the system prompts the user to wipe the applied object. The system then captures another sequence of
7 images. The system repeats the process until the predetermined number of maximum trials has been met, at which time the system rejects the fingerprint as coming from a spoof.
The methods are typically embodied in computer software product executed and/or interpreted by one or more processors. The processor(s) can be part of a network system, a stand-alone computer, an automated teller machine, a wireless telephone, or a smart card, for instance.
BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1, which comprises FIG. IA and FIG. IB arranged in a manner shown in the key to FIG. 1, is a flowchart of a method for distinguishing between a living fmger and a spoof by calculating average intensities of a sequence of images.
FIG. 2 shows a block diagram of a fingerprint system.
FIG. 3, which comprises FIG. 3 A and FIG. 3B arranged in a manner shown in the key to FIG. 3, is a flowchart of a method for distinguishing between a living finger and a spoof by calculating density of a sequence of images.
FIG. 4 is a flowchart of a method for distinguishing between a living fmger and a spoof by calculating the average intensity change along a ridge.
FIG. 5 is a flowchart of a method for distinguishing between a living finger and a spoof by determining whether the average intensity changes along the ridges and whether the average intensities at the ridges are at a maximum value.
FIG. 6 is a flowchart of a method for distinguishing between a living finger and a spoof by comparing the size of water droplets on a sequence of images.
FIG. 7, which comprises FIG. 7A and FIG. 7B arranged in a manner shown in the key to FIG. 7, is a flowchart of a method for distinguishing between a living finger and a spoof based on perspiration characteristics.
FIG. 8 is a flowchart depicting a method for distinguishing between a living finger and an inverted spoof of the living fmger.
FIG. 9A is a block diagram of an exemplary biometric sensing system. FIG. 9B is a block diagram of a capacitive fingerprint sensor.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS In accordance with the invention, computer implemented methods and associated apparatuses for distinguishing a living fmger from a spoof are provided. A living fmger, as opposed to a fake or a dead fmger, exhibits vitality, meaning that it has blood flow, perspiration, neuroactivity and other biological functions. For example, living skin has the functions of excretion of substances through sweat glands and absorption of lipid-soluble substances. Each square inch of skin includes approximately 600 sweat glands. Sweat glands include pores, which are small openings of the sweat ducts in the skin surface. Pore patterns are unique and they do not disappear, move, or spontaneously change over time. Typically, pore-to-pore distance is approximately 0.5 mm, or there are approximately 20.8 pores per centimeter (cm) on the ridge. The sweat glands produce some unique biological effects such as the production of heat and moisture (perspiration), as well as the absorption of moisture.
A living fmger is a heat source. Therefore, after a living fmger is placed on a fingerprint sensor, the finger will warm the sensor. This increase in sensor temperature over a sequence of images can indicate a living fmger. In one embodiment, the temperature change is measured directly such as with a heat sensor or the equivalent. In another embodiment, the temperature change is measured indirectly. For instance, increasing temperature causes increasing surface moisture on the finger (due to perspiration).
A living finger perspires. Because moisture is conductive, conductivity increases with increasing moisture. Therefore, if measured by a resistive sensor, then resistance decreases can be observed and, if measured by a capacitive sensor, then capacitance increases can be observed. In addition to an increase/decrease in signal strength, the area of the increased/decreased signal strength expands as perspiration emanates from pore locations along the ridges.
Skin on a living finger is hydrophilic, that is, it absorbs water and other liquids applied at the surface of the skin. Hence, moistures are evenly distributed over the fmger and when a living fmger is wetted, either with liquid or from perspiration, a quality image having good differentiation between ridges and valleys of the fmger may still be generated. This is opposed to a wetted spoof, which can be hydrophobic and tends to pool the moisture in the valleys, thereby yielding little or no ridge to valley differentiation.
Moreover, the texture of the skin has particular characteristics where transitions between ridges and valleys, i.e., at endpoints and bifurcations, occur.
In accordance with an embodiment of the invention, these unique biological and physical characteristics are captured by a fingerprint sensor over a sequence of images and interpreted from the captured images. They are not necessarily measured directly by
9 biosensors, but rather extracted from the electrical representation of the fmger in what may have, in the past, been considered "noise" in the electrical representation.
SYSTEM OVERVIEW FIGS. 9A-B are block diagrams of an embodiment of the biometric sensing system.
Biometric sensing system 900 comprises a processing unit 901 and a sensor 916. The two are communicatively coupled, for instance, by a serial port, or a common bus. The processing unit 901 comprises a processor 902, such as a Motorola 6800 or 68000 series microprocessor. Further included in the processing unit is volatile memory 904 (e.g. RAM), in which temporary variables and executed program code can reside, and a non- volatile memory 906 (e.g., ROM, EPROM, EEPROM, or FLASH), in which persistent program code (to be later executed, for instance) and data reside.
According to one embodiment, the non-volatile memory 906 holds a minutia matching module 908, which is computer program code for finding matches between minutia of the electrical representation of an applied object 914 (a finger 918). An exemplary matching module is described in U.S. Patent Application Serial No. 09/354,929, filed November 17, 1999, entitled "Method and System for Fingerprint Template Matching," which is incorporated herein by reference in its entirety. Another matching module can include the techniques embodiment in U.S. Patent Application Serial No. 09/501,355, filed February 9, 2000, entitled "Biometric False Accept Detection," which is also incorporated herein by reference in its entirety.
An enrolled object 912, which was previously captured and verified, is also stored in non- volatile memory 906. One embodiment of a data structure that can be used to hold information representative of the enrolled object is described in PCT International Application Serial No. PCT/USOO/l 8714, filed July 7, 2000, entitled "Multi-Dimensional Fingerprint Minutia Data Constellation," which is incorporated herein by reference in its entirety.
Furthermore, the non-volatile memory 906 includes and spoof detection module, which can also be computer program code, implementing the methods described in detail below, which distinguishes between a living applied object and a non-living applied object. (It is appreciated that the particular memory or component arrangement can vary — for instance, the various modules and memories might not all be physically resident in a single
10 location, but can be distributed and "logically" arranged as depicted in FIG. 9A. The figure and description is intended to capture this logically arrangement.)
While the program code is ultimately embodied in a system as depicted, for instance, in FIG. 9A, the program code can be sequences of instructions for causing a processor (or distributed processors) to perform the functionalities described above (and more particularly below). The program code, or "software product", can be stored in a tangible medium, such as a CD-ROM, floppy disk, or a computer memory, for instance a shared memory or shared disk on a networked computer system. Further still, the software product can be embodied in downloadable, and perhaps compressed and encrypted, computer data files that are later loaded and executed (or interpreted) by a general purpose computer or an image capture device. According to one embodiment, the invention is completely represented by the spoof detection module software alone — and can be sold as a stand-alone product augmenting or improving an existing image capture device and/or biometric matching module.
The sensor 916 is preferably a capacitive fingerprint sensor. It is further described with reference to FIG. 9B, to which we now turn.
FIG. 9B shows a diagram of a fingerprint imaging device 932. In normal operation, the fingerprint imaging device 932 uses techniques derived from Coulombs law to determine the location of ridges and valleys in a fingerprint surface 938. By modeling each sensing element 936 in the sensor chip 916 as one plate in a capacitor, the fmger surface 938 (that is, the ridges and valleys) being the second plate in the capacitor, it is possible to measure a relative distance between the ridges and valleys to construct an electrical representation of the fingerprint. According to one embodiment, a passivation layer 924 is disposed over the sensing elements 936 to form the capacitor at the ridges.
Turning back to FIG. 9A, when the sensor 916 detects an applied object 918 and captures an image, the individual sensing elements 936 create an electrical representation 920 of the surface 938 of the skin. In an abstract form, the electrical representation may be modeled as the fingerprint 922, wherein particular minutiae are highlighted by boxes only for the purpose of illustration. What is captured, however, is more than simply data representative of ridges and valleys in a fingerprint; anomalies, or "noise" (as is mentioned above) in the image will inevitably occur. These anomalies are discarded by prior systems, but here they are used to computationally analyze the electrical representation 920 for indicia of a living fmger.
11 FIG. 2 is a block diagram of an alternative embodiment of a fingerprint system 5 for implementing the processes described below. Fingerprint system 5, in one embodiment, includes a fingerprint sensor 10 coupled to a processor 6 via a communication line 9. Fingerprint sensor 10 is for capturing fingeφrint images from an applied object (e.g. finger 16) and is described in detail below. Processor 6 may be any processor capable of executing a software program. For example, processor 6 may be a central processing unit (CPU) of a conventional personal computer having a monitor 7, keyboard 8 and other peripheral devices (not shown) such as a mouse, biomedical measurement devices, and/or other biometric devices coupled thereto. Communication line 9 can be a serial communication path, such as a universal serial bus, between a serial interface of processor 6 and fingeφrint sensor 10.
AVERAGE INTENSITY TECHNIQUE FIG. 1 is a flowchart of a process of determining whether a fingeφrint is coming from a living fmger by calculating and comparing the average intensity of each image in a sequence of captured images.
The process of determining whether a fingeφrint is coming from a living finger starts in box 102. An image is captured from an applied object by a fingeφrint sensor in step 104. The image can be captured using, for example, a solid state fingeφrint sensor, such as a capacitive sensor described in U.S. Patent Nos. 6,016,355, entitled "Capacitive Fingeφrint Sensor," and 6,049,620, entitled "Capacitive Fingeφrint Sensor with Adjustable Gain," and U.S. Patent Application Serial No. 09/354,386, filed July 14, 1999, entitled "A method of Constructing an Ultra-Rugged Biometric I.C. Sensor," and PCT International Application Serial No. PCT/USOO/l 9227, filed July 13, 2000, entitled "Ultra-Rugged I.C. Sensor and Method for Making the Same," which are all incoφorated herein by reference in their entirety. Capacitive sensors, such as those described in the above-mentioned references are manufactured using conventional CMOS silicon fabrication technology and include an array of pixels, each pixel comprising a capacitive sensing element. An exemplary sensor has an array of 300x300 pixels. In another embodiment, the image can be captured by a radio frequency based sensor, the radio frequency based sensor detecting surface texture or depth differentials between one or more layers of skin. In still another embodiment, the sensor can be a thermal sensor. In yet another embodiment, the sensor can be a swipe type sensor. (It is noted that these sensor embodiments can be employed in any of the techniques described below.)
12 Because capacitive fmgeφrint sensors require a three-dimensional object, they do not accept a two-dimensional photographic copy of a fingeφrint. In addition, in order for these sensors to capture a fingeφrint, the finger or object presented to the sensor must have electrostatic characteristics similar to the skin on a living fmger. In other words, the object must be conductive. By requiring such electrostatic characteristics, fingeφrints from plastic or rubber molded fingers can be eliminated, as these materials are non-conductive. Of course, other types of fingeφrint sensors, e.g., optical sensors and pressures sensors, can be used as well.
Typically, the system uses a raw fingeφrint image captured by the sensor to perform the spoof detection methods described herein. It is typically the case that the types of properties described herein, when electrically reproduced by the sensor, are found not, strictly speaking, found in ridge and valley data reproduced by the sensor. Rather, the properties are found in anomalies in the expected (perfect) fingeφrint image and/or in other "non- linearities" or disturbances in the image. Thus, it may be undesirable in most, but not all, circumstances to process the raw fingeφrint image to reduce the noise and enhance the raw image.
While some tests described below may benefit from such image processing, others may not. (Of course, alternatively, in some embodiments, once it has been determined the sampled image is a living fingeφrint (and not a spoof), it may then be desirable to further process the raw data.) So, by way of example, image processing and/or filtering techniques such as those described in U.S. Patent No. 6,049,620, incoφorated herein by reference above, U.S. Patent Application Serial Nos. 08/971,455, filed November 17, 1997, entitled "Automatic Adjustment Processing for Sensor Devices," Serial No. 09/300,087, filed April 26, 1999, entitled "Method for Imaging Fingeφrints and Concealing Latent Fingeφrints," and Serial No. 09/560,702, filed April 27, 2000, entitled "Automatic Gain Amplifier," which are each incoφorated herein by reference in their entirety, can be employed.
Measurement for each pixel in the image can be translated into an analog signal representing the intensity of the pixel. In one embodiment, the value of the intensity is represented by a grayscale of 0 to 255 or 8 bits, 0 being the lightest (white, no signal) and 255 being the darkest (black, strong signal).
In general, an image generated from the conductivity measurements of a capacitive fingeφrint sensor exhibits darker and lighter areas. These areas represent various levels of conductivity measured at the capacitors. As the fmger, which acts as one plate of the
13 capacitor, becomes more conductive from moisture within the skin and on the skin surface (from perspiration), the image gets stronger, meaning that the intensity at each pixel increases. Furthermore, the darker intensity emanates from pore locations along the ridges as sweat is generated at these pore locations. As a result, image quality changes over time and these changes are secondary effects of the above-described biological characteristics. Hence, a sequence of images that exhibits an increase in image intensity may be deemed to be coming from a living fmger.
An average intensity is calculated for the captured image in step 106. In one embodiment, the average intensity is ON-pixel normalized. That is, the average intensity of a captured image is an average of the intensities of the pixels that exhibit signal strengths that are greater than an ON-pixel threshold amount. The ON-pixel threshold is important in picture processing for extracting objects from their background and in this case, extracting ridges and valleys from a fingeφrint. There are techniques that are known in the image processing arts for selecting an adequate threshold of gray level. One such technique example is described by N. Otsu in "A Threshold Selection Method From Gray-Level
Histograms," IEEE Trans. Systems, Man, and Cybernetics, Vol. SMC-9, No. 1, Jan. 1979, pp. 62-66, which is incoφorated herein by reference. In one embodiment, the ON-pixel threshold is 10. Thus, a pixel having a signal strength of 10 (e.g. on a scale of 0 to 255) is considered to be an ON-pixel (e.g. a ridge pixel). In one embodiment, the average intensity is calculated for the full image, which can consist of the entire sensor area, e.g. 300x300 pixels. In another embodiment, the average intensity is calculated based on a portion of the captured image, for example, approximately 1/3 to approximately 2/3 of the central portion of the captured image. Calculation based on a portion of the captured image can reduce computational complexity without losing significant data because the central portion of the image typically contains the majority of the fingeφrint image data.
The system then determines whether the captured image is a first image (step 108). If the captured image is a first captured image, then the capture device waits for a predetermined period of time in step 109, prior to capturing another fingeφrint image (step 104). The predetermined time period is the actual capture rate, or the time between the capturing of two images. In one embodiment, actual capture rate is approximately 0.2 seconds. In general, actual capture rate depends on the capture device's capture rate (which
14 may be limited due to a particular physical or electrical configuration). The actual capture rate should give sufficient time for a detectable change in the level of the average intensity. If the currently captured image is not the first image (step 108), then a delta average intensity is calculated from the average intensities of the currently captured image and the previous captured image in step 110. The delta average intensity between each two successively captured images is calculated by subtracting the average intensity of the previous captured image from the average intensity of the currently captured image.
In the embodiment shown in FIG. 1 , the delta average intensity is calculated as each image is captured, on an image-by-image basis. In an alternative embodiment, the delta average intensity between each two successively captured images is calculated after a predetermined minimum number of images have been captured.
The system then determines whether the delta average intensity is positive in step 112. A counter that records the number of positive delta average intensities is incremented by one in step 114 if the delta average intensity is positive. The system then determines whether Nl images have been captured in step 116. Similarly, if the delta average intensity is not positive (step 112), then the system does not increment the counter but determines whether Nl images have been captured in step 116.
The value of number Nl depends on the rate of capture of the fingeφrint sensor. According to one embodiment, using a capacitive fingeφrint sensor, the minimum time for a sufficient change over the sequence of captured images is approximately 1.2 seconds. In this embodiment, at a capture rate of 0.2 seconds, six images are captured prior to any determination of whether the fingeφrint is coming from a living fmger. Similarly, if the capture device has a capture rate of, for example, 0.02 seconds per image, then a minimum of 60 images are captured prior to any determination. If Nl images have not been captured (step 116), then an additional fmgeφrint image is captured in step 104 after a time delay (step 109). Steps 104 through steps 116 are repeated until Nl images have been captured.
After Nl images have been captured, an average delta average intensity is calculated in step 118. The average delta average intensity is calculated as (average of delta average intensities / average of average intensities). For example, for six average intensities (from six captured images) of 80, 90, 100, 110, 120 and 130, the average delta average intensity is calculated as:
[(10+10+10+10+10) / 5] / [(80+90+100+110+120+130) / 6] x 100 __ 9.5%.
15 The system then determines whether the average intensity is increasing monotonically for the sequence of captured images (step 120). Monotonically increasing average intensity indicates that the applied object is warming up which is an indication of a living fmger. A monotonically increasing average intensity is characterized by, for instance, 80% of the captured images having a positive delta average intensity.
An example is illustrative. If six images are captured, the same as in the above example, then the five delta average intensities are +10, +10, +10, +10, and +10. Since 100% of the images exhibit an increase in average intensity, the average intensity increases monotonically. If the average intensity increases monotonically (step 120), then the system determines whether the average delta intensity is greater than a predetermined threshold amount (step 122). In one embodiment, the predetermined threshold is 10%. This value is selected based on experimentation. If the average delta intensity is greater than the predetermined threshold amount (step 122), then the image is accepted as coming from a living fmger (step 124) and the process ends in step 126. All counters are reset at step 126. It is noted that steps 120 and 122 may be reversed in sequence or executed in parallel.
After Nl images have been captured, if the average intensity does not increase monotonically (step 120), or the average delta average intensity does not exceed the predetermined threshold amount (step 122), then a decision cannot be made as to whether the captured image is coming from a living fmger or a spoof. When the decision cannot be made in the minimum time, e.g. 1.2 seconds for 6 images captured at a rate of 0.2 seconds, it is determined whether N2 images have been captured (step 128). N2 is the maximum number of images captured for the system to make a decision. If N2 images have not been captured (step 128), then an additional image is captured (step 104) after the time delay (step 109) until N2 images have been captured or until the system determines that the image is coming from a living finger. In one embodiment, the value of N2 is 15 (or the capture time is 3 seconds).
The average intensity of the additional image is calculated in step 106. The delta average intensity is calculated (step 110). The average delta average intensity is calculated (step 118). The system then determines whether the average intensity increases monotonically (step 120) and whether the average delta average intensity is greater than the threshold amount (step 122) as described above. If the average intensity increases
16 monotonically (step 120) and the average delta intensity is above a predetermined threshold amount (step 122), then the image is accepted as coming from a living finger (step 124).
If, however, the average intensity is not increasing monotonically or the average delta intensity is not greater the threshold amount and N2 images have been captured, then the image is rejected as coming from a spoof (step 130) and the process ends in step 126. In a static imaging environment, i.e. an environment where the applied object does not move relative to the sensor for image capture puφoses, it is noted that the applied object, e.g. the finger, should remain on the fingeφrint sensor until a decision is made. For example, the applied object must remain relatively stationary on the fingeφrint sensor for a minimum of 1.2 seconds.
It is noted that in the embodiment above (FIG. 1), uniform sensor devices should be employed because different devices give different ranges of intensities. Since intensity ranges vary with different devices, it is difficult to use a fixed threshold while obtaining good or uniform results for all the devices. In addition, different fingers also give different ranges of intensities. Accordingly, the lack in uniformity causes it to be even more difficult to use a fixed threshold. Hence, where uniformity of the sensor is a concern, software and/or hardware based normalization may need to be performed on the sampled image(s) so that a different threshold does not need to be calculated for different people or different devices. Filtering and image processing techniques, such as those mentioned above for instance U.S. Patent Application Serial No. 09/560,702, filed April 27, 2000, as well as those discussed below, can be employed to achieve such a normalization.
PIXEL DENSITY TECHNIQUE A more robust way of testing vitality is to use the density of ON-pixels along a ridge. Density is a measurement of the number of ON-pixels in a certain area, the certain area being the normalization factor. Because increased moisture within and on the surface of the skin increases the signals and that as the fmger is warming up the sensor, the moisture increases, in particular, along the ridges. Hence, the ridge area is used in measuring the density of ON- pixels. FIG. 3 shows a flowchart for determining whether a fingeφrint is coming from a living fmger by calculating the density of ON-pixels in a predetermined area. The process starts in step 150. A fmgeφrint image is captured in step 152. A portion of the captured image is selected in step 154. In one embodiment, a central portion of an image is selected.
17 For example, approximately 1/3 to approximately 2/3 of the central portion of the image is selected. In step 156, a maximum intensity value and a minimum intensity value for the pixels in the selected area are established. A mid- value between the maximum intensity value and the minimum intensity value is then determined in step 158. A pixel in the selected portion having an intensity value above the mid-value indicates an ON-pixel, or a ridge pixel (step 160). The ON-pixels in the selected portion are counted in step 162. Also in step 162, the density of the full image, or a portion thereof, is calculated by dividing the number of the ON-pixels into the total pixels in the selected portion.
The system then determines whether the captured image is the first image captured (step 164). If the captured image is the first image captured, then another fingeφrint image is captured after a time delay (i.e. at a predetermined capture rate) in step 152. If the captured image is not the first image captured, then the change in density between the current captured image and the previous captured image is calculated in step 166.
The system then determines whether N3 images have been captured in step 168. Parameter N3 is the minimum number of images required prior to determining whether an image is coming from a living fmger. If N3 images have not been captured, then another image is captured after a time delay (step 152). If N3 images have been captured, then the system determines whether the density increases monotonically in step 170. Similar to that described above, a counter counts the number of positive delta densities. If the number of positive delta densities exceeds a predetermined percentage, then the density of the sequence of images increase monotonically.
If the density does not increase monotonically, then the system checks whether N4 images have been captured in step 180. N4 is the maximum number of images captured for determining whether an image is coming from a living fmger, similar to N2 described above. If the maximum number of images have not been captured, then another image is captured in step 152 after a time delay. If the maximum number of images have been captured, then the system rejects the image as coming from a spoof (step 182).
If the density increases monotonically, then the system calculates the average delta density (Step 172). Specifically, the delta densities are averaged and divided by the averaged density. If the average delta density exceeds a predetermined threshold amount (step 174), then the image is accepted as coming from a living finger (step 176) and the process ends in step 178. In one embodiment, the threshold amount is 0.35%.
18 If, however, the average delta density does not exceed the threshold, then it is determined whether N4 images have been captured (step 180). If less than N4 images have been captured, then an additional fingeφrint image is capture in step 152 after a time delay. It is noted that all counters are reset in end step 178. In one embodiment, to protect the system from falsely accepting an artificially warmed spoof, e.g. a spoof that is warmed by a hair blower, image characteristics are measured at different portions of the fingeφrint. For instance, measurements can be taken from the middle of the fingeφrint (middle fmgeφrint portion) and the outer peripheral of the fmgeφrint (outer fingeφrint portion). The middle fingeφrint portion typically warms up faster than the outer fingeφrint portion for a living fmger. To the contrary, the outer fmgeφrint portion of an artificially warmed spoof warms up faster than the middle fingeφrint portion. Therefore, the system rejects the applied object as a spoof if the outer fingeφrint portion warms up faster than the middle fingeφrint portion.
In another embodiment, a predetermined portion of pixels is designated to be the middle fingeφrint portion and the surrounding remaining area of pixels is designated as the outer fingeφrint portion. For example, a center portion having 150x150 pixels is designated as the middle fingeφrint portion and the surrounding remaining area of the 300x300 pixels is designated as the outer fingeφrint portion. However, the number of pixels may be modified for different applications. According to another embodiment, the average intensity is calculated for the middle fingeφrint portion and the outer fingeφrint portion of each image. The average intensity is calculated in a manner described above with respect to FIG. 1. Specifically, the intensity of the ON-pixels of an image are averaged. The rates of increase, i.e. the average delta average intensity, for the middle fingeφrint portion and the outer fingeφrint portion, are also calculated for each image. The rates of increase for the middle fingeφrint portion and the rate of increase for the outer fingeφrint portion are then compared. If the outer fingeφrint portion has a rate that is greater than the rate for the middle fingeφrint portion, then the system rejects the fingeφrint as coming from a spoof. On the other hand, if the middle fingeφrint portion has a rate of increase that is greater than the outer fingeφrint portion, then the system accepts the fingeφrint as coming from a living fmger.
In an alternative embodiment, density is calculated for the middle fingeφrint portion and the outer fingeφrint portion of each image. The density is calculated in a manner described above with respect to FIG. 3. In one embodiment, ON-pixels in a center portion of
19 the image and along the ridges are counted and the density of ON-pixels is then calculated. ON-pixels in an outer portion surrounding the central portion of the image are also counted and the density calculated for each image. The rates of density increase for the middle fmgeφrint portion and the outer fingeφrint portion are also calculated for each image. The rates of increase for the middle fingeφrint portion and the rate of increase for the outer fingeφrint portion are then compared. If the outer fingeφrint portion has a rate that is greater than the rate for the middle fingeφrint portion, then the system rejects the fingeφrint as coming from a spoof. On the other hand, if the middle fingeφrint portion has a rate of increase that is greater than the outer fingeφrint portion, then the system accepts the fingeφrint as coming from a living fmger.
RATE OF WARMING TECHNIQUE Another way to protect the system from falsely accepting an artificially warmed spoof is to measure the rate of warming due to the applied object and compare it to the rate of warming measured at the time of enrollment or during the subsequent successful verifications for the same fmger. The rate of warming, in one embodiment, is calculated using average intensity. For example, average intensity of each captured image is calculated. A successive comparison of the average intensity is done for a currently captured image with the average intensity for a previous captured image. An average delta average intensity representing the rate of warming is then calculated. This rate is then compared with a rate of warming stored in a memory in the system. If the difference between the two rates exceeds a predetermined tolerance, then the system rejects the applied object as a spoof.
In an alternative embodiment, the rate of warming is calculated using density. For example, density of each captured image is calculated. A successive comparison of the density is performed for the density of a currently captured image with the density of a previous captured image. An average delta density representing the rate of warming is then calculated. This rate is then compared with a rate of warming stored in a memory in the system. If the difference between the two exceeds a predetermined tolerance, then the system rejects the applied object as a spoof. Comparing the rate of warming for an applied object and a previously determined rate of warming prevents false acceptance of an artificially warmed spoof because the warming characteristics of a spoof are different than the warming characteristics of a living fmger.
20 RIDGE UNIFORMITY TECHNIQUE FIG. 4 shows a flowchart for a process where the characteristic associated with perspiration is used to determine whether a fingeφrint is from a living finger or a spoof by measuring the changes in intensity along the ridges over a sequence of images. The process starts in step 202. A plurality of images are sequentially captured at a predetermined rate, as is described above (step 204). The intensity along the ridges of each image is measured in step 206. The ridges are again represented by ON-pixels, the ON-pixels being the pixels that have intensities greater than a predetermined threshold.
The intensity along the ridges of each image is compared with intensity along corresponding ridges of a previous captured image successively (step 208). For example, the intensity along ridges of the second image is compared with corresponding intensity along ridges of the first image; the intensity along ridges of the third image is compared with corresponding intensity along ridges of the second image; and so on. If the intensity along the ridges increases in a spatially non-uniform way (step 210), then the system accepts the image as coming from a live person in step 212. This is because a spatially non-uniform increase in intensity along the ridges indicates pores emanating sweat, which is an indicator of a living finger. However, if the intensity along the ridges does not increase in a spatially non-uniform way (step 210), then the image is rejected as coming from a spoof in step 216. The process ends in step 214. In an alternative embodiment, instead of global measurement of nonuniformly increasing intensity, the nonuniformly increasing intensity is measured along the contours of the ridges to obtain a linear sequence of intensity values. The contours are located by first determining the location of the ridges. The ridges are in turn determined by pixels having intensity values greater than a threshold amount. Pixels having intensities greater than a threshold amount are ON-pixels, as is described above. The image is then binarized, i.e. the ON-pixels are set as black pixels and the rest are set as white pixels. After binarization, the ridge measurement is linear.
A ridge follower (i.e. a computer technique that traces a line through analysis of a string of coterminous pixels) then tracks the ridges on the grayscale image, for example the original image prior to thresholding and binarizing. When the ridge follower is over a pore, the image signal is stronger and the image signal decreases after the ridge follower passed the pore. The signal level tracking a ridge is thus roughly sinusoidal as the signal goes from pore to pore. This nonuniformity in image signals does not exist for a dead or a fake fmger
21 because they do not have pores. Therefore, the nonuniformity of signals of a ridge indicates vitality. The intensity periodicity and nonuniformity such as frequency may be measured, for example, by Fourier transform.
RIDGE SIGNAL STRENGTH TECHNIQUE
FIG. 5 shows a flowchart for a method of determining whether a fingeφrint image is coming from a living finger by determining whether a ridge has maximum signal and the signal does not change. The method starts at box 302 and continues to step 304. In step 304, a sequence of images is captured. The images are compared successively in step 306, using either average intensity or density. If the average intensity or the density of the image does monotonically increase (step 308) and the ridge signals are strong (indicating that the signal is near a maximum value) (step 310), then the system accepts the image as coming from a living fmger. In this case, the living fmger may be a living fmger that is wetted, or a living finger that is saturated due to perspiration caused by heat, exercise, etc. It is noted that a spoof is able to produce a strong signal and may be accepted falsely.
In another embodiment, depending on the security provisions, the system may prompt the user to wipe his fmger and try again before accepting the image. In an alternative embodiment, the system may accept the fingeφrint without further investigation (e.g. in a lower security application). If the comparison in step 306 yields a result that the signals between images have changed (step 308), for example the difference is greater than a threshold value, then the system cannot determine whether the image is coming from a living finger (step 316). Similarly, if the signals indicate no change between images and the image does not have a maximum signal value — i.e., the image strength is as high as the image capture device can measure — , then the system cannot determine whether the image is coming from a living finger (step 316) and the process terminates. The process terminates at box 314.
In one embodiment, instead of ending the process the system continues to look for other characteristics that are indicative of a living finger, such as checking the average intensity as described above in FIG. 1 and/or checking the uniformity of intensity along the ridges over a sequence of images, as described above with reference to FIG. 4.
In sum, the overall objective of the technique is to recognize certain limitations in image capture devices, or image capture means. For instance, due to limitations in the device, there will be circumstances where the initial image has as high a signal strength as the
22 device can measure. When this occurs, there is little ability to determine whether the change in image strength over a sequence of images gets stronger, which is a characteristic of a living fmger. Accordingly, in one embodiment, gain adjustment techniques, such as those described in U.S. Patent Application Serial No. 09/560,702, filed April 27, 2000, incoφorated herein by reference in its entirety above, can be employed when the signal strength is too high — the gain adjustment technique can lower the image capture device gain and correspondingly lower the strength of the captured images so that a change is detectable. An alternative technique is to measure the signal strength along the middle of the ridges, which is described below in further detail.
WATER DROPLET DIFFERENTIAL TECHNIQUE FIG. 6 is a flowchart depicting a situation where randomly located water droplets are used to determine whether the fingeφrint is coming from a living fmger or a spoof. The process starts in box 402 and continues to step 404. In step 404, a sequence of fingeφrint images are captured. In step 406, randomly located, i.e. not all located along the ridges, water droplets are identified. Water droplets are indicated by splotches (e.g. globules) — in this case clusters of pixels in the electrical representation that are representative of a splotch. Because water (at least perspiration) is conductive, water produces a strong signal. The system determines a globule if the ON-pixel area is wider than a ridge (a ridge is usually narrow and long versus a globule that has a larger width). If there is a large number of water droplets, then the applied object is likely either a wetted living finger or a wetted spoof.
The average size of the water droplets of a currently captured image is compared with corresponding water droplets in a previously captured image in step 408. If the water droplet size increases (step 410), then the system accepts the image as coming from a living fmger (step 416) and the process ends in step 414. This is because, in general, a perspiring sweat gland creates moisture at a faster rate than the rate at which the skin absorbs the moisture. Hence, for a living fmger, the water droplet size increases over time.
If the water droplet size remains static, or decreases over the timed sequence of images (step 410), or is too large, e.g. greater than approximately two or three ridge widths, then two situations may be possible: 1) the applied object is a wetted spoof or 2) the applied object is a living fmger with over- accumulation of surface moisture. In either situation, the system does not accept the image as coming from a living fmger (step 414), although the system may be falsely rejecting a living fmger (as in the second situation).
23 In one embodiment, the system notifies the user to remove excessive moisture from the finger, e.g. by wiping the finger, and try again. When the user removes excess moisture from a true fmger, the fingeφrint will pass under the tests of FIGS. 1, 3, 4 and 5. A spoof, on the other hand, will either continue to fail under the test of FIG. 6 or, if the excessive moisture is removed from the spoof, then the fingeφrint will then fail under the tests of FIGS. 1, 3, 4, 7 or 8.
FINGERPRINT VITALITY TECHNIQUE FIG. 7 shows a flowchart for a method of distinguishing a living fmger from a spoof by identifying the vitality of a fingeφrint. The vitality is identified by processing a plurality of fingeφrints captured over a predetermined period of time.
The process starts at box 502. In step 503 where a first fingeφrint image is captured by a fingeφrint sensor from an applied object at time zero. A last fingeφrint image is captured after a predetermined (and programmable) time in step 504. According to an embodiment, the predetermined time is five seconds. In general, the predetermined time is selected to maximize the differences that can be observed between the captured images over the predetermined time frame. In an embodiment, a sequence of fingeφrint images, e.g. nine images, are captured by a fingeφrint sensor from the applied object during the predetermined time period. In another embodiment, only the first and the last captured images from the sequence of images are used for classification.
Image differences caused by temporal change in a perspiration pattern are used to determine whether the fingeφrint is coming from a living finger or a spoof. More specifically, a living finger perspires, causing the signal pattern to change over time while changes due to perspiration do not occur in a dead or a fake finger. Hence, the fmger should not be moisture-saturated when the first image is captured. In one embodiment, if the applied object gives a signal larger than a predetermined level, then the fingeφrint verification system prompts the user to wipe his fmger to eliminate excess moisture and try again.
Steps 506 through 512 describe a method for obtaining a mask from the last captured image. These steps are exemplary and other techniques can be employed. The sequence of raw images are then processed through an image processing stage, including noise reduction, image enhancement and contour extraction in step 506, which can include any of the techniques described above, such as that described in U.S. Patent
24 Application Serial No. 09/560,702, filed April 27, 2000, entitled "Automatic Gain Amplifier," which is incoφorated herein by reference above.
In one embodiment, a blank is captured from the fingeφrint sensor, i.e. an image is captured from the fingeφrint sensor without a finger placed on the sensor. This blank image represents the static in the background. A fingeφrint is then captured from the fmgeφrint sensor and the blank image subtracted from the captured fmgeφrint to eliminate the static. In one embodiment, pixels that change within 2% of the blank image are discarded because they are considered to be static noise. In still other embodiments, a median filter can be applied to fill in the white pixels in the middle of the pores and to further smooth the image. Also in step 506, the last captured image is translated into binary representation. The last captured image is used because a fingeφrint from a living fmger darkens over time due to increased moisture on the surface of the skin. Therefore, in general, the last captured image has the best contrast, meaning that it has the most defined ridge structure.
In another embodiment, the last captured image is first transformed into grayscale signals, represented by a number of 0 to 255. The last image is then transformed into binary representation by mapping pixels having signals greater than the threshold amount as a black pixel and a pixel having a signal less than a threshold amount as a white pixel.
Binarizing, according to one embodiment, is accomplished with a thresholding circuitry incoφorated into the readout circuit of the sensor so that binary outputs are generated from the sense elements. The threshold can be set locally in specific regions of a fingeφrint sensor to correct for variations across the sensor. A ridge is identified by the ON- pixels. A valley, on the other hand, is identified by the white pixels.
The binary image is then thinned so that ridges are only one pixel wide (step 508). Thinning reduces the amount of information to be processed to the minimum necessary for the recognition of patterns. In addition, the shape of a thin- line representation of a pattern is more easily analyzed, thus permitting a simpler structural analysis and more intuitive design of recognition algorithms. Preferably, a thinning algorithm compresses data representing the fingeφrint, retain significant features of the pattern and eliminate local noise without introducing distortions of its own. Various thinning methods are described by L. Lam et al. in "Thinning Methodologies - A Comprehensive Survey," IEEE Trans. Pattern Analysis and Machine Intelligence, Sept. 1992, pp. 869-885. In general, any suitable method may be used to reduce the number of pixels to a curve of one pixel wide (i.e. a thin line).
25 In one embodiment, a shift is performed after thinning the binary image if the result does not pass through the middle of the original ridges. Shifting is performed because the main part of a ridge is more desirable than the borders of the ridge. The center of the ridges is preferred because the unique features, e.g. pores, of a living fmger, reside in the center portion of the ridges rather than the borders of the ridges. Shifting involves moving the thinned line so that it corresponds to the middle of a thicker line from which it came. Preferably, a medial line, that is, one running down the middle of the thicker line, is desired. It is noted that, if thinning is accomplished by medial axis, then no such shifting is needed. The thinned line is moved from its current position so that it intersects midpoints of the thicker line. The midpoints can be determined by creating a line peφendicular to the thick line and identifying a middle point between the intersections of the boundaries of the peφendicular line across the thick line.
The y-connections are removed in step 510 so that the contours only consists of individual curves. A y-connection is where a ridge splits in two (i.e. bifurcates) and has multi-values (each leg of the y is defined by an equation, thus the joint has multiple values). In one embodiment, y-connections are removed by a simple 3x3 non-overlapping neighbor operation.
A 3x3 non-overlapping neighbor operation is useful not only in removing Y- connections, but filtering noise. The 3x3 non-overlapping neighbor operation applies a 3x3 pixel mask (or boundary) centered over a focal pixel (typically an ON-pixel). Thus, there exists a focal pixel and eight neighboring pixels. Next, the system examines each of the neighboring pixels. According to one embodiment, if no neighboring pixel is an ON-pixel, then the focal pixel is assumed to be noise. The process can be repeated for each pixel in the image. Furthermore, focal pixels with only one neighboring pixel that is an ON-pixel can be marked. The marked ON-pixel and focal pixel might, in fact, also be noise. So in a like manner, the marked ON-pixel can be processed as a new focal pixel and, if only one neighboring ON-pixel is found, then the marked ON-pixel and the original focal pixel are probably noise. Likewise, similar coterminous series of ON-pixels, of any desired length, can be eliminated in this manner. Of course, for circumstances greater than two pixel lengths, two neighboring ON-pixels, instead of one, would be the threshold criteria. Pixels assumed to be noise are turned off.
As for circumstances where a y-connection is to be removed, when a focal pixel (an ON-pixel) has two neighbors as ON-pixels, and the neighboring pixels are separated by, for
26 instance, at least one but not more than two, OFF-pixels, then it is possible that the series of three ON-pixels (the focal pixel and the two neighboring ON-pixels) represents a y- connection. In such a circumstance, the two neighboring ON-pixels are turned off.
After the image is thinned, the resulting contour representing each ridge includes the edge of each ridge (i.e. extremes). Extremes are undesirable because they are not a main part of the ridge but rather on the border of the ridge. As discussed before, the center portion of the ridge contains biological features that are unique to a living fmger. Therefore, the end points on a contour (i.e. extremes) are eliminated. The spurs, which are tick marks along ridges (usually a one-bit or a two-bit intrusion), are eliminated as well. In one embodiment, extremes of the ridges and the spurs are removed using 2-pixel erosion in step 512. A 2-pixel erosion can use the same basic 3x3 non-overlapping neighbor process. Here, a focal pixel has one neighbor. Thus, it is assumed that the focal pixel is an extreme or spur in a ridge. Consequently, the focal pixel is turned off. If extremes or ridges of greater than one pixel are to be removed, then the process can be performed in an iterative manner — once for each additional pixel of the extreme or spur to be removed.
In one embodiment, curves shorter than fifteen pixels are discarded (for instance, using the 3x3 non-overlapping neighbor process described above). Since the nominal pore- to-pore distance is approximately 0.5 mm, spanning approximately ten pixels, a curve shorter than fifteen pixels is too short to capture a pore and their vicinity adequately. At the end of this step, contours roughly follow the center of the ridges of the original captured image and largely contain useful information.
The results from the last captured image are used as a mask in step 514. The mask is placed over the first and the last captured images and the gray scales along the contours (i.e. the mask) of each of the first and last captured images are converted into signals (i.e. strings). The resulting contours, which traverse through the middle of the ridges, have varying gray levels of the fingeφrint image. It is noted that the peaks of the gray levels denote the moist (e.g. pore) locations and the valleys of the gray levels show the dryer regions, usually between each two pores. Typically, there is regularity in the spacing of the pores in a live fingeφrint signal. For example, the peak to peak distance in a live fingeφrint signal is approximately 10 pixels or 0.5 mm. This is opposed to a fingeφrint signal from a dead finger or a plastic fmger, which does not have a specific periodicity because a spoof does not have evenly spaced perspiring pores.
27 In step 528, the periodicity of gray levels on each string in the first captured image is measured by a Fourier transform. In one embodiment, the FFT (Fast Fourier Transform) is used because it reveals periodicities (e.g. relative variability) observed across different frequencies of an image. The first captured image is typically "patchy," meaning that when a dry finger is first placed on a fingeφrint sensor, gray scale only shows up around the pore areas due to droplets of sweat forming around the pores and the rest of the ridges may not show clearly. Thus, a FFT is done for the first captured image to quantify the variability in gray level along the ridges due to the pores and the presence of perspiration.
A FFT is a known algorithm that is computationally much faster for large numbers of samples in the sequence. Any type of FFT algorithm can be used, for example, mixed radix method, radix 2 method, decimation in time method and the Danlielson-Lanczos Lemma method are all well known and readily available. In another embodiment, a 256-point FFT command is performed. In one embodiment, prior to taking the FFT, the DC signal is removed to eliminate a spike near zero frequency. The patchiness of a fmgeφrint changes over time as the ridges become more uniform.
This is because sweat emanating from a pore spreads and migrates to dryer parts of the finger. When the last image is taken after a time delay, gray scales typically show up all along the ridges due to moisture spreading towards drier parts of the fmger. Thus, there is less variability and a FFT is not performed for the last captured image. Instead, the last captured image is used to extract temporal changes relative to the first capture, which is discussed below.
It is noted that if the fmger is already saturated with moisture when the first image is captured, then the first image will not exhibit variability in gray scale. In addition, the last image captured after a time delay will not exhibit sufficient signal change from the first captured image if the fmger is initially saturated with moisture. Hence, the fmger should not be moisture-saturated when the first image is captured.
For a live fingeφrint, the signal representing the first captured image exhibits periodic peaks and valleys having a particular spatial frequency for the peaks. The peaks correspond to pore locations. Also in step 528, the total energy that corresponds to the spatial frequency of the pores is calculated. In general, spatial frequency is defined in cycles/pixel or cycles/mm. The total energy is the approximation of the area under the signal curve that is a result of the FFT. All the numbers in a predetermined window that correspond to spatial distance of approximately 0.4 mm to approximately 1.2 mm are added up, the result of which
28 is the total energy. Spatial distances of 0.4 mm and 1.2 mm are selected, because as discussed above, the average pore-to-pore distance is approximately 0.5 mm. Assuming a ±0.1 mm variation, for example, the minimum spatial distance is approximately 0.4 mm. A pore may be missing, e.g. a pore that is not sweating. Therefore, the maximum distance between two sweating pores is approximately 1.2 mm.
As discussed before, because a spoof does not have pores, no spatial frequency is available for a spoof. Hence, the total energy for a spoof is of a negligible value, or much less than that of a living finger.
In parallel to step 528 are steps 516 through 526, where temporal changes of a fingeφrint are quantified. In step 516, signals representing the contours are connected to form a long signal that represents each of the first captured fingeφrint and the last captured fingeφrint. The local maximums and local minimums of the first and the last fingeφrint signals are detected in step 518. Local maximums are all the different peaks in the long signal and local minimums are all the different valleys in the long signal. Because the long signal exhibits periodic ups and downs, there are multiple peaks and valleys within one signal. This is opposed to a global maximum and a global minimum where only one peak and one valley is selected from the long signal.
The local maximum and the local minimum are used to calculate a series of parameters quantifying the changes caused by the sweating process over time. For live fingeφrint signals, the local maximums are typically fairly constant, but the local minimums typically rise over time due to the diffusion of perspiration. In general, the pixels near the pores are relatively saturated while the areas between the pores are relatively dry when the first image is captured, creating fluctuation in the fingeφrint signal. However, as time progresses and the sweat diffuses toward dryer regions, the signal becomes more homogeneous (and thus less varying gray level along the ridges). In one embodiment, the series of parameters include total swing ratio (step 520), minimum and maximum growth ratio (step 522), last-first fingeφrint signal difference (step 524) and percent change of standard deviation of the first and the last fingeφrint signal (step 526). These four parameters are chosen because these four parameters exhibit sufficient differences between a dead and a live fmgeφrint signal.
In step 520, the total swing ratio of the first fingeφrint signal to the last fingeφrint signal is calculated. In general, the swing for a live fingeφrint is larger than that of a spoof (a fake or dead fmger). In addition, the swing is generally smaller for the last captured image
29 as compared to the first captured image. As discussed above, moisture begins mainly around the pores creating peaks in the signal. As the time progresses, the moisture gradually spreads along the ridges. Thus, the total swing decreases over time. Because a spoof does not have perspiring pores, this general trend is not present for a spoof fingeφrint signal. Total swing ratio of first captured fingeφrint signal to the last captured fmgeφrint signal is calculated as follows:
D.M.I = ∑ | (d , - Cπ- l / ∑ I (C2 f - C2 l-1) | (1) where C, is the intensity value of the image at point i along the contour, Ci denotes the first capture; C2 denotes the last capture, and the sum is taken for all i=2, 3, ..., C2 length. It is noted that length Ci is equal to length C2 because both the first and the last captured images use the same mask.
In step 522, minimum/maximum growth ratio of first to last fingeφrint signal is calculated. As discussed above, for a live fingeφrint signal, the height of the local maximums (peaks) increases at a slower rate than the height of the local minimums (valleys). Therefore, the average ratio of the local minimum growth to the local maximum growth is typically larger for a live fmgeφrint signal as compared to a spoof fingeφrint signal. The minimum/maximum growth ratio of first to last fingeφrint signal is calculated using the following equation.
D.M.2 = ∑(C2 m,n J - C / ∑(C2 max 1 - Ci™",), (2) where i and j are the maximum/minimum location indexes extracted from the second capture C2 (the first capture Ci maximums/minimums are read from the same locations accordingly.)
In step 524, the last-first fingeφrint signal difference mean is calculated. Because a nonliving fmger does not perspire, the last captured fingeφrint signal subtracts the first captured fingeφrint signal exhibits a lower degree of change than the signal difference between the first captured fingeφrint signal and the last captured fingeφrint signal for a living fmger. The last-first fingeφrint signal difference mean is calculated as follows.
D.M.3 = (∑(C2 , - Ci ,)) n, i = 1, 2, ..., n (3) where n = length C = length Ci .
Percentage change of standard deviations of first and last fingeφrint signals is calculated in step 526 as follows.
D.M.4 = (SD(Cι)-SD(C2) / SD(Cι)) (4) where SD is Standard Deviation Operator: SD(X) = (∑(x,- mean(x))2 / (n-l))1/2, n = length X.
30 If the signal fluctuation decreases around the means, which is typical for a live fingeφrint signal, then the percentage change of standard deviation would rise. Hence, a higher value of percentage change of standard deviations of the first and the last fingeφrint signals indicates a living fmger. The results from steps 520 through 528 are then fed into a back-propagation neural network classifier. In step 530, a decision on vitality is made in accordance to the results from the back-propagation neural network classifier. It is noted that the back-propagation neural network classifier can classify a fingeφrint signal based on any one of these five measurements. However, a combination with one or more other measurements can produce better classification precision and thus a more robust system than using one measurement alone.
A neural network is a software (or hardware) simulation of a biological brain. One advantage a neural network has over digital computers is the ease of taking into account high- order statistical relationships of stochastic data. More specifically, a neural network learns to recognize patterns in an input data by utilizing the principle of reward and punishment, by back-propagating the needed information for altering structures of interconnections and strength or weights of these interconnections dynamically. Once the neural network has been trained on samples of input data, it can make predictions by detecting similar patterns in future data. The architecture of a back-propagation neural network consists of two major components: nodes and the connections between the nodes. The back-propagation neural network is a multi-layered network, with the output from one layer serving as input to the next. The layers with no external output connections are referred to as hidden layers. A back-propagation network with a single hidden layer consists of three layers of nodes (input, hidden, and output) and full interconnection between input and hidden layers and the hidden and output layers. Residing on these connections are weights that multiply signals passing over the connection. The product of the signal magnitude and weight is summed over all connections leading to a particular node to give the initial output for that neuron (the actual result is dictated by the activation function chosen for the node). In general, the number of hidden layer neurons should be approximately one-half of the input layer nodes.
Training begins with all weights set to random numbers. For each data record, the predicted value is compared to the desired (actual) value and the weights are adjusted to move the prediction closer to the desired value. Many cycles are made through the entire set
31 of training data with the weights being continually adjusted to produce more accurate predictions. In one embodiment, a back-propagation neural net (three layer perceptron), with sigmoid non-linearity, is used.
Training a neural network requires the use of the same parameters that will be used to make the classification. Training involves, for example, different data sets of the five parameters from subjects in each classification. For instance, one classification may be a living fmger, one classification may be a dead fmger, and one classification may be a plastic fmger. In another instance, twelve data sets are collected from each class. According to one embodiment, the training algorithm uses gradient descent in conjunction to batch input-output training vectors to classify the input cases as live or spoof. In another embodiment, bipolar targets (+1, -1) are chosen to denote live and spoof, respectively.
If the output of the back-propagation neural network indicates that the fingeφrint is from a living fmger (step 532), then the system passes the fingeφrint as coming from a living fmger (step 540) and the process ends in step 544. However, if the output of the back- propagation neural network indicates that the fmgeφrint is from a spoof (step 532), then it is determined whether the number of trials is equal to a predetermined number, e.g. 3 (step
534).
If the number of trials has reached the predetermined number, then the system indicates that the fmgeφrint failed and cannot be accepted as coming from a live person (step 542). The process ends in step 544. It is noted that each time input data travels through the back-propagation neural network, a counter recording the number of trials is incremented by one. The counter is reset at the end step 544.
If the number of trials has not reached the predetermined number, then the system prompts the user to wipe his finger and try again in step 536. The process repeats by returning to the start step (step 538).
INVERTED SPOOF DETECTION TECHNIQUES Turning to FIG. 8, it depicts a method for distinguishing between a real fingeφrint and an inverted spoof of the fingeφrint. An inverted spoof is created when the ridge and valleys of an enrolled fingeφrint are essentially turned inside-out. The inverted spoof is a fairly simple technique for fooling a fingeφrint capture system. Indeed, it can be particularly powerful: turning contour of the fmger inside-out maintains the ridges and valleys — or at
32 least their transition points (endpoints and bifurcations), which is what is compared according to an embodiment of the system.
In order to detect an inverted fingeφrint, characteristics of the transition points are analyzed and compared between the enrolled fingeφrint data and the captured fingeφrint data. More particularly, and in the embodiment described below, when a fingeφrint contour is inverted, the bifurcations become endpoints and vise-versa; the system, accordingly, performs a statistical analysis on the ratio of bifurcations to endpoints, with respect to matching minutiae, to detect the inversion.
In step 804, the fingeφrint sensor captures an image of the applied object (the fmger). In step 808, minutiae are compared between the captured image and an enrolled image. A test is performed in step 812 to determine whether a match was found in step 808. If a match was found then, in step 816, a match counter is incremented (once for each match if multiple minutia are compared in step 808).
In step 820, the matching minutiae are classified by type by extracting information from the image. By classifying each minutia by type it is meant is endpoints and bifurcations are so marked. If the matching minutiae of the enrolled image have not already been classified, then they are classified too. According to one embodiment, to classify the minutiae by type, the 3x3 non-overlapping neighbor technique, described above, can be employed. Y-connections (bifurcations) can be extracted by examining at least three ON- pixels in the proximity of the minutia, while endpoints can be extracted by examining at least two ON-pixels in the proximity of the minutia. If the image has been filtered, in accordance with the process described above with reference to FIG. 7, then such information can be extracted then, rather than in a separate step as is depicted in FIG. 7.
In step 824, a test is performed to determine whether the captured and enrolled minutiae are of the same type. If they are not, then a type mismatch counter is incremented in step 828. If they are, or after step 828, then a next test (step 832) determines whether there are more minutia that need to be compared. If there are more minutiae to be compared, then a next minutiae is retried in step 836 and processing continues to step 808. If there are no more minutiae to be compared, then in step 840 the ratio of the type mismatch counter value to the match counter value is calculated.
A test is performed in step 844 to determine whether the calculated ratio exceeds a threshold value. According to one embodiment, the threshold value is 50%). This is because the inventors have found that due to potential non-uniformities in, for instance, a capacitive
33 fingeφrint sensor, and/or environmental conditions, a number minutiae will be misclassified — in particular where pixel intensity at or near an endpoint or bifurcation is just below the ON-pixel threshold value. The threshold value, can however, be reduced to a much lower tolerance if proper filtering or more ideal environmental conditions are present. If the ratio exceeds the threshold, then in step 848 the captured image is rejected as an inverted spoof. Otherwise, in step 852, the captured image is identified as not being an inverted spoof and the captured image is either accepted, or other security constrains enforced.
ADDITIONAL TECHNIQUES
Additional tests may be executed to deal with varying environmental conditions. In one embodiment, the rate of effects under different environmental conditions is compared with the rate of effects under ambient conditions, such as temperature and humidity, for adjustment of certain predetermined parameters. For example, the rate of heating in freezing temperature and the rate of heating in extreme heat differ from the rate of heating under ambient temperature. Another example is that the rate of heating in dry weather and the rate of heating in humid weather differ from the rate of heating under ambient condition. Because the rate of heating varies under different environmental conditions, the minimum time and the maximum time for determining whether a fmgeφrint is coming from a living fmger, for example, are modified according to the ambient conditions. For example, in cold weather, instead of a minimum time of 1.2 seconds, the system waits for 3 seconds and instead of a maximum time of 3 seconds, the system allows for 5 seconds to determine whether the fmger is alive.
In one embodiment, pore placement is measured along the ridges to detect whether the pore pattern matches that of the true enrolled finger or a previously successfully verified fmger. Pore placement can be measured directly or by locations of the sweat droplets as described above. More specifically, in one embodiment, pores are detected by detecting the peaks of a sinusoidal curve. Since the pores are very shallow and small, their fine features are not easily replicated by a spoof. Hence, pore placement may be a test for a true, versus a fake, finger.
In another embodiment, a global measurement of the texture of a fmger is performed and the result compared with the texture of a finger measured during enrollment or measured during a previous successful verification. The skin of a living fmger has very fine features
34 that cannot be easily reconstituted in a rendition of that fmger. These fine features may be, for instance, measured and quantified by using Fourier transform and looking at the high frequency details. A living fmger, in general, has higher frequency details then a plastic fmger. The texture measured may be, but not limited to, pores and other small texture that cannot be readily and reliably reproduced. In one embodiment, the textures may be measured using a device that measures very fine features.
According to another embodiment, a biomedical measurement device is used to measure certain characteristics of a living fmger, such as, but not limited to, skin resistance, temperature, pulse-oximetry (blood oxygen measured by absoφtion of near infrared light and red light), electrocardiogram (electrical potential changes of cardiac activity versus time), laser Doppler measurement of blood flow, x-ray, pressure and other physiological vitality indicators. These measurements are compared with corresponding measurements taken during enrollment or during a previous successful verification.
In still another embodiment, the system is a multi-modal system which requires more than one biometric measurements such as, but are not limited to, fingeφrint, voice, handwriting, retina, etc. By using a multi-modal system, an attacker must obtain multiple tools to manufacture different modes of spoof, thus making a spoof attack more difficult.
According to another embodiment, the system requests a user to put the same fmger down on the capture device twice. A sequence of images is captured for each application. The first and the second sequence of images are then compared. It is noted that the fingeφrint images from a living f ger exhibit some hysteresis as the characteristics change. In particular, the first of the second sequence of images exhibits characteristics closer to the last image of the first sequence of images. The first image of the second sequence of images from a spoof, on the other hand, displays characteristics that are closer to the first image of the first sequence of images captured from the initial application. This difference is due to the fact that a fmger has an elastic surface, thus the surface of the fmger stretches and varies under different conditions. Hence, the closer in time the fingeφrints are taken, the closer the images. In contrast, a molded finger typically produces identical fmgeφrints during initial application. Therefore, by comparing the images from two different applications, the system is able to determine whether the applied object is a living fmger.
Although the invention has been described with reference to particular embodiments, the description is only an example of the invention's application and should not be taken as a
35 limitation. For example, other types of classifier may be used to classify the fingeφrint signals. For instance, a classifier may be devised using the median values derived from a training set to determine a threshold. The classifier may than accept or reject a fingeφrint signal based on the threshold value. Various other adaptations and combinations of features of the embodiments disclosed are within the scope of the invention as defined by the following claims.
In one embodiment, the system can include an algorithm that randomly selects and requests one or more fingers for measurements, the maximum allowable number of fingers requested being ten. Randomly requesting different and varying number of fingers makes the attack more difficult because it is more difficult for an attacker to manufacture multiple spoofs. Furthermore, any of the particular spoof detection techniques described above can also employed, on a case-by-case basis, to detect a spoof. Since the selection of the fmger and/or algorithm is random, it adds unpredictability (with respect to potential spoofers), thus making the system more resistant to attacks.
36 CLAIMS What is claimed is: 1. A biometric sensing system comprising: an image capture device configured to sample an applied object and create an electrical representation of the applied object; and a spoof detection module configured to analyze the electrical representation of the applied object for relative intensity, density, geometric, or temporal anomalies indicative of a non-living applied object.
2. The biometric sensing system of claim 1 , wherein the spoof detection module employs an average intensity technique to detect and classify the anomalies, the average intensity technique configured to cause the system to capture a plurality of images of the applied object, and calculate an average intensity for each of the plurality of captured images.
3. The biometric sensing system of claim 2, wherein the average intensity technique is further configured to cause the system to reject the applied object as a spoof when the average intensity does not increase monotonically over the plurality of images.
4. The biometric sensing system of any of the above claims, wherein the spoof detection module employs a pixel density technique to detect and classify the anomalies, the pixel density technique configured to cause the system to capture a plurality of images of the applied object, determine an ON-pixel value based upon a first captured image, determine a pixel count for each image in the plurality of captured images, wherein the counted pixels exceed the ON-pixel value, and calculate a delta pixel count value over the plurality of images.
5. The biometric sensing system of claim 4, wherein the pixel density technique is further configured to cause the system to reject the applied object as a spoof when the delta pixel count does not increase monotonically over the plurality of images.
6. The biometric sensing system of any of the above claims, wherein the spoof detection module employs a ridge uniformity technique to detect and classify the anomalies, the ridge uniformity technique configured to cause the system to capture a plurality of images of the
37 applied object, measure pixel intensity along ridges in each of the plurality of captured images, determine whether the pixel intensity increases in a spatially non-uniform manner, and reject the applied object as a spoof when the pixel intensity does not increase in the spatially non-uniform manner.
7. The biometric sensing system of any of the above claims, wherein the spoof detection module employs a ridge uniformity technique to detect and classify the anomalies, the ridge uniformity technique configured to cause the system to capture a plurality of images of the applied object, measure pixel intensity along ridges in each of the plurality of captured images, determine whether the pixel intensity increases in a spatially non-uniform manner, and reject the applied object as a spoof when the pixel intensity does not increase in the spatially non-uniform manner.
8. The biometric sensing system of any of the above claims, wherein the spoof detection module employs a ridge uniformity technique to detect and classify the anomalies, the ridge uniformity technique configured to cause the system to capture a plurality of images of the applied object, measure pixel intensity values along contours of ridges in each of the plurality of captured images, binarize the pixel intensity values, measure pixel intensity variations along the ridges, and reject the applied object as a spoof when the measured pixel intensity variations are not roughly sinusoidal.
9. The biometric sensing system of any of the above claims, wherein the spoof detection module employs a water droplet differential technique to detect and classify the anomalies, the ridge uniformity technique configured to cause the system to capture an image of an applied object, locate a first water droplet positioned within the image, capture a subsequent image of the applied object, locate a like-positioned water droplet within the subsequently captured image, compare a size of the first water droplet with a size of the like-positioned water droplet, and reject the applied object as a spoof when the size of the like-positioned water droplet is smaller than the size of the first water droplet.
10. The biometric sensing system of any of the above claims, wherein the spoof detection module employs a fingeφrint vitality technique to detect and classify the anomalies, the fingeφrint vitality technique configured to cause the system to capture a plurality of images
38 of an applied object, digitally process the plurality of captured images, form a fingeφrint signal representative of fingeφrint strings from each of the plurality of images, compare changes between the fingeφrint signal corresponding to an initial image in the plurality of images and a fmgeφrint signal from a subsequently captured image in the plurality of images, and reject the applied object as a spoof when the changes exceed a threshold amount.
11. The biometric sensing system of claim 10, wherein the fmgeφrint vitality technique is further configured to cause the system to compare a total swing ratio of the fmgeφrint signal of the initial image and the fingeφrint signal of the subsequently captured image.
12. The biometric sensing system of claim 10 or 11, wherein the fmgeφrint vitality technique is further configured to cause the system to compare a minimum or a maximum growth ratio as between the fingeφrint signal of the initial image and the fingeφrint signal of the subsequently captured image.
13. The biometric sensing system of claim 10, 11 or 12, wherein the fingeφrint vitality technique is further configured to cause the system to compare last to first fingeφrint signal difference mean between the plurality of images.
14. The biometric sensing system of claim 10, 11, 12 or 13, wherein the fingeφrint vitality technique is further configured to cause the system to compare a percentage change of standard deviations between the fingeφrint signal of the initial image and the fingeφrint signal of the subsequently captured image.
15. The biometric sensing system of claim 10-14 or 15, further comprising a neural network, wherein the neural network configured to perform the steps of comparing.
16. The biometric sensing system of any of the above claims, wherein the spoof detection module is configured to extract minutia type information from the electrical representation, compare minutia type information with information corresponding to an enrolled object, calculate a ratio of mismatched minutia type information to matching minutia information, and reject the applied object as an inverted spoof when the ratio exceeds a threshold type mismatch ratio.
39
17. The biometric sensing system of any of the above claims, further comprising a mmutia matching module configured to compare minutiae extracted from the electrical representation of the applied object with minutiae of an enrolled object.
18. The biometric sensing system of any of the above claims, wherein the image capture device is a capacitive fingeφrint sensor.
19. A computer implemented method for detecting a spoof of a living fmger, comprising: receiving one or more electrical representations representative of a plurality of images of an applied object; and analyzing the one or more electrical representations for relative intensity, density, or geometric anomalies indicative of a non-living applied object.
20. The method of claim 19, wherein the step of analyzing comprises: calculating an average intensity for each of the plurality of images; and rejecting the applied object as a spoof when the average intensity, as sequentially measured over the plurality of images, does not increase monotonically.
21. The method of claim 20, further comprising: calculating an average change in the average intensity of the plurality of images; and rejecting the applied object as a spoof when the average change in the average intensity is below a threshold average change in the average intensity value.
22. The method of claim 19, 20 or 21 , wherein the step of analyzing comprises: selecting a portion of a first image in the plurality of images; determining an ON-pixel value based upon intensity values in the portion of the first image; counting a number of pixels exceeding the ON-pixel value in the portion of the first image; repeating the above steps of selecting, determining, and counting for a next image in the plurality of images; calculating an average density value between the first image and the next image; and
40 rejecting the applied object as a spoof when the average density value is not increasing monotonically.
23. The method of claim 22, further comprising: calculating an average change in average density between the plurality of images; and rejecting the applied object as a spoof when the average change in average density is less than a threshold average change in average density value.
24. The method of claim 19, 20, 21, 22 or 23, wherein the step of analyzing comprises: measuring intensity along ridges in each of the plurality of images; comparing the intensity along the ridges from each captured image with the intensity along the ridges from a coterminously captured image; determining whether the intensity along the ridges increases in a spatially non- uniform manner; and rejecting the applied object as a spoof when the intensity along the ridges does not increase in a spatially non-uniform manner.
25. The method of claim 19, 20, 21, 22, 23 or 24, wherein the step of analyzing comprises: comparing the electrical representations of each of the plurality of images for changes; comparing maximum signal values in the electrical representations when there are insignificant changes between the electrical representations; rejecting the applied object as a spoof when the electrical representation holds signal values indicative of insignificant changes between successive images.
26. The method of claim 19, 20, 21, 22, 23, 24 or 25, wherein the step of analyzing comprises: locating water droplets in each of the plurality of images; comparing a size of the water droplet in each of the plurality of images with a size of the water droplet in a subsequently captured image in the plurality of images; rejecting the applied object as a spoof when the water droplet size is static or decreases, over time, as represented in the plurality of images.
41
27. The method of claim 19, 20, 21, 22, 23, 24, 25 or 26, wherein the step of analyzing comprises: digitally processing a first electrical representation; saving the digitally processed electrical representation as a mask; applying the mask over subsequent electrical representations; converting the result of the mask of the subsequent electrical representations into fingeφrint strings; connecting the fingeφrint strings into a fingeφrint signal for each image; analyzing the fingeφrint signal for anomalies; and rejecting the fingeφrint signal when the anomalies are not indicative of a living finger.
28. The method of claim 27, wherein analyzing the fingeφrint signal for anomalies comprises calculating a total swing ratio of a first fingeφrint signal to a last fingeφrint signal; and rejecting the fingeφrint signal when the total swing ratio is not indicative of a living fmger.
29. The method ofclaim 27 or 28, wherein analyzing the fingeφrint signal for anomalies comprises: calculating a minimum or maximum growth ratio as measured between a first fingeφrint signal and a last fingeφrint signal; and rejecting the fingeφrint signal when the growth ratio is not indicative of a living fmger.
30. The method of claim 27, 28 or 29, wherein analyzing the fingeφrint signal for anomalies comprises calculating a first fingeφrint signal to a last fingeφrint signal difference mean; and rejecting the fmgeφrint signal when the signal difference mean is not indicative of a living fmger.
31. The method of claim 27, 28, 29 or 30,
42 wherein analyzing the fingeφrint signal for anomalies comprises calculating a percentage change of standard deviation between a first fingeφrint signal and a last fmgeφrint signal; and rejecting the fingeφrint signal when the signal difference mean is not indicative of a living finger.
32. The method of claim 27, 28, 29, 30 or 31, further comprising: calculating a spatial frequency of peaks in the fingeφrint strings; calculating a total energy for the fingeφrint strings, based on the spatial frequency; and rejecting the fmgeφrint signal as a spoof when the average energy is below a threshold total energy.
33. The method of claim 27, 28, 29, 30, 31 or 32, further comprising, prior to rejecting the fingeφrint signal as a spoof, sending the anomalies to a neural network for classification.
34. The method of claim 19-32 or 33, wherein the step of analyzing the one or more electrical representations comprises: extracting minutia type information from the electrical representation; determining whether the minutia type information matches a minutia type of a matching minutia from an enrolled fmger; calculating a ratio of mismatched minutia types to matching minutia; and rejecting the fmgeφrint signal as a spoof when the ratio of mismatched minutia types to matching minutia exceeds a threshold mismatch value.
35. The method of claim 19-33, or 34, further comprising: capturing the plurality of images of the applied object with a fingeφrint sensor; and converting the plurality of images into the one or more electrical representations of the applied object.
36. The method of claim 35, wherein the fingeφrint sensor captured the plurality of images with a capacitive fingeφrint sensor.
43
37. The method of claim 19-35, or 36, further comprising matching minutiae extracted from at the one or more electrical representations with minutiae from an enrolled fmger.
38. A computer software product having stored therein one or more sequences of instructions for causing one or more processors to perform any of steps as recited in any of above claims 19 through 37.
44
PCT/US2000/027782 1999-10-07 2000-10-06 Spoof detection for biometric sensing systems WO2001024700A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU13305/01A AU1330501A (en) 1999-10-07 2000-10-06 Spoof detection for biometric sensing systems

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US15842399P 1999-10-07 1999-10-07
US60/158,423 1999-10-07

Publications (1)

Publication Number Publication Date
WO2001024700A1 true WO2001024700A1 (en) 2001-04-12

Family

ID=22568053

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/027782 WO2001024700A1 (en) 1999-10-07 2000-10-06 Spoof detection for biometric sensing systems

Country Status (2)

Country Link
AU (1) AU1330501A (en)
WO (1) WO2001024700A1 (en)

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001065470A1 (en) * 2000-02-29 2001-09-07 Infineon Technologies Ag Method for analysing fingerprint images
WO2002093462A1 (en) * 2001-05-14 2002-11-21 Infineon Technologies Ag Method for detecting falsity in fingerprint recognition by classifying the texture of grey-tone differential values
EP1302908A2 (en) * 2001-10-09 2003-04-16 BMF Corporation Verification techniques for biometric identification systems
EP1358842A1 (en) * 2002-04-09 2003-11-05 Omron Corporation Apparatus for measuring physiological data and exercise machines
WO2004023999A1 (en) 2002-09-13 2004-03-25 Fujitsu Limited Biosensing instrument and method and identifying device having biosensing function
EP1403810A2 (en) * 2002-09-27 2004-03-31 Nec Corporation Fingerprint authentication with input image validation based on spatial frequency analysis
FR2849246A1 (en) * 2002-12-20 2004-06-25 Sagem Fingerprint sensor operation method in which additional impedance measurements are used with a digital fingerprint image to determine whether the object placed on the sensor is a living finger or an imitation
EP1434163A2 (en) * 2002-12-26 2004-06-30 Kabushiki Kaisha Toshiba Apparatus and method for acquisition, validation and verification of biometric features
FR2849763A1 (en) * 2003-01-14 2004-07-16 Oreal To evaluate the hydration of skin, and the like, a sensor with capacitative cells is connected to a micro-computer through an interface to give an image, with a link to a remote server
FR2849764A1 (en) * 2003-01-14 2004-07-16 Oreal To evaluate the hydration of skin, and the like, a sensor with capacitative cells is connected to a micro-computer through an interface to give an image, with a link to a remote server
WO2004084025A2 (en) * 2003-03-14 2004-09-30 Elo Touchsystems, Inc. Water tolerant touch sensor
DE10319281A1 (en) * 2003-04-29 2004-12-23 United Business Gmbh & Co. Consulting Kg Device and method for the live detection of a body part in a biometric test method
EP1545300A1 (en) * 2002-08-22 2005-06-29 Smiths Medical PM, Inc. Combination fingerprint and oximetry device
US6944491B2 (en) 2000-08-01 2005-09-13 L'oreal Method of acquiring an image of a non-dermatoglyphic zone of the skin or of a zone of the hair by means of acquisition apparatus including a non-optical sensor
WO2006087656A1 (en) * 2005-02-18 2006-08-24 Koninklijke Philips Electronics N.V. Method and apparatus for biometric analysis of sweat ducts
EP1818084A1 (en) * 2006-02-13 2007-08-15 Precor Incorporated User identification for fitness trainers
EP1872719A1 (en) * 2006-06-27 2008-01-02 TBS Holding AG Liveness detection method
EP1988489A1 (en) * 2007-05-04 2008-11-05 STMicroelectronics (Research & Development) Limited Biometric sensor apparatus and method
US20090245596A1 (en) * 2008-03-27 2009-10-01 Fujitsu Limited Authentication apparatus and authentication method
WO2010117542A1 (en) * 2009-03-30 2010-10-14 Motorola, Inc. Method and apparatus for determining a physiological parameter using a fingerprint sensor on a portable electronic device
WO2011111056A1 (en) * 2010-03-12 2011-09-15 Tata Consultancy Services Limited A system for vehicle security, personalization and cardiac activity monitoring of a driver
WO2016037078A1 (en) * 2014-09-05 2016-03-10 Qualcomm Incorporated Multi-stage liveness determination
WO2016040898A1 (en) * 2014-09-12 2016-03-17 Qualcomm Incorporated Method and system for determining whether a target object was present during a scanning operation
KR20160108054A (en) * 2015-03-06 2016-09-19 주식회사 리얼아이덴티티 Perspiration detecting apparatus, detecting method and andauthenticating apparatus using the same
TWI597671B (en) * 2017-03-01 2017-09-01 映智科技股份有限公司 Method and circuit for fingerprint detection
CN107844754A (en) * 2015-10-19 2018-03-27 广东欧珀移动通信有限公司 A kind of decision method and device of Terminal fingerprints sensor surface foreign matter
EP3309709A4 (en) * 2015-08-14 2018-07-18 Fujian Landi Commercial Equipment Co., Ltd Fingerprint identification method and system
WO2018151646A1 (en) * 2017-02-17 2018-08-23 Fingerprint Cards Ab Enabling identification of fingerprints from captured images using contour points
CN109409054A (en) * 2017-08-15 2019-03-01 南昌欧菲生物识别技术有限公司 Terminal and its unlocked by fingerprint method and apparatus
WO2019078769A1 (en) * 2017-10-18 2019-04-25 Fingerprint Cards Ab Differentiating between live and spoof fingers in fingerprint analysis by machine learning
CN109923482A (en) * 2016-10-03 2019-06-21 唯景公司 Website monitors system
CN110705411A (en) * 2019-09-23 2020-01-17 Oppo广东移动通信有限公司 Fingerprint identification method and device, equipment and storage medium
EP3622437A4 (en) * 2017-05-11 2021-01-27 Fingerprint Cards AB Post verification fingerprint image capture
WO2021222917A1 (en) * 2020-04-29 2021-11-04 Qualcomm Incorporated Fingerprint sensor with detection of latent fingerprints
CN114511007A (en) * 2022-01-17 2022-05-17 上海梦象智能科技有限公司 Non-invasive electrical fingerprint identification method based on multi-scale feature perception
US11426103B2 (en) 2008-07-03 2022-08-30 Masimo Corporation Multi-stream data collection system for noninvasive measurement of blood constituents
US11579571B2 (en) 2014-03-05 2023-02-14 View, Inc. Monitoring sites containing switchable optical devices and controllers
US11631493B2 (en) 2020-05-27 2023-04-18 View Operating Corporation Systems and methods for managing building wellness
US11638532B2 (en) 2008-07-03 2023-05-02 Masimo Corporation User-worn device for noninvasively measuring a physiological parameter of a user
US11681197B2 (en) 2011-03-16 2023-06-20 View, Inc. Onboard controller for multistate windows
US11687045B2 (en) 2012-04-13 2023-06-27 View, Inc. Monitoring sites containing switchable optical devices and controllers
US11740948B2 (en) 2014-12-08 2023-08-29 View, Inc. Multiple interacting systems at a site
US11750594B2 (en) 2020-03-26 2023-09-05 View, Inc. Access and messaging in a multi client network
US11754902B2 (en) 2009-12-22 2023-09-12 View, Inc. Self-contained EC IGU
US11868103B2 (en) 2014-03-05 2024-01-09 View, Inc. Site monitoring system
US11892737B2 (en) 2014-06-30 2024-02-06 View, Inc. Control methods and systems for networks of optically switchable windows during reduced power availability

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0372748A2 (en) * 1988-11-25 1990-06-13 Fujitsu Limited Fingerprint-detecting device
JPH0353385A (en) * 1989-07-21 1991-03-07 Nippon Denki Sekiyuritei Syst Kk Feature extracting device
US5587533A (en) 1990-11-07 1996-12-24 Ultra-Scan Corporation Surface feature mapping using high resolution C-scan ultrasonography
US5719950A (en) 1994-03-24 1998-02-17 Minnesota Mining And Manufacturing Company Biometric, personal authentication system
JPH10290796A (en) * 1997-04-18 1998-11-04 Nec Corp Method and device for identifying living body

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0372748A2 (en) * 1988-11-25 1990-06-13 Fujitsu Limited Fingerprint-detecting device
JPH0353385A (en) * 1989-07-21 1991-03-07 Nippon Denki Sekiyuritei Syst Kk Feature extracting device
US5587533A (en) 1990-11-07 1996-12-24 Ultra-Scan Corporation Surface feature mapping using high resolution C-scan ultrasonography
US5719950A (en) 1994-03-24 1998-02-17 Minnesota Mining And Manufacturing Company Biometric, personal authentication system
JPH10290796A (en) * 1997-04-18 1998-11-04 Nec Corp Method and device for identifying living body

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 015, no. 205 (P - 1206) 27 May 1991 (1991-05-27) *
PATENT ABSTRACTS OF JAPAN vol. 1999, no. 02 26 February 1999 (1999-02-26) *

Cited By (89)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001065470A1 (en) * 2000-02-29 2001-09-07 Infineon Technologies Ag Method for analysing fingerprint images
US7133541B2 (en) 2000-02-29 2006-11-07 Infineon Technologies Ag Method for analyzing fingerprint images
US6944491B2 (en) 2000-08-01 2005-09-13 L'oreal Method of acquiring an image of a non-dermatoglyphic zone of the skin or of a zone of the hair by means of acquisition apparatus including a non-optical sensor
US7489960B2 (en) 2000-08-01 2009-02-10 L'oreal Method of acquiring an image of a non-dermatoglyphic zone of the skin or of a zone of the hair by means of acquisition apparatus including a non-optical sensor
WO2002093462A1 (en) * 2001-05-14 2002-11-21 Infineon Technologies Ag Method for detecting falsity in fingerprint recognition by classifying the texture of grey-tone differential values
US7215798B2 (en) 2001-05-14 2007-05-08 Infineon Technologies Ag Method for forgery recognition in fingerprint recognition by using a texture classification of gray scale differential images
EP1302908A2 (en) * 2001-10-09 2003-04-16 BMF Corporation Verification techniques for biometric identification systems
EP1302908A3 (en) * 2001-10-09 2004-06-09 BMF Corporation Verification techniques for biometric identification systems
EP1358842A1 (en) * 2002-04-09 2003-11-05 Omron Corporation Apparatus for measuring physiological data and exercise machines
US7212659B2 (en) 2002-04-09 2007-05-01 Omron Corporation Apparatus for measuring biological data and exercise machines
EP1545300A1 (en) * 2002-08-22 2005-06-29 Smiths Medical PM, Inc. Combination fingerprint and oximetry device
EP1545300A4 (en) * 2002-08-22 2009-04-22 Smiths Medical Pm Inc Combination fingerprint and oximetry device
US7362888B2 (en) 2002-09-13 2008-04-22 Fujitsu Limited Living-body detecting apparatus and method, and authenticating apparatus having living-body detecting function
EP1530949A1 (en) * 2002-09-13 2005-05-18 Fujitsu Limited Biosensing instrument and method and identifying device having biosensing function
WO2004023999A1 (en) 2002-09-13 2004-03-25 Fujitsu Limited Biosensing instrument and method and identifying device having biosensing function
EP1530949A4 (en) * 2002-09-13 2007-11-14 Fujitsu Ltd Biosensing instrument and method and identifying device having biosensing function
EP1403810A2 (en) * 2002-09-27 2004-03-31 Nec Corporation Fingerprint authentication with input image validation based on spatial frequency analysis
EP1403810A3 (en) * 2002-09-27 2004-05-26 Nec Corporation Fingerprint authentication with input image validation based on spatial frequency analysis
US7330572B2 (en) 2002-09-27 2008-02-12 Nec Corporation Fingerprint authentication method, program and device capable of judging inexpensively whether input image is proper or not
WO2004061758A1 (en) * 2002-12-20 2004-07-22 Sagem S.A. Method of determining the living character of an element bearing a fingerprint
US7657066B2 (en) 2002-12-20 2010-02-02 Sagem Securite Method of determining the living character of an element carrying a fingerprint
CN100388302C (en) * 2002-12-20 2008-05-14 萨甘股份有限公司 Method of determining the living character of an element bearing a fingerprint
FR2849246A1 (en) * 2002-12-20 2004-06-25 Sagem Fingerprint sensor operation method in which additional impedance measurements are used with a digital fingerprint image to determine whether the object placed on the sensor is a living finger or an imitation
EP1434163A2 (en) * 2002-12-26 2004-06-30 Kabushiki Kaisha Toshiba Apparatus and method for acquisition, validation and verification of biometric features
EP1434163A3 (en) * 2002-12-26 2006-06-07 Kabushiki Kaisha Toshiba Apparatus and method for acquisition, validation and verification of biometric features
EP1438922A1 (en) * 2003-01-14 2004-07-21 L'oreal Device and method for evaluating the hydration of skin and mucosa
FR2849763A1 (en) * 2003-01-14 2004-07-16 Oreal To evaluate the hydration of skin, and the like, a sensor with capacitative cells is connected to a micro-computer through an interface to give an image, with a link to a remote server
FR2849764A1 (en) * 2003-01-14 2004-07-16 Oreal To evaluate the hydration of skin, and the like, a sensor with capacitative cells is connected to a micro-computer through an interface to give an image, with a link to a remote server
US7402135B2 (en) 2003-01-14 2008-07-22 L'oreal Apparatus and method to evaluate hydration of the skin or the mucous membranes
WO2004084025A3 (en) * 2003-03-14 2004-11-25 Elo Touchsystems Inc Water tolerant touch sensor
WO2004084025A2 (en) * 2003-03-14 2004-09-30 Elo Touchsystems, Inc. Water tolerant touch sensor
US7116315B2 (en) 2003-03-14 2006-10-03 Tyco Electronics Corporation Water tolerant touch sensor
DE10319281B4 (en) * 2003-04-29 2006-06-29 United Business Gmbh & Co. Consulting Kg Apparatus and method for detecting the life of a body part in a biometric test method
DE10319281A1 (en) * 2003-04-29 2004-12-23 United Business Gmbh & Co. Consulting Kg Device and method for the live detection of a body part in a biometric test method
WO2006087656A1 (en) * 2005-02-18 2006-08-24 Koninklijke Philips Electronics N.V. Method and apparatus for biometric analysis of sweat ducts
EP1818084A1 (en) * 2006-02-13 2007-08-15 Precor Incorporated User identification for fitness trainers
EP1872719A1 (en) * 2006-06-27 2008-01-02 TBS Holding AG Liveness detection method
EP1988489A1 (en) * 2007-05-04 2008-11-05 STMicroelectronics (Research & Development) Limited Biometric sensor apparatus and method
US8031912B2 (en) 2007-05-04 2011-10-04 Stmicroelectronics (Research & Development) Limited Biometric sensor apparatus and method
US20090245596A1 (en) * 2008-03-27 2009-10-01 Fujitsu Limited Authentication apparatus and authentication method
US8520914B2 (en) * 2008-03-27 2013-08-27 Fujitsu Limited Authentication apparatus and authentication method
US11426103B2 (en) 2008-07-03 2022-08-30 Masimo Corporation Multi-stream data collection system for noninvasive measurement of blood constituents
US11751773B2 (en) 2008-07-03 2023-09-12 Masimo Corporation Emitter arrangement for physiological measurements
US11647914B2 (en) 2008-07-03 2023-05-16 Masimo Corporation User-worn device for noninvasively measuring a physiological parameter of a user
US11642037B2 (en) 2008-07-03 2023-05-09 Masimo Corporation User-worn device for noninvasively measuring a physiological parameter of a user
US11642036B2 (en) 2008-07-03 2023-05-09 Masimo Corporation User-worn device for noninvasively measuring a physiological parameter of a user
US11638532B2 (en) 2008-07-03 2023-05-02 Masimo Corporation User-worn device for noninvasively measuring a physiological parameter of a user
US11484230B2 (en) 2008-07-03 2022-11-01 Masimo Corporation User-worn device for noninvasively measuring a physiological parameter of a user
US11484229B2 (en) 2008-07-03 2022-11-01 Masimo Corporation User-worn device for noninvasively measuring a physiological parameter of a user
US8605961B2 (en) 2009-03-30 2013-12-10 Motorola Mobility Llc Method and apparatus for determining a physiological parameter using a fingerprint sensor on a portable electronic device
WO2010117542A1 (en) * 2009-03-30 2010-10-14 Motorola, Inc. Method and apparatus for determining a physiological parameter using a fingerprint sensor on a portable electronic device
US11754902B2 (en) 2009-12-22 2023-09-12 View, Inc. Self-contained EC IGU
US9144389B2 (en) 2010-03-12 2015-09-29 Tata Consultancy Services Limited System for vehicle security, personalization and cardiac activity monitoring of a driver
WO2011111056A1 (en) * 2010-03-12 2011-09-15 Tata Consultancy Services Limited A system for vehicle security, personalization and cardiac activity monitoring of a driver
US11681197B2 (en) 2011-03-16 2023-06-20 View, Inc. Onboard controller for multistate windows
US11687045B2 (en) 2012-04-13 2023-06-27 View, Inc. Monitoring sites containing switchable optical devices and controllers
US11868103B2 (en) 2014-03-05 2024-01-09 View, Inc. Site monitoring system
US11733660B2 (en) 2014-03-05 2023-08-22 View, Inc. Monitoring sites containing switchable optical devices and controllers
US11579571B2 (en) 2014-03-05 2023-02-14 View, Inc. Monitoring sites containing switchable optical devices and controllers
US11892737B2 (en) 2014-06-30 2024-02-06 View, Inc. Control methods and systems for networks of optically switchable windows during reduced power availability
US9633269B2 (en) 2014-09-05 2017-04-25 Qualcomm Incorporated Image-based liveness detection for ultrasonic fingerprints
US9953233B2 (en) 2014-09-05 2018-04-24 Qualcomm Incorporated Multi-stage liveness determination
US9639765B2 (en) 2014-09-05 2017-05-02 Qualcomm Incorporated Multi-stage liveness determination
WO2016037078A1 (en) * 2014-09-05 2016-03-10 Qualcomm Incorporated Multi-stage liveness determination
US9547083B2 (en) 2014-09-12 2017-01-17 Qualcomm Incorporated Determining whether a target object was present during a scanning operation
WO2016040898A1 (en) * 2014-09-12 2016-03-17 Qualcomm Incorporated Method and system for determining whether a target object was present during a scanning operation
US11948015B2 (en) 2014-12-08 2024-04-02 View, Inc. Multiple interacting systems at a site
US11740948B2 (en) 2014-12-08 2023-08-29 View, Inc. Multiple interacting systems at a site
KR102398972B1 (en) 2015-03-06 2022-05-17 주식회사 리얼아이덴티티 Perspiration detecting apparatus, detecting method and andauthenticating apparatus using the same
KR20160108054A (en) * 2015-03-06 2016-09-19 주식회사 리얼아이덴티티 Perspiration detecting apparatus, detecting method and andauthenticating apparatus using the same
EP3309709A4 (en) * 2015-08-14 2018-07-18 Fujian Landi Commercial Equipment Co., Ltd Fingerprint identification method and system
CN107844754A (en) * 2015-10-19 2018-03-27 广东欧珀移动通信有限公司 A kind of decision method and device of Terminal fingerprints sensor surface foreign matter
US10628653B2 (en) 2015-10-19 2020-04-21 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method and apparatus for determining existence of foreign material on surface of fingerprint sensor of terminal
EP3312763A4 (en) * 2015-10-19 2018-08-15 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method, apparatus, and mobile terminal for determining foreign material on surface of fingerprint sensor
CN109923482B (en) * 2016-10-03 2022-11-01 唯景公司 Site monitoring system
CN109923482A (en) * 2016-10-03 2019-06-21 唯景公司 Website monitors system
WO2018151646A1 (en) * 2017-02-17 2018-08-23 Fingerprint Cards Ab Enabling identification of fingerprints from captured images using contour points
TWI597671B (en) * 2017-03-01 2017-09-01 映智科技股份有限公司 Method and circuit for fingerprint detection
EP3622437A4 (en) * 2017-05-11 2021-01-27 Fingerprint Cards AB Post verification fingerprint image capture
CN109409054A (en) * 2017-08-15 2019-03-01 南昌欧菲生物识别技术有限公司 Terminal and its unlocked by fingerprint method and apparatus
US11580775B2 (en) 2017-10-18 2023-02-14 Fingerprint Cards Anacatum Ip Ab Differentiating between live and spoof fingers in fingerprint analysis by machine learning
WO2019078769A1 (en) * 2017-10-18 2019-04-25 Fingerprint Cards Ab Differentiating between live and spoof fingers in fingerprint analysis by machine learning
CN110705411B (en) * 2019-09-23 2022-06-21 Oppo广东移动通信有限公司 Fingerprint identification method and device, equipment and storage medium
CN110705411A (en) * 2019-09-23 2020-01-17 Oppo广东移动通信有限公司 Fingerprint identification method and device, equipment and storage medium
US11750594B2 (en) 2020-03-26 2023-09-05 View, Inc. Access and messaging in a multi client network
US11882111B2 (en) 2020-03-26 2024-01-23 View, Inc. Access and messaging in a multi client network
WO2021222917A1 (en) * 2020-04-29 2021-11-04 Qualcomm Incorporated Fingerprint sensor with detection of latent fingerprints
US11631493B2 (en) 2020-05-27 2023-04-18 View Operating Corporation Systems and methods for managing building wellness
CN114511007A (en) * 2022-01-17 2022-05-17 上海梦象智能科技有限公司 Non-invasive electrical fingerprint identification method based on multi-scale feature perception

Also Published As

Publication number Publication date
AU1330501A (en) 2001-05-10

Similar Documents

Publication Publication Date Title
WO2001024700A1 (en) Spoof detection for biometric sensing systems
Tan et al. Spoofing protection for fingerprint scanner by fusing ridge signal and valley noise
Marasco et al. A survey on antispoofing schemes for fingerprint recognition systems
Shahin et al. Biometric authentication using fast correlation of near infrared hand vein patterns
Marasco et al. Combining perspiration-and morphology-based static features for fingerprint liveness detection
O’Gorman Fingerprint verification
US6876757B2 (en) Fingerprint recognition system
Abhyankar et al. Integrating a wavelet based perspiration liveness check with fingerprint recognition
Nixon et al. Spoof detection schemes
US7272247B2 (en) Method and system for fingerprint authentication
Ribarić et al. Multimodal biometric user-identification system for network-based applications
Gupta et al. Fingerprint image enhancement and reconstruction using the orientation and phase reconstruction
Marasco et al. An anti-spoofing technique using multiple textural features in fingerprint scanners
Ghiani et al. Experimental results on fingerprint liveness detection
Nagwanshi et al. Biometric authentication using human footprint
Nikam et al. Ridgelet-based fake fingerprint detection
Xie et al. Fingerprint quality analysis and estimation for fingerprint matching
Schuckers et al. Liveness detection in biometric devices
Jobin et al. Palm biometrics recognition and verification system
Anwar et al. Multibiometric systems based verification technique
Ribarić et al. A biometric identification system based on the fusion of hand and palm features
Yao et al. Fingerprint quality assessment
KR100359920B1 (en) Discrimination method for imitative iris in iris recognition system
Akintoye et al. Challenges of finger vein recognition system: a theoretical perspective
TanishaAggarwal Fake Fingerprint Detection Methods

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 10089987

Country of ref document: US

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP