CA1216946A - Fingerprint verification method - Google Patents

Fingerprint verification method

Info

Publication number
CA1216946A
CA1216946A CA000452862A CA452862A CA1216946A CA 1216946 A CA1216946 A CA 1216946A CA 000452862 A CA000452862 A CA 000452862A CA 452862 A CA452862 A CA 452862A CA 1216946 A CA1216946 A CA 1216946A
Authority
CA
Canada
Prior art keywords
domain
image
correlation
reference segment
positions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired
Application number
CA000452862A
Other languages
French (fr)
Inventor
Michael Schiller
Emily Ginsberg
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fingermatrix Inc
Original Assignee
Fingermatrix Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fingermatrix Inc filed Critical Fingermatrix Inc
Application granted granted Critical
Publication of CA1216946A publication Critical patent/CA1216946A/en
Expired legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/30Individual registration on entry or exit not involving the use of a pass
    • G07C9/32Individual registration on entry or exit not involving the use of a pass in combination with an identity check
    • G07C9/33Individual registration on entry or exit not involving the use of a pass in combination with an identity check by means of a password
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/30Individual registration on entry or exit not involving the use of a pass
    • G07C9/32Individual registration on entry or exit not involving the use of a pass in combination with an identity check
    • G07C9/37Individual registration on entry or exit not involving the use of a pass in combination with an identity check using biometric data, e.g. fingerprints, iris scans or voice recognition

Abstract

Abstract An input fingerprint image consists of ones and zeros pixels representing light and dark pixels which in turn correspond to ridge and valley formations. This image is compared with a reference file fingerprint to verify the identity of the input fingerprint. The reference file has two relatively small segments which are subfields of the entire field of pixels that constituted the original fingerprint image for the individual involved. Two substantially larger domain subfields are extracted from the input fingerprint image. The center of each segment corresponds to the center of a respective one of the domains. Each segment is scanned over its corresponding domain to determine the position of maximum ones correlation and maximum zeros correlation between each segment and its corresponding domain. These four positions together with the correlation values associated with each of these four positions are subjected to various criteria to provide positive or negative verification of the input image relative to the reference file. These criteria include (a) positional closenes to one another, (b) exclusion of the positions from a predetermined border of the domain, (c) closeness of the correlation values of certain of the positions, and (d) magnitude of the sum of the ones correlations and zeros correlation for certain of the positions.

Description

FINGEr~n~INT ~RIFICATION METHOD

Background Of The Invention The ?resent invention relates to a fingerprint verification method for use in an access control system and more particularly to a fingerprint verification method which is fast, relatively inexpensive and has a low identification error rate.

Access control systems which use a fingerprint of an individual as the basis for verifiying the ident-ity of the individual have been proposed. These systems store an enrollment or reference fingerprint. Each time access is desired, a verification fingerprint scan is taken and compared to the enrollment or reference print.
Based upon the comparison, access is granted or denied.

The fingerprint verification method used in the access control system must be capable of quickly and ineY~pensively comparing the verification scan to the re-ference fingerprint.

In many systems, it is essential that the system have a very low type I error. A type I error is the incorrect rejection of an individual seeking access.
The installation may tolerate a somewhat higher type II

--2~

error. Type II error is the incorrect admission of an individual seeking access~ Nonetheless, it is also desirable to keep type II error low.

If the method used for comparing the input fingerprint and the file fingerprint involves the entire fingerprint, the result may be quite low in type I and type II errors but such a method tends to be too slow and too expensive for large scale control systems.

Accordingly, it is a purpose of the present invention to provide a fingerprint verification method having a very low incidence of type I error which oper-ates at high speed and low C05t.

9~

Brief Description One embodiment of the fingerprint identifi-cation technique of this invention uses two relatively small reference segments which are extracted from an enrolled fingerprint image. The reference segments are scanned across a somewhat larger domain of an input fingerprint image to establish positional correlation between enrollment image and input image.

The enrolled fingerprint is provided as an image having a series of light and dark pixels repre-senting respectively ridge and valley zones. The light pixels are stored as having a "ones" value and the dark pixels are stored as having a "zeros" value. The fingerprint image used in the method of the present invention can be generated by known optical scanning techniques such as that described in United States Patent No. 4,322,163.

When access is desired, a verification or input fingerprint scan is made to provide an input im-age. Predetermined domains from the input image andpredetermined segments from the enrolled image are compared to locate (a) the maximum correlation between the ones pixels in the reference segments and the ones pixels in the corresponding domain subfields of the input image and (b) the maximum correlation between the zeros pixels in the reference segments and the zeros pixels in the corresponding domain subfields.
Specifically, first and second 32 x 32 pixel reference segment subfields from the enrollment image are scanned across respective first and second 91 x 91 pixel domain subfields from the input image to locate positions of highest correlation.

These reference segment positions of highest correlation are subjected to a series of verification criteria in order to determine whether or not the person desiring access is the same person from whom the en-rollment print was taken. The verification criteriaincludes position closeness of the highest correlation values and relative magnitudes of the correlations at those positions.

The cascading of the requirement that two sep-arate fingers of the same individual be identifiedimproves ~ype II error. Permitting access if verifi-cation is made on any one of multiple attempts improves type I error.

6S~6 Brief Descri~tion Of The Drawings FIG. 1 is a schematic view of the two reference segments from the enrolled image overlaid onto the two domains from the input fingerprint image. The input image is shown in the background.

FIG. ~ is a schematic view illustrating the scanning of one of the reference segments across a cor-responding input fingerprint domain.

FIG. 3 is a block diagram illustrating the se-quence of steps taken when scanning the segments acrossthe domains to locate the positions "A", "B", "C" and "D" of maximum correlation between each segment and an underlying portion of the domain and to determine the correlation value (CVs) at each of these four positions.

FIG. ~ is a flow chart illustration of the screening stages using the position and correlation value (CV) information provided by the scanning arrangement of FIG 3.

16~

Descri~tion Of The Preferred Embodiments Referring now to the drawings and more parti-cularly to FIG. 1, an enrolled finyerprint image is generated in accordance with known techniques, as shown, for example, in United States Patent No.
4,322,163, issued March 30, 1982. Two reference segment subfields 10 and 12 are extracted from the full field of the fingerprint image. Reference segments 10 and 12 are stored, in a conventional manner, as a series of ones and zeros pixels representing respectively light and dark pixels of the fingerprint image, which, in turn, correspond to ridge and valley locations of the fingerprint. In one embodiment, each reference segment 10 and 12 is a 32 x 32 pixel square having a center to center spacing of about seventy pixels. The reference segments 10 and 12 are identified in the storage system by their one and zero values and further by their loca-tion in the enrolled fingerprint image. The reference file thus consists of the segments 10 and 12 for each enrolled individual.

During enrollment, a pre-use verification analysis is made to confirm that the reference file will be effective in confirming the correct individual during a later access or verification use. What is done is that the individual being enrolled is requested to apply his or her finger to the input platen for the system a number of times; for example four times. The first application of the finger generates the reference file from which the reference segments 10 and 12 are ex-tracted. Each subsequent application of the finger is then treated as a verification procedure in which the reference segments are correlated against the input image, exactly as would occur in a later verification ~z~

routine, to determine whether or not the system properly positively identifies the individual involved. If this pre-use verification analysis identifies the individual in three successive verification routines, then the reference segments 10 and 12 are stored in the access control system as part of the reference file therein.

Each time a previously enrolled individual desires access to the system a verification fingerprint scan is made. The enrolled individual identifies him or herself by some additional means at the time that the verification scan is made to inform the access control system which of its stored reference segments to use in the verification process.

In verification, two domains 20 and 22 within the input image are captured. The center of each domain 20 and 22 in the input image nominally corresponds to the center of one of the reference segments 10, 12, respectively from the enrollment file.

The domains 20 and 22 must be substantially larger in size than the corresponding reference segments 10, 12 to assure that somewhere within the domain there will be a segment which correlates highly with the re-ference segment. The problem which gives rise to this requirement is the fact that each time the individual's finger is placed on a platen, there is likely to be a displacement of the fingerprint image relative to other times when the finger is placed on the platen. This relative displacement requires that a reference segment be scanned across a substantially larger domain in order 3~6 to determine that position of the reference segment in which the greatest correlation occurs between reference file segment and the same size segment portion of the corresponding domain from the input image.

In one embodiment, each reference segment 10, 12 is a 32 x 32 pixel square while the corresponding domain 20 or 22 is a 91 x 91 pixel square.

Determinin~ Positions and Values of Maximum Correlation.

Each reference segment 10, 12 of the reference file is scanned across the corresponding domain 20, 22 respectively and four significant positions are determined. The four significant positions are: the position of maximum correlation of ridge zones in each reference-domain pair and the position of maximum correlation of valley zones in each reference-domain pair. FIG. 3 indicates the sequence of steps taken to determine these four significant positions and the correlation valves at each position.

As indicated in FIG 2, the two significant positions of each reference segment are determined by scanning that reference segment across the associated - domain in a predetermined manner. The reference segment 10 is first located at position "1", in the lower left hand corner of the corresponding domain 20. A
determination is made as to the number of coincidents between ones pixels in the reference segment and the underlying 32 x 32 pixel portion of the domain. A
second determination is made of the number of coincidents between the zeros pixels in the reference segment 10 and the underlying portion of the domain 20. These two ~2~l6~
g coincidence scores are retained and will be referred to herein as correlation values (CVs). The reference segment 10 is then moved one ~ixel up ~o a position '2" and the two correlation values determined. This is continued by moving the re~erence segment one pixel at a time up from position "1" to position "60". The reference segment then returns to the bottom of the domain but is located one pixel to the right of position "1" so that position 'l61" overlaps position "1" as shown. The sequence is followed with the reference segments being positioned in the particular sequence shown by being displaced one pixel at a time, moving up.
In this fashion the entire domain 20 is covered, in a predetermined sequence, with the reference segment 10 thereby being scanned over each of 3,600 overlapping test segments of the domain with which it is associated.

At each of these test segment positions, the two correlation values (CVs) indicated above are determined. The highest CVs are maintained in memory together with the location of the center of the reference segment where the high correlation values are obtained. At the end of the 3,600 test positions, there is in memory (a) a high ones correlation value together with the indication of the position of the center of the reference segment in the domain where that high ones value is obtained and (b) a high zeros correlation value together with the center position of the reference seqment in the domain where that high zeros value is found. The reference segment position associated with the high ones correlation value may or may not be the same as .he reference segmen-t position associated with the high zeros correlation value.

If there is more than one high CV, the last to be found is retained and the prior one discarded. Hence it is important the test procedure be in a predetermined sequence~

Each of the four high CV positions are assigned x and y coordinates in terms of the position of the associated reference segments within their respective domains. These four positions are referenced to the center of the reference segment 10, 12 within the domain 20, 22 where the high correlation score is obtained. Since the position is referenced to the center of these reference segments, it can be appreciated that these x, y coordinates all appear within a zone of 60 x 60 pixels around the center pixel of the domain. For example, the center of the reference segment 10 in position "1" is deemed to have the coordinates 1, 1. The center of the reference segment in location 60 has the coordinates 1, 60. The coordinates 31, 31 designate the center of the domain and the coordinates 60, 60 designate the center of the reference segment location 3~600O

It must be kept in mind that the coordinate system used for the first domain is replicated in the second domain because what we are concerned with is the relationship of the reference segment to its corresponding domain. Thus the substantial center of the first domain and the substantial center of the second domain will both have the coordinates 31, 31 although these two substantial centers are spaced apart from one another by about 71 pixels in the image plane. Thus the two sets of x and y coordinates are transformed to the same coordinate plane.

FIG. 3 illustrates the manner in which the comparison of reference segment to image domain is made so as to provide these four positions and the correlation values for these four positions. The enrollment file 30 contains the two reference segments 10 and 12. The image file 32 contains the two domains 20 and 22. From these two files the corresponding segment and domain are selected as indicated in blocks 34 and 36. A comparator 38 compares the reference segment against each of the 3,600 same size segment locations on the corresponding domain. Accordingly, a computer controlled means 40 is provided for the serial selection of these segment size sub-fields in the domain 36 for serial application as one of the two inputs to the comparator 3~.

The value and position of the zeros correlation and of the ones correlation for each sub-field/segment comparision are held in a temporary store 42 and 44 respectively to permit comparison with prior CV and position figures in the stores 46, 48 respectively. This permits updating the stores 46, 48 to provide holding only the maximum CV for the zeros and for the ones in the stores 46, 48. Specifically, in the scan of a single segment, such as a segment 10, over its 3~;

corresponding domain 20, only the maximum correlation value for the zeros and the maximum correlation value for the ones is given import for purpose of the subsequent screening process. Accordingly, the update store 46 holds the maximum running ones correlation value and zeros correlation value during each scan.
Correspondingly a position update store 48 holds the position (in x and y coordinates) of the maximum correlation values. Accordingly the storage units 42 and 44 are only for the purpose of permitting comparison of the most recent correlation value and its position against the magnitudes in the storage units 46 and 48 for the purpose of determining if those values have to be updated.

l~ At the end of the scan of the first segment over the first domain, two positions are determined representing the position of maximum ones correlation value and the position of the maximum zeros correlation value. Similarly, in addition, the correlation values of both the ones and zeros at each of those two positions are retained in storage. A similar scan of the second segment 12 over the domain 22 will produce another two maximum correlation value positions and corresponding correlation values. In FIG 3 these values are represented by twelve storage boxes along the right hand margin of FIG 3. Specifically these values which are stored at the end of the scanning operation are:

Position "A" is represented by the x and y coordinates for the position of the maximum ones correlation in the first domain.

Position "B" is represented by the x and y coordinates for the position of the maximum zeros correlation in the first domain.

Position "C" is represented by the x and y coordinates for the position of the maximum ones correlation in the second domainO

Position "D" is represented by the x and y coordinates for the position of the maximum zeros correlation in the second domain~

CV-lA is the correlation value of the ones pixels at position A.

CV-OA is the correlation value of the zeros pixels at position A.

Similarly, the rest of the storage correlation values are correlation values of the ones pixels and the zeros pixels at the positions "B" "C" "D".
There are a number of screening stages to verify the input fingerprint image against the corresponding enrollment file. These stages employ the position of the two maximum ones correlation and two maximum zeros correlation as well as the correlation values at those positions. At each stage of the screening, if the input image data fails to meet the criteria, then verification is negatived and access is denied. These stages employing the position and correlation valves set forth above are taken in sequence and are as follows.

A_First Screening Stage (Positional Closeness).

The first stage of screening involves a deter-mination that the x and y coordinates for the positions where the four maximum correlations are found have a certain closeness to one another in accordance with a particular criteria.

More specifically, the first screening test requires that any three of the positions "A", "B", "C"
and "D" be within eight coordinate units from each other. This distance is measured on a diagonal. If more than one subset of three points meets the closeness criteria, the subset which is most tightly clustered (i.e., has the smallest aggregate of the distances between each two of the three points) comprises the points which will form the basis for the further screening tests.

If there are not three of these positions within eight units from each other, then access is denied. If there are at least three such positions, then the second screening s~age is undertaken.

A Second Screening Stage (Border Edit).

The second screening stage is based on the observation that a large number of the false positive identifications which pass the first screening stage have the positions "A", "B", "C" and "D" located away from the center of the domains 20, 22.

More particularly, referring back to FIG. 2, it should be remembered that the 32 x 32 pixel reference segment is sequentially positioned in each of 3600 over-lapping positions in the 91 x ~1 pixel domain. The center point of each of those 3600 reference segment positions lies within a window of 60 x 60 pixels about the domain center~ Accordingly each ones and zeros correlation value is referenced to one of the points in this 60 x 60 window. That 60 x 60 window is the basis for the coordinates that identify points "A", "B", "C"
and "D" as defined above. The lower left hand corner of that window has the coordinate x = l and y = l and the upper right hand corner of that window has the coordinate x = 60, y = 60.

With that geometric image in mind, the second screening step can be readily understood. The criteria re~uires looking at the coordinates of each of the three positions that constitute the group that is within eight coordinate units of each other. These are the three positions identified at the first screening stage.
If any one of the six x or y coordinates representing those three points has a value of less than four or greater than fifty-seven, then the verification is negatived and access is denied.

In effect, this says that if any one of the three positions is within a border zone having a thickness of three pixels the verification will be nega-tived.

A Third Screening Stage (Deviant Point Correlation Value).

The third screening stage looks to the fourth of the four positions "A", "B", "C" and "D". For example, if the three positions found in the first screening stage (which are within a distance of eight of one another) are the positions "A", "B" and "D", then the fourth position is "C".

The maximum correlation (CV) value that determines that fourth position is noted. If the fourth position is 'IC'', then the ones correlation value for position "C'l is noted. That correlation value is compared with the comparable correlation value of the - other point from that domain. In the example of 'IC'' bein~ the fourth position, the ones correlation value for "C" is compared with the ones correlation value for the position "D". If these two ones correlation values differ by more than 10~, then the identification is negatived and access denied. Specifically, with reference to FIG. 3, the criteria is that if [(CV of lC) - (CV of lD)] ~ 0.1 (CV of lC), then access is denied.

It should be kept in mind that if the fourth position were, for example, "B", then it is the zeros correlation value for "B" that is compared with the zeros correlation value for "A". If the fourth position were "A", then the ones correlation values of "A" and "B" would be compared. Similarly if the fourth position were "D", then the zeros correlation values for "C" and "D" would be compared.

In this fashion the fourth maximum correlation value has an effect on the identification.

If all four positions were required to be within a relatively tight distance of one another in stage one, that would be a much tighter screen. But in part because of the sequential technique in stage one of comparing the reference segment against overlapping domain positions, there is the risk that one of the highest correlation values will be a substantial distance from the others even though there is a correlation value near the other three which is just slightly under the correlation value of the fourth further removed point. Accordingly, it has been found preferable to employ only three of the positions for the first stage screening. But because that tends to be too loose a screen, this third stage provides a check to make sure that the fourth correlation value is not too different from the correlation value of the more closely associated positions. If the correlation value associated with the fourth position is very much different, then it suggests that there was not in fact a fourth point close to the other three points with a correlation value within hailing distance of their correlation values. Such indicates that the corres-pondence between the input image and the reference file image is not very good.

Thus, if the ma~imum CV associated with the fourth point (the ones correlation for C in this case) is greater than the CV for the same pixel value for the comparison point (the ones correlation for D in this case) by enough so that the difference between these two CVs is more than lO~ of the ones CV for point C, then access is denied.

-18~

A Fourth Screening Stage (Summed _ - First Threshold)._ _ The fourth screening stage looks to the value of the correlations at the positions "A", "B", "C", and "D" and requires that certain of these correlation values meet a first threshold relative to perfect - correlation.

If correlation were perfect at any one position, the sum of the ones correlation value and zeros correlation value at that position would be 1,024 because that is the number of pixels in a segment 10 or 12. A threshold value as a percent of this perfect correlation of 1,024 is established as a criteria. If this first threshold is 62%, then the threshold value for the sum of the ones and zeros correlation is 635.

In a currently tested embodiment, this fourth screening stage is applied only to the three positions identified in the first screening stage that are within a distance of eight of one another. As to these positions, for example "A", "B", and "D", this criteria requires that there be at least one summed correlation value (CV) in each domain equal to 635 or greater~ In the example set forth above, if the three are "A", "B"
and "D", this requires that the summed CV for "D" be 635 or greater and that the summed CV for either "A" or "B"
be 635 or greater.

If the summed CV for one of the points in domain one (for example, CV-lB plus CV-OB) and the summed CV for one of the points in domain two (for example, CV-lD plus CV-OD) are each greater than 634, then the screening proceeds to the next step. But otherwise, access is denied. Thus access is denied if the summed CV for points A and B exceed 634, if the summed CV for "D'l is less than 635.

A Fifth Screening Stage (Summed CV - Second Threshold This fifth screening stage further requires that one of the two summed CVs that meet the first threshold of the fourth screening stage also meet a second threshold, which second threshold is higher than the first threshold. For example, if the second higher threshold is 68%, this would mean a summed correlation value of 697 or greater (68% of 1024 = 696.32).
Accordingly, in the example given, if the summed correlation values for "D" and "B" were both 635 or greater, then one of those two must be 697 or greater.

If the summed CV for either "D" or "B" exceeds 696, access is granted. Otherwise, access is denied.
By virtue of the fourth and fifth screening stages, the magnitude of the total correlation at these maximum correlation positions has an impact on identification.

Position of Reference_Segments 10, 12.

The position of the reference segments 10, 12 and of the domains 20, 22 in the field of pixels which constitutes the fingerprint image plane is pre determined as an established address in relation to the image buffer. The image buffer, in turn, has a set relationship to the platen to which the fingerprint is applied and which determines the Eingerprint image.

This pre-determined position of the segments and domains is constrained by the requirement that these segments and domains appear within the fingerprint of as large a population of individuals as possible. In particular, because some fingerprints are quite narrow compared to others, it is important that the segments and domains be predetermined, as shown in FIG. 1, toward the right of the fingerprint image because, in the embodiment involved, the individual's fingerprint is constrained to be against the right edge of the platen.

Accordingly, in an embodiment of this invention where the image plane is 192 pixels wide (the x axis in FIG. 1) and 25~ pixels high (the y axis in FIG. 1), it was determined that a desirable location is to have the second domain 22 start at least 33 pixels from the lowest pixel line in the image plane and for the two domains to be eleven pixels in from the right margin of the image plane. The two domains 20, 22 overlap by nineteen pixels in order to make sure that the upper end of the first domain 20 is always within a useful portion of the image.

As with all the other specific numerical cri-teria set forth in this disclosure, these numbers can be varied somewhat as a function of considerations such as the nature and the size of the population involved and the sizes of the reference segments 10, 12 and domains 20, 22.

Reduction Of Type II Error.

The above system of criteria has a very low type I error and yet is a relatively simple system that ~q d ~
~L~

does not require the selection, identification and correlation of fingerprint minutia. However, as a system it has a type II error (that is, false positive error) which may be unacceptable in some situations.
For example, depending on the situation, it might have a type II error a large as five percent (5%).

A simple technique of substantially reducing this Type II error without adding complexity to the system is to incorporate the requirement that at least two fingers of the individual seeking admission must be successfully identified in the system. Because the system operates so fast, two finger cascading is acceptable. If the type II error with a single finger is in the range of 5%, then it might be expected that the type II error in a system which requires that two separate fingers must be separately identified would be in the range of one four-hundredth of one percent (0.0025%).

Of course, this requirement that two separate fingers be individually identified to obtain access, increases the type I error~ But this requirement simply makes the very low type I error additive so that the type I error remains low.

However, type I error can be brought down appreciably by permitting the individual to go through the routine of applying the two fingers in succession either a second time or even a third time. If there is a positive identification at any one of the three sets of two finger applications, then the individual is admitted. This will substantially reduce type I error while not being additive with respect to type II error.

~L~

A cascading of two separate fingers requires - that both finger A and finger B be positively identified.

Further, by permitting access if any one of three attempts is successful, the effect on the overall type I error rate is to substantially reduce that error rate while having only minor effect on a type II error rate, which has been substantially reduced because of the two finger cascading requirement.

Thus, two finger cascading in combination with multiple alternative attempts affects a significant improvement in both type I and type II errors.

Further, training of and practice by the individual in placing his or her finger carefully on the platen will greatly improve the false rejection type I
error.

Although the invention has been described in connection with a specific embodiment, it should be remembered that there are various modifications that can be made which are encompassed by the claims.

For example, the particular values of various criteria can be modified to obtain a range of results.
Thus Type I error misht be reduced by increasing the value of the critical distance (that is, the eight coordinate units) between the three locations in the first screening stage. Type II error might be reduced by decreasing that critical value. Similarly, changes in the second and third screening stage criteria would also tend to affect both types of errors, a specific ~L~

change generally increasing one type of error and decreasing the other type of error. Perhaps more importantly, there is a trade-off between the criteria of these three screening stages so that a change in the critical value of one of these stages might call for a change in the critical value in another one of the stages. Both experience and the requirements of a particular installation will dictate such modifications to the specific example provided.

In addition, and more generally, it will be possible to modify the nature of these criteria by establishing other criteria for the relationship between the maximum correlation points "A", "B", "C" and "D" to obtain comparable results.

Claims (32)

Claims:
1. The method of fingerprint verification by comparison of a reference fingerprint image against an input fingerprint image wherein each image is composed of a field of pixels having first and second values representing respectively fingerprint ridge and valley pixels comprising the steps of:
providing at least first and second reference segment images representing subfields from a reference fingerprint, providing at least first and second domain images representing subfields from an input fingerprint, said domain images being substantially larger than said reference segment images, scanning said first reference segment image across said first domain image and scanning said second reference segment image across said second domain image to determine significant positions, said significant positions constituting locations of maximum correlation for each reference segment image/domain image pair between pixels having said first value and also locations of maximum correlation for each reference segment image/domain image pair between pixels having said second value, and verifying the input fingerprint from which said domain images are extracted as corresponding to the reference fingerprint from which said reference segment images are extracted by subjecting at least one of said significant locations to pre-determined positional criteria.
2. The method of claim 1 wherein said significant positions include:
a first position being the position of said first reference segment image in said first domain image where the maximum correlation is found between pixels having said first value, a second position being the position of said first reference segment image in said first domain image where the maximum correlation is found between pixels having said second value, a third position being the position of said second reference segment image in said second domain image where the maximum correlation is found between pixels having said first value, and, a fourth position being the position of said second reference segment image in said second domain image where the maximum correlation is found between pixels having said second value.
3. The method of claim 2 wherein:
the centers of said first and second reference segment images have substantially the same positional location relative to the reference fingerprint as do the centers of said first and second domain images to the input fingerprint being verified.
4. The method of claim 2 wherein said step of determining said four significant positions is performed by the steps of:
systematically scanning each reference segment image across the corresponding domain image in a series of predetermined overlaping test locations, determining in each test location a score of ones pixel correlation and a score of zeros pixel correlation, and storing for reference the positions of maximum ones pixel correlation and maximum zeros pixel correlation for each reference segment image/domain image pair.
27 The method of claim 4 wherein said four significant positions are given x and y coordinates in terms of their locations within their respective domain images, said four sets of coordinates being referenced to the same coordinate plane.
6. The method of claim 5 further comprising the step of:
storing for reference the magnitudes of the ones correlation values and the zeros correlation values at each of said four positions.
7. The method of claim 5 wherein said step of verifying requires that three of said four significant positions be positioned in said coordinate plane within a pre-determined distance of one another.
8. The method of claim 7 wherein each step of verifying further requires that said three closely associated significant positions be a pre-determined distance away from the border of said coordinate plane based on the dimension of said domain images.
9. The method of claim 7 wherein said step of verifying further requires that said maximum correlation value of said fourth significant position be within a predetermined magnitude or ratio of the correlation value of the same pixel value of the other significant position in the domain of said fourth position.
10. The method of claim 8 wherein said step of verifying further requires that said maximum correlation value of said fourth significant position be within a predetermined magnitude or ratio of the correlation value of the same pixel value of the other significant position in the domain of said fourth position.
11. The method of claim 7 wherein said step of verifying further requires that one of said three of said significant positions within said predetermined distance which is in said first domain has a correlation sum that exceed a first threshold and that one of said three significant locations within said predetermined distance that is in said second domain also exceeds said first threshold.
12. The method of claim 8 wherein said step of verifying further requires that one of said three of said significant positions within said predetermined distance which is in said first domain has a correlation sum that exceed a first threshold and that one of said three significant locations within said predetermined distance that is in said second domain also exceeds said first threshold.
13. The method of claim 9 wherein said step of verifying further requires that one of said three of said significant positions within said predetermined distance which is in said first domain has a correlation sum that exceed a first threshold and that one of said three significant locations within said predetermined distance that is in said second domain also exceeds said first threshold.
14. The method of claim 10 wherein said step of verifying further requires that one of said three of said significant positions within said predetermined distance which is in said first domain has a correlation sum that exceed a first threshold and that one of said three significant locations within said predetermined distance that is in said second domain also exceeds said first threshold.
15. The method of claim 11 where said step of verifying further requires that the correlation value of one of the two positions that exceeds said first threshold also exceeds a second threshold greater than said first threshold.
16. The method of claim 13 wherein said step of verifying further requires that the correlation value of one of the two positions that exceeds said first threshold also exceeds a second threshold greater than said first threshold.
17. The method of claim 14 where said step of verifying further requires that the correlation value of one of the two positions that exceeds said first threshold also exceeds a second threshold greater than said first threshold.
18. The method of claim 1 wherein the subfield of the domain image is approximately an order of magnitude greater in area than the subfield of the reference segment image.
19. The method of claim 7 wherein the subfield of the domain image is approximately an order of magnitude greater in area than the subfield of the reference segment image.
20. The method of claim 1 wherein said first and second domain images overlap.
21. The method of claim 7 wherein said first and second domain images overlap.
22. The method of claim 4 wherein each of said reference segment images is scanned across the respective domain image in a predetermined pattern and in case of coincident maximum correlation values, the last in sequence is deemed to be one of said significant locations.
23. The method of claim 7 wherein each of said reference segment images is scanned across the respective domain image in a predetermined pattern and in case of coincident maximum correlation values, the last in sequence is deemed to be one of said significant locations.
24. The method of claim 1 further comprising the steps of:
establishing said reference segment images by first establishing an initial set of reference segment images for a subject finger, then verifying repeated applications of said subject finger in accordance with the method of claim 1, and retaining said initial set in a reference file only if positive verification is obtained in a predetermined number of said repeated applications.
25. The method of claim 7 further comprising the steps of:
establishing said reference segment images by first establishing an initial set of reference segment images for a subject finger, then verifying repeated applications of subject finger in accordance with the method of claim 1, and retaining said initial set in a reference file only if positive verification is obtained in a predetermined number of said repeated applications.
26. The method of claim 1 further comprising the steps of:
applying the steps of claim 1 to verification of first and second input fingerprints taken from different first and second fingers, and providing identification of the subject having said first and second fingerprints if and only if said step of verifying is successfully completed in connection with both of the fingerprint images.
27. The method of claim 7 further comprising the steps of:
applying the steps of claim 7 to verification of first and second input fingerprints taken from different first and second fingers, and providing identification of the subject having said first and second fingerprints if and only if said step of verifying is successfully completed in connection with both of the fingerprint images.
28. The method of claim 1 further comprising the steps of:
applying the steps of claim 1 to separate applications of the subject's input fingerprint, and providing identification if any one of the applications is verified by said step of verifying.
29. The method of claim 7 further comprising the steps of:
applying the steps of claim 7 to separate applications of the subject's input fingerprint, and providing identification if any one of the applications is verified by said step of verifying.
30. The system of verifying an input fingerprint image by comparison of a reference fingerprint image against an input fingerprint image wherein each image is composed of a field of pixels having first and second values representing respectively fingerprint ridge and valley pixels comprising:
means to establish at least first and second reference segment images representing subfields from a reference fingerprint, means to establish at least first and second domain images representing subfields from an input fingerprint, said domain images being substantially larger than said reference segment images, scanning means for scanning said first reference segment image across said first domain image and scanning said second reference segment image across said second domain image to determine significant positions, said significant positions constituting locations of maximum correlation for each reference segment image/domain image pair between pixels having said first value and also locations of maximum correlation for each reference segment image/domain image pair between pixels having said second value, and means to verify the input fingerprint from which said domain images are extracted as corresponding to the reference fingerprint from which said reference segment images are extracted by subjecting at least one of said significant locations to pre-determined positional criteria.
31. The combination of claim 30 further comprising:
first storage means for storing the maximum ones correlation position and maximum zeros correlation position for each of said reference segment image/domain image pair.
32. The combination of claim 31 further comprising:
second storage means for storing the magnitude of the correlation values for each position stored in said first storage means.
CA000452862A 1983-04-27 1984-04-26 Fingerprint verification method Expired CA1216946A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US48924983A 1983-04-27 1983-04-27
US489,249 1983-04-27
US06/531,766 US4581760A (en) 1983-04-27 1983-09-13 Fingerprint verification method
US531,766 1983-09-13

Publications (1)

Publication Number Publication Date
CA1216946A true CA1216946A (en) 1987-01-20

Family

ID=27049645

Family Applications (1)

Application Number Title Priority Date Filing Date
CA000452862A Expired CA1216946A (en) 1983-04-27 1984-04-26 Fingerprint verification method

Country Status (4)

Country Link
US (1) US4581760A (en)
EP (1) EP0125532A3 (en)
CA (1) CA1216946A (en)
IL (1) IL71663A (en)

Families Citing this family (76)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0159037B1 (en) * 1984-04-18 1993-02-10 Nec Corporation Identification system employing verification of fingerprints
US4641350A (en) * 1984-05-17 1987-02-03 Bunn Robert F Fingerprint identification system
EP0216902A4 (en) * 1985-04-02 1989-02-20 Fingermatrix Inc Matcher.
GB2174831B (en) * 1985-04-22 1988-12-14 Quantum Fund Ltd The Skin-pattern recognition method and device
FR2585153A1 (en) * 1985-07-17 1987-01-23 Desgorces Jean Method of control of sequential operations by presentation of fingerprints, and its application to strongrooms
FR2585152A1 (en) * 1985-07-17 1987-01-23 Desgorces Jean Method of coding a fingerprint and its application to identity checking
US4696046A (en) * 1985-08-02 1987-09-22 Fingermatrix, Inc. Matcher
ATE64484T1 (en) * 1986-05-06 1991-06-15 Siemens Ag ARRANGEMENT AND PROCEDURE FOR DETERMINING THE AUTHORIZATION OF INDIVIDUALS BY VERIFYING THEIR FINGERPRINTS.
US5067162A (en) * 1986-06-30 1991-11-19 Identix Incorporated Method and apparatus for verifying identity using image correlation
GB8723299D0 (en) * 1987-10-05 1987-11-11 Imagepack Ltd Identity verification
US5040223A (en) * 1988-02-17 1991-08-13 Nippondenso Co., Ltd. Fingerprint verification method employing plural correlation judgement levels and sequential judgement stages
DE68905237T2 (en) * 1988-05-24 1993-07-29 Nec Corp METHOD AND DEVICE FOR COMPARING FINGERPRINTS.
US4925300A (en) * 1988-08-02 1990-05-15 Rachlin Daniel J Optical fingerprint imaging device
US5210797A (en) * 1989-10-30 1993-05-11 Kokusan Kinzoku Kogyo Kabushiki Kaisha Adaptive dictionary for a fingerprint recognizer
US5109432A (en) * 1989-12-27 1992-04-28 Fujitsu Limited Character recognition method
US5481623A (en) * 1990-04-19 1996-01-02 Fuji Photo Film Co., Ltd. Apparatus for determining an image position on imaging media
EP0470530B1 (en) * 1990-08-07 1997-01-22 Yozan Inc. Fingerprint verification method
US5261008A (en) * 1990-08-07 1993-11-09 Yozan, Inc. Fingerprint verification method
US5633947A (en) * 1991-03-21 1997-05-27 Thorn Emi Plc Method and apparatus for fingerprint characterization and recognition using auto correlation pattern
EP0521507A3 (en) * 1991-07-04 1993-12-15 Ezel Inc Fingerprint data processing method
WO1993023816A1 (en) * 1992-05-18 1993-11-25 Silicon Engines Inc. System and method for cross correlation with application to video motion vector estimation
US6002787A (en) * 1992-10-27 1999-12-14 Jasper Consulting, Inc. Fingerprint analyzing and encoding system
CA2119327A1 (en) * 1993-07-19 1995-01-20 David Crawford Gibbon Method and means for detecting people in image sequences
JP2821348B2 (en) * 1993-10-21 1998-11-05 日本電気ソフトウェア株式会社 Fingerprint collation device
US5495537A (en) * 1994-06-01 1996-02-27 Cognex Corporation Methods and apparatus for machine vision template matching of images predominantly having generally diagonal and elongate features
US5509083A (en) * 1994-06-15 1996-04-16 Nooral S. Abtahi Method and apparatus for confirming the identity of an individual presenting an identification card
JPH0962840A (en) * 1995-08-30 1997-03-07 Sony Corp Picture collating method and device therefor
US5809171A (en) * 1996-01-05 1998-09-15 Mcdonnell Douglas Corporation Image processing method and apparatus for correlating a test image with a template
US5841888A (en) * 1996-01-23 1998-11-24 Harris Corporation Method for fingerprint indexing and searching
US5956415A (en) * 1996-01-26 1999-09-21 Harris Corporation Enhanced security fingerprint sensor package and related methods
US5828773A (en) * 1996-01-26 1998-10-27 Harris Corporation Fingerprint sensing method with finger position indication
US5963679A (en) * 1996-01-26 1999-10-05 Harris Corporation Electric field fingerprint sensor apparatus and related methods
US5796858A (en) * 1996-05-10 1998-08-18 Digital Persona, Inc. Fingerprint sensing system using a sheet prism
JPH10105707A (en) * 1996-09-25 1998-04-24 Sony Corp Image collation device
JP3744620B2 (en) * 1996-09-25 2006-02-15 ソニー株式会社 Image collation apparatus and image collation method
US6038334A (en) * 1997-02-21 2000-03-14 Dew Engineering And Development Limited Method of gathering biometric information
US6072891A (en) * 1997-02-21 2000-06-06 Dew Engineering And Development Limited Method of gathering biometric information
JP3770344B2 (en) 1996-12-26 2006-04-26 ソニー株式会社 Image collation apparatus and image collation method
JP3082837B2 (en) * 1997-03-19 2000-08-28 日本電気株式会社 Pattern matching encoding device, decoding device, and recording medium
US5982913A (en) * 1997-03-25 1999-11-09 The United States Of America As Represented By The National Security Agency Method of verification using a subset of claimant's fingerprint
US6125192A (en) * 1997-04-21 2000-09-26 Digital Persona, Inc. Fingerprint recognition system
US6023522A (en) * 1997-05-05 2000-02-08 Draganoff; Georgi H. Inexpensive adaptive fingerprint image acquisition framegrabber
AU7228598A (en) * 1997-05-07 1998-11-27 Georgi H. Draganoff Sliding yardsticks fingerprint enrollment and verification system
US6075876A (en) 1997-05-07 2000-06-13 Draganoff; Georgi Hristoff Sliding yardsticks fingerprint enrollment and verification system and method
US5917928A (en) * 1997-07-14 1999-06-29 Bes Systems, Inc. System and method for automatically verifying identity of a subject
US6122737A (en) * 1997-11-14 2000-09-19 Digital Persona, Inc. Method for using fingerprints to distribute information over a network
US6035398A (en) * 1997-11-14 2000-03-07 Digitalpersona, Inc. Cryptographic key generation using biometric data
US6324310B1 (en) 1998-06-02 2001-11-27 Digital Persona, Inc. Method and apparatus for scanning a fingerprint using a linear sensor
US6188781B1 (en) 1998-07-28 2001-02-13 Digital Persona, Inc. Method and apparatus for illuminating a fingerprint through side illumination of a platen
US6771264B1 (en) * 1998-08-20 2004-08-03 Apple Computer, Inc. Method and apparatus for performing tangent space lighting and bump mapping in a deferred shading graphics processor
US6288730B1 (en) * 1998-08-20 2001-09-11 Apple Computer, Inc. Method and apparatus for generating texture
US6950539B2 (en) * 1998-09-16 2005-09-27 Digital Persona Configurable multi-function touchpad device
EP1017008B1 (en) * 1998-12-28 2007-04-04 Casio Computer Co., Ltd. Apparatus and method for collating image
US6097035A (en) * 1999-02-22 2000-08-01 Digital Persona, Inc. Fingerprint detection apparatus with partial fingerprint images
EP1054340B1 (en) * 1999-05-17 2008-05-28 Nippon Telegraph and Telephone Corporation Surface shape recognition apparatus and method
SE9902990L (en) * 1999-08-24 2001-01-08 Fingerprint Cards Ab Method and apparatus for recording and verifying fingerprint information
JP2001117579A (en) 1999-10-21 2001-04-27 Casio Comput Co Ltd Device and method for voice collating and storage medium having voice collating process program stored therein
JP3742279B2 (en) * 2000-06-09 2006-02-01 日本電信電話株式会社 Image collation apparatus, image collation method, and recording medium recording image collation program
US6898301B2 (en) * 2000-07-10 2005-05-24 Casio Computer Co., Ltd. Authentication system based on fingerprint and electronic device employed for the system
JP3780830B2 (en) * 2000-07-28 2006-05-31 日本電気株式会社 Fingerprint identification method and apparatus
US7627145B2 (en) * 2000-09-06 2009-12-01 Hitachi, Ltd. Personal identification device and method
JP3558025B2 (en) * 2000-09-06 2004-08-25 株式会社日立製作所 Personal authentication device and method
US20020196963A1 (en) * 2001-02-23 2002-12-26 Biometric Security Card, Inc. Biometric identification system using a magnetic stripe and associated methods
US7046829B2 (en) * 2001-05-30 2006-05-16 International Business Machines Corporation Fingerprint verification
KR100432491B1 (en) 2001-08-31 2004-05-22 (주)니트 젠 Method of extracting fingerprint features using ridge direction model
ATE367618T1 (en) * 2002-02-18 2007-08-15 Precise Biometrics Ab METHOD AND DEVICE FOR CHECKING FINGERPRINTS
JP4030829B2 (en) * 2002-08-13 2008-01-09 日本電気株式会社 Striped pattern image matching apparatus and striped pattern image matching method
JP4032241B2 (en) * 2003-02-28 2008-01-16 日本電気株式会社 Fingerprint verification apparatus and method
US7599044B2 (en) 2005-06-23 2009-10-06 Apple Inc. Method and apparatus for remotely detecting presence
US7085673B2 (en) * 2004-08-31 2006-08-01 Hewlett-Packard Development Company, L.P. Displacement estimation system and method
US7242169B2 (en) * 2005-03-01 2007-07-10 Apple Inc. Method and apparatus for voltage compensation for parasitic impedance
US7577930B2 (en) 2005-06-23 2009-08-18 Apple Inc. Method and apparatus for analyzing integrated circuit operations
US9298311B2 (en) * 2005-06-23 2016-03-29 Apple Inc. Trackpad sensitivity compensation
US7433191B2 (en) * 2005-09-30 2008-10-07 Apple Inc. Thermal contact arrangement
US7598711B2 (en) * 2005-11-23 2009-10-06 Apple Inc. Power source switchover apparatus and method
US9077537B2 (en) 2008-11-13 2015-07-07 International Business Machines Corporation Generating secure private keys for use in a public key communications environment

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2262834B1 (en) * 1973-04-09 1977-10-21 Calspan Corp
US3944978A (en) * 1974-09-09 1976-03-16 Recognition Systems, Inc. Electro-optical method and apparatus for making identifications
US4185270A (en) * 1976-07-19 1980-01-22 Fingermatrix, Inc. Fingerprint identification method and apparatus
US4047154A (en) * 1976-09-10 1977-09-06 Rockwell International Corporation Operator interactive pattern processing system
US4135147A (en) * 1976-09-10 1979-01-16 Rockwell International Corporation Minutiae pattern matcher
US4246568A (en) * 1978-12-08 1981-01-20 Peterson Vernon L Apparatus and method of personal identification by fingerprint comparison
JPS6012674B2 (en) * 1979-04-02 1985-04-02 日本電気株式会社 Pattern feature extraction device
JPS5923467B2 (en) * 1979-04-16 1984-06-02 株式会社日立製作所 Position detection method

Also Published As

Publication number Publication date
IL71663A0 (en) 1984-07-31
EP0125532A3 (en) 1987-05-06
EP0125532A2 (en) 1984-11-21
IL71663A (en) 1987-09-16
US4581760A (en) 1986-04-08

Similar Documents

Publication Publication Date Title
CA1216946A (en) Fingerprint verification method
EP0251504B1 (en) Method and apparatus for verifying identity using image correlation
US5917928A (en) System and method for automatically verifying identity of a subject
EP0329166B1 (en) Fingerprint verification method employing plural correlation judgement levels and sequential judgement stages
US6181807B1 (en) Methods and related apparatus for fingerprint indexing and searching
US4827527A (en) Pre-processing system for pre-processing an image signal succession prior to identification
US4752966A (en) Fingerprint identification system
EP1467308B1 (en) Image identification system
US5239590A (en) Fingerprint verification method
US7120280B2 (en) Fingerprint template generation, verification and identification system
US20080273770A1 (en) Fast Fingerprint Identification And Verification By Minutiae Pair Indexing
WO2000068873A1 (en) Method and apparatus for creating a composite fingerprint image
US5261008A (en) Fingerprint verification method
US6047079A (en) Method of and an apparatus for pre-selecting fingerprint cards
JPS6321233B2 (en)
EP0090377B1 (en) Finger-print identification system
US6785408B1 (en) Fingerprint segment area processing method and associated apparatus
JPH04320583A (en) Method for updating registered finger print feature point
GB2271657A (en) Signature verification
JPH0498370A (en) Fingerprint identification device
JPH07114640A (en) Individual authenticating device
JP2964199B2 (en) Fingerprint matching method
JP2868909B2 (en) Fingerprint collation device
JPH06208611A (en) Device for authenticating individual person
JPH0668241A (en) Fingerprint identifying device

Legal Events

Date Code Title Description
MKEX Expiry