WO2007103701A1 - A standoff iris recognition system - Google Patents

A standoff iris recognition system Download PDF

Info

Publication number
WO2007103701A1
WO2007103701A1 PCT/US2007/063024 US2007063024W WO2007103701A1 WO 2007103701 A1 WO2007103701 A1 WO 2007103701A1 US 2007063024 W US2007063024 W US 2007063024W WO 2007103701 A1 WO2007103701 A1 WO 2007103701A1
Authority
WO
WIPO (PCT)
Prior art keywords
module
iris
pupil
border
curve
Prior art date
Application number
PCT/US2007/063024
Other languages
French (fr)
Inventor
Rida Hamza
Original Assignee
Honeywell International Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honeywell International Inc. filed Critical Honeywell International Inc.
Priority to GB0815928A priority Critical patent/GB2450026B/en
Priority to AU2007223574A priority patent/AU2007223574B2/en
Publication of WO2007103701A1 publication Critical patent/WO2007103701A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • G06V10/225Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition based on a marking or identifier characterising the area
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction

Definitions

  • the present invention pertains to recognition systems and particularly to biometric recognition systems. More particularly, the invention pertains to iris recognition systems.
  • Related applications may include U.S. Patent Application No. 10/979, 129, filed
  • the present invention is a stand off iris recognition system.
  • Figure 1 is a diagram of an overall structure of the standoff iris recognition system
  • Figure 2 is a diagram of a pupil processing mechanism
  • Figures 3, 4 and 5 are diagrams showing a basis for pupil border analysis, curve fitting and portion substitution
  • Figure 6 is a diagram of an approach for an iris outer border analysis, curve fitting and portion removal or substitution
  • Figure 7 is a diagram of a polar segmentation subroutine mechanism
  • Figures 8a and 8b are diagrams illustrating an approach for estimating eyelash/lid curve detection
  • Figure 9 is an illustration showing an eye having eyelash/lid obscuration
  • Figure 10 is a diagram of pupil and iris centers
  • Figures 1 1 and 12 are diagrams of iris quadrants and masking
  • Figures 13-18 are diagrams of various kinds of masking for noisy and informational areas of the eye.
  • irises may make iris recognition technology as a reliable person identification tool.
  • irises may have uniqueness unlike other biometric technologies, such as face-prints and fingerprints.
  • Irises may be unique to a person and even among genetically twin individuals. Although the striking visual similarity of identical twins reveals the genetic penetrance of facial appearance, a comparison of genetically identical irises reveals just the opposite for iris patterns.
  • iris patterns are unalterable without significant duress.
  • a non-invasive iris may be considered as an internal unique organ but yet is externally visible and can be measured. It is in a protected environment but still visible.
  • the present system and approach address the real-time operational requirements of a standoff iris recognition system and may be regarded as an "on-the-fly" iris recognition system.
  • PES polar segmentation
  • the present iris recognition system is well suited for high-security access control or "at-a-distance biometrics" applications with little or no control exercised on subject positioning or orientations.
  • the iris recognition operation may include subjects captured at various ranges from the acquisition device or include subjects that may not have their eye directly aligned with the imaging equipment.
  • it may be difficult to implement a level of control required by most of the existing art to enable reliable iris recognition operations.
  • the present approach of iris recognition may cope with asymmetry in acquired iris imaging and it can operate under any uncontrolled operations as long as some of the iris annular is visible.
  • the present system may provide an accurate segmentation technique and hence identify good iris patterns, which may be regarded as signatures.
  • the present system may take the analysis of edges into polar domain and use local patterns to detect iris features using an enhanced version of POSE technique disclosed in U.S. Patent Application No. 11/275,703. This technique may detect curves of the iris borders of any irregular shapes.
  • a detection algorithm may robustly detect the inner and outer borders of the eye iris for the purpose of human or animal recognition.
  • the present approach may begin with a mapping the analysis immediately into the polar domain with respect to a centered point in the pupil region. The centered point, not necessarily the exact center of the pupil but may be identified within the pupil region.
  • ID POSE one dimensional polar segmentation
  • the patterns may then be matched against multiple codes within a database and are given weights based upon the pattern visibility and exposure to the camera system.
  • the present system and approach may include the following items.
  • Iris inner border detection may be achieved using the estimated edges of POSE or any other active contour technique that provides a way to analyze each edge at each angle separately to determine whether the resulting edge is a valid border edge or invalided edge.
  • a valid edge may be defined as an edge that was detected within a predefined range. Any edge point that results out of range or at the extreme points of the gradient signal segment may represent a leaked peak and is treated as invalid edge.
  • a predefined regular or irregular model shape may be used to fit the resulting edges. The depicted model shape may be used to fill in any missing edges within the contour of the pupil to replace the non- valid points with the estimated points from the irregular shape.
  • the analysis may be offset with a predefined minimum possible width of an iris as the starting point for the iris outer border analysis.
  • Boundary edges may be extracted using POSE.
  • a median filter may be run to smooth the resulting outcome of POSE.
  • the boundary edge points may be clustered into several categories: 1) sclera and iris boundary points; and 2) iris and eyelid boundary points to be analyzed differently.
  • the valid sclera and iris boundary points may be extracted.
  • These edge points may be fitted into a predefined regular model shape. The regular model shape may be used for guidance of the analysis and will not present the final outcome of the edge estimates.
  • the area between the estimated eyelid-eyelash curve and the pupil curve (inner border) may be measured. Weights may be assigned based upon significance of the area between the curves. In some approaches, one may choose to assign zero to the weights to discard the entire region given the significance of the occlusions.
  • the spacing between the inner and outer curves may be scaled based upon the position of the outer curve within the regular shape.
  • the actual edge points detected by POSE may be used to be the actual edges of the iris borders and not the fitted model shapes.
  • Any pixel that lies within the outer border of the iris and the fitting model shape may be masked. Any pixel that lies outside the fitting shape may be discarded.
  • the pixels may be mapped into an iris pattern map.
  • Virtually any encoding scheme may be used to compress the image into few bits while covering the entire angular range using a predefined angular resolution and radius resolution.
  • a similarity of information metric may be used to measure the similarity among the barcode of the templates for matching while weighing the pixels that come from valid edges with higher values and weighing pixels associated with invalid edges or obscuration with smaller or zero values.
  • the present approach may be for performing iris recognition under suboptimal image acquisition conditions.
  • the approach may be for iris segmentation to detect all boundaries (inner, outer, eyelid and sclera and horizon) of the image iris simultaneously.
  • the overall structure of the standoff iris recognition system 10 is shown in the Figure 1.
  • One may start an analysis by mapping 12 a located eye image 11 into a polar domain at the start with respect to a centered point within the pupil region of the eye image.
  • An approach to estimate a point within the pupil may be straightforward in that it can use thresholding or summation over the x-axis and the y-axis to localize the darkest contrast within the eye image to locate the pupil region.
  • the eye finder approach which is discussed in U.S. Patent Application No. 11/672,108, filed February 7, 2007, may be used to estimate a pupil point.
  • There may be an iris inner curve estimation 13 and outer curve estimation 14.
  • a feature extraction 15 may proceed, leading to an iris signature map 16.
  • the iris signature 17 may be compressed.
  • An enroll and/or match 18 may occur with iris signature data flowing to and from a storage 19 in the form of bar codes.
  • FIG. 2 is a diagram of a pupil processing mechanism 20.
  • An image 21 having an eye may go to an eye finder 22 which is discussed in U.S. Patent Application No. 11/672, 108, filed February 7, 2007. From the eye finder, the result may enter a filter 30 having a median filter 23 and then a smooth low pass filter 24 for noise removal.
  • An input kernel (pupil) module 69 may define a specific kernel or matrix of pixels covering just the pupil from the eye image for analysis. The edges of the pupil may include the most significant peaks, sufficient for detection.
  • An output image of the pupil with certain edge smoothened out may go from the filter 24 may go to a POSE-ID segmentation 25.
  • Constraint evaluation is where a peak may be detected within a range. Edge detection may be on the limits within a certain range. A rough center location and an approximate size of the pupil may be attained. When the edges of the pupil are detected as peaks within the ID signal along the radial axis and are said to be valid if they were detected within the radial range, one may have a validation of the pupil by testing the pupil profile, estimates of the edges. The new edges may yield to a better estimate of the pupil center sufficient for analysis.
  • a median filter 23 may be applied to eliminate salt and pepper noise due to the system acquisition of background noise.
  • the image may be a kernel, i.e., a block of pixels of a pupil for analysis.
  • the image 21 may be passed through a low pass filter 24 to smooth the variation with the pupil region while preserving the apparent contrast change at the edge of the pupil and the iris.
  • the POSE-ID segmentation 25 may be applied.
  • the validity of the edges at step or stage 51, indicated by a diamond symbol, may be determined by checking whether the peaks in the contrast changes are leaked to the edges of the gradient of the contrast change signal. The leaking may indicate several cases.
  • a constraint may include that the pixels of the edge be within a set range.
  • the actual edge of the pupil may be too close to the signal edge and therefore the detected edge might not reflect the actual edge of the gradient. There may not be enough contrast to can determine whether there is a pupil edge. There may be a presence of obstacles that is obscuring the pupil edges. Obstacles may include skin of an eye, eyelashes due to eye closure, an eyeglass frame, a contact lens, optics, and the like. In either case, the peak may be deemed an invalid peak or an edge of a pupil. One may then fit only the valid points into a predefined model shape, i.e., elliptic fitting 52, just for guidance. Two alternatives may then be proposed.
  • the estimated shape 56, 52, 48 i.e., ellipse
  • the actual edges may be kept as a final outcome using the POSE technique and only the invalid edges will be replaced by points from the estimated shape (i.e., the estimated ellipse).
  • An offset 90 of Figure 9 may vary from zero to some value depending on the visibility of the pupil within the eye image during image acquisition. For instance, one offset may vary dependent on a scoring and/or a validation of a pupil profile being captured. Relative to a closed or highly obscured eye, an offset may be at a minimum or zero. For an open eye with no obscuration and having a high score and/or validation of a pupil profile, the offset may be large. The offset may vary depending on the areas or angular segments of the eye that are visible.
  • Offset may vary according to the border type.
  • the iris/sclera border may warrant significant offset, and the offset for the iris/eyelash-lid may be low, minimus or zero.
  • the iris outer border analysis is illustrated, at least partially, in a diagram of Figure 3.
  • Figure 3 shows a pupil 31 of which a portion of an edge 38 is within a range 32 of a circle 33 having a radius 34 about an approximate center 35. It may be noted that there may be a first reflection 36 and a first center estimate 37. However, an approximate center 35 is noted for subsequent use.
  • the range 32 can have a set amount of deviation that the edge 38 of pupil 31 may have and yet be regarded as valid. It may be noted that the edge 38 could but does not extend beyond the outer circumference of range 33, but edge 38 does appear at points 41, 42 and 45 to be inside of a circumference 39 showing an inner limit of range 32. Points 43, 44 and 46 appear within the range 32 and thus may be deemed to be valid.
  • the edge 38 of the pupil 31 may not be within the range 32 at points 41, 42 and 45 because of the eyelashes, eyelid and/or noise 47 at the bottom and top of the pupil.
  • Other factors of pupil 31 may include a blob fitting (BF) and a coverage fitting (CF).
  • the validity of the edge 38 may be determined at symbol 51 of Figure 2.
  • the input may be an output from the segmentation stage or block 25. Also, an output from block 25 may go to a snake plus elliptic curve (or the like module) block 53.
  • the output of the valid edge determination diamond symbol 51 may go to a pruning block 40 where prompt changes of the edge 38 may be smoothed or reduced in its extension out from the edge curve. Then, the edge 38 may go to a predefined model shape (such as elliptic fitting) block 52.
  • a model shape curve 48 (as an example, one may show an elliptic shape as a fitting model shown as a thick line in Figure 4).
  • the entire edge 38, including the invalid and valid portions, may be replaced with the elliptic fitting 48 in a first approach (elliptic or like module) 54. Only the valid portions of the edge 38 are incorporated in determining an elliptic fitting curve 48 as indicated by block 54.
  • the elliptic fitting 48 may be used to do a final estimate of the pupil center 35.
  • a non-linear fitting may be done as shown in Figure 5.
  • the model fitting 48 may be kept for only the non- valid portion or points 41, 42 and 45, but the actual valid edges or points 43, 44 and 46 may be kept, as indicated by block 53.
  • An output of elliptic fitting block 52 may go to a diamond 55 which asks whether the actual contour 38 or the model fitting 48 should be used.
  • the model fitting or curve 48 should always be used for the non-valid portions of curve or contour 38 incorporating such.
  • the approach does not get affected by any reflection within the pupil and as shown in Figure 3, the analysis goes around the reflection and thus it would be neglected without having to add any preprocessing for its elimination. Besides reflections, a partially closed eye, eyelashes or lids, noise, and the like may be well treated using this segmentation method.
  • the output of block 54 may be a pupil border 56 as shown in image 58. If the answer is yes at diamond 55, then a "snake", which is an active contour, that is, an estimate of the actual edge 38, rather than the ellipse approximation 48, is used for the valid portions of edge 38.
  • the output of block 53 may be a pupil border 57 as shown in image 59.
  • arrows 62 may repeat elliptic fitting data sent to blocks 53 and 54 for effecting an elliptic curve fit.
  • An enhancement to elliptic fitting may be added as a part of the elliptic fitting box 52.
  • This enhancement may be a pruning of the pupil edge before doing a model fitting at block or module 52 ( Figure 2).
  • the pruning may be used to smooth the curve edges and eliminate any mismatches of extraneous edges.
  • outliers are replaced with the likelihood edge within a predefined angular segment.
  • FIG. 6 is a diagram of an approach for an iris outer border analysis, curve fitting and portion removal or substitution.
  • An eye image 21 may be processed through the median filter 24, respectively, which is noted herein.
  • a kernel 91 which may be a matrix or block of pixels of the iris of the image 21, can be processed.
  • a resulting image 93 for analysis may proceed to a cluster angular range module 92.
  • the eye symmetry, as shown by inset 93, may proceed on to a POSE+ (illustrated in Figure 7) segmentation module 94.
  • Figure 7 reveals more detail (i.e., the ID POSE+ subroutine) of the segmentation module 94.
  • Two major portions of the eye image 93 go to module 94 for segmentation concerning sclera borders and eyelash borders.
  • Input 96 for sclera borders may go to a ID POSE segmentation submodule 98 and input 97 for eyelash borders may go to ID POSE segmentation submodule 99.
  • Information 67 of the pupil model fitting, center may be input to the submodules 98 and 99.
  • An output of segmentation submodule 98 may go to a get max peak submodule 60 which in turn provides an output to a ID median filter 102.
  • Also input to median filter 102 may be a filter bandwidth 68.
  • An output from segmentation submodule 99 may go to a get max peak submodule 101 which in turn provides an output to a ID median filter 63.
  • a filter bandwidth signal 68 may be provided to filter 63.
  • An output 64 from median filter 102 of module 94 may go to a (dr I d ⁇ ) module
  • An output 65 from median filter 63 may go to a (3 /3# ) module 72 for eyelash/lid borders.
  • Modules 71 and 72 may be of a border module 103.
  • An output from module 71 may go to a count module 73, and an output from module 72 may go to a count module 74.
  • Modules 73 and 74 may be of a count module 104. If the count at module 73 is not less than ⁇ , where ⁇ is threshold, then there is not a valid eye image 75. If the count is less than ⁇ , then a circular, elliptic, or the like, fitting may be placed on the iris outer sclera borders at module 76.
  • the eyelash edges may be extracted at module 77. This may involve ID POSE+. If the count at module 74 is greater than ⁇ , then the eyelashes may be masked at module 78. This may involve POSE ID.
  • may be a number indicating a number of hits or places where a curve discontinues. The range of ⁇ may be around 3 or 4. Under certain circumstances of more tolerance, ⁇ may be set to be 5 or grater.
  • a combined output 66 from the ID median filters 102 and 63 may go to a map analysis center 81. Also, outputs from the circular fitting module 76, the extract eyelash edges module 77 and the mask eyelashes module 78 may go to a center 81 for a map analysis.
  • the preprocessing may include the filter or combination 30 of a median 23 and low pass filter 24 of Figure 6 to smooth the iris texture while preserving the strong edge of the contrast change at the outer border of the iris.
  • One may then cluster the angular range into two categories. Boundary points may be clustered. With the occlusion of the iris by the eyelids and eyelids, there may be two groups of boundary points around the outer bounds of the iris that may be treated differently in the present analysis. The groups may be iris sclera boundaries and iris eyelid boundaries. The two classes of points may be treated according to the expected distributions of edge pixels. To cluster the points into these two classes, one may use the symmetry method in POSE+ (see U.S. Patent Application No. 11/275,703, filed January 25, 2006) where pixels placed symmetrically relative to each other in terms of curvature with smooth continuous edges.
  • the lowermost edge points of the upper eyelid edge may be fit into a straight-line and the uppermost of the lower eyelid edge points may be fit into a straight line crossing the detected iris outer border curve (original curve detected by POSE).
  • the intersection of these two straight lines and the curve may define a good estimate of the trapezoid contour of the eye socket.
  • the intersection of these lines and the pre-estimated shape may define these boundary points.
  • the POSE+ subroutine is shown with a diagram in Figure 7.
  • Figure 7 reveals more detail (i.e., the ID POSE+ subroutine) of the segmentation module 94.
  • Two major portions of the eye image 93 go to module 94 for segmentation concerning sclera borders and eyelash borders.
  • Input 96 for sclera borders may go to a ID POSE segmentation submodule 98 and input 97 for eyelash borders may go to ID POSE segmentation submodule 99.
  • Information 67 of the pupil ellipse fitting and center may be input to the submodules 98 and 99.
  • An output of segmentation submodule 98 may go to a get max peak submodule 60 which in turn provides an output to a ID median filter 102.
  • Also input to median filter 102 may be a filter bandwidth 68.
  • An output from segmentation submodule 99 may go to a get max peak submodule 101 which in turn provides an output to a ID median filter 63.
  • a filter bandwidth signal 68 may be provided to filter 63.
  • An output 64 from median filter 102 of module 94 may go to a (dr I d ⁇ ) module
  • An output 65 from median filter 63 may go to a (3/3# ) module 72.
  • An output from module 71 may go to a count module 73, and an output from module
  • a count module 74 may go to a count module 74. If the count at module 73 is not less than ⁇ (where ⁇ is as discussed herein), then there is not a valid eye image 75. If the count is less than ⁇ , then a circular fitting may be placed on the iris outer sclera borders at module 76. If the count at module 74 is not greater than ⁇ , then the eyelash edges may be extracted at module 77. This may involve ID POSE+. If the count at module 74 is greater than ⁇ , then the eyelashes may be masked at module 78. This may involve POSE ID. A combined output 66 from the ID median filters 102 and 63 may go to a map analysis center 81.
  • outputs from the circular fitting module 76, the extract eyelash edges module 77 and the mask eyelashes module 78 may go to a center 81 for a map analysis.
  • Eyelid detection may be noted. With the nature of eye closure under nominal conditions, there may be two possibilities for eye positioning. One is a wide-open eye and another partially open. In either case, one might only consider points of observable edges of iris in the curve fitting. To estimate the eyelid edges, one may track the lowermost points of the lowermost curve 82 ( Figures 8a and 8b) of the upper eyelid 87 edge, and track the uppermost points of the upper curve 84 of the lower eyelid 88 edges.
  • Figures 8a and 8b are graphs illustrating an approach for estimating eyelid curve detection.
  • a piece-wise linear fitting 83 of the local minima of the curve 82 may be done for the upper eyelid 87.
  • a piece-wise linear fitting 85 of the local maxima of the curve 84 may be done for the lower eyelid 88.
  • Figure 9 relates to eyelid detection and shows a picture of an eye 86 with an obscuration by an upper eyelid 87 and possible obscuration with a lower eyelid 88.
  • This Figure illustrates a resulting output of a following process.
  • a weighting scheme may also be introduced to assess the obscuration amount of the eyelids, eyelashes or other manner of obscuration such as glass, a frame, and so forth.
  • the obscuration may be assessed by computing the integral of the area between the eyelid curve and pupil boundary with the following equation,
  • O 1 represents the angles associated with the boundary curve of the eyelash/eyelid
  • r ( ⁇ ) is the estimated pupil radius at angle ⁇ .
  • the integral may be evaluated over the angular range covered by eyelashes (and/or eyelids) and be based upon the value of the integral with respect to a pre-estimated threshold.
  • a weighting factor may be assigned to these angular segments to be used in the matching function.
  • the next stage may be to extract the valid sclera and iris boundary points and fit these edge points into a predefined regular shape, e.g., a circular shape. It is important to note that these regular shapes are generally not used as the final outcome of the detection.
  • the regular shapes may be used for guiding the present normalization process and to keep the actual detected edges of the active contour that POSE has identified.
  • the normalization is crucial to iris processing to address dimensional changes of the iris shapes. These dimensional inconsistencies may be mainly due to the iris stretches and dilation of the pupil that usually undergoes different environment lightings as well as imaging distance variations.
  • the regular shape is not meant to be the final outcome of the present estimates.
  • the curve detected by the present active contour approach as an ensemble of all edges detected by POSE may be the final estimate of the iris outer border edges.
  • the predefined shape may be used to scale back the curve shape into a common scaling for normalization purposes as well as an approach to identify areas that do not belong to the iris map and ought to be masked from the analysis.
  • the regular shape may define the actual scaling needed to bring uniformity among all the captured images and templates in the database.
  • s ⁇ (r) s ⁇ (r)u(R e - r) + E[s ⁇ (r)] ⁇ r u(r - R e ) , (3)
  • s ⁇ (r) represents the pixel values at a radius r and angle d?
  • the function 7(r) may represent the elements of the scaled vector that is used to map the iris pixels into the normalized iris pattern map (also referred to as a rubber sheet).
  • u(r) may be used to denote the step function.
  • a challenge in building the standoff iris recognition system may lie at how to extract and segment the boundaries of an iris and not necessarily the compression approach to encode the barcode of the extracted map.
  • iris encoding may usually be used to compress the iris map into fewer bits in a barcode to be stored or matched against other barcodes stored in a database.
  • the iris encoding may be processed on the iris map to extract the pattern texture variations. What type of encoding or algorithm may be irrelevant here as there are many COTS approaches to encode a digital image.
  • One may make use of Gabor filters to encode the iris map image to its minimum possible number of bits so that metrics can be used to give one range of values when comparing templates with capture maps.
  • any similarity metrics may be used to measure the information similarity among templates.
  • One metric in particular that may be used is the weighted hamming distance (WHD).
  • WHD weighted hamming distance
  • the WHD may give more weight to the pixels associated with valid edges and less weight to the pixels that are associated with non-valid pixels.
  • the masked pixels may of course be zeroed out during the matching process.
  • the present system provides a solution to an issue of eye gazing where an individual subject is looking off angle and not straight to the camera system. Gazing effects on iris segmentation may be dramatic and trying to quantify the amount of eye gazing to correct for it may be regarded by many as challenging.
  • a correction process may involve many geometrical models and assumptions that are not general and image specific. The model complexity and its analysis might not only reduce the robustness of the gaze detection estimations but also often introduce errors into the estimates.
  • the present system does not require any gaze detection in that it is designed to deal with all image perspectives.
  • is with respect to a center 1 11 of a pupil 1 14, and ⁇ + ⁇ is with respect to the iris center 1 12, as shown in Figure 10.
  • the edge point 113 may be on the outside border of the iris 115.
  • the map pixels are constructed using interpolation scheme to sample a predefined number of pixels at each angle that passes from the inner edge 1 17 to outer edge 113 with respect to the analysis center 1 11.
  • Figure 1 1 is a diagram of angular clustering where a focus is on the sclera, that is, the side portions 121 and 122 of the iris 142. One may start at an estimated edge and end up at a new edge. To start, the sclera portions 121 and 122 may appear symmetrical but probably will not end up as such in actuality.
  • Each angle of the quadrants or portions may have a distinct value.
  • the noisy portions at the top 123 and the bottom 124 may be treated differently than the side sclera portions 121 and 122. If the upper and lower portions 123 and 124, respectively, are too discontinuous or noisy, then they may be masked down through the iris 142 to the center of the pupil 141, as shown in Figure 12.
  • Figure 13 is a mapping 131 showing the noisy upper 123 and lower 124 portions relative to pupil 141 and iris 142.
  • a mapping 132 Figure 14 one may attempt to use information in the iris 142 the within a radius 133 of the iris 142 that does not extend into the portions 123 and 124.
  • the mapping 151 of Figure 15 shows a masking 145 and 146 that is complete from portions 123 and 124, respectively, through the iris 142 to the center of the pupil 141, as shown in Figure 12. Since much information in the iris 142 may not be available as shown by the masking of Figures 12 and 15, a partial masking 147 and 148 of portions 123 and 124 may done according to a mapping 152 as shown in Figure 16. Masking could be used right on the edges of the noisy pixels and therefore masking only those pixels that represent 124 and 123. Mapping 152 may make more iris information available.
  • Figure 17 is a masking 161 of iris 142 showing the masking out of only the portions 123 and 124, plus some other minor noise, with zeros.
  • Figure 18 shows a masking 162 showing various masking schemes of noisy or obscured areas of the iris 142, such as a reflection 163, blurriness or obscuration 164, and other iris non-information spots near portions 123 and 124.
  • the ones and zeros are merely approximations of example masks (for instance, the ones can be replaced with weights based upon the segmentation analysis as explained herein) as they are for illustrative purposes.

Abstract

An iris recognition system having pupil and iris border conditioning prior to iris mapping and analysis. The system may obtain and filter an image of an eye. A pupil of the mage may be selected and segmented. Portions of the pupil border can be evaluated and pruned. A curve may be fitted on at least the invalid portions of the pupil border. The iris of the eye with an acceptable border of the pupil as an inside border of the iris may be selected from the image. The iris outside border having sclera and eyelash/lid boundaries may be grouped using a cluster angular range based on eye symmetry. The sclera boundaries may be fitted with a curve. The eyelash/lid boundaries may be extracted or masked. The iris may be segmented, mapped and analyzed.

Description

A STANDOFF IRIS RECOGNITION SYSTEM
This application is a continuation-in-part of U.S. Patent Application No. 11/275,703, filed January 25, 2006, which claims the benefit of U.S. Provisional Application No. 60/647,270, filed January 26, 2005.
This application is a continuation-in-part of U.S. Patent Application No. 11/043,366, filed January 26, 2005.
This application is a continuation-in-part of U.S. Patent Application No. 11/372,854, filed March 10, 2006;
This application is a continuation-in-part of U.S. Patent Application No. 11/672, 108, filed February 7, 2007.
This application claims the benefit of U.S. Provisional Application No. 60/778,770, filed March 3, 2006. The government may have rights in the present invention.
Background
The present invention pertains to recognition systems and particularly to biometric recognition systems. More particularly, the invention pertains to iris recognition systems. Related applications may include U.S. Patent Application No. 10/979, 129, filed
November 3, 2004, which is a continuation-in-part of U.S. Patent Application 10/655, 124, filed September 5, 2003; and U.S. Patent Application No. 1 1/672, 108, filed February 7, 2007.
U.S. Patent Application No. 1 1/275,703, filed January 25, 2006, is hereby incorporated by reference.
U.S. Provisional Application No. 60/647,270, filed January 26, 2005, is hereby incorporated by reference.
U.S. Patent Application No. 1 1/043,366, filed January 26, 2005, is hereby incorporated by reference. U.S. Patent Application No. 1 1/372,854, filed March 10, 2006, is hereby incorporated by reference.
U.S. Provisional Application No. 60/778,770, filed March 3, 2006, is hereby incorporated by reference. U.S. Patent Application No. 1 1/672, 108, filed February 7, 2007, is hereby incorporated by reference.
Summary The present invention is a stand off iris recognition system.
Brief Description of the Drawing
Figure 1 is a diagram of an overall structure of the standoff iris recognition system; Figure 2 is a diagram of a pupil processing mechanism;
Figures 3, 4 and 5 are diagrams showing a basis for pupil border analysis, curve fitting and portion substitution;
Figure 6 is a diagram of an approach for an iris outer border analysis, curve fitting and portion removal or substitution; Figure 7 is a diagram of a polar segmentation subroutine mechanism;
Figures 8a and 8b are diagrams illustrating an approach for estimating eyelash/lid curve detection;
Figure 9 is an illustration showing an eye having eyelash/lid obscuration; Figure 10 is a diagram of pupil and iris centers; Figures 1 1 and 12 are diagrams of iris quadrants and masking; and
Figures 13-18 are diagrams of various kinds of masking for noisy and informational areas of the eye.
Description Various noted properties of irises may make iris recognition technology as a reliable person identification tool. For instance, irises may have uniqueness unlike other biometric technologies, such as face-prints and fingerprints. Irises may be unique to a person and even among genetically twin individuals. Although the striking visual similarity of identical twins reveals the genetic penetrance of facial appearance, a comparison of genetically identical irises reveals just the opposite for iris patterns.
Further, there appears to be no aging effect, that is, there is stability over the life of iris features. The physical characteristics of iris patterns are unalterable without significant duress. A non-invasive iris may be considered as an internal unique organ but yet is externally visible and can be measured. It is in a protected environment but still visible. The present system and approach address the real-time operational requirements of a standoff iris recognition system and may be regarded as an "on-the-fly" iris recognition system. Unlike other approaches, which mostly are based on brute force of a Hough Transform to fit the iris edges into circular or regular shapes, one may employ an efficient and robust enhancement approach built around a polar segmentation (POSE) technique by the present assignee disclosed in U.S. Patent Application No. 1 1/043,366, filed January 26, 2005. Present improvements made to the POSE segmentation technique contribute to a robust and computational efficient and accurate real-time iris recognition. The present iris recognition system is well suited for high-security access control or "at-a-distance biometrics" applications with little or no control exercised on subject positioning or orientations. The iris recognition operation may include subjects captured at various ranges from the acquisition device or include subjects that may not have their eye directly aligned with the imaging equipment. Usually, for such applications, it may be difficult to implement a level of control required by most of the existing art to enable reliable iris recognition operations. The present approach of iris recognition may cope with asymmetry in acquired iris imaging and it can operate under any uncontrolled operations as long as some of the iris annular is visible.
The present system may provide an accurate segmentation technique and hence identify good iris patterns, which may be regarded as signatures. The present system may take the analysis of edges into polar domain and use local patterns to detect iris features using an enhanced version of POSE technique disclosed in U.S. Patent Application No. 11/275,703. This technique may detect curves of the iris borders of any irregular shapes. A detection algorithm may robustly detect the inner and outer borders of the eye iris for the purpose of human or animal recognition. The present approach may begin with a mapping the analysis immediately into the polar domain with respect to a centered point in the pupil region. The centered point, not necessarily the exact center of the pupil but may be identified within the pupil region. One may then detect edges of the inner and outer borders of the iris based upon a one dimensional polar segmentation (ID POSE) technique and detect the irregular shape of the iris curves using additional rules that are introduced on the POSE technique to cluster the edge points separately into two groups that represent edges at the sclera and edges at the borders of the eyelids. One may extract the iris signature using a guided analysis to correctly normalize the stretching and compression of the patterns and bring uniformity into the interpretation of the patterns. In addition, one may cluster obscured pixels and affected areas to be either weighted with low weights or masked out of the analysis. The patterns may then be matched against multiple codes within a database and are given weights based upon the pattern visibility and exposure to the camera system.
The present system and approach may include the following items. There may be a map analysis at an earlier stage to conduct segmentation into the polar domain. Iris inner border detection may be achieved using the estimated edges of POSE or any other active contour technique that provides a way to analyze each edge at each angle separately to determine whether the resulting edge is a valid border edge or invalided edge. A valid edge may be defined as an edge that was detected within a predefined range. Any edge point that results out of range or at the extreme points of the gradient signal segment may represent a leaked peak and is treated as invalid edge. A predefined regular or irregular model shape may be used to fit the resulting edges. The depicted model shape may be used to fill in any missing edges within the contour of the pupil to replace the non- valid points with the estimated points from the irregular shape. The analysis may be offset with a predefined minimum possible width of an iris as the starting point for the iris outer border analysis. Boundary edges may be extracted using POSE. A median filter may be run to smooth the resulting outcome of POSE. The boundary edge points may be clustered into several categories: 1) sclera and iris boundary points; and 2) iris and eyelid boundary points to be analyzed differently. The valid sclera and iris boundary points may be extracted. These edge points may be fitted into a predefined regular model shape. The regular model shape may be used for guidance of the analysis and will not present the final outcome of the edge estimates.
One may track the lowermost points of the lowermost curve of the upper eyelid edge, and track the uppermost points of the upper curve of the lower eyelid edges. Then one may interpolate among these samples to replace the entire angular range corresponding to the eyelid obscurations. The area between the estimated eyelid-eyelash curve and the pupil curve (inner border) may be measured. Weights may be assigned based upon significance of the area between the curves. In some approaches, one may choose to assign zero to the weights to discard the entire region given the significance of the occlusions. The spacing between the inner and outer curves may be scaled based upon the position of the outer curve within the regular shape. The actual edge points detected by POSE may be used to be the actual edges of the iris borders and not the fitted model shapes. Any pixel that lies within the outer border of the iris and the fitting model shape may be masked. Any pixel that lies outside the fitting shape may be discarded. The pixels may be mapped into an iris pattern map. Virtually any encoding scheme may be used to compress the image into few bits while covering the entire angular range using a predefined angular resolution and radius resolution. A similarity of information metric may be used to measure the similarity among the barcode of the templates for matching while weighing the pixels that come from valid edges with higher values and weighing pixels associated with invalid edges or obscuration with smaller or zero values.
The present approach may be for performing iris recognition under suboptimal image acquisition conditions. The approach may be for iris segmentation to detect all boundaries (inner, outer, eyelid and sclera and horizon) of the image iris simultaneously.
The overall structure of the standoff iris recognition system 10 is shown in the Figure 1. One may start an analysis by mapping 12 a located eye image 11 into a polar domain at the start with respect to a centered point within the pupil region of the eye image. An approach to estimate a point within the pupil may be straightforward in that it can use thresholding or summation over the x-axis and the y-axis to localize the darkest contrast within the eye image to locate the pupil region. The eye finder approach which is discussed in U.S. Patent Application No. 11/672,108, filed February 7, 2007, may be used to estimate a pupil point. There may be an iris inner curve estimation 13 and outer curve estimation 14. A feature extraction 15 may proceed, leading to an iris signature map 16. The iris signature 17 may be compressed. An enroll and/or match 18 may occur with iris signature data flowing to and from a storage 19 in the form of bar codes.
Figure 2 is a diagram of a pupil processing mechanism 20. An image 21 having an eye may go to an eye finder 22 which is discussed in U.S. Patent Application No. 11/672, 108, filed February 7, 2007. From the eye finder, the result may enter a filter 30 having a median filter 23 and then a smooth low pass filter 24 for noise removal. One does not want an actual feature on the pupil to interfere with the actual edge detection. An input kernel (pupil) module 69 may define a specific kernel or matrix of pixels covering just the pupil from the eye image for analysis. The edges of the pupil may include the most significant peaks, sufficient for detection. An output image of the pupil with certain edge smoothened out may go from the filter 24 may go to a POSE-ID segmentation 25.
Constraint evaluation is where a peak may be detected within a range. Edge detection may be on the limits within a certain range. A rough center location and an approximate size of the pupil may be attained. When the edges of the pupil are detected as peaks within the ID signal along the radial axis and are said to be valid if they were detected within the radial range, one may have a validation of the pupil by testing the pupil profile, estimates of the edges. The new edges may yield to a better estimate of the pupil center sufficient for analysis.
A median filter 23 may be applied to eliminate salt and pepper noise due to the system acquisition of background noise. At this point, the image may be a kernel, i.e., a block of pixels of a pupil for analysis. The image 21 may be passed through a low pass filter 24 to smooth the variation with the pupil region while preserving the apparent contrast change at the edge of the pupil and the iris. Next, the POSE-ID segmentation 25 may be applied. The validity of the edges at step or stage 51, indicated by a diamond symbol, may be determined by checking whether the peaks in the contrast changes are leaked to the edges of the gradient of the contrast change signal. The leaking may indicate several cases. A constraint may include that the pixels of the edge be within a set range. First, the actual edge of the pupil may be too close to the signal edge and therefore the detected edge might not reflect the actual edge of the gradient. There may not be enough contrast to can determine whether there is a pupil edge. There may be a presence of obstacles that is obscuring the pupil edges. Obstacles may include skin of an eye, eyelashes due to eye closure, an eyeglass frame, a contact lens, optics, and the like. In either case, the peak may be deemed an invalid peak or an edge of a pupil. One may then fit only the valid points into a predefined model shape, i.e., elliptic fitting 52, just for guidance. Two alternatives may then be proposed. In an approach 54, one may actually use the estimated shape 56, 52, 48 (i.e., ellipse) that replaces the actual edges as an approximation to the pupil edges (which may also be referred to as an inner bound of the iris). In another approach 53, the actual active contour edge 57 may be kept as a final outcome using the POSE technique and only the invalid edges will be replaced by points from the estimated shape (i.e., the estimated ellipse).
Once the iris inner border at the pupil is estimated, one may move outward from the pupil with some margin that represents the least possible width of an iris. Then that width offset may be used as the starting point of the iris outer border analysis. An offset 90 of Figure 9 may vary from zero to some value depending on the visibility of the pupil within the eye image during image acquisition. For instance, one offset may vary dependent on a scoring and/or a validation of a pupil profile being captured. Relative to a closed or highly obscured eye, an offset may be at a minimum or zero. For an open eye with no obscuration and having a high score and/or validation of a pupil profile, the offset may be large. The offset may vary depending on the areas or angular segments of the eye that are visible. Offset may vary according to the border type. For example, the iris/sclera border may warrant significant offset, and the offset for the iris/eyelash-lid may be low, minimus or zero. The iris outer border analysis is illustrated, at least partially, in a diagram of Figure 3.
Figure 3 shows a pupil 31 of which a portion of an edge 38 is within a range 32 of a circle 33 having a radius 34 about an approximate center 35. It may be noted that there may be a first reflection 36 and a first center estimate 37. However, an approximate center 35 is noted for subsequent use. The range 32 can have a set amount of deviation that the edge 38 of pupil 31 may have and yet be regarded as valid. It may be noted that the edge 38 could but does not extend beyond the outer circumference of range 33, but edge 38 does appear at points 41, 42 and 45 to be inside of a circumference 39 showing an inner limit of range 32. Points 43, 44 and 46 appear within the range 32 and thus may be deemed to be valid. The edge 38 of the pupil 31 may not be within the range 32 at points 41, 42 and 45 because of the eyelashes, eyelid and/or noise 47 at the bottom and top of the pupil. Other factors of pupil 31 may include a blob fitting (BF) and a coverage fitting (CF). An example set of percentages may be BF=78% and CF=92%, which appear to be an acceptable indication of an actual pupil. The validity of the edge 38 may be determined at symbol 51 of Figure 2. The input may be an output from the segmentation stage or block 25. Also, an output from block 25 may go to a snake plus elliptic curve (or the like module) block 53.
The output of the valid edge determination diamond symbol 51 may go to a pruning block 40 where prompt changes of the edge 38 may be smoothed or reduced in its extension out from the edge curve. Then, the edge 38 may go to a predefined model shape (such as elliptic fitting) block 52. Here, the edge 38 of pupil 31 is fitted with a model shape curve 48 (as an example, one may show an elliptic shape as a fitting model shown as a thick line in Figure 4). The entire edge 38, including the invalid and valid portions, may be replaced with the elliptic fitting 48 in a first approach (elliptic or like module) 54. Only the valid portions of the edge 38 are incorporated in determining an elliptic fitting curve 48 as indicated by block 54. The elliptic fitting 48 may used to do a final estimate of the pupil center 35. In a second approach, a non-linear fitting may be done as shown in Figure 5. The model fitting 48 may be kept for only the non- valid portion or points 41, 42 and 45, but the actual valid edges or points 43, 44 and 46 may be kept, as indicated by block 53.
An output of elliptic fitting block 52 may go to a diamond 55 which asks whether the actual contour 38 or the model fitting 48 should be used. One may note that in either case, the model fitting or curve 48 should always be used for the non-valid portions of curve or contour 38 incorporating such. The approach does not get affected by any reflection within the pupil and as shown in Figure 3, the analysis goes around the reflection and thus it would be neglected without having to add any preprocessing for its elimination. Besides reflections, a partially closed eye, eyelashes or lids, noise, and the like may be well treated using this segmentation method.
If the answer at diamond 55 is no, then the model curve 48 is used in place of the valid and non-valid portions of pupil edge 38. The output of block 54 may be a pupil border 56 as shown in image 58. If the answer is yes at diamond 55, then a "snake", which is an active contour, that is, an estimate of the actual edge 38, rather than the ellipse approximation 48, is used for the valid portions of edge 38. The output of block 53 may be a pupil border 57 as shown in image 59. One may note two reflections 61 in the pupil of images 58 and 59. These reflections may be a pattern of the light used for analytical purposes of a pupil and so that the reflection on the pupil may be found and identified. Also, arrows 62 may repeat elliptic fitting data sent to blocks 53 and 54 for effecting an elliptic curve fit.
An enhancement to elliptic fitting may be added as a part of the elliptic fitting box 52. This enhancement may be a pruning of the pupil edge before doing a model fitting at block or module 52 (Figure 2). The pruning may be used to smooth the curve edges and eliminate any mismatches of extraneous edges. In pruning, outliers are replaced with the likelihood edge within a predefined angular segment.
Figure 6 is a diagram of an approach for an iris outer border analysis, curve fitting and portion removal or substitution. An eye image 21 may be processed through the median filter 24, respectively, which is noted herein. A kernel 91, which may be a matrix or block of pixels of the iris of the image 21, can be processed. A resulting image 93 for analysis may proceed to a cluster angular range module 92. The eye symmetry, as shown by inset 93, may proceed on to a POSE+ (illustrated in Figure 7) segmentation module 94.
Figure 7 reveals more detail (i.e., the ID POSE+ subroutine) of the segmentation module 94. Two major portions of the eye image 93 go to module 94 for segmentation concerning sclera borders and eyelash borders. Input 96 for sclera borders may go to a ID POSE segmentation submodule 98 and input 97 for eyelash borders may go to ID POSE segmentation submodule 99. Information 67 of the pupil model fitting, center may be input to the submodules 98 and 99. An output of segmentation submodule 98 may go to a get max peak submodule 60 which in turn provides an output to a ID median filter 102. Also input to median filter 102 may be a filter bandwidth 68. An output from segmentation submodule 99 may go to a get max peak submodule 101 which in turn provides an output to a ID median filter 63. A filter bandwidth signal 68 may be provided to filter 63. An output 64 from median filter 102 of module 94 may go to a (dr I dθ) module
71 for sclera borders, as shown in Figure 6. An output 65 from median filter 63 may go to a (3 /3# ) module 72 for eyelash/lid borders. Modules 71 and 72 may be of a border module 103. An output from module 71 may go to a count module 73, and an output from module 72 may go to a count module 74. Modules 73 and 74 may be of a count module 104. If the count at module 73 is not less than λ, where λ is threshold, then there is not a valid eye image 75. If the count is less than λ, then a circular, elliptic, or the like, fitting may be placed on the iris outer sclera borders at module 76. If the count at module 74 is not greater than λ, then the eyelash edges may be extracted at module 77. This may involve ID POSE+. If the count at module 74 is greater than λ, then the eyelashes may be masked at module 78. This may involve POSE ID. λ may be a number indicating a number of hits or places where a curve discontinues. The range of λ may be around 3 or 4. Under certain circumstances of more tolerance, λ may be set to be 5 or grater.
A combined output 66 from the ID median filters 102 and 63 may go to a map analysis center 81. Also, outputs from the circular fitting module 76, the extract eyelash edges module 77 and the mask eyelashes module 78 may go to a center 81 for a map analysis.
The preprocessing may include the filter or combination 30 of a median 23 and low pass filter 24 of Figure 6 to smooth the iris texture while preserving the strong edge of the contrast change at the outer border of the iris. One may then cluster the angular range into two categories. Boundary points may be clustered. With the occlusion of the iris by the eyelids and eyelids, there may be two groups of boundary points around the outer bounds of the iris that may be treated differently in the present analysis. The groups may be iris sclera boundaries and iris eyelid boundaries. The two classes of points may be treated according to the expected distributions of edge pixels. To cluster the points into these two classes, one may use the symmetry method in POSE+ (see U.S. Patent Application No. 11/275,703, filed January 25, 2006) where pixels placed symmetrically relative to each other in terms of curvature with smooth continuous edges.
In another approach, one may estimate the limits the symmetry ends by conducting the following steps. The lowermost edge points of the upper eyelid edge may be fit into a straight-line and the uppermost of the lower eyelid edge points may be fit into a straight line crossing the detected iris outer border curve (original curve detected by POSE). The intersection of these two straight lines and the curve may define a good estimate of the trapezoid contour of the eye socket. The intersection of these lines and the pre-estimated shape may define these boundary points. The POSE+ subroutine is shown with a diagram in Figure 7.
Figure 7 reveals more detail (i.e., the ID POSE+ subroutine) of the segmentation module 94. Two major portions of the eye image 93 go to module 94 for segmentation concerning sclera borders and eyelash borders. Input 96 for sclera borders may go to a ID POSE segmentation submodule 98 and input 97 for eyelash borders may go to ID POSE segmentation submodule 99. Information 67 of the pupil ellipse fitting and center may be input to the submodules 98 and 99. An output of segmentation submodule 98 may go to a get max peak submodule 60 which in turn provides an output to a ID median filter 102. Also input to median filter 102 may be a filter bandwidth 68. An output from segmentation submodule 99 may go to a get max peak submodule 101 which in turn provides an output to a ID median filter 63. A filter bandwidth signal 68 may be provided to filter 63.
An output 64 from median filter 102 of module 94 may go to a (dr I d θ) module
71 for sclera borders. An output 65 from median filter 63 may go to a (3/3# ) module 72. An output from module 71 may go to a count module 73, and an output from module
72 may go to a count module 74. If the count at module 73 is not less than λ (where λ is as discussed herein), then there is not a valid eye image 75. If the count is less than λ, then a circular fitting may be placed on the iris outer sclera borders at module 76. If the count at module 74 is not greater than λ, then the eyelash edges may be extracted at module 77. This may involve ID POSE+. If the count at module 74 is greater than λ, then the eyelashes may be masked at module 78. This may involve POSE ID. A combined output 66 from the ID median filters 102 and 63 may go to a map analysis center 81. Also, outputs from the circular fitting module 76, the extract eyelash edges module 77 and the mask eyelashes module 78 may go to a center 81 for a map analysis. Eyelid detection may be noted. With the nature of eye closure under nominal conditions, there may be two possibilities for eye positioning. One is a wide-open eye and another partially open. In either case, one might only consider points of observable edges of iris in the curve fitting. To estimate the eyelid edges, one may track the lowermost points of the lowermost curve 82 (Figures 8a and 8b) of the upper eyelid 87 edge, and track the uppermost points of the upper curve 84 of the lower eyelid 88 edges. Figures 8a and 8b are graphs illustrating an approach for estimating eyelid curve detection. A piece-wise linear fitting 83 of the local minima of the curve 82 may be done for the upper eyelid 87. A piece-wise linear fitting 85 of the local maxima of the curve 84 may be done for the lower eyelid 88.
One may interpolate among these samples to cover the entire angular range corresponding to the eyelid segments, L =
Figure imgf000012_0001
- θλ \ Thus,
V(x, , x, _, ) pair sequence; Let Ax = (xk - xk_x ) ; Δ/ = ^-
Ax
=> V xk_x < x < xk, /O) = /(X-i ) + ΔxΔf One may limit the sampling space to a predefined angular range φ, so the next sampling point is determined using the following minimization equation, xk = min^.j + φ, xk ) . Figures 8a and 8b illustrate a technical approach for estimating the eyelids curve detections
Figure 9 relates to eyelid detection and shows a picture of an eye 86 with an obscuration by an upper eyelid 87 and possible obscuration with a lower eyelid 88. This Figure illustrates a resulting output of a following process.
A weighting scheme may also be introduced to assess the obscuration amount of the eyelids, eyelashes or other manner of obscuration such as glass, a frame, and so forth. The obscuration may be assessed by computing the integral of the area between the eyelid curve and pupil boundary with the following equation,
Figure imgf000012_0002
where O1 represents the angles associated with the boundary curve of the eyelash/eyelid, and r (θ) is the estimated pupil radius at angle θ . The integral may be evaluated over the angular range covered by eyelashes (and/or eyelids) and be based upon the value of the integral with respect to a pre-estimated threshold. A weighting factor may be assigned to these angular segments to be used in the matching function.
Once the iris region is successfully segmented using the POSE technique, the next stage may be to extract the valid sclera and iris boundary points and fit these edge points into a predefined regular shape, e.g., a circular shape. It is important to note that these regular shapes are generally not used as the final outcome of the detection. The regular shapes may be used for guiding the present normalization process and to keep the actual detected edges of the active contour that POSE has identified.
The normalization is crucial to iris processing to address dimensional changes of the iris shapes. These dimensional inconsistencies may be mainly due to the iris stretches and dilation of the pupil that usually undergoes different environment lightings as well as imaging distance variations. The regular shape is not meant to be the final outcome of the present estimates. The curve detected by the present active contour approach as an ensemble of all edges detected by POSE may be the final estimate of the iris outer border edges. The predefined shape may be used to scale back the curve shape into a common scaling for normalization purposes as well as an approach to identify areas that do not belong to the iris map and ought to be masked from the analysis. The regular shape may define the actual scaling needed to bring uniformity among all the captured images and templates in the database. The analytical formula for computing the scaled signal vector of the pixels along the radius variable is shown in the following, sθ(r) = sθ(r)u(Re - r) + E[sθ(r)] θ ru(r - Re) , (3) where sθ(r) represents the pixel values at a radius r and angle d? . The function 7(r) may represent the elements of the scaled vector that is used to map the iris pixels into the normalized iris pattern map (also referred to as a rubber sheet). One may use u(r) to denote the step function. The expected value of the signal function shown in equation (3) represents the expected value edge based upon the fitting model. For circular model, E[sθ(r)]= Re (circular radius).
A challenge in building the standoff iris recognition system may lie at how to extract and segment the boundaries of an iris and not necessarily the compression approach to encode the barcode of the extracted map. To complete the iris recognition process, iris encoding may usually be used to compress the iris map into fewer bits in a barcode to be stored or matched against other barcodes stored in a database. The iris encoding may be processed on the iris map to extract the pattern texture variations. What type of encoding or algorithm may be irrelevant here as there are many COTS approaches to encode a digital image. One may make use of Gabor filters to encode the iris map image to its minimum possible number of bits so that metrics can be used to give one range of values when comparing templates with capture maps. Similarly, any similarity metrics may be used to measure the information similarity among templates. One metric in particular that may be used is the weighted hamming distance (WHD). The WHD may give more weight to the pixels associated with valid edges and less weight to the pixels that are associated with non-valid pixels. The masked pixels may of course be zeroed out during the matching process. The present system provides a solution to an issue of eye gazing where an individual subject is looking off angle and not straight to the camera system. Gazing effects on iris segmentation may be dramatic and trying to quantify the amount of eye gazing to correct for it may be regarded by many as challenging. A correction process may involve many geometrical models and assumptions that are not general and image specific. The model complexity and its analysis might not only reduce the robustness of the gaze detection estimations but also often introduce errors into the estimates. The present system does not require any gaze detection in that it is designed to deal with all image perspectives.
In iris feature extraction analysis, for instance, θ is with respect to a center 1 11 of a pupil 1 14, and Θ+ΔΘ is with respect to the iris center 1 12, as shown in Figure 10. The edge point 113 may be on the outside border of the iris 115. One usually needs the iris center to read relative to a corresponding angle. One may measure a distance from the center of the pupil to the edge of the iris. For a point 1 13 on the iris edge, at each angle, the map pixels are constructed using interpolation scheme to sample a predefined number of pixels at each angle that passes from the inner edge 1 17 to outer edge 113 with respect to the analysis center 1 11. The above analysis is applicable whether the fitting model is circular, an ellipse, or a non-linear fitting that may be parametrized (i.e., as a polynomial). One may select fixed size sample vectors from the pupil edge to the iris edge. Or, one may take samples from the pupil edge to the iris at a number of points. Figure 1 1 is a diagram of angular clustering where a focus is on the sclera, that is, the side portions 121 and 122 of the iris 142. One may start at an estimated edge and end up at a new edge. To start, the sclera portions 121 and 122 may appear symmetrical but probably will not end up as such in actuality. Each angle of the quadrants or portions may have a distinct value. The noisy portions at the top 123 and the bottom 124 may be treated differently than the side sclera portions 121 and 122. If the upper and lower portions 123 and 124, respectively, are too discontinuous or noisy, then they may be masked down through the iris 142 to the center of the pupil 141, as shown in Figure 12. Figure 13 is a mapping 131 showing the noisy upper 123 and lower 124 portions relative to pupil 141 and iris 142. In a mapping 132 Figure 14, one may attempt to use information in the iris 142 the within a radius 133 of the iris 142 that does not extend into the portions 123 and 124. The mapping 151 of Figure 15 shows a masking 145 and 146 that is complete from portions 123 and 124, respectively, through the iris 142 to the center of the pupil 141, as shown in Figure 12. Since much information in the iris 142 may not be available as shown by the masking of Figures 12 and 15, a partial masking 147 and 148 of portions 123 and 124 may done according to a mapping 152 as shown in Figure 16. Masking could be used right on the edges of the noisy pixels and therefore masking only those pixels that represent 124 and 123. Mapping 152 may make more iris information available. Figure 17 is a masking 161 of iris 142 showing the masking out of only the portions 123 and 124, plus some other minor noise, with zeros. Ones represent areas of iris information. Figure 18 shows a masking 162 showing various masking schemes of noisy or obscured areas of the iris 142, such as a reflection 163, blurriness or obscuration 164, and other iris non-information spots near portions 123 and 124. The ones and zeros are merely approximations of example masks (for instance, the ones can be replaced with weights based upon the segmentation analysis as explained herein) as they are for illustrative purposes.
In the present specification, some of the matter may be of a hypothetical or prophetic nature although stated in another manner or tense. Although the invention has been described with respect to at least one illustrative example, many variations and modifications will become apparent to those skilled in the art upon reading the present specification. It is therefore the intention that the appended claims be interpreted as broadly as possible in view of the prior art to include all such variations and modifications.

Claims

What is claimed is:
1. An iris recognition system comprising: an eyefmder; a filter connected to the eyefmder; a segmenter connected to the filter; an edge validity evaluator connected to the segmenter; and a curve fitter connected to the edge validity evaluator.
2. The system of claim 1, wherein: the eyefmder is for providing a valid eye image; and the filter is for smoothing out edges in the eye image.
3. The system of claim 1, further comprising a kernel module, for selecting an image of a pupil in the eye image, connected to the filter.
4. The system of claim 3, further comprising: a curve implementation module connected to the curve fitter; and wherein the curve implementation module is for establishing a border for the pupil in the image of the pupil, based on a curve from the curve fitter.
5. The system of claim 1, further comprising: an active contour plus a curve implementation module connected to the curve fitter and to the segmenter; and wherein the active contour plus a curve implementation module is for establishing a border in place of invalid edges of the pupil in the image of the pupil, based on a curve from the curve fitter, and for establishing a border from valid edges of the pupil.
6. The system of claim 5, further comprising a contour selector connected to the active contour plus a curve implementation module and connected to the curve fitter.
7. The system of claim 1, further comprising: an edge pruner connected between the evaluator and the curve fitter; and wherein the segmenter is a one dimensional polar segmenter.
8. An iris recognition system comprising: an eyefmder; a filter connected to the eyefmder; a range module connected to the filter; a segmenter connected to the range determiner; a border module connected to the segmenter; a count module connected to the border module; and a curve fitter connected to the count module.
9. The system of claim 8, wherein: the eyefmder is for providing a valid eye image having a processed pupil border; and the filter is for smoothing out edges in the eye image.
10. The system of claim 8, further comprising a kernel module, for selecting an image of an iris in the eye image, connected to the filter.
11. The system of claim 8, further comprising a map analysis module connected to the count module, the curve fitter and the segmenter.
12. The system of claim 11, further comprising: an extract eyelash/lid module connected to the count module and the map analysis module; and a mask eyelash/lid module connected to the count module and the map analysis module.
13. The system of claim 8, wherein: the range module is for setting a cluster angular range; and the segmenter is a one dimensional polar plus segmentation module.
14. The system of claim 9, wherein: the border module comprises a sclera border module and an eyelash/lid border module; the count module is for determining a number of discontinuities in sclera borders; the count module is for determining a number of discontinuities in the eyelash/lid borders; if the number of discontinuities in the sclera borders is less than a first threshold, then the curve fitter is activated for curve fitting the sclera borders; and if the number of discontinuities in the sclera borders is not less than the first threshold, then the eye image is invalid.
15. The system of claim 14, wherein: if the number of discontinuities in the eyelash/lid borders is greater than a second threshold, then the eyelash/lid borders are masked; and if the number of discontinuities in the eyelash/lid borders is not greater than the second threshold, then the eyelash/lid borders are extracted.
16. The system of claim 13, wherein the segmenter comprises: a first one dimensional polar segmenter, for sclera borders, connected to the range module; a second one dimensional polar segmenter, for eyelash/lid borders, connected to the range module; a first get max peak module connected to the first one dimensional polar segmenter; a second get max peak module connected to the second one dimensional polar segmenter; a first one dimensional median filter connected to the first get max peak module and to the border module; and a second one dimensional median filter connected to the second get max peak module and to the border module.
17. A method for iris recognition comprising: providing an image of an eye; selecting a pupil in the image; segmenting the pupil; determining a validity of portions of a border of the pupil; and fitting a curve on at least invalid portions of the border of the pupil to form a resulting border of the pupil.
18. The method of claim 17, further comprising: selecting an iris with the pupil having the resulting border from the image of the eye; and clustering iris sclera boundaries and the eyelash/lid boundaries of the iris into first and second groups of boundaries, respectively.
19. The method of claim 18, further comprising: determining a first number of discontinuities of the first group of boundaries; and wherein: if the first number is less than a first threshold, then the first group of boundaries is fitted with a curve fitting model; and
l: if the first number is not less than the first threshold, then the eye image is invalid.
20. The method of claim 19, further comprising: determining a second number of discontinuities of the second group of boundaries; and wherein: if the second number is not greater than a second threshold, then the second group of boundaries is extracted; if the second number is not greater than the second threshold and an area between outer borders of the second group of boundaries and an inner border of the iris is less than a third threshold, then the second group of boundaries are weighted accordingly; and if the second number is greater than the second threshold, then the second group of boundaries is masked; and further comprising mapping the iris.
21. The method of claim 19, wherein an iris map is constructed based upon actual inner and outer edge estimates with respect to fitting models.
22. The method of claim 21, wherein pixels of the iris map are extracted based upon an interpolation of image pixels within inner fitting model and outer fitting model edges at nearly all angles, deemed to be valid, with respect to a pupil center.
23. The method of claim 21 , wherein nearly any pixel that lies within an outer border of the iris and a fitting model shape may be masked.
24. The method of claim 21, wherein nearly any pixel that lies outside an outer fitting model shape may be discarded.
PCT/US2007/063024 2006-03-03 2007-03-01 A standoff iris recognition system WO2007103701A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB0815928A GB2450026B (en) 2006-03-03 2007-03-01 A standoff iris recognition system
AU2007223574A AU2007223574B2 (en) 2006-03-03 2007-03-01 A standoff iris recognition system

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US77877006P 2006-03-03 2006-03-03
US60/778,770 2006-03-03
US11/675,424 2007-02-15
US11/675,424 US8098901B2 (en) 2005-01-26 2007-02-15 Standoff iris recognition system

Publications (1)

Publication Number Publication Date
WO2007103701A1 true WO2007103701A1 (en) 2007-09-13

Family

ID=38164427

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2007/063024 WO2007103701A1 (en) 2006-03-03 2007-03-01 A standoff iris recognition system

Country Status (4)

Country Link
US (1) US8098901B2 (en)
AU (1) AU2007223574B2 (en)
GB (1) GB2450026B (en)
WO (1) WO2007103701A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103310186A (en) * 2012-02-29 2013-09-18 三星电子株式会社 Method for correcting user's gaze direction in image, machine-readable storage medium and communication terminal

Families Citing this family (77)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8705808B2 (en) * 2003-09-05 2014-04-22 Honeywell International Inc. Combined face and iris recognition system
US8442276B2 (en) 2006-03-03 2013-05-14 Honeywell International Inc. Invariant radial iris segmentation
US8050463B2 (en) 2005-01-26 2011-11-01 Honeywell International Inc. Iris recognition system having image quality metrics
US7593550B2 (en) 2005-01-26 2009-09-22 Honeywell International Inc. Distance iris recognition
US8064647B2 (en) 2006-03-03 2011-11-22 Honeywell International Inc. System for iris detection tracking and recognition at a distance
US8090157B2 (en) 2005-01-26 2012-01-03 Honeywell International Inc. Approaches and apparatus for eye detection in a digital image
WO2007101275A1 (en) * 2006-03-03 2007-09-07 Honeywell International, Inc. Camera with auto-focus capability
GB2448653B (en) 2006-03-03 2011-03-23 Honeywell Int Inc Single lens splitter camera
JP2009529197A (en) * 2006-03-03 2009-08-13 ハネウェル・インターナショナル・インコーポレーテッド Module biometrics collection system architecture
GB2450023B (en) 2006-03-03 2011-06-08 Honeywell Int Inc An iris image encoding method
FR2900482B1 (en) * 2006-04-28 2008-06-20 Sagem Defense Securite METHOD FOR IDENTIFYING A PERSON BY ANALYZING THE CTERISTIC CARA OF ITS CILES
US8063889B2 (en) 2007-04-25 2011-11-22 Honeywell International Inc. Biometric data collection system
US20100202669A1 (en) * 2007-09-24 2010-08-12 University Of Notre Dame Du Lac Iris recognition using consistency information
US20090092283A1 (en) * 2007-10-09 2009-04-09 Honeywell International Inc. Surveillance and monitoring system
FR2924247B1 (en) * 2007-11-22 2009-11-13 Sagem Securite METHOD OF IDENTIFYING A PERSON BY ITS IRIS
JP5262243B2 (en) * 2008-03-31 2013-08-14 アイシン精機株式会社 Eye opening / closing discrimination device and program
US9655515B2 (en) * 2008-04-08 2017-05-23 Neuro Kinetics Method of precision eye-tracking through use of iris edge based landmarks in eye geometry
US8436907B2 (en) 2008-05-09 2013-05-07 Honeywell International Inc. Heterogeneous video capturing system
US8213782B2 (en) * 2008-08-07 2012-07-03 Honeywell International Inc. Predictive autofocusing system
US8090246B2 (en) 2008-08-08 2012-01-03 Honeywell International Inc. Image acquisition system
US8280119B2 (en) * 2008-12-05 2012-10-02 Honeywell International Inc. Iris recognition system using quality metrics
US8948513B2 (en) * 2009-01-27 2015-02-03 Apple Inc. Blurring based content recognizer
JP5448981B2 (en) * 2009-04-08 2014-03-19 株式会社半導体エネルギー研究所 Driving method of liquid crystal display device
US20100296742A1 (en) * 2009-05-22 2010-11-25 Honeywell Inernational Inc. System and method for object based post event forensics in video surveillance systems
US8630464B2 (en) * 2009-06-15 2014-01-14 Honeywell International Inc. Adaptive iris matching using database indexing
US8472681B2 (en) 2009-06-15 2013-06-25 Honeywell International Inc. Iris and ocular recognition system using trace transforms
US8948467B2 (en) * 2010-08-06 2015-02-03 Honeywell International Inc. Ocular and iris processing system and method
US8742887B2 (en) 2010-09-03 2014-06-03 Honeywell International Inc. Biometric visitor check system
US8905314B2 (en) 2010-09-30 2014-12-09 Apple Inc. Barcode recognition using data-driven classifier
US8523075B2 (en) 2010-09-30 2013-09-03 Apple Inc. Barcode recognition using data-driven classifier
US8639058B2 (en) * 2011-04-28 2014-01-28 Sri International Method of generating a normalized digital image of an iris of an eye
US8755607B2 (en) 2011-04-28 2014-06-17 Sri International Method of normalizing a digital image of an iris of an eye
US8682073B2 (en) 2011-04-28 2014-03-25 Sri International Method of pupil segmentation
US8854446B2 (en) 2011-04-28 2014-10-07 Iristrac, Llc Method of capturing image data for iris code based identification of vertebrates
US9832023B2 (en) 2011-10-31 2017-11-28 Biobex, Llc Verification of authenticity and responsiveness of biometric evidence and/or other evidence
US9160536B2 (en) * 2011-11-30 2015-10-13 Advanced Biometric Controls, Llc Verification of authenticity and responsiveness of biometric evidence and/or other evidence
US9094388B2 (en) 2013-05-01 2015-07-28 Dmitri Tkachev Methods and systems for identifying, verifying, and authenticating an identity
US8958608B2 (en) 2013-06-04 2015-02-17 Ut-Battelle, Llc Frontal view reconstruction for iris recognition
KR102198852B1 (en) * 2014-03-24 2021-01-05 삼성전자 주식회사 Iris recognition apparatus and and mobile apparatus having the same
JP6550460B2 (en) * 2014-05-09 2019-07-24 グーグル エルエルシー System and method for identifying eye signals, and continuous biometric authentication
US10425814B2 (en) 2014-09-24 2019-09-24 Princeton Identity, Inc. Control of wireless communication device capability in a mobile device with a biometric key
US9928422B2 (en) * 2014-10-15 2018-03-27 Samsung Electronics Co., Ltd. User terminal apparatus and IRIS recognition method thereof
CA2969331A1 (en) 2014-12-03 2016-06-09 Princeton Identity, Inc. System and method for mobile device biometric add-on
US9704038B2 (en) 2015-01-07 2017-07-11 Microsoft Technology Licensing, Llc Eye tracking
US9811729B2 (en) 2015-05-12 2017-11-07 Ut-Battelle, Llc Iris recognition via plenoptic imaging
CN105631394B (en) * 2015-05-29 2019-08-02 宇龙计算机通信科技(深圳)有限公司 Iris information acquisition method, iris information acquisition device and terminal
CN108135469B (en) 2015-08-21 2021-03-09 奇跃公司 Eyelid shape estimation using eye pose measurements
CN108135467A (en) * 2015-08-21 2018-06-08 奇跃公司 Eyelid shape is estimated
US10016130B2 (en) * 2015-09-04 2018-07-10 University Of Massachusetts Eye tracker system and methods for detecting eye parameters
WO2017066296A1 (en) * 2015-10-16 2017-04-20 Magic Leap, Inc. Eye pose identification using eye features
CN105574865B (en) * 2015-12-14 2019-11-12 沈阳工业大学 Based on the method for improving ant group algorithm extraction eyelashes
WO2017123702A1 (en) 2016-01-12 2017-07-20 Princeton Identity, Inc. Systems and methods biometric analysis
US10373008B2 (en) 2016-03-31 2019-08-06 Princeton Identity, Inc. Systems and methods of biometric analysis with adaptive trigger
WO2017173228A1 (en) 2016-03-31 2017-10-05 Princeton Identity, Inc. Biometric enrollment systems and methods
US10217265B2 (en) * 2016-07-07 2019-02-26 Disney Enterprises, Inc. Methods and systems of generating a parametric eye model
US10217275B2 (en) * 2016-07-07 2019-02-26 Disney Enterprises, Inc. Methods and systems of performing eye reconstruction using a parametric model
CN109661194B (en) * 2016-07-14 2022-02-25 奇跃公司 Iris boundary estimation using corneal curvature
US10922393B2 (en) 2016-07-14 2021-02-16 Magic Leap, Inc. Deep neural network for iris identification
CN114253400A (en) 2016-08-22 2022-03-29 奇跃公司 Augmented reality display device with deep learning sensor
RU2016138608A (en) 2016-09-29 2018-03-30 Мэджик Лип, Инк. NEURAL NETWORK FOR SEGMENTING THE EYE IMAGE AND ASSESSING THE QUALITY OF THE IMAGE
CN110073359B (en) 2016-10-04 2023-04-04 奇跃公司 Efficient data placement for convolutional neural networks
IL285121B2 (en) 2016-11-15 2023-04-01 Magic Leap Inc Deep learning system for cuboid detection
EP4220572A1 (en) 2016-12-05 2023-08-02 Magic Leap, Inc. Virtual user input controls in a mixed reality environment
KR101776944B1 (en) * 2017-01-09 2017-09-08 주식회사 쓰리이 Method for coding iris pattern
US10275648B2 (en) * 2017-02-08 2019-04-30 Fotonation Limited Image processing method and system for iris recognition
KR102302725B1 (en) 2017-03-17 2021-09-14 매직 립, 인코포레이티드 Room Layout Estimation Methods and Techniques
US10607096B2 (en) 2017-04-04 2020-03-31 Princeton Identity, Inc. Z-dimension user feedback biometric system
CN107368725B (en) * 2017-06-16 2020-04-10 Oppo广东移动通信有限公司 Iris recognition method, electronic device, and computer-readable storage medium
JP7149300B2 (en) 2017-07-26 2022-10-06 マジック リープ, インコーポレイテッド Training Neural Networks Using User Interface Device Representations
US10902104B2 (en) 2017-07-26 2021-01-26 Princeton Identity, Inc. Biometric security systems and methods
US10521661B2 (en) 2017-09-01 2019-12-31 Magic Leap, Inc. Detailed eye shape model for robust biometric applications
US10719951B2 (en) 2017-09-20 2020-07-21 Magic Leap, Inc. Personalized neural network for eye tracking
US11537895B2 (en) 2017-10-26 2022-12-27 Magic Leap, Inc. Gradient normalization systems and methods for adaptive loss balancing in deep multitask networks
CN110472546B (en) * 2019-08-07 2024-01-12 南京大学 Infant non-contact eye movement feature extraction device and method
CN112905816A (en) * 2021-03-19 2021-06-04 上海聚虹光电科技有限公司 Iris search identification method, iris search identification device, iris search identification processor and electronic device
CN113780234B (en) * 2021-09-24 2024-03-12 北京航空航天大学 Edge-guided human eye image analysis method
CN115393350B (en) * 2022-10-26 2023-06-09 广东麦特维逊医学研究发展有限公司 Iris positioning method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006081209A2 (en) * 2005-01-26 2006-08-03 Honeywell International Inc. Iris recognition system and method

Family Cites Families (476)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7467809B2 (en) 1992-05-05 2008-12-23 Automotive Technologies International, Inc. Vehicular occupant characteristic determination system and method
US4641349A (en) 1985-02-20 1987-02-03 Leonard Flom Iris recognition system
US4836670A (en) 1987-08-19 1989-06-06 Center For Innovative Technology Eye movement detector
US7512254B2 (en) 2001-11-07 2009-03-31 Symbol Technologies, Inc. System and method for mobile biometric authentication
US6091899A (en) 1988-09-16 2000-07-18 Canon Kabushiki Kaisha Apparatus for detecting the direction of visual axis and information selecting apparatus utilizing the same
US5231674A (en) 1989-06-09 1993-07-27 Lc Technologies, Inc. Eye tracking method and apparatus
EP0484076B1 (en) 1990-10-29 1996-12-18 Kabushiki Kaisha Toshiba Video camera having focusing and image-processing function
FR2670031A1 (en) 1990-11-29 1992-06-05 Asahi Optical Co Ltd AUTOMATIC FOCUSING DEVICE.
JP2522859B2 (en) 1990-12-14 1996-08-07 日産自動車株式会社 Eye position detection device
GB2255870B (en) 1991-05-15 1995-11-01 Asahi Optical Co Ltd Automatic focusing apparatus
US5291560A (en) 1991-07-15 1994-03-01 Iri Scan Incorporated Biometric personal identification system based on iris analysis
JP3206980B2 (en) 1992-09-11 2001-09-10 旭光学工業株式会社 Automatic focus control device
EP0593386A3 (en) 1992-10-16 1996-07-31 Ibm Method and apparatus for accessing touch screen desktop objects via fingerprint recognition
US5608472A (en) 1992-11-06 1997-03-04 Research Development Foundation Eye imaging system
JP2583010B2 (en) 1993-01-07 1997-02-19 インターナショナル・ビジネス・マシーンズ・コーポレイション Method of maintaining consistency between local index table and global index table in multi-tier index structure
JP3500539B2 (en) 1993-02-25 2004-02-23 富士通株式会社 Automatic focus adjustment method for infrared camera
US6325765B1 (en) 1993-07-20 2001-12-04 S. Hutson Hay Methods for analyzing eye
JPH07128579A (en) 1993-10-29 1995-05-19 Canon Inc Detecting method for light of sight, detecting means for line of sight and video device having detecting means for line of sight
US5572596A (en) 1994-09-02 1996-11-05 David Sarnoff Research Center, Inc. Automated, non-invasive iris recognition system and method
US6714665B1 (en) 1994-09-02 2004-03-30 Sarnoff Corporation Fully automated iris recognition system utilizing wide and narrow fields of view
JP3635687B2 (en) 1994-09-07 2005-04-06 株式会社ニコン Automatic focusing device
US6012376A (en) 1995-04-28 2000-01-11 Raytheon Company Gun sight system for a military vehicle
US6363164B1 (en) 1996-05-13 2002-03-26 Cummins-Allison Corp. Automated document processing system using full image scanning
US6493363B1 (en) 1995-11-09 2002-12-10 The United States Of America As Represented By The Secretary Of Agricultural Automated counting insect electrocutor
JPH09212644A (en) 1996-02-07 1997-08-15 Oki Electric Ind Co Ltd Iris recognition device and iris recognition method
US6433818B1 (en) 1998-11-06 2002-08-13 Fotonation, Inc. Digital camera with biometric security
US6400835B1 (en) 1996-05-15 2002-06-04 Jerome H. Lemelson Taillight mounted vehicle security system employing facial recognition using a reflected image
US5717512A (en) 1996-05-15 1998-02-10 Chmielewski, Jr.; Thomas A. Compact image steering and focusing device
US6320610B1 (en) 1998-12-31 2001-11-20 Sensar, Inc. Compact imaging device incorporating rotatably mounted cameras
US6503163B1 (en) 1996-05-15 2003-01-07 Sensar, Inc. Precision cable drive
US6718665B2 (en) 1996-05-17 2004-04-13 Dimplex North America Limited Flame simulating assembly
CN1305440C (en) 1996-06-06 2007-03-21 英国电讯有限公司 Personal identification
JP3436293B2 (en) 1996-07-25 2003-08-11 沖電気工業株式会社 Animal individual identification device and individual identification system
US6837436B2 (en) 1996-09-05 2005-01-04 Symbol Technologies, Inc. Consumer interactive shopping system
US6108636A (en) 1996-10-15 2000-08-22 Iris Corporation Berhad Luggage handling and reconciliation system using an improved security identification document including contactless communication insert unit
JP3533308B2 (en) 1997-02-10 2004-05-31 株式会社ニデック Ophthalmic equipment
US6229905B1 (en) 1997-03-26 2001-05-08 Oki Electric Industry Co., Ltd. Animal identification based on irial granule analysis
US6285780B1 (en) 1997-03-28 2001-09-04 Oki Electric Industry Co., Ltd. Apparatus for identifying individual animals and image processing method
US6144754A (en) 1997-03-28 2000-11-07 Oki Electric Industry Co., Ltd. Method and apparatus for identifying individuals
GB9709883D0 (en) 1997-05-16 1997-07-09 Ncr Int Inc User verification system
US5859686A (en) 1997-05-19 1999-01-12 Northrop Grumman Corporation Eye finding and tracking system
US6516416B2 (en) 1997-06-11 2003-02-04 Prism Resources Subscription access system for use with an untrusted network
US6119096A (en) 1997-07-31 2000-09-12 Eyeticket Corporation System and method for aircraft passenger check-in and boarding using iris recognition
US6246751B1 (en) 1997-08-11 2001-06-12 International Business Machines Corporation Apparatus and methods for user identification to deny access or service to unauthorized users
US7071971B2 (en) 1997-08-25 2006-07-04 Elbex Video Ltd. Apparatus for identifying the scene location viewed via remotely operated television camera
US6151403A (en) 1997-08-29 2000-11-21 Eastman Kodak Company Method for automatic detection of human eyes in digital images
US6007202A (en) 1997-10-23 1999-12-28 Lasersight Technologies, Inc. Eye illumination system and method
EP0910986A1 (en) 1997-10-24 1999-04-28 BRITISH TELECOMMUNICATIONS public limited company Imaging apparatus
US6064752A (en) 1997-11-04 2000-05-16 Sensar, Inc. Method and apparatus for positioning subjects before a single camera
US6069967A (en) 1997-11-04 2000-05-30 Sensar, Inc. Method and apparatus for illuminating and imaging eyes through eyeglasses
US6055322A (en) 1997-12-01 2000-04-25 Sensor, Inc. Method and apparatus for illuminating and imaging eyes through eyeglasses using multiple sources of illumination
US6021210A (en) 1997-12-01 2000-02-01 Sensar, Inc. Image subtraction to remove ambient illumination
US6028949A (en) 1997-12-02 2000-02-22 Mckendall; Raymond A. Method of verifying the presence of an eye in a close-up image
US5953440A (en) 1997-12-02 1999-09-14 Sensar, Inc. Method of measuring the focus of close-up images of eyes
US6154321A (en) 1998-01-20 2000-11-28 University Of Washington Virtual retinal display with eye tracking
US6101477A (en) 1998-01-23 2000-08-08 American Express Travel Related Services Company, Inc. Methods and apparatus for a travel-related multi-function smartcard
US6088470A (en) 1998-01-27 2000-07-11 Sensar, Inc. Method and apparatus for removal of bright or dark spots by the fusion of multiple images
US5978494A (en) 1998-03-04 1999-11-02 Sensar, Inc. Method of selecting the best enroll image for personal identification
JP3271750B2 (en) 1998-03-05 2002-04-08 沖電気工業株式会社 Iris identification code extraction method and device, iris recognition method and device, data encryption device
US6508397B1 (en) 1998-03-30 2003-01-21 Citicorp Development Center, Inc. Self-defense ATM
US8108307B1 (en) 1998-03-30 2012-01-31 Citicorp Development Center, Inc. System, method and apparatus for value exchange utilizing value-storing applications
WO2001084494A1 (en) 2000-04-28 2001-11-08 Precise Biometrics Ab Biometric identity check
US6320612B1 (en) 1998-05-12 2001-11-20 Jan J. Young Vehicular camera system with plural perspectives
US6957341B2 (en) 1998-05-14 2005-10-18 Purdue Research Foundation Method and system for secure computational outsourcing and disguise
US6594399B1 (en) 1998-05-14 2003-07-15 Sensar, Inc. Method and apparatus for integrating multiple 1-D filters into a digital image stream interface
GB9811586D0 (en) 1998-05-30 1998-07-29 Stevenson Neil J A vehicle entry/exit control system
JP3296420B2 (en) 1998-06-18 2002-07-02 松下電器産業株式会社 Iris imaging apparatus and iris imaging method thereof
JP2000005150A (en) 1998-06-18 2000-01-11 Matsushita Electric Ind Co Ltd Device and method for iris image pickup
JP2000011163A (en) 1998-06-18 2000-01-14 Matsushita Electric Ind Co Ltd Iris image pickup device and method therefor
US6424845B1 (en) 1998-06-19 2002-07-23 Ncr Corporation Portable communication device
US5956122A (en) 1998-06-26 1999-09-21 Litton Systems, Inc Iris recognition apparatus and method
JP4291514B2 (en) 1998-07-09 2009-07-08 コロラド ステイト ユニヴァーシティー リサーチ ファンデーション Retinal blood vessel image collection apparatus and method
JP2000023946A (en) 1998-07-10 2000-01-25 Matsushita Electric Ind Co Ltd Iris imaging instrument and method therefor
JP3610234B2 (en) 1998-07-17 2005-01-12 株式会社メディア・テクノロジー Iris information acquisition device and iris identification device
JP3315648B2 (en) 1998-07-17 2002-08-19 沖電気工業株式会社 Iris code generation device and iris recognition system
JP3562970B2 (en) 1998-09-10 2004-09-08 沖電気工業株式会社 Biological identification device
JP2000092046A (en) 1998-09-11 2000-03-31 Mitsubishi Electric Corp Remote authentication system
US6134339A (en) 1998-09-17 2000-10-17 Eastman Kodak Company Method and apparatus for determining the position of eyes and for correcting eye-defects in a captured frame
WO2000017823A1 (en) 1998-09-22 2000-03-30 Siemens Aktiengesellschaft Method and device for verifying the authorization to log onto a system
JP3337988B2 (en) 1998-09-29 2002-10-28 沖電気工業株式会社 Individual identification device
JP2000105830A (en) 1998-09-29 2000-04-11 Oki Electric Ind Co Ltd Individual identifying device
JP3383594B2 (en) 1998-09-29 2003-03-04 沖電気工業株式会社 Eye opening measurement device
US6522772B1 (en) 1998-09-30 2003-02-18 Ncr Corporation Self-service checkout terminal having a biometric sensing device for verifying identity of a user and associated method
US6330674B1 (en) 1998-09-30 2001-12-11 Compaq Computer Corporation Use of biometrics as a methodology for defining components for ECC encryption
JP2000107156A (en) 1998-10-08 2000-04-18 Hoan Denshi Tsushin Gijutsu Kyokai Authentication device
JP3212952B2 (en) 1998-10-26 2001-09-25 沖電気工業株式会社 Iris pattern input device
US6580356B1 (en) 1998-11-05 2003-06-17 Eckhard Alt Advanced personal identification systems and techniques
US6196972B1 (en) 1998-11-11 2001-03-06 Spentech, Inc. Doppler ultrasound method and apparatus for monitoring blood flow
GB9824697D0 (en) 1998-11-11 1999-01-06 Ncr Int Inc Terminal
JP4258868B2 (en) 1998-11-16 2009-04-30 沖電気工業株式会社 Iris pattern recognition device
JP3523795B2 (en) 1998-11-19 2004-04-26 沖電気工業株式会社 Access control system
US6377699B1 (en) 1998-11-25 2002-04-23 Iridian Technologies, Inc. Iris imaging telephone security module and method
US6753919B1 (en) 1998-11-25 2004-06-22 Iridian Technologies, Inc. Fast focus assessment system and method for imaging
US6289113B1 (en) 1998-11-25 2001-09-11 Iridian Technologies, Inc. Handheld iris imaging apparatus and method
US6532298B1 (en) 1998-11-25 2003-03-11 Iridian Technologies, Inc. Portable authentication device and method using iris patterns
US6424727B1 (en) 1998-11-25 2002-07-23 Iridian Technologies, Inc. System and method of animal identification and animal transaction authorization using iris patterns
JP2000182050A (en) 1998-12-11 2000-06-30 Oki Electric Ind Co Ltd Individual identifying device for animal
JP2000185031A (en) 1998-12-22 2000-07-04 Oki Electric Ind Co Ltd Individual identification device
JP2000189403A (en) * 1998-12-25 2000-07-11 Oki Electric Ind Co Ltd Iris region extraction and individual identifying device
JP3296423B2 (en) 1998-12-25 2002-07-02 松下電器産業株式会社 Access control system
US6393136B1 (en) 1999-01-04 2002-05-21 International Business Machines Corporation Method and apparatus for determining eye contact
NL1011509C2 (en) 1999-01-05 2000-07-06 Nedap Nv Method for biometric identification of animals.
KR100320465B1 (en) 1999-01-11 2002-01-16 구자홍 Iris recognition system
US6332193B1 (en) 1999-01-18 2001-12-18 Sensar, Inc. Method and apparatus for securely transmitting and authenticating biometric data over a network
US6950139B2 (en) 1999-01-22 2005-09-27 Nikon Corporation Image reading device and storage medium storing control procedure for image reading device
TW468123B (en) 1999-01-29 2001-12-11 Ibm Method and apparatus for automatic gas pump charging cost of gasoline to auto computer using pan technology
JP2000237167A (en) 1999-02-19 2000-09-05 Oki Electric Ind Co Ltd Iris recognizing device
JP2000242788A (en) 1999-02-22 2000-09-08 Matsushita Electric Ind Co Ltd Iris recognition system
JP3728386B2 (en) 1999-03-08 2005-12-21 沖電気工業株式会社 Iris recognition device
KR100319608B1 (en) 1999-03-09 2002-01-05 김영환 A stacked semiconductor package and the fabricating method thereof
KR100320188B1 (en) 1999-03-23 2002-01-10 구자홍 Forgery judgment method for iris recognition system
GB9907514D0 (en) 1999-04-01 1999-05-26 Ncr Int Inc Self service terminal
GB9907513D0 (en) 1999-04-01 1999-05-26 Ncr Int Inc Transaction recordal and validation
US6247813B1 (en) 1999-04-09 2001-06-19 Iritech, Inc. Iris identification system and method of identifying a person through iris recognition
JP4037009B2 (en) 1999-06-16 2008-01-23 沖電気工業株式会社 Iris recognition device and identification system
JP2001005948A (en) 1999-06-17 2001-01-12 Matsushita Electric Ind Co Ltd Iris imaging device
JP3317928B2 (en) 1999-06-17 2002-08-26 松下電器産業株式会社 Hotel management equipment
JP2001004909A (en) 1999-06-18 2001-01-12 Olympus Optical Co Ltd Camera having automatic focusing device
US6438752B1 (en) 1999-06-22 2002-08-20 Mediaone Group, Inc. Method and system for selecting television programs based on the past selection history of an identified user
WO2001006448A1 (en) 1999-07-14 2001-01-25 Veridicom, Inc. Ultra-rugged i.c. sensor and method of making the same
JP2001034754A (en) 1999-07-19 2001-02-09 Sony Corp Iris authentication device
US6553494B1 (en) 1999-07-21 2003-04-22 Sensar, Inc. Method and apparatus for applying and verifying a biometric-based digital signature to an electronic document
US6516078B1 (en) 1999-07-29 2003-02-04 Hewlett-Packard Company Multi-level detection and deterrence of counterfeiting of documents with reduced false detection
US6120461A (en) 1999-08-09 2000-09-19 The United States Of America As Represented By The Secretary Of The Army Apparatus for tracking the human eye with a retinal scanning display, and method thereof
JP2001067399A (en) 1999-08-25 2001-03-16 Oki Electric Ind Co Ltd Electronic money transaction system
AU7346800A (en) 1999-09-02 2001-03-26 Automated Business Companies Communication and proximity authorization systems
US6370260B1 (en) 1999-09-03 2002-04-09 Honeywell International Inc. Near-IR human detector
US7076088B2 (en) 1999-09-03 2006-07-11 Honeywell International Inc. Near-infrared disguise detection
US6282475B1 (en) 1999-09-20 2001-08-28 Valdemar L. Washington System for automatically adjustable devices in an automotive vehicle
US7391865B2 (en) 1999-09-20 2008-06-24 Security First Corporation Secure data parser method and system
US6674367B2 (en) 1999-09-28 2004-01-06 Clifford Sweatte Method and system for airport and building security
JP2001101429A (en) 1999-09-28 2001-04-13 Omron Corp Method and device for observing face, and recording medium for face observing processing
JP2001118103A (en) 1999-10-15 2001-04-27 Oki Electric Ind Co Ltd Gate managing device
JP2001135422A (en) 1999-11-08 2001-05-18 Yazaki Corp Electric shield connector directly attached to apparatus
AU4137601A (en) 1999-11-30 2001-06-12 Barry Johnson Methods, systems, and apparatuses for secure interactions
US6505193B1 (en) 1999-12-01 2003-01-07 Iridian Technologies, Inc. System and method of fast biometric database searching using digital certificates
US6711562B1 (en) 1999-12-01 2004-03-23 The Trustees Of Columbia University In The City Of New York Cache sensitive search (CSS) tree indexing system and method
US6775774B1 (en) 1999-12-06 2004-08-10 Bsi 2000, Inc. Optical card based system for individualized tracking and record keeping
JP2001167275A (en) 1999-12-13 2001-06-22 Oki Electric Ind Co Ltd Individual identifying device
US6516087B1 (en) 2000-01-10 2003-02-04 Sensar, Inc. Method for real time correlation of stereo images
US6446045B1 (en) 2000-01-10 2002-09-03 Lucinda Stone Method for using computers to facilitate and control the creating of a plurality of functions
US6494363B1 (en) 2000-01-13 2002-12-17 Ncr Corporation Self-service terminal
JP2001222661A (en) 2000-02-08 2001-08-17 Oki Software Kk Automatic transaction device
GB0004287D0 (en) 2000-02-23 2000-04-12 Leeper Kim System and method for authenticating electronic documents
JP2001236324A (en) 2000-02-24 2001-08-31 Fujitsu Ltd Portable electronic device with individual authenticating function by biometric information
AU2001243673A1 (en) 2000-03-15 2001-09-24 Emedicalfiles, Inc. Web-hosted healthcare medical information management system
JP3631655B2 (en) 2000-03-22 2005-03-23 シャープ株式会社 Solid-state imaging device
JP3825222B2 (en) 2000-03-24 2006-09-27 松下電器産業株式会社 Personal authentication device, personal authentication system, and electronic payment system
US20030159051A1 (en) 2000-03-27 2003-08-21 Wilhelm Hollnagel Method for generating electronic signatures
GB0007360D0 (en) 2000-03-28 2000-05-17 Ncr Int Inc Electronic wallet
US6580821B1 (en) 2000-03-30 2003-06-17 Nec Corporation Method for computing the location and orientation of an object in three dimensional space
US6968457B2 (en) 2000-03-31 2005-11-22 Joseph Wing On Tam Method for making secured personal identity card and procedures for validation and obtaining secure personal information
US6299306B1 (en) 2000-03-31 2001-10-09 Sensar, Inc. Method and apparatus for positioning subjects using a holographic optical element
US6540392B1 (en) 2000-03-31 2003-04-01 Sensar, Inc. Micro-illuminator for use with image recognition system
US6441482B1 (en) 2000-04-11 2002-08-27 Omnivision Technologies, Inc. Biometric device with integrated CMOS image sensor
JP2001292981A (en) 2000-04-12 2001-10-23 Oki Electric Ind Co Ltd Iris recognition apparatus
JP2001297177A (en) 2000-04-17 2001-10-26 Oki Electric Ind Co Ltd Accounting system
US7280984B2 (en) 2000-05-08 2007-10-09 Phelan Iii Frank Money card system, method and apparatus
US20010051924A1 (en) 2000-05-09 2001-12-13 James Uberti On-line based financial services method and system utilizing biometrically secured transactions for issuing credit
JP4693329B2 (en) 2000-05-16 2011-06-01 スイスコム・アクチエンゲゼルシヤフト Command input method and terminal device
ES2241598T3 (en) 2000-05-16 2005-11-01 Swisscom Mobile Ag BIOMETRIC PROCEDURE OF IDENTIFICATION AND AUTHENTICATION.
US6493669B1 (en) 2000-05-16 2002-12-10 Delphi Technologies, Inc. Speech recognition driven system with selectable speech models
GB0012840D0 (en) 2000-05-25 2000-07-19 Thirdphase Limited Method and system for collection and verification of data from plural sites
JP2003535559A (en) 2000-06-02 2003-11-25 キネティック サイエンシーズ インコーポレイテッド Email biometric encryption method
US6323761B1 (en) 2000-06-03 2001-11-27 Sam Mog Son Vehicular security access system
US7599847B2 (en) 2000-06-09 2009-10-06 Airport America Automated internet based interactive travel planning and management system
EP1162645A3 (en) * 2000-06-09 2007-09-26 Jeol Ltd. Specimen inspection instrument
JP4228520B2 (en) 2000-06-12 2009-02-25 沖電気工業株式会社 Iris photography device
US6836554B1 (en) 2000-06-16 2004-12-28 International Business Machines Corporation System and method for distorting a biometric for transactions with enhanced security and privacy
US7120607B2 (en) 2000-06-16 2006-10-10 Lenovo (Singapore) Pte. Ltd. Business system and method using a distorted biometrics
US20020194131A1 (en) 2001-06-18 2002-12-19 Dick Richard S. Method and system for electronically transmitting authorization to release medical information
EP1294440B1 (en) 2000-06-23 2005-06-01 Medtronic, Inc. Portable extender for data transmission within a medical device communication system
MY134895A (en) 2000-06-29 2007-12-31 Multimedia Glory Sdn Bhd Biometric verification for electronic transactions over the web
KR100373850B1 (en) 2000-10-07 2003-02-26 주식회사 큐리텍 Identification system and method using iris, and media that can record computer program sources thereof
FR2811843B1 (en) 2000-07-13 2002-12-06 France Telecom ACTIVATION OF AN INTERACTIVE MULTIMEDIA TERMINAL
EP1172771B1 (en) 2000-07-14 2006-04-19 Voice.Trust Ag Process and system for authorising a commercial transaction
AUPQ896000A0 (en) 2000-07-24 2000-08-17 Seeing Machines Pty Ltd Facial image processing system
US7552333B2 (en) 2000-08-04 2009-06-23 First Data Corporation Trusted authentication digital signature (tads) system
BE1013637A6 (en) 2000-08-07 2002-05-07 Smet Francis De Method for searching for information on the Internet
WO2002013130A1 (en) 2000-08-07 2002-02-14 Skidata Ag Reader device for bar-coded authorisation cards
US20040025053A1 (en) 2000-08-09 2004-02-05 Hayward Philip John Personal data device and protection system and method for storing and protecting personal data
JP4469476B2 (en) 2000-08-09 2010-05-26 パナソニック株式会社 Eye position detection method and eye position detection apparatus
FR2829272A1 (en) 2000-08-22 2003-03-07 France Telecom DEVICE FOR AUTOMATED CONTROL OF ELECTRICAL EQUIPMENT BY A PERSON
US20030191949A1 (en) 2000-08-30 2003-10-09 Akihiro Odagawa Authentication system, authentication request device, validating device and service medium
WO2002023308A2 (en) 2000-09-12 2002-03-21 Viaken Systems, Inc. Techniques for providing and obtaining research and development information technology on remote computing resources
JP4529263B2 (en) 2000-09-18 2010-08-25 沖電気工業株式会社 Iris recognition device
US6750435B2 (en) 2000-09-22 2004-06-15 Eastman Kodak Company Lens focusing device, system and method for use with multiple light wavelengths
GB0024274D0 (en) 2000-10-04 2000-11-15 Harris John W Vehicle security and occupant safety system
US7277561B2 (en) 2000-10-07 2007-10-02 Qritek Co., Ltd. Iris identification
US6819219B1 (en) 2000-10-13 2004-11-16 International Business Machines Corporation Method for biometric-based authentication in wireless communication for access control
JP2002122899A (en) 2000-10-16 2002-04-26 Matsushita Electric Ind Co Ltd Iris image pickup apparatus
JP2002125142A (en) 2000-10-16 2002-04-26 Matsushita Electric Ind Co Ltd Iris image pickup device
JP4487405B2 (en) 2000-10-16 2010-06-23 沖電気工業株式会社 Iris circle detection device and iris pattern encoding device using the same
JP4649035B2 (en) 2000-10-18 2011-03-09 株式会社トプコン Eye characteristics measuring device
JP2002122778A (en) 2000-10-19 2002-04-26 Fuji Electric Co Ltd Automatic focusing unit and electronic imaging unit
JP2002133415A (en) 2000-10-27 2002-05-10 Oki Electric Ind Co Ltd Individual identifying device
US6754640B2 (en) 2000-10-30 2004-06-22 William O. Bozeman Universal positive pay match, authentication, authorization, settlement and clearing system
WO2002044868A2 (en) 2000-11-10 2002-06-06 Medidata Solutions, Inc. Method and apparatus of assuring informed consent while conducting secure clinical trials
KR100649303B1 (en) 2000-11-16 2006-11-24 엘지전자 주식회사 Apparatus of taking pictures in iris recognition system based on both of eyes's images
GB2369205B (en) 2000-11-17 2005-02-02 Personal Data Prot System Ltd Personal data device and protection system and method for storing and protecting personal data
JP2002153445A (en) 2000-11-21 2002-05-28 Oki Electric Ind Co Ltd Iris recognition device
JP2002153444A (en) 2000-11-21 2002-05-28 Oki Electric Ind Co Ltd Individual authentication system
US20020062280A1 (en) 2000-11-21 2002-05-23 John Zachariassen System and method for transmitting goods, remuneration, and information
EP1213638A1 (en) 2000-12-06 2002-06-12 Siemens Aktiengesellschaft Enabling devices
US7125335B2 (en) 2000-12-08 2006-10-24 Igt Casino gambling system with biometric access control
US7114080B2 (en) 2000-12-14 2006-09-26 Matsushita Electric Industrial Co., Ltd. Architecture for secure remote access and transmission using a generalized password scheme with biometric features
US20030182182A1 (en) 2000-12-18 2003-09-25 Kocher Robert W. Biometrics-based voting
US6920237B2 (en) 2000-12-19 2005-07-19 Eastman Kodak Company Digital image processing method and computer program product for detecting human irises in an image
US6792134B2 (en) 2000-12-19 2004-09-14 Eastman Kodak Company Multi-mode digital image processing method for detecting eyes
US6930707B2 (en) 2000-12-22 2005-08-16 International Business Machines Corporation Digital camera apparatus with biometric capability
US6867683B2 (en) 2000-12-28 2005-03-15 Unisys Corporation High security identification system for entry to multiple zones
US7028009B2 (en) 2001-01-17 2006-04-11 Contentguardiholdings, Inc. Method and apparatus for distributing enforceable property rights
US7215797B2 (en) 2001-02-02 2007-05-08 Lg Electronics Inc. Iris recognition system
US20020142844A1 (en) 2001-02-06 2002-10-03 Kerr Michael A. Biometric broadband gaming system and method
KR20020065248A (en) * 2001-02-06 2002-08-13 이승재 Preprocessing of Human Iris Verification
US20020112177A1 (en) 2001-02-12 2002-08-15 Voltmer William H. Anonymous biometric authentication
US6732278B2 (en) 2001-02-12 2004-05-04 Baird, Iii Leemon C. Apparatus and method for authenticating access to a network resource
JP4453209B2 (en) 2001-02-27 2010-04-21 沖電気工業株式会社 Automatic transaction equipment
JP3586431B2 (en) 2001-02-28 2004-11-10 松下電器産業株式会社 Personal authentication method and device
KR100374708B1 (en) * 2001-03-06 2003-03-04 에버미디어 주식회사 Non-contact type human iris recognition method by correction of rotated iris image
KR100374707B1 (en) 2001-03-06 2003-03-04 에버미디어 주식회사 Method of recognizing human iris using daubechies wavelet transform
JP2002271689A (en) 2001-03-12 2002-09-20 Sony Corp Imaging device, and iris control method for the imaging device
US6845479B2 (en) 2001-03-14 2005-01-18 Tality Uk Limited Method for testing for the presence of faults in digital circuits
US7095901B2 (en) 2001-03-15 2006-08-22 Lg Electronics, Inc. Apparatus and method for adjusting focus position in iris recognition system
US7271839B2 (en) 2001-03-15 2007-09-18 Lg Electronics Inc. Display device of focal angle and focal distance in iris recognition system
JP2002286650A (en) 2001-03-27 2002-10-03 Nippon Steel Corp Apparatus and method for acquiring data for adjustment of formation as well as computer program and computer- readable storage medium
GB0107689D0 (en) 2001-03-28 2001-05-16 Ncr Int Inc Self service terminal
GB2373943A (en) 2001-03-28 2002-10-02 Hewlett Packard Co Visible and infrared imaging camera
US20020186131A1 (en) 2001-04-03 2002-12-12 Brad Fettis Card security device
JP2002312772A (en) 2001-04-13 2002-10-25 Oki Electric Ind Co Ltd Individual identification device and eye forgery judgment method
US7331667B2 (en) 2001-04-27 2008-02-19 Bausch Lomb Incorporated Iris pattern recognition and alignment
JP2002329204A (en) 2001-04-27 2002-11-15 Oki Electric Ind Co Ltd System and program for individual authentication
US20020158750A1 (en) 2001-04-30 2002-10-31 Almalik Mansour Saleh System, method and portable device for biometric identification
JP2002334325A (en) 2001-05-11 2002-11-22 Matsushita Electric Ind Co Ltd Method and device for picking up image to be authenticated
JP2002341406A (en) 2001-05-11 2002-11-27 Matsushita Electric Ind Co Ltd Method and device for imaging object to be authenticated
US20040193893A1 (en) 2001-05-18 2004-09-30 Michael Braithwaite Application-specific biometric templates
US20020175182A1 (en) 2001-05-23 2002-11-28 Matthews Shaun Kerry Self contained dispenser incorporating a user monitoring system
US20020194128A1 (en) 2001-06-14 2002-12-19 Michael Maritzen System and method for secure reverse payment
JP2003006628A (en) 2001-06-20 2003-01-10 Oki Electric Ind Co Ltd Iris recognizing device
US20020198731A1 (en) 2001-06-26 2002-12-26 Barnes Jessica M. Method and apparatus for processing an international passenger
JP2003016435A (en) 2001-06-27 2003-01-17 Matsushita Electric Ind Co Ltd Device and method for individual authentication
JP2003016434A (en) 2001-06-27 2003-01-17 Matsushita Electric Ind Co Ltd Individual authenticating device
US20040233038A1 (en) 2001-07-10 2004-11-25 American Express Travel Related Services Company, Inc. Method and system for retinal scan recognition biometrics on a fob
US6523165B2 (en) 2001-07-13 2003-02-18 Numerical Technologies, Inc. Alternating phase shift mask design conflict resolution
JP2003030659A (en) 2001-07-16 2003-01-31 Matsushita Electric Ind Co Ltd Iris authentication device and iris image pickup device
JP2003036434A (en) 2001-07-23 2003-02-07 Nagano Kogaku Kenkyusho:Kk Optical system for portable individual authenticating device
JP2003037766A (en) 2001-07-24 2003-02-07 Matsushita Electric Ind Co Ltd Iris imager
JP2003037765A (en) 2001-07-24 2003-02-07 Matsushita Electric Ind Co Ltd Iris imager device
CA2354614C (en) 2001-07-30 2002-12-10 Silvano Pregara Autofocus sensor
US20030169334A1 (en) 2001-08-06 2003-09-11 Michael Braithwaite Iris capture device having expanded capture volume
US20030112326A1 (en) 2001-08-17 2003-06-19 Byoungyi Yoon Method and system for transmitting or storing stereoscopic images and photographing ratios for the images
WO2003019447A1 (en) 2001-08-21 2003-03-06 Diebold, Incorporated Atm deposit verification system and method
US20030046228A1 (en) 2001-08-28 2003-03-06 Jean-Marc Berney User-wearable functional jewelry with biometrics and smartcard to remotely sign and/or authenticate to e-services
JP2003085084A (en) 2001-09-12 2003-03-20 Sony Corp Contents delivery system and method, portable terminal, delivery server, and recording medium
US6690997B2 (en) 2001-09-13 2004-02-10 M.A. Rivalto, Inc. System for automated package-pick up and delivery
US7058209B2 (en) 2001-09-20 2006-06-06 Eastman Kodak Company Method and computer program product for locating facial features
JP2003099693A (en) 2001-09-20 2003-04-04 Fujitsu Ltd Electronic settlement method
US7269737B2 (en) 2001-09-21 2007-09-11 Pay By Touch Checking Resources, Inc. System and method for biometric authorization for financial transactions
US7203343B2 (en) 2001-09-21 2007-04-10 Hewlett-Packard Development Company, L.P. System and method for determining likely identity in a biometric database
JP2003108720A (en) 2001-09-26 2003-04-11 Ricoh Co Ltd Work flow support system, its method, work flow support program, and computer readable recording medium with the program recorded thereon
JP2003108983A (en) 2001-09-28 2003-04-11 Matsushita Electric Ind Co Ltd Eye image pickup device, iris authentication device, and portable terminal device with iris authentication function
US20030065626A1 (en) 2001-09-28 2003-04-03 Allen Karl H. User verification for conducting health-related transactions
US20030073499A1 (en) 2001-10-02 2003-04-17 Kenneth Reece Network gaming device and method for allowing a player to participate in a live game over a network
JP2003111749A (en) 2001-10-09 2003-04-15 Bmf:Kk Device for discriminating human
US20030071743A1 (en) 2001-10-12 2003-04-17 Singapore Technologies Electronics Limited Aircraft monitoring and incident management system
US20030074317A1 (en) 2001-10-15 2003-04-17 Eyal Hofi Device, method and system for authorizing transactions
US20030074326A1 (en) 2001-10-17 2003-04-17 Byers James T. Method and apparatus for providing biometric information as a signature to a contract
FR2831302A1 (en) 2001-10-19 2003-04-25 St Microelectronics Sa CODING OF CONCENTRIC INFORMATION
AU2002363055A1 (en) 2001-10-19 2003-05-06 Bank Of America Corporation System and method for interative advertising
FR2831351A1 (en) 2001-10-19 2003-04-25 St Microelectronics Sa Bandpass filter for detecting texture of iris, has passband which is real, bi-dimensional, oriented along phase axis and resulting from product of pair of identical one-dimensional Hamming windows having specified transfer function
US20030080194A1 (en) 2001-10-25 2003-05-01 O'hara Sean M. Biometric water mixing valve
JP2003132355A (en) 2001-10-25 2003-05-09 Matsushita Electric Ind Co Ltd Iris authenticating method and its device
WO2003063102A2 (en) 2001-11-06 2003-07-31 Radian Inc. Physiomagnetometric inspection and surveillance system and method
US6598971B2 (en) 2001-11-08 2003-07-29 Lc Technologies, Inc. Method and system for accommodating pupil non-concentricity in eyetracker systems
US20030092489A1 (en) 2001-11-09 2003-05-15 Veradej Annusorn Andy Interactive gaming with biometric verification
JP2003150942A (en) 2001-11-16 2003-05-23 Kiyomi Nakamura Eye position tracing method
JP3904903B2 (en) 2001-11-21 2007-04-11 三菱電機株式会社 Personal authentication device and personal authentication method
US20030099379A1 (en) 2001-11-26 2003-05-29 Monk Bruce C. Validation and verification apparatus and method
JP4068334B2 (en) 2001-11-26 2008-03-26 日本電気株式会社 Fingerprint authentication method, fingerprint authentication system, and biometric authentication system
SG124246A1 (en) 2001-11-26 2006-08-30 Inventio Ag System for security control and/or transportation of persons with an elevator installation, method of operating this system, and method of retro-fitting an elevator installation with this system
WO2003046777A2 (en) 2001-11-26 2003-06-05 Ball, Ronald, H. Portable messaging device adapted to perform financial transactions
US20030210139A1 (en) 2001-12-03 2003-11-13 Stephen Brooks Method and system for improved security
KR100453943B1 (en) 2001-12-03 2004-10-20 주식회사 세넥스테크놀로지 Iris image processing recognizing method and system for personal identification
KR100456619B1 (en) 2001-12-05 2004-11-10 한국전자통신연구원 A system for registering and authenticating human face using support vector machines and method thereof
US7239726B2 (en) 2001-12-12 2007-07-03 Sony Corporation System and method for effectively extracting facial feature information
US20030115148A1 (en) 2001-12-13 2003-06-19 Takhar Harinder Singh Method and apparatus for processing a secure transaction
US7003669B2 (en) 2001-12-17 2006-02-21 Monk Bruce C Document and bearer verification system
KR100464081B1 (en) 2001-12-20 2004-12-30 엘지전자 주식회사 Automatic upgrade method of iris code database for iris recognition system
US20030116630A1 (en) 2001-12-21 2003-06-26 Kba-Giori S.A. Encrypted biometric encoded security documents
US20030125057A1 (en) 2001-12-27 2003-07-03 Pesola Troy Raymond System and method for automatic synchronization of managed data
KR100854890B1 (en) 2001-12-28 2008-08-28 엘지전자 주식회사 Iris recording and recognition method using of several led for iris recognition system
US20030126560A1 (en) 2001-12-28 2003-07-03 Koninklijke Philips Electronics N.V. Adaptive bookmarking of often-visited web sites
US20030131245A1 (en) 2002-01-04 2003-07-10 Michael Linderman Communication security system
US7506172B2 (en) 2002-01-07 2009-03-17 Igt Gaming device with biometric system
WO2003060814A1 (en) 2002-01-16 2003-07-24 Iritech, Inc. System and method for iris identification using stereoscopic face recognition
US20030158762A1 (en) 2002-01-18 2003-08-21 Jiang Wu System and method for airplane security / service / maintenance management
JP4275344B2 (en) 2002-01-22 2009-06-10 富士フイルム株式会社 Imaging apparatus, imaging method, and program
US20030140928A1 (en) 2002-01-29 2003-07-31 Tuan Bui Medical treatment verification system and method
US20030149881A1 (en) 2002-01-31 2003-08-07 Digital Security Inc. Apparatus and method for securing information transmitted on computer networks
US20030141411A1 (en) 2002-01-31 2003-07-31 Ashish Pandya Novel method to secure airline travel
EP1335329B1 (en) 2002-02-05 2020-05-27 Panasonic Intellectual Property Management Co., Ltd. Personal authentication method, personal authentication apparatus and image capturing device
EP1495386A2 (en) 2002-02-07 2005-01-12 Eyeticket Corporation System and method for automated biometric data collection
US20030158821A1 (en) 2002-02-15 2003-08-21 Maia Regis Pires Equivalence and data verification system
JP2003242125A (en) 2002-02-18 2003-08-29 Canon Inc Portable information terminal, authentication auxiliary terminal and individual authentication method
US20030225711A1 (en) 2002-02-20 2003-12-04 Martin Paping Method and apparatus for postal user identification and billing
KR100842501B1 (en) 2002-02-21 2008-07-01 엘지전자 주식회사 Indicated device of iris location for iris recognition system
US7206431B2 (en) 2002-02-22 2007-04-17 Symbol Technologies, Inc. System and method for generating and verifying a self-authenticating document
US20030182151A1 (en) 2002-02-26 2003-09-25 Neal Taslitz Method of using biometric measurements as a legal seal for authenticating real estate deeds and mortgages
US6905411B2 (en) 2002-02-27 2005-06-14 Igt Player authentication for cashless gaming machine instruments
US20030163739A1 (en) 2002-02-28 2003-08-28 Armington John Phillip Robust multi-factor authentication for secure application environments
JP2003271940A (en) 2002-03-13 2003-09-26 Oki Electric Ind Co Ltd Iris recognition device
JP2003271565A (en) 2002-03-15 2003-09-26 Matsushita Electric Ind Co Ltd Individual authentication system, individual authentication terminal, reader and individual authentication method
US7204425B2 (en) 2002-03-18 2007-04-17 Precision Dynamics Corporation Enhanced identification appliance
US20030174049A1 (en) 2002-03-18 2003-09-18 Precision Dynamics Corporation Wearable identification appliance that communicates with a wireless communications network such as bluetooth
US8086867B2 (en) 2002-03-26 2011-12-27 Northrop Grumman Systems Corporation Secure identity and privilege system
AUPS140502A0 (en) 2002-03-27 2002-05-09 Seeing Machines Pty Ltd Method for automatic detection of facial features
US20030189481A1 (en) 2002-04-04 2003-10-09 Laurence Hamid Remote actuation system, device and method
US20030189480A1 (en) 2002-04-04 2003-10-09 Laurence Hamid Remote actuation system, device and method
US20040052418A1 (en) 2002-04-05 2004-03-18 Bruno Delean Method and apparatus for probabilistic image analysis
JP2003308522A (en) 2002-04-15 2003-10-31 Oki Electric Ind Co Ltd Iris recognizer
JP4154917B2 (en) 2002-04-15 2008-09-24 沖電気工業株式会社 Iris photography device
KR20030082128A (en) 2002-04-16 2003-10-22 엘지전자 주식회사 System of mouse include iris recognition of pc
US7145457B2 (en) 2002-04-18 2006-12-05 Computer Associates Think, Inc. Integrated visualization of security information for an individual
US20030236120A1 (en) 2002-04-19 2003-12-25 Kenneth Reece Method and device for determining the physical location and identity of a user
KR100438841B1 (en) 2002-04-23 2004-07-05 삼성전자주식회사 Method for verifying users and updating the data base, and face verification system using thereof
JP4008282B2 (en) 2002-04-24 2007-11-14 沖電気工業株式会社 Pupil / iris circle detector
JP2003323607A (en) 2002-04-30 2003-11-14 Matsushita Electric Ind Co Ltd Iris image pickup device
US6745520B2 (en) 2002-05-10 2004-06-08 John L. Puskaric Integrated rapid access entry/egress system
JP2003331265A (en) 2002-05-14 2003-11-21 Matsushita Electric Ind Co Ltd Eye image imaging device and iris authentication device
AUPS254302A0 (en) 2002-05-24 2002-06-13 Resmed Limited A sleepiness test
US7519819B2 (en) 2002-05-29 2009-04-14 Digimarc Corporatino Layered security in digital watermarking
US7472283B2 (en) 2002-05-30 2008-12-30 Hewlett-Packard Development Company, L.P. Method and apparatus for secured digital video and access tracking
JP2004005167A (en) 2002-05-31 2004-01-08 Matsushita Electric Ind Co Ltd Eye position specification method and device
JP2004021406A (en) 2002-06-13 2004-01-22 Matsushita Electric Ind Co Ltd Eye position specifying method, and authentication apparatus
JP2004023733A (en) 2002-06-20 2004-01-22 Canon Inc Image photographing device and its control method
JP4174244B2 (en) 2002-06-20 2008-10-29 キヤノン株式会社 Image capturing apparatus and control method thereof
US20040005078A1 (en) 2002-06-21 2004-01-08 Spectra Systems Corporation Method and apparatus for digitally watermarking images created with a mobile imaging device
US7898385B2 (en) 2002-06-26 2011-03-01 Robert William Kocher Personnel and vehicle identification system using three factors of authentication
US7177449B2 (en) 2002-06-26 2007-02-13 Hewlett-Packard Development Company, L.P. Image correction system and method
JP2004030334A (en) 2002-06-26 2004-01-29 Nec Soft Ltd Method, system and program for biometrics authentication service
JP2004038305A (en) 2002-06-28 2004-02-05 Matsushita Electric Ind Co Ltd Individual identification device
US7406184B2 (en) 2002-07-03 2008-07-29 Equinox Corporation Method and apparatus for using thermal infrared for face recognition
US8412623B2 (en) 2002-07-15 2013-04-02 Citicorp Credit Services, Inc. Method and system for a multi-purpose transactional platform
TW588243B (en) 2002-07-31 2004-05-21 Trek 2000 Int Ltd System and method for authentication
GB2391681B (en) 2002-08-01 2005-09-21 Ncr Int Inc Self-service terminal
US7169052B2 (en) 2002-08-05 2007-01-30 Igt Personalized gaming apparatus and gaming method
US7333798B2 (en) 2002-08-08 2008-02-19 Value Added Communications, Inc. Telecommunication call management and monitoring system
JP3779247B2 (en) 2002-08-08 2006-05-24 株式会社リコー Imaging device
DE10237132A1 (en) 2002-08-13 2004-02-26 BSH Bosch und Siemens Hausgeräte GmbH Household appliance with biometric identification for control of access by activation and deactivation of a locking mechanism for the appliance door
US20040037450A1 (en) 2002-08-22 2004-02-26 Bradski Gary R. Method, apparatus and system for using computer vision to identify facial characteristics
KR20040017978A (en) 2002-08-23 2004-03-02 삼성전자주식회사 Refrigerator and control method thereof
US6853444B2 (en) 2002-08-30 2005-02-08 Waleed S. Haddad Non-contact optical imaging system for biometric identification
US20040042641A1 (en) 2002-08-30 2004-03-04 Jakubowski Peter Joel Personnel identity verification system
JP2004094575A (en) 2002-08-30 2004-03-25 Megane Center:Kk Customer identification system by iris
CN100345163C (en) 2002-09-13 2007-10-24 松下电器产业株式会社 Iris coding method, personal identification method, iris code registration device, iris identification device, and iris identification program
US20040059590A1 (en) 2002-09-13 2004-03-25 Dwayne Mercredi Credential promotion
US20040050930A1 (en) 2002-09-17 2004-03-18 Bernard Rowe Smart card with onboard authentication facility
JP4272863B2 (en) 2002-09-20 2009-06-03 キヤノン株式会社 Camera and camera system
US20040059953A1 (en) 2002-09-24 2004-03-25 Arinc Methods and systems for identity management
US20040117636A1 (en) 2002-09-25 2004-06-17 David Cheng System, method and apparatus for secure two-tier backup and retrieval of authentication information
US7084904B2 (en) 2002-09-30 2006-08-01 Microsoft Corporation Foveated wide-angle imaging system and method for capturing and viewing wide-angle images in real time
AU2003282943A1 (en) 2002-10-11 2004-05-04 Digimarc Corporation Systems and methods for recognition of individuals using multiple biometric searches
JP2004152046A (en) 2002-10-31 2004-05-27 Oki Electric Ind Co Ltd User authentication method and biological information recording device, user authentication device, user authentication system, and ticket issuing device
CN100337248C (en) 2002-11-07 2007-09-12 松下电器产业株式会社 Personal identification method, iris registration device, iris identification device, and personal identification program
JP2004163356A (en) 2002-11-15 2004-06-10 Matsushita Electric Ind Co Ltd Eyeglasses detecting apparatus and authentication system
JP2004164483A (en) 2002-11-15 2004-06-10 Matsushita Electric Ind Co Ltd Eye image certification device, and access control system and information processor using it
JP2004171350A (en) 2002-11-21 2004-06-17 Matsushita Electric Ind Co Ltd Eye imaging device and information device using the same
US6820059B2 (en) 2003-04-08 2004-11-16 Richard Glee Wood Method for reducing fraud in government benefit programs using a smart card
JP2004178141A (en) 2002-11-26 2004-06-24 Hitachi Ltd Ic card with illicit use preventing function
US7804982B2 (en) 2002-11-26 2010-09-28 L-1 Secure Credentialing, Inc. Systems and methods for managing and detecting fraud in image databases used with identification documents
US7130452B2 (en) 2002-12-03 2006-10-31 International Business Machines Corporation System and method for multi-party validation, authentication and/or authorization via biometrics
JP2004206444A (en) 2002-12-25 2004-07-22 Matsushita Electric Ind Co Ltd Individual authentication method and iris authentication device
JP2004212431A (en) 2002-12-27 2004-07-29 Casio Comput Co Ltd Autofocus apparatus and autofocus method
KR20040062247A (en) 2003-01-02 2004-07-07 엘지전자 주식회사 Structure of iris recognition camera and using method of the same
TWI349204B (en) 2003-01-10 2011-09-21 Panasonic Corp Group admission system and server and client therefor
US7542945B2 (en) 2003-01-15 2009-06-02 Sanmina-Sci Corporation Authentication device, system and methods
JP2004220376A (en) 2003-01-15 2004-08-05 Sanyo Electric Co Ltd Security management method and system, program, and recording medium
KR100543699B1 (en) 2003-01-21 2006-01-20 삼성전자주식회사 Method and Apparatus for user authentication
TWI224287B (en) 2003-01-23 2004-11-21 Ind Tech Res Inst Iris extraction method
US7404086B2 (en) 2003-01-24 2008-07-22 Ac Technology, Inc. Method and apparatus for biometric authentication
JP4128570B2 (en) 2003-01-28 2008-07-30 富士通株式会社 Biometric information verification device
US7173348B2 (en) 2003-03-03 2007-02-06 Startech Automotive Anti Theft Systems Ltd. Device, system and method for preventing vehicle theft
JP2004261515A (en) 2003-03-04 2004-09-24 Matsushita Electric Ind Co Ltd Iris image pickup device
US20030177051A1 (en) 2003-03-13 2003-09-18 Robin Driscoll Method and system for managing worker resources
JP3598109B2 (en) 2003-03-13 2004-12-08 松下電器産業株式会社 Iris code generation method, personal authentication method, iris code registration device, and personal authentication device
US7184577B2 (en) 2003-03-14 2007-02-27 Intelitrac, Inc. Image indexing search system and method
JP2004280547A (en) 2003-03-17 2004-10-07 Matsushita Electric Ind Co Ltd Individual identification device
JP2004287621A (en) 2003-03-19 2004-10-14 Matsushita Electric Ind Co Ltd Iris recognition system
US7436986B2 (en) 2003-03-25 2008-10-14 Bausch & Lomb Incorporated Positive patient identification
JP2004318248A (en) 2003-04-11 2004-11-11 Matsushita Electric Ind Co Ltd Iris authentication system, iris authentication method, and iris authentication program
JP2004315127A (en) 2003-04-14 2004-11-11 Mitsubishi Electric Corp Elevator controller
TWI314304B (en) 2003-05-05 2009-09-01 Inventio Ag System for security checking or transport of persons by a lift installation and a method for operating this system
US8171304B2 (en) 2003-05-15 2012-05-01 Activcard Ireland Limited Method, system and computer program product for multiple biometric template screening
JP2005004181A (en) 2003-05-21 2005-01-06 Fujinon Corp Visible light/infrared light photographing lens system
US7421097B2 (en) 2003-05-27 2008-09-02 Honeywell International Inc. Face identification verification using 3 dimensional modeling
GB2402840A (en) 2003-06-10 2004-12-15 Guy Frank Howard Walker Mobile with wireless key entry system
US6992562B2 (en) 2003-06-10 2006-01-31 Visteon Global Technologies, Inc. Biometric keyless entry system
EP1486904A1 (en) 2003-06-10 2004-12-15 STMicroelectronics S.A. Generation of a template reconstituted from a set of several images which represent the same element
JP2005004524A (en) 2003-06-12 2005-01-06 Oki Electric Ind Co Ltd Identifying system, and personal authenticating system
JP2005010826A (en) 2003-06-16 2005-01-13 Fujitsu Ltd Authentication terminal device, biometrics information authentication system and biometrics information acquisition system
JP2005011207A (en) 2003-06-20 2005-01-13 Matsushita Electric Ind Co Ltd Ic card, biometrics authentication system and method for authenticating biometrics
JP2005025577A (en) 2003-07-03 2005-01-27 Matsushita Electric Ind Co Ltd Ic card, biometrics authentication system, and biometrics authentication method
US20050012817A1 (en) 2003-07-15 2005-01-20 International Business Machines Corporation Selective surveillance system with active sensor management policies
JP4462863B2 (en) 2003-07-16 2010-05-12 株式会社エヌ・ティ・ティ・データ Personal authentication device, biometric feature information update method, and program
US7731093B2 (en) 2003-08-04 2010-06-08 Canon Kabushiki Kaisha Image reading/forming apparatus and method
JP3918788B2 (en) 2003-08-06 2007-05-23 コニカミノルタフォトイメージング株式会社 Imaging apparatus and program
JP3802892B2 (en) 2003-08-08 2006-07-26 株式会社シゲマツ Iris authentication device
US20050206502A1 (en) 2003-08-27 2005-09-22 Georg Bernitz Method and apparatus for releasing a vehicle for a user
JP2005096744A (en) 2003-09-01 2005-04-14 Matsushita Electric Ind Co Ltd Occupant certifying system
JP2007504562A (en) 2003-09-04 2007-03-01 サーノフ コーポレーション Method and apparatus for performing iris authentication from a single image
US7183895B2 (en) 2003-09-05 2007-02-27 Honeywell International Inc. System and method for dynamic stand-off biometric verification
US7593550B2 (en) 2005-01-26 2009-09-22 Honeywell International Inc. Distance iris recognition
US8705808B2 (en) 2003-09-05 2014-04-22 Honeywell International Inc. Combined face and iris recognition system
US8064647B2 (en) 2006-03-03 2011-11-22 Honeywell International Inc. System for iris detection tracking and recognition at a distance
US8050463B2 (en) 2005-01-26 2011-11-01 Honeywell International Inc. Iris recognition system having image quality metrics
US8442276B2 (en) 2006-03-03 2013-05-14 Honeywell International Inc. Invariant radial iris segmentation
US8090157B2 (en) 2005-01-26 2012-01-03 Honeywell International Inc. Approaches and apparatus for eye detection in a digital image
US7362210B2 (en) 2003-09-05 2008-04-22 Honeywell International Inc. System and method for gate access control
KR20050025927A (en) * 2003-09-08 2005-03-14 유웅덕 The pupil detection method and shape descriptor extraction method for a iris recognition, iris feature extraction apparatus and method, and iris recognition system and method using its
JP2005100063A (en) 2003-09-24 2005-04-14 Sanyo Electric Co Ltd Authentication device and method
NO319858B1 (en) 2003-09-26 2005-09-26 Tandberg Telecom As Identification procedure
JP4615272B2 (en) 2003-09-29 2011-01-19 富士フイルム株式会社 Authentication system, program, and building
JP2005148883A (en) 2003-11-12 2005-06-09 Dainippon Printing Co Ltd Pin change and closure release method for ic card
US7447911B2 (en) 2003-11-28 2008-11-04 Lightuning Tech. Inc. Electronic identification key with portable application programs and identified by biometrics authentication
US7338167B2 (en) 2003-12-10 2008-03-04 Joslin Diabetes Center, Inc. Retinal imaging system
EP1542182A3 (en) 2003-12-11 2006-01-25 NCR International, Inc. Automated teller machine
US20050129286A1 (en) 2003-12-16 2005-06-16 Hekimian Christopher D. Technique using eye position and state of closure for increasing the effectiveness of iris recognition authentication systems
US20050138387A1 (en) 2003-12-19 2005-06-23 Lam Wai T. System and method for authorizing software use
US7443441B2 (en) 2004-01-05 2008-10-28 Canon Kabushiki Kaisha Lens apparatus and image-taking system with multiple focus modes
JP2005195893A (en) 2004-01-07 2005-07-21 Canon Inc Imaging apparatus, its control method and program
JP2005242677A (en) 2004-02-26 2005-09-08 Ntt Comware Corp Composite authentication system and method, and program for the same
JP2005248445A (en) 2004-03-01 2005-09-15 Matsushita Electric Ind Co Ltd Coordination authenticating device
US20050199703A1 (en) 2004-03-09 2005-09-15 Friedman Lawrence J. Method and system for a host based smart card
GB2411980A (en) 2004-03-10 2005-09-14 Giga Byte Tech Co Ltd Computer booting using biometrics
JP3727323B2 (en) 2004-03-11 2005-12-14 沖電気工業株式会社 Passbook handling device
US20050206501A1 (en) 2004-03-16 2005-09-22 Michael Farhat Labor management system and method using a biometric sensing device
US20050210267A1 (en) 2004-03-18 2005-09-22 Jun Sugano User authentication method and system, information terminal device and service providing server, subject identification method and system, correspondence confirmation method and system, object confirmation method and system, and program products for them
US20050210270A1 (en) 2004-03-19 2005-09-22 Ceelox, Inc. Method for authenticating a user profile for providing user access to restricted information based upon biometric confirmation
US7336806B2 (en) * 2004-03-22 2008-02-26 Microsoft Corporation Iris-based biometric identification
JP2005304809A (en) 2004-04-22 2005-11-04 Matsushita Electric Ind Co Ltd Eye image pickup device with lighting system
US8918900B2 (en) 2004-04-26 2014-12-23 Ivi Holdings Ltd. Smart card for passport, electronic passport, and method, system, and apparatus for authenticating person holding smart card or electronic passport
CN100498837C (en) 2004-05-10 2009-06-10 松下电器产业株式会社 Iris registration method, iris registration apparatus
US20050255840A1 (en) 2004-05-13 2005-11-17 Markham Thomas R Authenticating wireless phone system
US7518651B2 (en) 2004-05-28 2009-04-14 Aptina Imaging Corporation Multiple image autofocus
JP4686153B2 (en) 2004-09-10 2011-05-18 日立オムロンターミナルソリューションズ株式会社 Information processing apparatus, fraud detection method, and automatic teller machine
WO2006034135A2 (en) 2004-09-17 2006-03-30 Proximex Adaptive multi-modal integrated biometric identification detection and surveillance system
JP4225501B2 (en) 2004-11-15 2009-02-18 高司 澤口 Portable personal authentication device and electronic system to which access is permitted by the device
US7298873B2 (en) 2004-11-16 2007-11-20 Imageware Systems, Inc. Multimodal biometric platform
CN101111748B (en) 2004-12-03 2014-12-17 弗卢克公司 Visible light and ir combined image camera with a laser pointer
IL165586A0 (en) 2004-12-06 2006-01-15 Daphna Palti Wasserman Multivariate dynamic biometrics system
US7418115B2 (en) 2004-12-07 2008-08-26 Aoptix Technologies, Inc. Iris imaging using reflection from the eye
CA2600388C (en) 2005-03-17 2017-08-08 Imageware Systems, Inc. Multimodal biometric analysis
WO2006132686A2 (en) 2005-06-03 2006-12-14 Sarnoff Corporation Method and apparatus for designing iris biometric systems for use in minimally
EP2428413B1 (en) 2005-07-11 2013-03-27 Volvo Technology Corporation Methods and arrangement for performing driver identity verification
WO2007025258A2 (en) 2005-08-25 2007-03-01 Sarnoff Corporation Methods and systems for biometric identification
US7471451B2 (en) 2005-10-14 2008-12-30 Flir Systems, Inc. Multiple field of view optical system
US20070160266A1 (en) 2006-01-11 2007-07-12 Jones Michael J Method for extracting features of irises in images using difference of sum filters
EP1991947B1 (en) 2006-03-03 2020-04-29 Gentex Corporation Indexing and database search system
JP2009529197A (en) 2006-03-03 2009-08-13 ハネウェル・インターナショナル・インコーポレーテッド Module biometrics collection system architecture
GB2448653B (en) 2006-03-03 2011-03-23 Honeywell Int Inc Single lens splitter camera
GB2450023B (en) 2006-03-03 2011-06-08 Honeywell Int Inc An iris image encoding method
US7580620B2 (en) 2006-05-08 2009-08-25 Mitsubishi Electric Research Laboratories, Inc. Method for deblurring images using optimized temporal coding patterns
US7756407B2 (en) 2006-05-08 2010-07-13 Mitsubishi Electric Research Laboratories, Inc. Method and apparatus for deblurring images
JP4207980B2 (en) 2006-06-09 2009-01-14 ソニー株式会社 IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND COMPUTER PROGRAM
US20080005578A1 (en) 2006-06-29 2008-01-03 Innovya Research & Development Ltd. System and method for traceless biometric identification
US7722461B2 (en) 2006-07-12 2010-05-25 Igt Method and system for time gaming with skill wagering opportunities
US20080148030A1 (en) 2006-12-14 2008-06-19 General Instrument Corporation Method and System for Configuring Electronic Communication Device
GB0700468D0 (en) 2007-01-10 2007-02-21 Mitsubishi Electric Inf Tech Improved image identification
US8025399B2 (en) 2007-01-26 2011-09-27 Aoptix Technologies, Inc. Combined iris imager and wavefront sensor
US20080211347A1 (en) 2007-03-02 2008-09-04 Joshua Isaac Wright Circuit System With Supply Voltage For Driving An Electromechanical Switch
US8063889B2 (en) 2007-04-25 2011-11-22 Honeywell International Inc. Biometric data collection system
US20090092283A1 (en) 2007-10-09 2009-04-09 Honeywell International Inc. Surveillance and monitoring system
US8436907B2 (en) 2008-05-09 2013-05-07 Honeywell International Inc. Heterogeneous video capturing system
US8213782B2 (en) 2008-08-07 2012-07-03 Honeywell International Inc. Predictive autofocusing system
US8090246B2 (en) 2008-08-08 2012-01-03 Honeywell International Inc. Image acquisition system
US8280119B2 (en) 2008-12-05 2012-10-02 Honeywell International Inc. Iris recognition system using quality metrics

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006081209A2 (en) * 2005-01-26 2006-08-03 Honeywell International Inc. Iris recognition system and method

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
DAUGMAN J: "HOW IRIS RECOGNITION WORKS", IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 14, no. 1, January 2004 (2004-01-01), pages 21 - 30, XP001186967, ISSN: 1051-8215 *
DU ET AL: "A One-Dimensional Approach for Iris Identification", PROCEEDINGS OF THE SPIE, SPIE, BELLINGHAM, VA, US, vol. 5404, August 2004 (2004-08-01), pages 237 - 247, XP002340990, ISSN: 0277-786X *
EMANUELE TRUCCO ET AL: "Robust iris location in close-up images of the eye", PATTERN ANALYSIS AND APPLICATIONS, SPRINGER-VERLAG, LO, vol. 8, no. 3, 1 December 2005 (2005-12-01), pages 247 - 255, XP019381497, ISSN: 1433-755X *
KAWAGUCHI T ET AL: "Detection of eyes from human faces by Hough transform and separability filter", IMAGE PROCESSING, 2000. PROCEEDINGS. 2000 INTERNATIONAL CONFERENCE ON SEPTEMBER 10-13, 2000, PISCATAWAY, NJ, USA,IEEE, 10 September 2000 (2000-09-10), pages 49 - 52, XP010530547, ISBN: 0-7803-6297-7 *
KONG W-K ET AL: "DETECTING EYELASH AND REFLECTION FOR ACCURATE IRIS SEGMENTATION", INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, WORLD SCIENTIFIC PUBLISHING, SINGAPORE, SI, vol. 17, no. 6, September 2003 (2003-09-01), pages 1025 - 1034, XP001171781, ISSN: 0218-0014 *
SUZAKI M ET AL: "A HORSE IDENTIFICATION SYSTEM USING BIOMETRICS", SYSTEMS & COMPUTERS IN JAPAN, WILEY, HOBOKEN, NJ, US, vol. 32, no. 14, December 2001 (2001-12-01), pages 12 - 23, XP001116204, ISSN: 0882-1666 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103310186A (en) * 2012-02-29 2013-09-18 三星电子株式会社 Method for correcting user's gaze direction in image, machine-readable storage medium and communication terminal

Also Published As

Publication number Publication date
GB2450026A (en) 2008-12-10
AU2007223574B2 (en) 2011-09-08
US8098901B2 (en) 2012-01-17
AU2007223574A1 (en) 2007-09-13
US20070140531A1 (en) 2007-06-21
GB2450026B (en) 2011-06-22
GB0815928D0 (en) 2008-10-08

Similar Documents

Publication Publication Date Title
AU2007223574B2 (en) A standoff iris recognition system
US10789465B2 (en) Feature extraction and matching for biometric authentication
JP5107045B2 (en) Method for identifying a pixel representing an iris in an image acquired for the eye
JP4767971B2 (en) Distance iris recognition system
Rankin et al. Iris recognition failure over time: The effects of texture
JP2009523265A (en) Method for extracting iris features in an image
JP2007188504A (en) Method for filtering pixel intensity in image
WO2013087026A1 (en) Locating method and locating device for iris
Hilal et al. Hough transform and active contour for enhanced iris segmentation
WO2009029638A1 (en) Iris recognition
CN101241550B (en) Iris image quality judgment method
Gupta et al. Iris recognition system using biometric template matching technology
Jin et al. Iris image segmentation based on K-means cluster
KR100794361B1 (en) The eyelid detection and eyelash interpolation method for the performance enhancement of iris recognition
Annapoorani et al. Accurate and fast iris segmentation
Joshi et al. A novel approach implementation of eyelid detection in biometric applications
Chai et al. Vote-based iris detection system
Crihalmeanu et al. Multispectral ocular biometrics
Talebi et al. A novel iris segmentation method based on balloon active contour
Munemoto et al. " Hallucinating Irises"-Dealing with Partial & Occluded Iris Regions
Sharma et al. Iris Recognition-An Effective Human Identification
Hasan Iris Recognition Method for Non-cooperative Images
Agarwal Multi-impression enhancement of fingerprint images
Salami et al. Enhancement of the Segmentation Framework for Rotated Iris Images
CN111950320A (en) Iris identification method based on relative total variation and probability collaborative representation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07757679

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 0815928

Country of ref document: GB

Kind code of ref document: A

Free format text: PCT FILING DATE = 20070301

WWE Wipo information: entry into national phase

Ref document number: 2007223574

Country of ref document: AU

Ref document number: 0815928.7

Country of ref document: GB

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2007223574

Country of ref document: AU

Date of ref document: 20070301

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 07757679

Country of ref document: EP

Kind code of ref document: A1