WO1998008439A1 - Apparatus for the iris acquiring images - Google Patents

Apparatus for the iris acquiring images Download PDF

Info

Publication number
WO1998008439A1
WO1998008439A1 PCT/US1997/014873 US9714873W WO9808439A1 WO 1998008439 A1 WO1998008439 A1 WO 1998008439A1 US 9714873 W US9714873 W US 9714873W WO 9808439 A1 WO9808439 A1 WO 9808439A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
illuminator
camera
iris
image
Prior art date
Application number
PCT/US1997/014873
Other languages
French (fr)
Inventor
Michael Negin
Thomas A. Chmielewski, Jr.
Robin Sainsbury
Marcos Salganicoff
Keith James Hanna
Robert Mandelbaum
Deepam Mishra
Original Assignee
Sensar, Inc.
The Sarnoff Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sensar, Inc., The Sarnoff Corporation filed Critical Sensar, Inc.
Priority to AU43282/97A priority Critical patent/AU727389B2/en
Priority to JP51177498A priority patent/JP2002514098A/en
Priority to EP97941356A priority patent/EP0959769A1/en
Priority to CA002264029A priority patent/CA2264029A1/en
Publication of WO1998008439A1 publication Critical patent/WO1998008439A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • A61B3/1216Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes for diagnostics of the iris
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/30Individual registration on entry or exit not involving the use of a pass
    • G07C9/32Individual registration on entry or exit not involving the use of a pass in combination with an identity check
    • G07C9/37Individual registration on entry or exit not involving the use of a pass in combination with an identity check using biometric data, e.g. fingerprints, iris scans or voice recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • A61B3/145Arrangements specially adapted for eye photography by video means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • A61B5/1171Identification of persons based on the shapes or appearances of their bodies or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor

Definitions

  • the invention relates to a method and apparatus for illuminating the eye
  • biometrics for recognizing
  • identifying an individual include analyzing a signature, obtaining and
  • Examples are immigration control, point of sale verification, welfare check dispensing,
  • the system could also be used to identify patients, criminal
  • ATMs Automated teller machines
  • the optical system which receives light reflected from the iris.
  • the system includes at least
  • illuminators we also prefer to provide a pan/tilt mirror, or gimbal device and at least
  • NFOV field of view
  • WFOV wide field of view
  • the WFOV cameras may be superfluous, if the user is always known to
  • pan/tilt mirror or
  • gimbal is adjusted to receive light reflected from the iris or other area of interest and
  • the illuminator preferably
  • a second set of light emitting diodes have center lines normal to the base, a second set of light emitting diodes have
  • centerlines at an acute angle to the base and a third set of light emitting diodes have centerlines at an obtuse angle relative to the base. This provides the array a wider field
  • the control system may
  • An image processor is provided to analyze the images from the wide
  • imaging or distance finding devices such as ultrasonic, radar, spread spectrum
  • microwave or thermal imaging or sensing or other optical means could be used.
  • the present system is particularly useful for verifying the identity of
  • Image acquisition and identification can generally be accomplished in less than five
  • NFOV camera and one NFOV camera, two NFOV cameras, multiple NFOV cameras and multiple
  • WFOV cameras can be utilized for other special purpose applications such as more or
  • Figure 1 is a front view of a present preferred embodiment of our device
  • Figure 2 is a side view of the embodiment of Figure 1.
  • Figure 3 is a top plan view of a first present preferred embodiment of our
  • Figure 4 is a side view of the illuminator of Figure 3 with the area of
  • Figure 5 is a side view similar to Figure 4 of a second present preferred
  • Figure 6 is top view of a first present preferred illuminator bracket.
  • Figure 7 is a side view showing the illuminator bracket of Figure 6 with
  • Figure 8 is top view of a second present preferred illuminator bracket
  • Figure 9 is a side view showing the illuminator bracket of Figure 8 with
  • Figure 10 is a block diagram showing a preferred control architecture for
  • Figure 1 1 is a front view showing the eyes and glasses of the person to
  • Figure 12 is a front view showing the eyes and glasses of the person to
  • housing 1 which is 14.5 inches wide, 15 inches high, and 18
  • a housing of this size can be easily placed within or near an automated
  • the bezel would be smoked or other
  • Our device would be positioned so as to be about eye level of most users.
  • Figure 2 we show the head of a person to be identified user in front of our device.
  • WFOV wide field of view
  • Hoods 5 and 7 are provided around the lens 2 and 4 to prevent light
  • view illuminator is comprised of sets of light emitting diodes 10. For ease of construction, these sets of diodes may be mounted on small circuit boards 12.
  • axis illumination, and oblique illuminator allow surface features to be imaged with
  • lighting may cause the pupil to be bright making the iris easier to locate although this
  • Illumination control of this type can be
  • WFOV cameras to create images from which an x, y, z coordinate location can be
  • the WFOV cameras may also be used to provide a first image.
  • the WFOV cameras could utilize a number of techniques such as stereo,
  • stereo processing techniques which compare at least a portion of the images from the
  • iris portion of the eye and obtain an iris image which can be used for iris verification
  • the iris recognition algorithms compare features of the iris in the image
  • illuminators for use in conjunction with the NFOV camera. These illuminators in
  • conjunction with the WFOV camera may be selectively switched to an orchestrated
  • illuminators 21 , 22 or 23 is reflected from the selected eye to an optical subsystem 30
  • Wc can provide a sensor 14
  • Information from this sensor can be used to determine what, if any, illumination must be used.
  • optical subsystem 30 in the embodiment shown in Figure 1 contains
  • pan/tilt mirror to be rotated about a tilt axis corresponding to a centerline through rod
  • Motor 34 is mounted on arm 35 which is pivotably attached to base 36 by rod 37.
  • This arm 35 can be moved around a pan axis corresponding to a centerline through rod
  • NFOV camera can be moved to change the focus or zoom, and the aperture of the
  • NFOV camera 16 is adjustable.
  • the pan axis the tilt axis, the focus axis, the aperture axis and the zoom axis.
  • a system with fewer degrees of freedom could also be used.
  • tilt axes are used to position the pan/tilt mirror 32 so that the correct narrow field is
  • the imager dictates the distance between the camera and lens and the distance between
  • the size of the imager is of paramount importance in
  • NFOV camera 16 may be solid state or of vidicon
  • sensing array size can vary from generic industry sizes of 1/4, 1/3,
  • the illuminators are positioned to provide
  • illuminators arc oriented to direct light to
  • illuminators could also be attached to a sliding mechanism for translation along an axis.
  • the light source preferably is a light emitting diode or other device that
  • a lens and diffuser (not
  • Optical filters may be placed in the light path in
  • Different wavelength priority filters may be used to permit
  • the LED light source could be strobed. Strobing provides the capability to
  • Strobing also provides the capability to overwhelm ambient light by
  • the NFOV illuminators are preferably
  • the light emitting diodes are mounted normal to the circuit board the illuminator will
  • Circular ring illuminators or other shapes may also be used which could
  • NFOV lens NFOV lens
  • Other optical elements such as polarizers or birefringent analysis devices may be used to minimize artifacts or enhance other
  • That bracket 30 has a U-shaped
  • first pair of gripper arms 33 and 34 with attached pin 42 are pivotably attached to one
  • a locking tab 39 extends from the
  • a second bracket 44 which we have used to attach the NFOV illuminators 21, 22, and 23 to the housing 1 is shown in Figures 8 and 9. That bracket
  • a rod 52 extends upward from the
  • Collar 53 slides along rod 52 and can be held at any desired location on the
  • Rod 55 extends from collar 53 and holds carrier 56. This carrier is
  • bracket 30 permit the attached illuminator to be repositioned or adjusted along a pan
  • ILLUMINATOR 1 can be one or more illuminator arrays these arrays are designated as ILLUMINATOR 1
  • controller 62 enable us to selectively light the WFOV illuminator 6 and the NFOV
  • Each set of light emitting diodes may emit a different wavelength of
  • LEDs of different beam widths could be mounted
  • the WFOV cameras 3 provide images to an image processor 64 which
  • That processor 64 tells the computer 65 the x, y, z coordinates of the
  • the image processor may also display selected eye or eyes of the person to be identified.
  • the image processor may also display selected eye or eyes of the person to be identified.
  • the image processor may also display selected eye or eyes of the person to be identified.
  • the PC 65 has
  • the expected position of the eye enable the computer to direct the illumination
  • the pan/tilt controller 66 accepts macro level commands from the
  • the intermediate continuous path set points for the axis are generated here and then sent to each axis supervisory controller.
  • a command interpreter decodes the commands from the image analysis and formats
  • a diagnostic subsystem performs health checks for the control system.
  • the microprocessor controller Besides the choreography of the five axes, the microprocessor controller
  • the illumination controller will accept
  • Images from the WFOV are transmitted as analog signals to the image
  • the image processor preferably contains two pyramid processors, a
  • memory capable of storing at least two frames, one LUT, ALU device, a digitizer
  • That information is then further processed to define an area of interest such as the head or an eye.
  • the coordinates of the area of interest are used to direct the
  • This unit 65 contains a 486,
  • PENTIUM or other microprocessor system and associated memory.
  • memory are
  • WFOV video images can be stored as a security video record.
  • the focus axes of the NFOV system may be controlled in an open loop
  • a closed loop focus method could also be
  • NFOV video would be processed by image processor 64 to obtain a
  • WFOV camera images can be combined with stereo and other focus information into
  • This sensor fusion may encompass other information as
  • a quality image may be acquired via the NFOV camera and optics.
  • the optics directing light to the NFOV camera are
  • imager may not have sufficient time to perform the required integration to get a non-
  • specularity or iris configuration or pattern matching may be capable of providing
  • the NFOV axes must settle to a point
  • obscured may be collected and fused into a single composite, less obscured iris image
  • the reflection 70 partially covers the iris 76 of the
  • the WFOV specularity makes finding the head and
  • the specularity will be in one position during the first
  • a calibration procedure must be used to correlate the center of the
  • WFOV coordinates ⁇ x,y,z ⁇ defining the position of a user's eye somewhere in the
  • the NFOV camera's field of view coincident with x,y coordinates and in focus on the z
  • targets are placed a known distance from the housing and the device is activated to

Abstract

At least one wide field of view camera (3) and with an associated illuminator (6, 10) obtains sufficient images of a person to be identified so that x, y, z coordinates can be established for the expected position of that person's eye. The coordinates are used to direct a narrow field of view camera (16) and associated illuminators (21-23) to take an image of the eye (76, 78) that can be used for to identifying the person using iris verification and recognition algorithms. These illuminators are positioned and illuminated to eliminate or minimize specularities and reflections that obscure the iris (76).

Description

TITLE
APPARATUS FOR THE IRIS ACQUIRING IMAGES
BACKGROUND OF THE INVENTION
1. Field of the Invention
The invention relates to a method and apparatus for illuminating the eye
to obtain an image of the iris.
2. Background of the Invention
There are several methods known as biometrics for recognizing or
identifying an individual. These methods include analyzing a signature, obtaining and
analyzing an image of a fingerprint and imaging and analyzing the retinal vascular
patterns of a human eye. Recently the art has used the iris of the eye which contains a
highly detailed pattern that is unique for each individual and stable over many years
as a non-contact, non-obtrusive biometric. This technique is described in United States
Patent No. 4.641 ,349 to Flom et al. and United States Patent No. 5,291 ,560 to
Daugman. The systems described in these references require the person being
identified to hold at least one of their eyes in a fixed position with respect to an imaging
camera which takes a picture of the iris. While this procedure is satisfactory for some
applications, it is not satisfactory for quick transactional activities such as using an
automated teller machine, unobtrusive access control or automated dispensing. Other
examples are immigration control, point of sale verification, welfare check dispensing,
internet banking, bank loan or account opening and other financial transactions. The iris identification techniques disclosed by Flom and Daugman
require a clear, well-focused image of the iris portion of the eye. Once that image is
obtained a comparison of that image with a coded file image of the iris of the person to
be identified can be accomplished quite rapidly. However, prior to the present
invention there has not been an optical system which could rapidly acquire a
sufficiently clear image of an iris of the person to be identified unless that person
positioned his eye in a fixed position relatively close to an imaging camera. There is a
need for a system which will rapidly obtain a clear picture of the iris of a person or
animal remotely from the optical system and in an uncertain position. This system
would be particularly useful to identify users of automated teller machines as well as
individuals seeking access to a restricted area or facility or other applications requiring
user identification. The system could also be used to identify patients, criminal
suspects and others who are unable or unwilling to be otherwise identified.
Automated teller machines, often called ATMs, are widely used for
banking transactions. Users are accustomed to receiving relatively fast verification of
their identity after inserting their identification card and entering an identification
number. However, anyone who knows the identification number associated with a
given card can use that card. Should a robber learn the identification number by
watching the owner use the card, finding the number written on the card or otherwise,
he can easily draw funds from the owner's account. Consequently, banks have been
searching for other more reliable ways of verifying the identity of ATM users. Since the iris identification methods disclosed by Flom et al. have
proved to be very reliable, the use of iris identification to verify the identity of ATM
users and other remote user recognition or verification application has been proposed.
However, for such use to be commercially available, there must be a rapid, reliable and
unobtrusive way to obtain iris images of sufficient resolution to permit verification and
recognition from an ATM user standing in front of the teller machine. To require the
user to position his head a predetermined distance from the camera, such as by using an
eyepiece or other fixture or without fixturing is impractical. Thus, there is a need for a
system which rapidly locates the iris of an ATM user and obtains a quality image of the
iris that can be used for verification and identification. This system should be suitable
for use in combination with an access card or without such a card. The system should
also be able to obtain such an image from users who are wearing eyeglasses or contact
lenses or ski masks or other occluding apparel.
SUMMARY OF THE INVENTION We provide a method and apparatus which can obtain a clear image of
an iris of a person to be identified whose head is located in front of the portion of our
optical system which receives light reflected from the iris. The system includes at least
one camera with or without ambient illumination and preferably at least one or more
illuminators. We also prefer to provide a pan/tilt mirror, or gimbal device and at least
one lens. Light reflected from the subject is captured by the gimbaled camera or mirror
and directed through the lens to the camera. In a preferred embodiment, a narrow field
of view (NFOV) camera receives the light reflected from the pan/tilt mirror through the lens or directly via a gimbaled mounted camera. A second camera and preferably a
third camera are provided to obtain a wide field of view (WFOV) image of the subject.
In some cases, the WFOV cameras may be superfluous, if the user is always known to
be in the field of view of the NFOV camera or could be located by moving the NFOV
camera. Images from these WFOV cameras are processed to determine the coordinates
of the specific location of interest, such as the head and shoulders and the iris of a
person to be identified. Based upon an analysis of those images the pan/tilt mirror or
gimbal is adjusted to receive light reflected from the iris or other area of interest and
direct that reflected light to a narrow field of view camera. That camera produces an
image of sufficient quality to permit iris identification.
The preferred embodiment contains a wide field of view illuminator
which illuminates the face of the person to be identified. The illuminator preferably
contains a plurality of infrared light emitting diodes positioned around the lens of the
wide field of view camera or cameras.
We also prefer to provide two or more narrow field of view illuminators
each comprised of an array of light emitting diodes. These arrays are mounted so as to
be rotatable about both a horizontal axis and a vertical axis. By using at least two
arrays we are able to compensate for specular reflection and reflection from eyeglasses
or contact lenses or other artifacts which obscure portions of the iris.
We further prefer to construct the arrays so that one set of light emitting
diodes have center lines normal to the base, a second set of light emitting diodes have
centerlines at an acute angle to the base, and a third set of light emitting diodes have centerlines at an obtuse angle relative to the base. This provides the array a wider field
of illumination. We further prefer to provide a control system which enables us to
separately illuminate each group of light emitting diodes. The control system may
permit selective activation of individual diodes. An alternative is to provide a single
illumination that is directed in a coordinated manner with the image steering device.
An image processor is provided to analyze the images from the wide
field of view camera and thereby specify the location of a point or area of interest on
the object or person being identified. A preferred technique for identifying the position
of the user is stereographic image analysis. Alternatively, visible or non-visible range
imaging or distance finding devices such as ultrasonic, radar, spread spectrum
microwave or thermal imaging or sensing or other optical means could be used.
The present system is particularly useful for verifying the identity of
users of automated teller machines. The system can be readily combined with most
conventional automated teller machines and many other financial transaction machines.
Image acquisition and identification can generally be accomplished in less than five
seconds and in less than two seconds in many cases.
Other configurations of cameras such as one NFOV camera, one WFOV
and one NFOV camera, two NFOV cameras, multiple NFOV cameras and multiple
WFOV cameras can be utilized for other special purpose applications such as more or
less restricted movement or position scenarios. For example, iris imaging in a
telephone booth or for a telephone hands free use, multiple iris imaging in a crowd of people, iris imagining of people in a moving or stationary vehicle, iris imaging of a race
horse, or a point of sale site use.
Other objects and advantages will become apparent from a description of
certain present preferred embodiments shown in the drawings.
BRIEF DESCRIPTION OF THE FIGURES
Figure 1 is a front view of a present preferred embodiment of our device
for obtaining images of irises.
Figure 2 is a side view of the embodiment of Figure 1.
Figure 3 is a top plan view of a first present preferred embodiment of our
narrow field of view illuminator.
Figure 4 is a side view of the illuminator of Figure 3 with the area of
illumination shown in chainline.
Figure 5 is a side view similar to Figure 4 of a second present preferred
embodiment of our narrow field of view illuminator.
Figure 6 is top view of a first present preferred illuminator bracket.
Figure 7 is a side view showing the illuminator bracket of Figure 6 with
the position of the illuminator shown in chainline.
Figure 8 is top view of a second present preferred illuminator bracket
Figure 9 is a side view showing the illuminator bracket of Figure 8 with
the position of the illuminator shown in chainline.
Figure 10 is a block diagram showing a preferred control architecture for
the embodiment of Figure 1. Figure 1 1 is a front view showing the eyes and glasses of the person to
be identified on which a reflection of the wide field of view illuminator appears.
Figure 12 is a front view showing the eyes and glasses of the person to
be identified on which a reflection from the narrow field of view illuminators appears.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
Referring to Figures 1 and 2 a present preferred embodiment of our
device is contained in housing 1 which is 14.5 inches wide, 15 inches high, and 18
inches deep. A housing of this size can be easily placed within or near an automated
teller machine or other limited access machine or entry place. When incorporated
within or placed near the housing of an automated teller machine our unit is located
behind a light transmissive bezel. Typically, the bezel would be smoked or other
visually opaque glass or a comparable plastic which obscures our device from being
easily seen. Our device would be positioned so as to be about eye level of most users.
In Figure 2 we show the head of a person to be identified user in front of our device.
In the preferred embodiment of our device shown in Figures 1 and 2, we
provide two wide field of view (WFOV) cameras 3 each having a lens 2 and 4. The
orientation and placement of lens 2 and 4 and other components may be changed to
accommodate available space or other applications. A wide field of view illuminator 6
surrounds the lens. Hoods 5 and 7 are provided around the lens 2 and 4 to prevent light
emitted from the illuminator 6 from passing directly into the camera. The wide field of
view illuminator is comprised of sets of light emitting diodes 10. For ease of construction, these sets of diodes may be mounted on small circuit boards 12. These
boards are then mounted on a housing 13 which surrounds the lens 2 and 4. A
sufficient number of LED containing circuit boards 12 are provided and positioned to
illuminate the head of the person to be identified who is standing in front of our device
as shown in Figure 2. Consequently, we prefer that the wide field of view illuminator
provide a field of illumination which encompasses a region of about two feet in
diameter at a distance of about one foot from the illuminator. This field of illumination
is indicated by the solid lines extending from the wide field of view illuminator 6 in
Figure 2.
Portions of the wide field of view illuminator are positioned around the
WFOV camera to provide nearly on-axis illumination. On axis illumination, nearly on
axis illumination, and oblique illuminator allow surface features to be imaged with
minimal shadows which can generate false edges or other artifacts. Such illumination
of the eye produces a shadow free image with good surface features. This type of
lighting may cause the pupil to be bright making the iris easier to locate although this
feature can be disabled if desired. Any shadows produced by other light sources and
camera angles are minimized or washed out. Illumination control of this type can be
used to stimulate parasympathetic autonomic nervous system reflexes that cause eye
blinks or pupil variation or other reactions. These changes may be useful to determine
subject awareness to establish certain life signs and to reduce pupil size for improving
imaging resolution. Light from the wide field of view illuminator is reflected from the user's
face into the lens of the wide field of view camera lens 2 and 4. This enables the
WFOV cameras to create images from which an x, y, z coordinate location can be
determined for one of the user's eyes. The WFOV cameras may also be used to provide
security video images for monitoring transactions or other activities. We take an image
of the right eye. However, either the left eye or the right eye or both eyes can be
selected. The WFOV cameras could utilize a number of techniques such as stereo,
stereo with structured light, and depth from focus with or without structured light to
determine both the x-y location and distance to the object of interest. We prefer to use
stereo processing techniques which compare at least a portion of the images from the
two wide field of view cameras to determine the x, y, z coordinate locations. We can
also provide a gaze director 9 which assists in identifying the location and motion of the
eye, by attracting the attention of the user.
After we know the position of the selected eye we must illuminate the
iris portion of the eye and obtain an iris image which can be used for iris verification
and recognition. We prefer to be able to obtain an image having approximately 200
pixels across the iris portion of the image so that the image recognition algorithms used
to perform identification and verification can operate reliably.
The iris recognition algorithms compare features of the iris in the image
with the same features in a file image. Verification is considered to have been made
when there is a match of a predetermined number of the compared features. For some
situations a match of at least 75% of the compared features may be required. It is therefore important that there be no specularities, spurious light reflections or dark
shadows covering a significant portion of the image. When the user is wearing glasses,
this can easily occur. To overcome this problem we provide at least two spaced apart
illuminators for use in conjunction with the NFOV camera. These illuminators in
conjunction with the WFOV camera may be selectively switched to an orchestrated
combination to enhance these image structures. For example, a structured specularity
can be created on eyeglasses to make eye finding easier, and then disabled for iris
image acquisition.
In the embodiment shown in Figure 1 a single NFOV camera 16 is
positioned behind mirror 18. Light emitted from one or more of the NFOV
illuminators 21 , 22 or 23 is reflected from the selected eye to an optical subsystem 30
which directs the reflected light to the NFOV camera. Wc can provide a sensor 14
which senses the level of ambient light surrounding the person to be identified.
Information from this sensor can be used to determine what, if any, illumination must
be provided by the illuminators 21,22 and 23. Alternatively, any one or a plurality of
cameras themselves may be used for such light sensing.
The optical subsystem 30 in the embodiment shown in Figure 1 contains
a pan/tilt mirror attached to rod 33 which extends from motor 34. This enables the
pan/tilt mirror to be rotated about a tilt axis corresponding to a centerline through rod
33. Motor 34 is mounted on arm 35 which is pivotably attached to base 36 by rod 37.
This arm 35 can be moved around a pan axis corresponding to a centerline through rod
37. Light emitted from any of the NFOV illuminators 21 , 22, or 23 is reflected from the subject iris to the pan/tilt mirror 32. That mirror is positioned to direct the reflected
light to mirror 18 from which the light is reflected to NFOV camera 16. The lens of the
NFOV camera can be moved to change the focus or zoom, and the aperture of the
camera is adjustable. One can also mount NFOV camera 16 on a movable platform so
that the NFOV camera can be turned toward the eye. Then, the portions of the optical
subsection shown in Figure 1 may not be needed. Our preferred optical subsystem has
five degrees of freedom: the pan axis, the tilt axis, the focus axis, the aperture axis and the zoom axis. A system with fewer degrees of freedom could also be used. The pan
and tilt axes are used to position the pan/tilt mirror 32 so that the correct narrow field is
imaged onto the sensing array of NFOV camera 16. The ability to control movements
along the focus axis, aperture axis and zoom axis allows us to be certain that the imaged
object is in focus. In some cases, only the NFOV camera is needed and the WFOV
camera may optionally not be used.
The design of the optics resolution, magnification, focusing and size of
the imager dictates the distance between the camera and lens and the distance between
the lens and object to be imaged. The size of the imager is of paramount importance in
defining the distance from lens to imager and contributes to the depth of focus. Those
versed in the art will recognize that NFOV camera 16 may be solid state or of vidicon
nature and that the sensing array size can vary from generic industry sizes of 1/4, 1/3,
1/2, 2/3 or 1 inch diagonal measurement. In an optical system, the introduction of a
mirror in an optical path allows the path to be redirected without effecting the optical
path length. The use of these mirrors allows the optical path to be folded back on itself thus reducing the overall required physical length needed to implement the optical
design. Those skilled in the art will recognize that a gimbaled camera can also be used
to perform image steering.
In the embodiment of Figure 1 the illuminators are positioned to provide
a field of illumination illustrated by the dotted lines in Figure 2. These fields must be
directed and sized so that light will be reflected from the eye of the user to the pan/tilt
mirror 32. To achieve this result we provide three illuminators 21 , 22 and 23 placed at
different locations on the housing 1. These illuminators arc oriented to direct light to
the areas where the user's eye is most likely to be based upon information received from
the WFOV cameras or other position detectors that could be used. The illuminators
may be mounted in a permanent location and orientation or placed on manually
adjustable or motorized brackets such as those shown in Figures 6, 7, 8 and 9. The
illuminators could also be attached to a sliding mechanism for translation along an axis.
The light source preferably is a light emitting diode or other device that
emits infrared or near infrared light or a combination of these. A lens and diffuser (not
shown) can be used to guarantee uniform illumination. We have found infrared light to
be particularly useful because it penetrates eyeglasses and sunglasses more easily than
visible light or colored light within the visible spectrum. Infrared light is also invisible
to the user and extremely unobtrusive. Optical filters may be placed in the light path in
front of the camera to reduce any undesirable ambient light wavelengths that corrupt
the desired images. Different wavelength priority filters may be used to permit
different wavelengths to be used to optimize each camera's performance. For example, longer wave IR could be used for WFOV camera imaging and shorter IR could be used
for NFOV camera imaging. Then the NFOV cameras would not respond to WFOV
illumination. This "speed of light" processing can be used to great advantage. If
desired, the LED light source could be strobed. Strobing provides the capability to
freeze motion. Strobing also provides the capability to overwhelm ambient light by
using a high intensity source for a brief period of time, and exposing the camera
accordingly to wash out the background ambient illumination which would otherwise
cause interference. Strobing NFOV and WFOV illuminators, perhaps at different
times, and allowing the cameras to integrate photos over appropriate, possibly
disparate, time slices permits optimum usage of the specific spatial and time
characteristics of each device.
As shown in Figure 1, 3, 4 and 5, the NFOV illuminators are preferably
comprised of a 6 x 6 array of light emitting diodes 20 mounted on a circuit board 24. If
the light emitting diodes are mounted normal to the circuit board the illuminator will
illuminate an area of illumination having some diameter b as indicated in Figure 4. We
have discovered that the area of illumination can be increased using the exact same
components as are used for the illuminator of Figure 4 by repositioning or otherwise
reconfiguring the light emitting diodes. This is shown in the embodiment of Figure 5
by larger diameter a. In that illuminator the top two rows of diodes are positioned at an
acute angle relative to the board 24 and the lower two rows of diodes are mounted at an
obtuse angle. Circular ring illuminators or other shapes may also be used which could
be placed around the NFOV lens. Other optical elements such as polarizers or birefringent analysis devices may be used to minimize artifacts or enhance other
biologically relevant characteristics such as corneal curvature, eye separation, iris
diameter, skin reflectance and sclerac vasculature.
In Figures 6 and 7 we show a bracket which we have used to attach the
NFOV illuminators 21, 22, and 23 to the housing 1. That bracket 30 has a U-shaped
base 31 having a hole 32 through which a screw attaches the bracket to the housing. A
first pair of gripper arms 33 and 34 with attached pin 42 are pivotably attached to one
upright of the base. A similar second pair of gripper arms 33 and 34 are pivotably
connected to the opposite upright through collar 37. The illuminator indicated in chain
line in Figure 7 is held between the gripper arms by screws or pins 38. A series of
holes (not visible) are provided along the uprights so that the gripper arms can be
positioned at any selected one of several positions. A locking tab 39 extends from the
collar 37 into an adjacent slot in the upright to prevent rotation of the collar. Set screw
38 is tightened against the pin 42 extending from gripper arms 35 and 36 to prevent
rotation of the gripper arms.
A second bracket 44 which we have used to attach the NFOV illuminators 21, 22, and 23 to the housing 1 is shown in Figures 8 and 9. That bracket
50 has a base 51 which attaches to the housing. A rod 52 extends upward from the
base 51. Collar 53 slides along rod 52 and can be held at any desired location on the
rod by set screw 54. Rod 55 extends from collar 53 and holds carrier 56. This carrier is
slidably attached to the rod 55 in the same manner as collar 53. The illuminator
indicated in chain line in Figure 9 is attached to the carrier 56 by fasteners or snap fit on pins 57 extending from the carrier. Both this bracket 50 and the other illustrated
bracket 30 permit the attached illuminator to be repositioned or adjusted along a pan
axis and a tilt axis.
We prefer to connect each array of light emitting diodes through a
distribution board 60 to an illumination controller 62 as shown in Figure 10. Since there
can be one or more illuminator arrays these arrays are designated as ILLUMINATOR 1
through ILLUMINATOR X in the drawing. The distribution board 60 and illumination
controller 62 enable us to selectively light the WFOV illuminator 6 and the NFOV
illuminators 21 , 22 and 23 in the embodiment of Figure 1. Furthermore, we can
selectively illuminate sets of light emitting diodes within each array or selectively light
individual diodes.
Each set of light emitting diodes may emit a different wavelength of
light. It has been noted that the irises of different people respond better to certain
wavelengths and worse to other wavelengths of light. This could be accomplished
using illuminators with different wavelength LEDs or populating a single illuminator
with different wavelength LEDs next to each other. These could be strobed and the
better image selected. Additionally, LEDs of different beam widths could be mounted
side by side or in different illuminators for illumination intensity control or for
specularity control - the higher tighter the beamwidth the less the size of the
specularity.
We can also control the duration and the intensity of the light which is
emitted. To prevent burnout of the illuminators caused by prolonged illumination we can provide timers 63 for each illuminator as indicated by the dotted blocks labeled "T"
in Figure 10. The timers will cut the power to the array after a predetermined period of
illumination. The WFOV cameras 3 provide images to an image processor 64 which
we call the PV-I. That processor 64 tells the computer 65 the x, y, z coordinates of the
selected eye or eyes of the person to be identified. The image processor may also
assess the quality of the image and contain algorithms that compensate for motion of
the subject. This processor may also perform image enhancement. The PC 65 has
access to information from the ambient light level detector 69 and the NFOV camera.
These data can be used to modify illumination strategies. The x, y, z coordinates for
the expected position of the eye enable the computer to direct the illumination
controller as to which illuminators should be lighted and to direct the pan/tilt controller
66 to properly position the pan/tilt unit 67 so that a useful image of the iris can be
obtained. These functions may be selected by results from the WFOV camera
processing. Commands are sent from the computer 65 to the motors to change the
location of the pan/tilt axes or to adjust focus. In the simplest case, one may consider
that a WFOV image is acquired, the data is processed and then passed through the
image processor 64 and computer 65 to the pan/tilt controller 66 or a gimbaled
controller. In order to minimize motion time and control settling time, there can be
simultaneous motion in the optical subsystem along all five axes.
The pan/tilt controller 66 accepts macro level commands from the
computer and generates the proper set points and/or commands for use by the
illumination control or each axis supervisor. The intermediate continuous path set points for the axis are generated here and then sent to each axis supervisory controller.
A command interpreter decodes the commands from the image analysis and formats
responses using positioning information from the optical devices. A real time interrupt
produces a known clock signal every n milliseconds. This signal is a requirement for
the implementation of a sampled data system for the position controller of each axis
and allows synchronization via the supervisory controller for continuous path motion.
A diagnostic subsystem performs health checks for the control system.
Besides the choreography of the five axes, the microprocessor controller
must also provide illumination control. The illumination controller will accept
commands similar to the commands associated with motion control to timely activate,
or to synchronously activate with the camera frame taking, selected illuminators.
Images from the WFOV are transmitted as analog signals to the image
processor 64. The image processor preferably contains two pyramid processors, a
memory capable of storing at least two frames, one LUT, ALU device, a digitizer
which digitizes the analog video signal, a Texas Instrument TMS 320 C-31 or C-32
processor and a serial/parallel processor. The image is processed using the pyramid
processors as described in United States Patent No. 5,359,574 to van der Wal. The
Texas Instruments processor computes disparities between images. The WFOV images
defines a region or point in the field of view of the WFOV cameras where the subject's
right eye or left eye or both are located. Using stereo processing techniques on the
disparities will result in x, y, z coordinates for points on the subject relative to the
WFOV cameras. That information is then further processed to define an area of interest such as the head or an eye. The coordinates of the area of interest are used to direct the
NFOV optical system. These position coordinates are transferred from the image
processor to a NFOV image and iris image processor 65. This unit 65 contains a 486,
PENTIUM or other microprocessor system and associated memory. In the memory are
programs and algorithms for directing the optical platform and doing iris identification.
Additionally, WFOV video images can be stored as a security video record.
The focus axes of the NFOV system may be controlled in an open loop
fashion. In this case, the x,y,z coordinate from stereo processing defines via table look
or analytic computation the focus axis position so that the lens properly focuses the
NFOV camera on the object of interest. A closed loop focus method could also be
used. In this case, NFOV video would be processed by image processor 64 to obtain a
figure of merit defining if the axis was in focus. From the figure of merit the axis could
be commanded forward or backward and then a new image acquired. The process
would continue in a closed loop form until the image is in focus. Other information
such as iris size and location as measured in camera units, and eye separation from
WFOV camera images can be combined with stereo and other focus information into
multivariate features than can be used to refine range information by fusion of direct or
derived sensory information. This sensor fusion may encompass other information as
well.
Since the object of interest, namely the eye, may be moving, there is a
requirement that the NFOV camera track the trajectory seen by the WFOV. When
motion ceases to blur the image, a quality image may be acquired via the NFOV camera and optics. By tracking the eye, the optics directing light to the NFOV camera are
aligned so that when it is desired to obtain an iris quality image little or no additional
motion may be required.
In this case, the x,y,z coordinates from analysis of the WFOV images are
sent to the NFOV controller at some uniform sample rate (such as every 100 ms). A
continuous path algorithm such as described in Robotic Engineering An Integrated
Approach, by Klafter, Chmielewski and Negin (Prentice Hall, 1989) would be used to
provide intermediate sets of {p,t,f,a,z} set points to the axis so that the axes remain in
motion during the tracking phase. To define the last end position, either a macro level
command can be given or the same {p,t,f,a,z} can be continually sent at the sample
periods.
It is important to recognize that as the NFOV axes move, the associated
imager may not have sufficient time to perform the required integration to get a non-
blurred image. Additionally, depending on the camera used (interlaced or progressive
scan) there may be field to field displacement or horizontal displacement of the image
all of which can be wholly or partially corrected by computation. Thus, it is easily seen
why the WFOV camera provides the information necessary for directing the NFOV
stage. It should be noted, that certain eye tracking algorithms (such as those based on
specularity or iris configuration or pattern matching) may be capable of providing
sufficient information (even if the image is slightly blurred due to focus or exhibits
some blur caused by motion) to provide a reasonable estimate of the eye location in the
NFOV camera. Hence, it is conceptually possible to use the WFOV data for coarse movement and the processed NFOV data (during motion) as additional information for
finer resolution. This fusion of data can provide a better estimate than one WFOV
camera image alone in positioning the NFOV image to acquire a quality iris image.
To acquire a quality iris image, the NFOV axes must settle to a point
where the residual motion is less than that which can be delected by the imager. Once
this occurs, any remaining images must be purged from the imager (typically there is a
delay between an image integrated and the readout via RSI 70) and the proper
integration time allowed to acquire a non blurred image. See Robotic Engineering An
Integrated Approach for a timing scenario. This can be accomplished in a number of
ways, the simplest being a time delay which occurs after the cessation of motion until a
good quality RSI 70 image is captured. Multiple iris images which may be partially
obscured may be collected and fused into a single composite, less obscured iris image
using normalization and fusion methods.
We have found that any light source will cause a reflection on eyeglasses
of the person to be identified. The eyes as seen from the WFOV cameras 3 are shown
in Figure 11. There is a reflection 70 from the WFOV illuminator 6 on both lenses 72
of the person's eyeglasses 74. The reflection 70 partially covers the iris 76 of the
person's eye making iris identification difficult if not impossible. To overcome this
problem we use illuminators 21, 22 and 23 located off-axis from the optical axis of the
NFOV camera 16. By carefully positioning and sometimes using only some of the
light emitting diodes we can achieve adequate illumination without creating an
obscuring reflection. This result is shown in Figure 12 where light from only a few light emitting diodes in the NFOV illuminators 21 , 22 and 23 has created a reflection
80 that appears in the image. That reflection does not cover any part of the iris 76. In
some cases especially with glasses, the WFOV specularity makes finding the head and
the eye more expeditious.
Multiple illuminators also enable us to determine the shape of eyeglasses
worn by the subject in the images. We illuminate the eyeglasses sequentially using two
spaced apart illuminators. The specularity will be in one position during the first
illumination and in a different position during the second illumination. The amount of
specularity change is then calculated to determine appropriate eyeglass shape. From
that information we can determine the minimum movement of illumination required to
move the specularity off of the iris.
A calibration procedure must be used to correlate the center of the
NFOV camera's field of view with pan/tilt and focus axis positions for a series of
coordinates in 3 dimensional space as defined by the wide field of view. Given a set of
WFOV coordinates {x,y,z} defining the position of a user's eye somewhere in the
working volume in front of the cameras, a transformation or table look up can be used
to define the coordinates of the pan, tilt and focus {p,t,f} axes that make the center of
the NFOV camera's field of view coincident with x,y coordinates and in focus on the z
plane. We prefer to use a series of targets to assist in calibration. These targets have
partially filled circles corresponding to iris positions at known locations on a page. The
targets are placed a known distance from the housing and the device is activated to
attempt to find an iris and produce a calibration image. When NFOV and WFOV cameras are used, they must be calibrated
together. This may be accomplished by manual or automatic procedures, optical targets
or projected targets may be used. An automatic procedure would require a calibration
object to be automatically recognized by the computational support equipment
operating on the camera images. Another possibility is to use the NFOV camera
motion capabilities or other motion generation capability to project a target into the
calibration volume, and then recognize the target.
Although we have shown certain present preferred embodiments of our
compact image steering and focusing device and methods of using that device, it should
be distinctly understood that our invention is not limited thereto but may be variously
embodied within the scope of the following claims.

Claims

We Claim:
1. An apparatus for acquiring images of irises comprising:
at least one camera positioned to take an image of an eye so that the
image will contain a representation of the iris which is of sufficient resolution to be
used for iris verification and identification; and
at least one illuminator positioned to illuminate the iris and comprised
of a plurality of light emitting elements which can be selectively illuminated and
concurrently illuminated.
2. The apparatus of claim 1 also comprising a second illuminator
positioned to illuminate the iris, the second illuminator being positioned apart from the
first illuminator.
3. The apparatus of claim 2 wherein the second illuminator is comprised
of a plurality of light emitting elements which can be selectively illuminated.
4. The apparatus of claim 1 wherein light is reflected from the iris to the
at least one camera along a camera axis and the light travels from the at least one
illuminator along a path which intersects the camera axis.
5. The apparatus of claim 1 wherein the at least one illuminator is
comprised of at least one array of light emitting diodes mounted on a base.
6. The apparatus of claim 5 wherein the array of light emitting diodes is
comprised of:
a. a first set of light emitting diodes which is attached to the base in a
manner to emit light along a first path; and
a second set of light emitting diodes which is attached to the base in a
manner to emit light along a second path which is not parallel to the first path.
7. The apparatus of claim 5 wherein the first set of light emitting diodes
and the second set of light emitting diodes can be separately illuminated.
8. The apparatus of claim 5 also comprising an array bracket to which
the base of the at least one array of light emitting diodes is pivotally attached for
rotation about an array axis.
9. The apparatus of claim 8 also comprising a second bracket to which
the array bracket is movably attached in a manner to permit movement of the array
along a line normal to the array axis.
10. The apparatus of claim 5 wherein at least some of the light emitting
diodes emit light of a wavelength which is different from light wavelengths emitted
from other light emitting diodes.
1 1. The apparatus of claim 1 wherein the at least one illuminator can
emit at least one of infrared light, visible light, near infrared light, a select band of light
frequencies, and both visible light and infrared light.
12. The apparatus of claim 1 the at least one illuminator can emit light
of varying intensity.
13. The apparatus of claim 12 also comprising a power source
connected to the at least one illuminator which can emit light of varying intensity and a
controller connected to the power source for changing power output to the at least one
illuminator thereby changing intensity of the light which is emitted by that illuminator.
14. The apparatus of claim 1 wherein the at least one of the illuminators
can emit different wavelengths of light.
15. The apparatus of claim 2 also comprising a controller connected to
the at least one illuminator and the second illuminator, the controller containing a program for selectively illuminating the illuminators according to a dynamically
predetermined pattern.
16. The apparatus of claim 1 also comprising a wide field of view
camera positioned to take an image which includes the eye.
17. The apparatus of claim 16 also comprising a wide field of view
illuminator.
18. The apparatus of claim 17 wherein the wide field of view
illuminator is comprised of a ring of light emitting elements arranged around the wide
field of view camera.
19. The apparatus of claim 17 also comprising a hood surrounding the
wide field of view camera to prevent light from the illuminators from directly entering
the camera.
20. The apparatus of claim 1 also comprising a power source connected
to the at least one of the illuminators and a timer connected to the power supply which
timer turns off the power source after a selected period of time.
21. The apparatus of claim 1 also comprising an ambient light sensor
and a controller connected to the ambient light sensor and the at least one illuminator
the controller containing a program for shutting off the illuminators when the ambient
light is at a predetermined level.
22. The apparatus of claim 1 also comprising a motor connected to the
at least one illuminator.
23. The apparatus of claim 1 also comprising an optical system which
receives light reflected from the iris and directs the light to the at least one camera.
24. The apparatus of claim 1 wherein the at least one camera is movable.
25. A method for acquiring an image of an iris comprising the steps of:
locating a three dimensional coordinate position of an eye containing the
iris to be imaged;
positioning a camera so that at least some light reflected from the iris
will be reflected to the camera;
illuminating the eye with at least one illuminator so that light is reflected
from the iris to the camera along a camera axis and the light travels from the at least
one illuminator along at least one path which intersects the camera axis wherein the at least one illuminator is comprised of a plurality of light emitting elements which can be
selectively illuminated; and
creating at least one image of the iris with the camera during
illumination which image is of sufficient resolution to be used for iris verification and
identification.
26. The method of claim 25 wherein the at least one illuminator is
comprised of a first illuminator and a second illuminator which are illuminated
sequentially and the camera creates a first image and a second image during the
sequential illumination which images are used to create the image of sufficient
resolution to distinguish among identifying features within the iris.
27. The method of claim 25 wherein the at least one illuminator is
comprised of two sets each set containing a plurality of light emitting elements which
sets are selectively and sequentially illuminated .
28. The method of claim 25 wherein the at least illuminator can emit at
least one of infrared light, visible light, near infrared light, a select band of light
frequencies, and both visible light and infrared light.
29. The method of claim 25 wherein the image is comprised of a
plurality of pixels and that portion of the image which contains the iris is comprised of
at least 200 pixels.
30. The method of claim 25 wherein the three dimensional coordinate
position of the eye is located by :
a. using a first wide field of view camera to create a first image
of a region in which the eye is believed to be located;
b. using a second wide field of view camera spaced apart from
the first wide field of view camera to create a second image of a region in which
the eye is believed to be located; and
c. combining the first image and the second image in a manner
to establish the three dimensional coordinate position of the eye.
31. The method of claim 30 wherein the images are combined using
stereographic image analysis.
32. The method of claim 30 also comprising the step of illuminating the
region using nearly on axis illumination.
33. The method of claim 25 wherein the camera is mounted within an
optical subsystem containing a pan/tilt mirror from which reflected light is directed to the camera and also comprising the step of adjusting the pan/tilt mirror toward
the three dimensional coordinate position of the eye to direct light reflected from
the iris to the camera.
34. The method of claim 25 wherein the camera is gimbal mounted and
also comprising the step of adjusting the camera toward the three dimensional
coordinate position of the eye to direct light reflected from the iris to the camera.
PCT/US1997/014873 1996-08-25 1997-08-22 Apparatus for the iris acquiring images WO1998008439A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
AU43282/97A AU727389B2 (en) 1996-08-25 1997-08-22 Apparatus for the iris acquiring images
JP51177498A JP2002514098A (en) 1996-08-25 1997-08-22 Device for iris acquisition image
EP97941356A EP0959769A1 (en) 1996-08-25 1997-08-22 Apparatus for the iris acquiring images
CA002264029A CA2264029A1 (en) 1996-08-25 1997-08-22 Apparatus for the iris acquiring images

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US70292396A 1996-08-25 1996-08-25
US08/702,923 1996-08-25

Publications (1)

Publication Number Publication Date
WO1998008439A1 true WO1998008439A1 (en) 1998-03-05

Family

ID=24823180

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1997/014873 WO1998008439A1 (en) 1996-08-25 1997-08-22 Apparatus for the iris acquiring images

Country Status (6)

Country Link
EP (1) EP0959769A1 (en)
JP (2) JP2002514098A (en)
KR (1) KR100342159B1 (en)
AU (1) AU727389B2 (en)
CA (1) CA2264029A1 (en)
WO (1) WO1998008439A1 (en)

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1041522A2 (en) * 1999-04-01 2000-10-04 Ncr International Inc. Self service terminal
US6289113B1 (en) 1998-11-25 2001-09-11 Iridian Technologies, Inc. Handheld iris imaging apparatus and method
WO2001088857A1 (en) * 2000-05-16 2001-11-22 Swisscom Mobile Ag Biometric method for identification and authorisation
US6377699B1 (en) 1998-11-25 2002-04-23 Iridian Technologies, Inc. Iris imaging telephone security module and method
US6532298B1 (en) 1998-11-25 2003-03-11 Iridian Technologies, Inc. Portable authentication device and method using iris patterns
EP1335329A2 (en) * 2002-02-05 2003-08-13 Matsushita Electric Industrial Co., Ltd. Personal authentication method, personal authentication apparatus and image capturing device
EP1374144A1 (en) * 2001-03-06 2004-01-02 Evermedia Co., Ltd. Non-contact type human iris recognition method by correction of rotated iris image
EP1387314A1 (en) * 2001-05-11 2004-02-04 Matsushita Electric Industrial Co., Ltd. Method and apparatus for picking up image of object being authenticated
WO2005024698A2 (en) 2003-09-04 2005-03-17 Sarnoff Corporation Method and apparatus for performing iris recognition from an image
KR100728657B1 (en) 2006-04-27 2007-06-14 서울통신기술 주식회사 Unmanned system and method for controlling entrance and exit using of face recognition with multiple infrared cameras
US7542628B2 (en) 2005-04-11 2009-06-02 Sarnoff Corporation Method and apparatus for providing strobed image capture
US7634114B2 (en) 2006-09-01 2009-12-15 Sarnoff Corporation Method and apparatus for iris biometric systems for use in an entryway
WO2010104870A3 (en) * 2009-03-11 2010-11-04 Harris Corporation A method for reconstructing iris scans through novel inpainting techniques and mosaicing of partial collections
EP2345431A2 (en) 2002-06-11 2011-07-20 Basf Se Method for producing esters from polyalcohols
US8189879B2 (en) 2008-02-14 2012-05-29 Iristrac, Llc System and method for animal identification using IRIS images
US8306288B2 (en) 2009-08-19 2012-11-06 Harris Corporation Automatic identification of fingerprint inpainting target areas
GB2495323A (en) * 2011-10-07 2013-04-10 Irisguard Inc Method of capturing an iris image free from specularities caused by spectacles
US8787624B2 (en) 2011-03-08 2014-07-22 Fujitsu Limited Biometric-information processing device, method of processing biometric information, and computer-readable recording medium storing biometric-information processing program
WO2014205021A1 (en) 2013-06-18 2014-12-24 Delta ID Inc. Multiple mode image acquisition for iris imaging
US9008375B2 (en) 2011-10-07 2015-04-14 Irisguard Inc. Security improvements for iris recognition systems
US9208391B2 (en) 2012-03-28 2015-12-08 Fujitsu Limited Biometric authentication device, biometric authentication method, and computer readable, non-transitory medium
US9336426B2 (en) 2012-03-28 2016-05-10 Fujitsu Limited Biometric authentication device, biometric authentication method, and computer readable, non-transitory medium
US9336427B2 (en) 2013-01-15 2016-05-10 Fujitsu Limited Biometric information image-capturing device, biometric authentication apparatus and manufacturing method of biometric information image-capturing device
US9412014B2 (en) 2011-05-30 2016-08-09 Fujitsu Limited Biometric information process device, biometric information process method, and computer readable medium
CN106407964A (en) * 2016-11-15 2017-02-15 刘霁中 Device and method using visible light source to collect iris and terminal device
US9858490B2 (en) 2011-12-15 2018-01-02 Fujitsu Limited Vein authentication method, image processing method, and vein authentication device
US10366296B2 (en) 2016-03-31 2019-07-30 Princeton Identity, Inc. Biometric enrollment systems and methods
US10373008B2 (en) 2016-03-31 2019-08-06 Princeton Identity, Inc. Systems and methods of biometric analysis with adaptive trigger
US10425814B2 (en) 2014-09-24 2019-09-24 Princeton Identity, Inc. Control of wireless communication device capability in a mobile device with a biometric key
US10452936B2 (en) 2016-01-12 2019-10-22 Princeton Identity Systems and methods of biometric analysis with a spectral discriminator
US10484584B2 (en) 2014-12-03 2019-11-19 Princeton Identity, Inc. System and method for mobile device biometric add-on
US10607096B2 (en) 2017-04-04 2020-03-31 Princeton Identity, Inc. Z-dimension user feedback biometric system
CN111126145A (en) * 2018-10-18 2020-05-08 天目爱视(北京)科技有限公司 Iris 3D information acquisition system capable of avoiding light source image influence
WO2020149981A1 (en) * 2019-01-17 2020-07-23 Gentex Corporation Alignment system
US10902104B2 (en) 2017-07-26 2021-01-26 Princeton Identity, Inc. Biometric security systems and methods
DE102019124127A1 (en) * 2019-09-09 2021-03-11 Bundesdruckerei Gmbh DEVICE AND METHOD FOR DETERMINING BIOMETRIC CHARACTERISTICS OF A FACE OF A PERSON
WO2022066816A3 (en) * 2020-09-25 2022-04-28 Sterling Labs Llc Pose optimization in biometric authentication systems
US20230062777A1 (en) * 2021-08-25 2023-03-02 Tools for Humanity Corporation Controlling a two-dimensional mirror gimbal for purposes of iris scanning
WO2023063861A1 (en) * 2021-10-13 2023-04-20 Fingerprint Cards Anacatum Ip Ab A method and a system configured to reduce impact of impairment data in captured iris images
US11642025B2 (en) 2018-07-16 2023-05-09 Verily Life Sciences Llc Retinal camera with light baffle and dynamic illuminator for expanding eyebox
WO2023150239A3 (en) * 2022-02-03 2023-10-19 Meta Platforms Technologies, Llc Techniques for producing glints and iris illumination for eye tracking

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100434370B1 (en) * 2001-05-12 2004-06-04 엘지전자 주식회사 Focusing distance measurement in iris recognition system
KR100447403B1 (en) * 2001-05-12 2004-09-04 엘지전자 주식회사 Focusing angle and distance display in iris recognition system
KR100924271B1 (en) * 2002-05-20 2009-11-03 신성복 Identification system and method using a iris, and media that can record computer program sources thereof
FR2864290B1 (en) * 2003-12-18 2006-05-26 Sagem METHOD AND DEVICE FOR RECOGNIZING IRIS
JP5184087B2 (en) * 2004-09-22 2013-04-17 トリパス イメージング, インコーポレイテッド Methods and computer program products for analyzing and optimizing marker candidates for cancer prognosis
JP2010535014A (en) * 2007-07-30 2010-11-18 オンコセラピー・サイエンス株式会社 Cancer-related gene LY6K
US20100030040A1 (en) 2008-08-04 2010-02-04 Masimo Laboratories, Inc. Multi-stream data collection system for noninvasive measurement of blood constituents
US8577431B2 (en) 2008-07-03 2013-11-05 Cercacor Laboratories, Inc. Noise shielding for a noninvasive device
KR20180133076A (en) 2017-06-05 2018-12-13 삼성전자주식회사 Image sensor and electronic apparatus including the same
KR102372809B1 (en) 2017-07-04 2022-03-15 삼성전자주식회사 Imaging sensor assembly having tilting structure and electronic device including the same
EP3474328B1 (en) 2017-10-20 2021-09-29 Samsung Electronics Co., Ltd. Combination sensors and electronic devices
WO2023032051A1 (en) 2021-08-31 2023-03-09 日本電気株式会社 Illumination control device, illumination control method, recording medium, and imaging system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4641349A (en) 1985-02-20 1987-02-03 Leonard Flom Iris recognition system
EP0240336A2 (en) * 1986-04-04 1987-10-07 Applied Science Group Inc. Method and system for generating a description of the distribution of looking time as people watch television commercials
WO1990001291A1 (en) * 1988-08-05 1990-02-22 Mario Angi Infrared videorefractometer particularly for application in pediatric ophthalmology
US5291560A (en) 1991-07-15 1994-03-01 Iri Scan Incorporated Biometric personal identification system based on iris analysis
WO1996007978A1 (en) * 1994-09-02 1996-03-14 David Sarnoff Research Center, Inc. Automated, non-invasive iris recognition system and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4641349A (en) 1985-02-20 1987-02-03 Leonard Flom Iris recognition system
EP0240336A2 (en) * 1986-04-04 1987-10-07 Applied Science Group Inc. Method and system for generating a description of the distribution of looking time as people watch television commercials
WO1990001291A1 (en) * 1988-08-05 1990-02-22 Mario Angi Infrared videorefractometer particularly for application in pediatric ophthalmology
US5291560A (en) 1991-07-15 1994-03-01 Iri Scan Incorporated Biometric personal identification system based on iris analysis
WO1996007978A1 (en) * 1994-09-02 1996-03-14 David Sarnoff Research Center, Inc. Automated, non-invasive iris recognition system and method

Cited By (70)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6289113B1 (en) 1998-11-25 2001-09-11 Iridian Technologies, Inc. Handheld iris imaging apparatus and method
US6377699B1 (en) 1998-11-25 2002-04-23 Iridian Technologies, Inc. Iris imaging telephone security module and method
US6483930B1 (en) 1998-11-25 2002-11-19 Iridian Technologies, Inc. Iris imaging telephone security module and method
US6532298B1 (en) 1998-11-25 2003-03-11 Iridian Technologies, Inc. Portable authentication device and method using iris patterns
EP1041522A2 (en) * 1999-04-01 2000-10-04 Ncr International Inc. Self service terminal
EP1041522A3 (en) * 1999-04-01 2002-05-08 Ncr International Inc. Self service terminal
US6583864B1 (en) 1999-04-01 2003-06-24 Ncr Corporation Self service terminal
WO2001088857A1 (en) * 2000-05-16 2001-11-22 Swisscom Mobile Ag Biometric method for identification and authorisation
US7346195B2 (en) 2000-05-16 2008-03-18 Swisscom Mobile Ag Biometric identification and authentication method
US7630524B2 (en) 2000-05-16 2009-12-08 Swisscom Mobile Ag Biometric identification and authentication method
EP1374144A1 (en) * 2001-03-06 2004-01-02 Evermedia Co., Ltd. Non-contact type human iris recognition method by correction of rotated iris image
EP1374144A4 (en) * 2001-03-06 2007-02-07 Evermedia Co Ltd Non-contact type human iris recognition method by correction of rotated iris image
EP1387314A1 (en) * 2001-05-11 2004-02-04 Matsushita Electric Industrial Co., Ltd. Method and apparatus for picking up image of object being authenticated
EP1387314A4 (en) * 2001-05-11 2004-08-11 Matsushita Electric Ind Co Ltd Method and apparatus for picking up image of object being authenticated
EP1335329A3 (en) * 2002-02-05 2004-09-22 Matsushita Electric Industrial Co., Ltd. Personal authentication method, personal authentication apparatus and image capturing device
EP1600898A2 (en) * 2002-02-05 2005-11-30 Matsushita Electric Industrial Co., Ltd. Personal authentication method, personal authentication apparatus and image capturing device
EP1600898A3 (en) * 2002-02-05 2006-06-14 Matsushita Electric Industrial Co., Ltd. Personal authentication method, personal authentication apparatus and image capturing device
US7155035B2 (en) 2002-02-05 2006-12-26 Matsushita Electric Industrial Co., Ltd. Personal authentication method, personal authentication apparatus and image capturing device
EP1335329A2 (en) * 2002-02-05 2003-08-13 Matsushita Electric Industrial Co., Ltd. Personal authentication method, personal authentication apparatus and image capturing device
EP2345432A2 (en) 2002-06-11 2011-07-20 Basf Se Method for producing esters from polyalcohols
EP2345431A2 (en) 2002-06-11 2011-07-20 Basf Se Method for producing esters from polyalcohols
EP1671258A2 (en) * 2003-09-04 2006-06-21 Sarnoff Corporation Method and apparatus for performing iris recognition from an image
EP1671258A4 (en) * 2003-09-04 2008-03-19 Sarnoff Corp Method and apparatus for performing iris recognition from an image
WO2005024698A2 (en) 2003-09-04 2005-03-17 Sarnoff Corporation Method and apparatus for performing iris recognition from an image
US7657127B2 (en) 2005-04-11 2010-02-02 Sarnoff Corporation Method and apparatus for providing strobed image capture
US7542628B2 (en) 2005-04-11 2009-06-02 Sarnoff Corporation Method and apparatus for providing strobed image capture
US7925059B2 (en) 2005-06-03 2011-04-12 Sri International Method and apparatus for iris biometric systems for use in an entryway
KR100728657B1 (en) 2006-04-27 2007-06-14 서울통신기술 주식회사 Unmanned system and method for controlling entrance and exit using of face recognition with multiple infrared cameras
US7634114B2 (en) 2006-09-01 2009-12-15 Sarnoff Corporation Method and apparatus for iris biometric systems for use in an entryway
US8189879B2 (en) 2008-02-14 2012-05-29 Iristrac, Llc System and method for animal identification using IRIS images
US8315440B2 (en) 2008-02-14 2012-11-20 Iristrac, Llc System and method for animal identification using iris images
WO2010104870A3 (en) * 2009-03-11 2010-11-04 Harris Corporation A method for reconstructing iris scans through novel inpainting techniques and mosaicing of partial collections
US8306288B2 (en) 2009-08-19 2012-11-06 Harris Corporation Automatic identification of fingerprint inpainting target areas
US8787624B2 (en) 2011-03-08 2014-07-22 Fujitsu Limited Biometric-information processing device, method of processing biometric information, and computer-readable recording medium storing biometric-information processing program
US9412014B2 (en) 2011-05-30 2016-08-09 Fujitsu Limited Biometric information process device, biometric information process method, and computer readable medium
GB2495323A (en) * 2011-10-07 2013-04-10 Irisguard Inc Method of capturing an iris image free from specularities caused by spectacles
US9008375B2 (en) 2011-10-07 2015-04-14 Irisguard Inc. Security improvements for iris recognition systems
US9002053B2 (en) 2011-10-07 2015-04-07 Irisguard Inc. Iris recognition systems
GB2495323B (en) * 2011-10-07 2018-05-30 Irisguard Inc Improvements for iris recognition systems
US9858490B2 (en) 2011-12-15 2018-01-02 Fujitsu Limited Vein authentication method, image processing method, and vein authentication device
US9208391B2 (en) 2012-03-28 2015-12-08 Fujitsu Limited Biometric authentication device, biometric authentication method, and computer readable, non-transitory medium
US9336426B2 (en) 2012-03-28 2016-05-10 Fujitsu Limited Biometric authentication device, biometric authentication method, and computer readable, non-transitory medium
US9336427B2 (en) 2013-01-15 2016-05-10 Fujitsu Limited Biometric information image-capturing device, biometric authentication apparatus and manufacturing method of biometric information image-capturing device
US10018804B2 (en) 2013-06-18 2018-07-10 Delta Id, Inc. Apparatus and method for multiple mode image acquisition for iris imaging
EP3011495A4 (en) * 2013-06-18 2017-01-25 Delta ID Inc. Multiple mode image acquisition for iris imaging
WO2014205021A1 (en) 2013-06-18 2014-12-24 Delta ID Inc. Multiple mode image acquisition for iris imaging
US10425814B2 (en) 2014-09-24 2019-09-24 Princeton Identity, Inc. Control of wireless communication device capability in a mobile device with a biometric key
US10484584B2 (en) 2014-12-03 2019-11-19 Princeton Identity, Inc. System and method for mobile device biometric add-on
US10643088B2 (en) 2016-01-12 2020-05-05 Princeton Identity, Inc. Systems and methods of biometric analysis with a specularity characteristic
US10452936B2 (en) 2016-01-12 2019-10-22 Princeton Identity Systems and methods of biometric analysis with a spectral discriminator
US10762367B2 (en) 2016-01-12 2020-09-01 Princeton Identity Systems and methods of biometric analysis to determine natural reflectivity
US10643087B2 (en) 2016-01-12 2020-05-05 Princeton Identity, Inc. Systems and methods of biometric analysis to determine a live subject
US10943138B2 (en) 2016-01-12 2021-03-09 Princeton Identity, Inc. Systems and methods of biometric analysis to determine lack of three-dimensionality
US10373008B2 (en) 2016-03-31 2019-08-06 Princeton Identity, Inc. Systems and methods of biometric analysis with adaptive trigger
US10366296B2 (en) 2016-03-31 2019-07-30 Princeton Identity, Inc. Biometric enrollment systems and methods
CN106407964A (en) * 2016-11-15 2017-02-15 刘霁中 Device and method using visible light source to collect iris and terminal device
CN106407964B (en) * 2016-11-15 2023-11-07 刘霁中 Device, method and terminal equipment for acquiring iris by using visible light source
US10607096B2 (en) 2017-04-04 2020-03-31 Princeton Identity, Inc. Z-dimension user feedback biometric system
US10902104B2 (en) 2017-07-26 2021-01-26 Princeton Identity, Inc. Biometric security systems and methods
US11642025B2 (en) 2018-07-16 2023-05-09 Verily Life Sciences Llc Retinal camera with light baffle and dynamic illuminator for expanding eyebox
CN111126145A (en) * 2018-10-18 2020-05-08 天目爱视(北京)科技有限公司 Iris 3D information acquisition system capable of avoiding light source image influence
WO2020149981A1 (en) * 2019-01-17 2020-07-23 Gentex Corporation Alignment system
US11151399B2 (en) 2019-01-17 2021-10-19 Gentex Corporation Alignment system
DE102019124127A1 (en) * 2019-09-09 2021-03-11 Bundesdruckerei Gmbh DEVICE AND METHOD FOR DETERMINING BIOMETRIC CHARACTERISTICS OF A FACE OF A PERSON
WO2022066816A3 (en) * 2020-09-25 2022-04-28 Sterling Labs Llc Pose optimization in biometric authentication systems
US20230062777A1 (en) * 2021-08-25 2023-03-02 Tools for Humanity Corporation Controlling a two-dimensional mirror gimbal for purposes of iris scanning
WO2023028242A1 (en) * 2021-08-25 2023-03-02 Tools for Humanity Corporation Controlling a two-dimensional mirror gimbal for purposes of iris scanning
US11895404B2 (en) 2021-08-25 2024-02-06 Worldcoin Foundation Controlling a two-dimensional mirror gimbal for purposes of iris scanning
WO2023063861A1 (en) * 2021-10-13 2023-04-20 Fingerprint Cards Anacatum Ip Ab A method and a system configured to reduce impact of impairment data in captured iris images
WO2023150239A3 (en) * 2022-02-03 2023-10-19 Meta Platforms Technologies, Llc Techniques for producing glints and iris illumination for eye tracking

Also Published As

Publication number Publication date
KR100342159B1 (en) 2002-06-27
JP2002514098A (en) 2002-05-14
EP0959769A1 (en) 1999-12-01
CA2264029A1 (en) 1998-03-05
AU727389B2 (en) 2000-12-14
AU4328297A (en) 1998-03-19
KR20000035840A (en) 2000-06-26
JP2004073880A (en) 2004-03-11

Similar Documents

Publication Publication Date Title
AU727389B2 (en) Apparatus for the iris acquiring images
US6320610B1 (en) Compact imaging device incorporating rotatably mounted cameras
US8077914B1 (en) Optical tracking apparatus using six degrees of freedom
US5717512A (en) Compact image steering and focusing device
US8983146B2 (en) Multimodal ocular biometric system
US6296358B1 (en) Ocular fundus auto imager
US7025459B2 (en) Ocular fundus auto imager
US8170293B2 (en) Multimodal ocular biometric system and methods
US6299306B1 (en) Method and apparatus for positioning subjects using a holographic optical element
US6064752A (en) Method and apparatus for positioning subjects before a single camera
JP5297486B2 (en) Device for detecting and tracking the eye and its gaze direction
JP2008104628A (en) Conjunctiva and sclera imaging apparatus
US20110170060A1 (en) Gaze Tracking Using Polarized Light
WO1999038121A1 (en) Method and apparatus for removal of bright or dark spots by the fusion of multiple images
WO2000004820A1 (en) Acquiring, analyzing and imaging three-dimensional retinal data
US20220148218A1 (en) System and method for eye tracking
CN109964230A (en) Method and apparatus for eyes measurement acquisition
JP2006318374A (en) Glasses determination device, authentication device, and glasses determination method
WO2023187780A1 (en) Eye tracking device
JP3342810B2 (en) Iris image acquisition device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH HU IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZW AM AZ BY KG KZ MD RU TJ TM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH KE LS MW SD SZ UG ZW AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
ENP Entry into the national phase

Ref document number: 2264029

Country of ref document: CA

Ref country code: CA

Ref document number: 2264029

Kind code of ref document: A

Format of ref document f/p: F

WWE Wipo information: entry into national phase

Ref document number: 1019997001528

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 1997941356

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWP Wipo information: published in national office

Ref document number: 1997941356

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1019997001528

Country of ref document: KR

WWG Wipo information: grant in national office

Ref document number: 1019997001528

Country of ref document: KR

WWW Wipo information: withdrawn in national office

Ref document number: 1997941356

Country of ref document: EP