US20060274918A1 - Method and apparatus for designing iris biometric systems for use in minimally constrained settings - Google Patents

Method and apparatus for designing iris biometric systems for use in minimally constrained settings Download PDF

Info

Publication number
US20060274918A1
US20060274918A1 US11/364,300 US36430006A US2006274918A1 US 20060274918 A1 US20060274918 A1 US 20060274918A1 US 36430006 A US36430006 A US 36430006A US 2006274918 A1 US2006274918 A1 US 2006274918A1
Authority
US
United States
Prior art keywords
iris
wavelengthnm
design
camera
subjects
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/364,300
Inventor
Robert Amantea
James Bergen
Dominick Loiacono
James Matey
Oleg Naroditsky
Michael Tinker
Thomas Zappia
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sarnoff Corp
Original Assignee
Sarnoff Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sarnoff Corp filed Critical Sarnoff Corp
Priority to US11/364,300 priority Critical patent/US20060274918A1/en
Assigned to SARNOFF CORPORATION reassignment SARNOFF CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AMANTEA, ROBERT, BERGEN, JAMES R., LOIACONO, DOMINICK, MATEY, JAMES R., NARODITSKY, OLEG, TINKER, MICHAEL, ZAPPIA, THOMAS
Publication of US20060274918A1 publication Critical patent/US20060274918A1/en
Priority to US12/576,644 priority patent/US7925059B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/40Software arrangements specially adapted for pattern recognition, e.g. user interfaces or toolboxes therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/94Hardware or software architectures specially adapted for image or video understanding
    • G06V10/945User interactive design; Environments; Toolboxes

Definitions

  • the invention relates generally to biometric techniques. More specifically, the invention relates to iris-based biometric techniques.
  • Iris based biometric techniques are useful for recognition, verification, or tracking of individuals. Iris based biometric methods can provide high accuracy identification and other functions with relatively low system cost. Because of the availability of very efficient indexing techniques, iris-based biometrics can also be used when a large number of people must be screened and rapidly matched to a database of millions of individuals.
  • iris-based biometrics have been limited by the conditions imposed by the requirement that high resolution, high contrast images of the iris be obtained. This requirement was only met reliably by the careful positioning of a single, cooperative, stationary subject within the limited field of view of a suitable illumination and image capture device. Typical existing systems limit this capture volume to a small region of space within a few 10's of centimeters of the sensor. For example, the LG3000 system manufactured by LG Electronics requires a subject to remain stationary for 3-10 seconds at standoff distance of roughly 10 cm and provides a capture volume of roughly 10 ⁇ 2 ⁇ 2 cm or 0.04 liters.
  • FIG. 1 depicts a computer system that executes iris-based biometric system design software of the present invention
  • FIG. 2 depicts a flow diagram of a method of the present invention
  • FIG. 3 depicts a block diagram of an iris-based biometric system having parameters that are designed by the present invention
  • FIG. 4 is a graph of an exemplary retinal thermal hazard function
  • FIG. 5 is a graph of an exemplary hazard function
  • FIGS. 6 and 7 are graphs of exemplary radiance threshold limit value (TLV) coefficient functions
  • FIG. 8 is a flow diagram showing an exemplary embodiment of a method for providing iris biometrics system design in minimally constrained settings, according to the present invention.
  • FIG. 9 which includes FIGS. 9A, 9B , 9 C, 9 D, and 9 E, is a table of an exemplary parametric model of an iris recognition system.
  • the invention will be primarily described within the general context of exemplary embodiment of the present invention of a method and apparatus for defining an iris biometrics system for operation in minimally constrained settings.
  • the present invention defines systems having fewer constraints on subjects than traditional methods by extending standoff distance and capture volume.
  • the standoff distance is the distance between the image acquisition system and the subject. In some cases, there may be two standoff distances, the camera-subject distance and illumination-subject distance.
  • the capture volume is a volume in four dimensions (i.e., space and time) within which an iris image can be captured with high probability that it will generate an acceptable iris template.
  • An input to the system comprises a definition of the environment in which iris biometrics are to be captured.
  • the invention processes these environmental constraints to derive the parameters of an iris biometric system that enables iris imaging using large standoff distances and optimal, safe iris illumination.
  • Exemplary embodiments of systems designed using the present invention may be used in many different scenarios, including the following exemplary scenarios: (1) a high volume security checkpoint, (2) an office, (3) an aircraft boarding bridge, (4) a wide corridor, and (5) an automobile.
  • a high volume security checkpoint subjects must be imaged without stopping or actively cooperating in the identification method.
  • an office multiple subjects must be imaged while moving through a large volume, changing direction orientation.
  • an aircraft boarding bridge which is loading or unloading
  • subjects must be imaged with only the constraints on position and speed of motion imposed by the width of the space.
  • a wide corridor subjects may move in different directions or change direction and the capture volume must be large enough that all persons passing through the corridor are imaged successfully.
  • subjects must be imaged at a range of distances from an outside camera and illuminator.
  • exemplary embodiments of the present invention include a method for designing systems to meet various combinations and degrees of constraint, such as a system for a particular scenario.
  • FIG. 1 depicts an exemplary system used to implement the present invention.
  • the system 100 comprises a general purpose computer 102 that, when executing certain software, becomes a specific purpose computer that performs the present invention.
  • the computer 102 comprises at least one central processing unit (CPU) 104 , support circuits 106 , and memory 108 .
  • the CPU 104 may be any one of the many microprocessors that are commercially available.
  • the support circuits 106 comprise circuits that facilitate operation of the CPU 104 including clock circuits, cache, power supplies, input/output circuits and the like.
  • the memory 108 may comprise one or more of read only memory, random access memory, disk drives, optical memory, removable memory and the like.
  • the memory 108 stores an operating system 110 and iris-based biometric system design software 112 .
  • the computer processes input information 114 to define an optimal biometric system 116 that fulfills the given design constraints.
  • the system 100 processes input design constraints 114 to produce output system design parameters 116 .
  • FIG. 2 depicts a flow diagram of the method 200 performed by the computer when executing the software 112 .
  • the method 200 receives input constraints that define the environment in which the iris biometrics are to be captured. These constraints are generally entered into a table or spread sheet that defines the scenario.
  • these constraints are processed as described below to generate, at step 206 , a set of parameters that define an iris-based biometric system that will operate in the given environment.
  • FIG. 3 depicts a block diagram of an iris biometrics system 300 that is designed using the present invention.
  • the system 300 comprises an ambient illuminator 304 , a controlled illuminator 306 , a camera 308 , and an iris image processor 312 .
  • the present invention generates the system parameter 310 including, but not limited to, an illumination standoff (the distance from the controlled illuminator to a subject 302 ), an illumination level, the camera standoff (a distance from the camera to the subject 302 ), and the camera settings (position, focal length, and the like).
  • the scenario represents a multidimensional problem that includes numerous variable parameters.
  • the subject is illuminated by both ambient illumination and controlled illumination. Both illuminators 304 and 306 may be controlled, or only the controlled illuminator 306 may be controlled.
  • the illumination levels received at the subject 302 from all sources of illumination must meet specific safety levels to avoid damage to the retina of the subject, yet provide sufficient illumination for the iris image to be processed.
  • a safety assessment is performed that accounts for scenario constraints (e.g., camera and illuminator position), standoff distances, ambient illumination, maximum level of illumination from the controlled illuminator and so on.
  • the camera 308 captures one or more images of the subject's iris, the processor 312 segments and normalizes the images. Then, an iris template is created, which is matched to a database of templates to identify the subject.
  • the subjects may be seated in chairs and all looking in the same direction, towards a desk of the occupant of the office, or they may be in various places around the room. Subjects may be entering or leaving the room and may be looking in any direction.
  • the camera or cameras
  • the cameras For seated, non-moving subjects, the camera (or cameras) can be positioned on the desk or behind the person sitting at the desk.
  • the cameras For moving subjects, the cameras may need to be positioned around the room, because, in effect, the entire room is the capture volume.
  • There are different challenges for moving and non-moving subjects In the case of non-moving subjects seated in front of the desk, one challenge is simply to find the subjects' eyes and irises and to associate the identification with a particular person.
  • exemplary embodiments of the present invention are able to capture essentially all the irises.
  • the challenge is harder.
  • Subjects may be facing in any direction, may be moving, and may be entering and leaving the capture volume at any time.
  • their irises may be found at varying heights, depending on whether they are standing, seated, or in the act of sitting down or getting up.
  • it can be anticipated that subjects will not be moving particularly fast—differentiating the office scenario from, for example, the airport scenario where subjects are in a hurry.
  • Table 1 shows design constraints for an exemplary office scenario. These constraints include specific values for the capture volume, the standoff distance, a number of persons in the capture volume that a biometric system is able to handle at one time, the acquisition rate, the speed at which the persons are moving, the cooperation of the subjects, and whether the exemplary embodiment of the present invention includes features for an attendant. In one exemplary embodiment of the present invention, these specific values and others are input to the method 200 for designing iris biometric systems in minimally constrained settings and the configuration for a particular iris biometric system is provided that operates within the input constraints for the office scenario.
  • a boarding bridge scenario passengers may be boarding or exiting an airplane through a boarding bridge.
  • the space through which passengers pass is approximately 1.5 m wide and 2.1 m high.
  • the depth of the capture volume is approximately 1 m.
  • Most people in the capture volume are moving and facing in the same direction, but that direction changes depending on whether the people are boarding or exiting. People may be moving side-by-side and occluding persons behind them.
  • One challenge is to find the eyes in the volume, capture the irises, and associate identification with a particular person. Persons on the bridge are rarely more than two abreast and are frequently single file. However, occlusion, which occurs when one person is directly between the camera and another person, is expected to occur occasionally.
  • NIR near infrared
  • a person for whom a good iris image has not been acquired may need to be taken aside for further screening. It may be desirable to screen for aberrant behavior, e.g., a person moving opposite to the general traffic flow.
  • a person who is, for example, attempting to board when passengers were exiting, may be checked against a list of authorized personnel.
  • Table 2 shows specific values of constraints for an exemplary boarding bridge scenario. In one exemplary embodiment of the present invention, these specific values and others are input to the method 200 for designing iris biometric systems in minimally constrained settings and the configuration for a particular iris biometric system is provided that operates within the input constraints for the boarding bridge scenario.
  • TABLE 2 Exemplary constraints for a Boarding Bridge Scenario Capture volume 1.5 m ⁇ 1 m ⁇ 1 m Standoff Distance 3 m-10 m Number of subjects in capture volume ⁇ 3 Acquisition rate 80% Speed of subjects ⁇ 2 m/sec. Cooperation When asked Attended/Unattended Attended
  • the capture volume of a corridor scenario may be larger and people may be moving in both directions at all times. For full coverage, therefore, it is desirable to have cameras pointing in both directions and across the corridor. It is expected that at times there may be many persons in the capture volume and that there may be multiple occlusions.
  • One challenge is to capture as many iris images as possible, given the uncontrolled nature of the environment. Usually it is difficult to ask persons to undergo secondary screening, because this scenario is usually an uncontrolled passageway without specific security measures associated with it. It may be desirable to identify as many persons as possible in a database (possibly a watch list), given that some percentage of irises will not be captured.
  • Table 3 shows specific values of constraints for an exemplary corridor scenario. In one exemplary embodiment of the present invention, these specific values and others are input to the method 200 for designing iris biometric systems in minimally constrained settings and the configuration for a particular iris biometric system is provided that operates within the input constraints for the corridor scenario.
  • TABLE 3 Exemplary Constraints for a Corridor Scenario Capture Volume 3 m ⁇ 1 m ⁇ 1 m Standoff Distance ⁇ 3 m Number of subjects in capture volume ⁇ 10 Acquisition rate 75% Speed of subjects ⁇ 2 m/sec. Cooperation None Attended/Unattended Unattended
  • the irises of the driver of a moving automobile are to be captured.
  • the subject is the single person seated in the front seat on the driver's side.
  • the automobile is not moving very rapidly, e.g., it has slowed down for a barrier or a tollbooth or a drive-by window, such as an automated teller machine (ATM) or a food service window.
  • ATM automated teller machine
  • One challenge is to find the iris and capture it through a glass partition, which may be either the front windshield or the side window.
  • the window may have a large degree of unpredictable specularity and may be tinted so that image brightness is attenuated.
  • Table 4 shows specific values of constraints for an exemplary automobile scenario.
  • these specific values and others are input to the method 200 for designing iris biometric systems in minimally constrained settings and the configuration for a particular iris biometric system is provided that operates within the input constraints for the automobile scenario.
  • other constraints and other specific values for these and other scenarios are within the scope of the present invention.
  • TABLE 4 Exemplary Parameters for an Automobile Scenario Capture Volume 0.5 m ⁇ 0.5 m ⁇ 0.5 m Standoff Distance 3 m-10 m Number of subjects in capture volume 1 Acquisition rate 75% Speed of subjects ⁇ 1 m/sec. Cooperation None Attended/Unattended Optional
  • the system may fail to recognize that a subject is within the capture volume, fail to acquire an iris template from a subject known to be within the capture volume, fail to recognize an acquired iris template, match an iris template against a watchlist, match an iris template against some other database (e.g., authorized users, passenger manifest, employee database), recognize some feature of an acquired template or iris image that triggers an alarm (e.g., an attempt to spoof the system, or a subject moving counter to expected flow for that type of subject, or a false match against a database.
  • Exemplary embodiments of systems for performing iris recognition can be designed to respond differently to the outcome depending on the particular scenario and the needs and desires of the client.
  • Some exemplary responses to failure to acquire an iris template from someone known to be in the capture volume include sounding an alarm, signaling a person, and collecting the failure as a statistic to measure system performance.
  • FIGS. 4 and 5 are charts of exemplary hazard functions 400 and 500 . These charts illustrate the relative hazard of thermal damage to the retina from visible or infra-red light as a function of the wavelength of that light. A hazard function according to these charts is used in an exemplary embodiment of the present invention of an iris acquisition model (see FIGS. 9 A-E) to determine eye-safe levels of absorption.
  • TLVs threshold limit values
  • BEls biological exposure indices
  • A( ⁇ ), B( ⁇ ) and R( ⁇ ) for the aphakic, blue and thermal hazards respectively. These functions are shown in FIG. 5 .
  • FIG. 7 provides an assessment of how the three hazards vary with wavelength. The values of the three weighting functions cannot be compared between hazards. The TLV value is set by the product of the weighting function with a hazard-specific function of exposure and subtense.
  • t is the exposure time
  • R is the retinal thermal hazard function of FIG. 7
  • is the angular subtense of the source as seen by the eye
  • f(t) is defined as the radiance TLV coefficient, which is a function of exposure time that is presented in FIGS. 6 and 7 .
  • FIGS. 6 and 7 are graphs of exemplary radiance threshold limit value (TLV) coefficient functions 600 and 700 .
  • the TLV describes the level of exposure to some hazard (in this case infrared illumination) that is deemed to be the maximum acceptable to avoid risk of injury.
  • This TLV coefficient gives the relative effectiveness of an infra-red illumination dose as a function of the length of time over which that dose is absorbed. The potential for damage is greater when very intense radiation is absorbed over a short period of time than when the same total dose is applied as low intensity radiation over a longer time.
  • FIG. 8 shows an exemplary embodiment of the present invention of a method 800 for providing iris biometrics system design in minimally constrained settings.
  • the method 800 receives design constraints and provides derived design parameters that are useful in designing a system having a specific set of performance requirements.
  • the method 800 receives particular values for the following design parameters: a lens focal length 802 , an illuminator power 804 , a camera standoff 806 , an illuminator standoff 808 , a camera resolution 810 , a camera sensitivity 812 , and an illuminator wavelength 814 .
  • the method 800 provides an amount of radiation absorbed by the iris 816 , a captured image contrast 818 , and a depth of field 820 .
  • an exemplary embodiment of the present invention of a system can be designed meeting particular performance requirements to perform in a minimally constrained setting.
  • the method 800 may be implemented using the information in FIGS. 9 A-E to generate a set of inequality constraints on the system design parameters to be satisfied.
  • One exemplary embodiment of the present invention is a system designed in accordance with method 800 using the information in FIGS. 9 A-E.
  • other exemplary embodiments of the present invention include methods that are not limited by any particular values or selection of system components, parameters, units or formulas, such as those in FIGS. 9 A-E.
  • FIG. 9 which includes FIGS. 9A-9E , is a table defining an exemplary iris acquisition model.
  • This model is implemented as a spreadsheet with macros (see Table 6).
  • Table 6 for each system component, there are associated parameters which have a sample value, units, symbol, and formula. Values and units are also provided in international system of units (SI).
  • SI international system of units
  • the following system components are defined: illumination, illumination standoff, ambient illumination, subject, camera standoff, camera lens, camera sensor, iris image, and constants.
  • the present invention contemplates various similar models and is not limited to any particular values shown in FIGS. 9 A-E.
  • This exemplary model is implementable as a system design and analysis tool using software, such as a Microsoft® Excel® spreadsheet program.
  • Design parameters such as lens focal length, illuminator power, camera standoff, illuminator standoff, camera resolution, camera sensitivity, illuminator wavelength, and so forth
  • the derived quantities may comprise the amount of radiation absorbed by the iris, the captured image contrast, and the depth of field for the particular constraints.
  • This model is capable of generating many different iris biometrics systems based on different combinations of parameters, such as capture volume, camera placement, speed of motion, and other factors.
  • the exemplary embodiment of the present invention of the iris acquisition model shown in FIGS. 9 A-E uses design principles embodied in that model to generate a set of inequality constraints on the system design parameters to be satisfied. For example, in order to obtain a specified depth of capture volume, the Far Focus Limit—Near Focus Limit must be greater than this specified value.
  • the Near Focus Limit is given by the model as a numerical expression relating Camera Standoff Distance, Hyperfocal Distance, and Focal Length. Hyperfocal Distance in turn is computed in terms of Focal Length, F number, and Circle of Confusion. Each of these quantities is either specified directly or computed from design constraints.
  • the capture volume is translated into a constraint on system design parameters, allowing engineering tradeoffs to achieve application requirements to be evaluated.
  • the exemplary iris acquisition model shown in FIGS. 9 A-E includes the following model components: controlled illumination, illumination standoff, ambient illumination, subject, camera standoff, camera lens, camera sensor, and iris image.
  • the controlled illumination is the illumination supplied by the iris acquisition system.
  • the illumination standoff is the distance between the controlled illumination and the subject including the properties of the intervening medium.
  • the ambient illumination is the illumination not controlled by the iris acquisition system.
  • the subject is the iris being measured.
  • the camera standoff is the distance between the front of the camera lens and the subject.
  • the camera lens is the optical elements of the camera.
  • the camera sensor is the device that converts photons into electronic signals.
  • the iris image is the output of the sensor.
  • Wavelength is the wavelength of the illumination source.
  • Radiance is the brightness of the illumination source, the areal density of radian intensity, and the watts/s per unit area of source.
  • Radiance TLV is the threshold limit value of radiance for eye sage operation. It depends on wavelength, exposure time, and angular subtense of the source as seen from the subject's iris and the pulse width and repetition rate if the illumination source is not continuous. The TLV is defined by the ACGIH.
  • macros in a spreadsheet compute the TLVs. (Table 14 shows exemplary code for these macros.)
  • the area of a single emitter is the area in meters squared of one of an array of emitters.
  • the number of emitters is a number of emitters in an array of emitters.
  • the duty cycle is a function of the time the source is on.
  • the pulse width is the width of pulses for a pulsed source.
  • the repetition rate is the frequency of pulses for a pulsed source.
  • the duration is the duration of exposure, which encompasses multiple pulses for a pulsed source.
  • Table 6 lists exemplary parameters associated with controlled illumination standoff.
  • Distance is the distance from the source to the iris.
  • Distance/size ratio is the ratio of the source distance to the largest transverse dimension of the source. In one exemplary embodiment of the present invention, values larger than about 10 for the small angle approximations are considered valid in a model, such as the model in FIGS. 9 A-E.
  • Angular subtense is the largest subtended angle of the source as seen from the iris.
  • the optical properties of the intervening medium are absorption, refraction, dispersion, and scattering.
  • Table 7 lists exemplary parameters for ambient illumination.
  • the wavelength is the wavelength of the illumination source.
  • the subject irradiance is the irradiance measured at the subject due to ambient illumination.
  • TABLE 7 Exemplary Parameters for Ambient Illumination Parameter Input Symbol SI Units Wavelength X m Subject irradiance X W/m2
  • the subject irradiance is the irradiance at the subject due to the controlled illumination source.
  • the irradiance TLV, non laser is the TLV for incoherent, non-laser light
  • the irradiance TLV, laser is the TLV for laser light.
  • the iris albedo is the measured average albedo of a prototype iris.
  • the iris albedo root mean square (RMS) is the measured RMS variation of albedo about the average.
  • the skin albedo is the measured average skin albedo.
  • the sclera albedo is the measured average sclera albedo.
  • the pupil albedo is the measured average pupil albedo.
  • the iris diameter is the measured iris diameter of a prototype iris.
  • the iris average excitance is the light reflected from the iris.
  • the iris RMS excitance is the variation in light reflected from the iris.
  • iris albedo is measured for the exemplary model.
  • a reflectance standard such as the type provided by Labsphere, North Sutton, N.H. or Ocean Optics, Dunedin, Fla. by performing the following method.
  • the iris albedo is the ratio of the pixel value on the iris to that on the standard. If the iris albedo is low, capture one image as described above and then increase the illumination level by a known factor and recapture. Rescale the pixel values of the iris in the original image by the illumination level ratio of the two images and proceed to compare those values with the iris pixel values in the new image.
  • a series of images may be taken with increasing intensity to plot the pixel values for the various structures, (e.g., reference, skin, iris, sclera) of interest as a function of the intensity. The data usually fits straight lines.
  • the ratios of the slopes of the iris, sclera, and skin to that of the reference then give the albedos for those structures.
  • the variation of albedo is estimated within a structure by extracting a region of pixels from the structure and computing the average and standard deviation of the pixels.
  • Table 9 lists the exemplary parameters associated with the camera standoff.
  • the distance is the distance from the front of the camera to the subject iris.
  • the optical properties of the intervening medium are absorption, refraction, dispersion, and scattering.
  • TABLE 9 Exemplary Parameters for the Camera Standoff Parameter Input Symbol SI Units Distance X d camera m Optical properties of intervening medium X
  • the magnification is the iris radius at the sensor divided by the iris radius at the subject.
  • the focal length is the distance along the optical axis from the lens to the focus (or focal point).
  • the lens F# is the ratio of the lens focal length to the lens diameter.
  • the transmission is the fraction of light transmitted by the lens.
  • the effective lens diameter is the areas of the lens for light capture.
  • the lens capture efficiency is the fraction of light from the subject that is captured by the lens.
  • the total lens efficiency is the fraction of light from the subject that arrives at the sensor.
  • the circle of confusion (at the sensor) is the maximum diameter of a point source as imaged on the sensor that is below the resolution limit of the sensor.
  • the hyper focal distance is the focus point at which the depth of field extends from half hyper focal distance to infinity.
  • the near focal distance is the near edge of the depth of field.
  • the far focus limit is the far edge of the depth of field.
  • the depth of field is the region of space within which a point is imaged to a circle smaller than the circle of confusion.
  • the quantum efficiency is for interacting/incident photons.
  • the camera gain J for interacting photons and the camera gain K for electrons are from the well known photon transfer curve.
  • the well depth is the depth of the sensor well in electrons.
  • the read noise is the noise in the absence of photons.
  • the shot noise is the square root of the sensor average signal.
  • the pixel width is set by the pitch and fill factor.
  • the pixel height is the same as width for square pixels.
  • the pixel pitch horizontal is the pixel spacing, which is not the same as the pixel width.
  • the pixel pitch vertical is the same as width, for square pixels.
  • the pixel area is the pixel width times the pixel height.
  • the fill factor is the size of the light sensitive photodiode relative to the surface of the pixel.
  • the sensor width and height are measured in pixels and meters.
  • the shutter time is the exposure time.
  • Photon energy Planck's constant*the speed of light/the source wavelength.
  • Sensor average irradiance iris average excitance*total lens efficiency.
  • Sensor RMS radiance iris RMS excitance*total lens efficiency.
  • Sensor signal photons sensor average irradiance*pixel area*camera shutter/photon energy.
  • Sensor average signal quantum efficiency*sensor signal photons.
  • Sensor RMS signal sensor average signal*iris albedo variation.
  • Table 12 lists the exemplary parameters for iris image.
  • the pixels across iris is the desired pixel resolution at the subject.
  • the resolution is the resolution at the subject.
  • the contrast is the contrast at the subject.
  • the excess resolution factor is used to determine the optical resolution needed for recognition.
  • the minimum required resolution at the subject is the minimum needed for recognition.
  • the albedo of a surface is the ratio of its radiant exitance to its irradiance.
  • the deposition of energy to the cornea depends on the corneal irradiance E c .
  • the deposition of energy in the retina depends on the retinal irradiance, E r .
  • the corneal irradiance tells little about the retinal irradiance, because the corneal irradiance is focused onto the retina and the details of the focused image depend on the source, rather than the corneal irradiance.
  • f e is the focal length of the eye lens and r is the transmission of the lens and the intraocular fluids. Because ⁇ and f e are reasonably constant across the population, the eye safety limits are expressed in terms of the source radiance. There are complicating factors. The size of the image on the retina has an impact on the rate at which energy is diffused from the image and this interacts with the exposure time so that the maximum permissible source radiance for safety depends on the angular subtense of the source, the source wavelength, and the duration of the exposure.
  • the size of the image on the retina is determined by diffraction and aberration in the eye, rather than by the size of the source.
  • the corneal irradiance is a better indicator of retinal irradiance than the source radiance.
  • TLV Macros Private Const CM2perM2 10000 ‘ cm ⁇ circumflex over ( ) ⁇ 2 per m ⁇ circumflex over ( ) ⁇ 2 conversion factor
  • JRM End If Ce c End Function Public Function retinalThermalHazardMultiplier(wavelengthNM As Single) As Single ‘ returns dimensionless ‘ from ACGIH 2004 TLVs and BEIs, page 153.

Abstract

A method and apparatus for designing an iris biometrics system that operates in minimally constrained settings. The image acquisition system has fewer constraints on subjects than traditional methods by extending standoff distance and capture volume. The method receives design parameters and provides derived quantities that are useful in designing an image acquisition system having a specific set of performance requirements. Exemplary scenarios of minimally constrained settings are provided, such as a high volume security checkpoint, an office, an aircraft boarding bridge, a wide corridor, and an automobile.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Application No. 60/687,106 filed on Jun. 3, 2005, which is incorporated herein by reference. In addition, this application is related to co-pending U.S. Application “Method and Apparatus for Providing Strobed Video Capture”, Attorney Docket No. SAR/15245.
  • GOVERNMENT RIGHTS IN THIS INVENTION
  • This invention was made with U.S. government support under contract number NMA401-02-9-2001-0041. The U.S. government has certain rights in this invention.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The invention relates generally to biometric techniques. More specifically, the invention relates to iris-based biometric techniques.
  • 2. Description of the Related Art
  • Iris based biometric techniques are useful for recognition, verification, or tracking of individuals. Iris based biometric methods can provide high accuracy identification and other functions with relatively low system cost. Because of the availability of very efficient indexing techniques, iris-based biometrics can also be used when a large number of people must be screened and rapidly matched to a database of millions of individuals.
  • However, the widespread use of iris-based biometrics has been limited by the conditions imposed by the requirement that high resolution, high contrast images of the iris be obtained. This requirement was only met reliably by the careful positioning of a single, cooperative, stationary subject within the limited field of view of a suitable illumination and image capture device. Typical existing systems limit this capture volume to a small region of space within a few 10's of centimeters of the sensor. For example, the LG3000 system manufactured by LG Electronics requires a subject to remain stationary for 3-10 seconds at standoff distance of roughly 10 cm and provides a capture volume of roughly 10×2×2 cm or 0.04 liters. These limitations are workable in constrained settings such as security checkpoints, bank teller machines, or information system access points, but severely limit the applicability of iris biometrics in minimally constrained settings, such as screening in airports, subway systems, or at entrances to otherwise uncontrolled buildings or facilities.
  • In designing a practical iris-based biometric system having a large standoff distance, many variables must be addressed including the standoff distance for both the camera and illumination source, the iris illumination power (pulsed and continuous), the lens focal length, the capture volume, and so on. To design a system to fulfill specific design parameters for a given minimally constrained environment for the system, extensive trial and error “tuning” is required. Such trial and error tuning is time consuming and costly.
  • Therefore, there is a need in the art for a method and apparatus for designing iris-based biometric systems for use in minimally constrained settings.
  • SUMMARY OF THE DISCLOSURE
  • The deficiencies of the prior art are addressed by various exemplary embodiments of the present invention of a method and apparatus for designing iris biometric systems for use in minimally constrained settings. Design constraints for a minimally constrained setting and input parameter values are received. Using the input parameters and constraints, parameter(s) are calculated to produce a design for an iris biometrics system using synchronized stroboscopic illumination that operates within the design parameters. The ultimate design is safe to a subject's eye yet provides sufficient illumination to enable iris recognition over long distances.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The teachings of the present invention can be readily understood by considering the following detailed description in conjunction with the accompanying drawings, in which:
  • FIG. 1 depicts a computer system that executes iris-based biometric system design software of the present invention;
  • FIG. 2 depicts a flow diagram of a method of the present invention;
  • FIG. 3 depicts a block diagram of an iris-based biometric system having parameters that are designed by the present invention;
  • FIG. 4 is a graph of an exemplary retinal thermal hazard function;
  • FIG. 5 is a graph of an exemplary hazard function;
  • FIGS. 6 and 7 are graphs of exemplary radiance threshold limit value (TLV) coefficient functions;
  • FIG. 8 is a flow diagram showing an exemplary embodiment of a method for providing iris biometrics system design in minimally constrained settings, according to the present invention; and
  • FIG. 9, which includes FIGS. 9A, 9B, 9C, 9D, and 9E, is a table of an exemplary parametric model of an iris recognition system.
  • To facilitate understanding, identical reference numerals have been used, where possible, to designate identical elements that are common to the figures.
  • DETAILED DESCRIPTION
  • The invention will be primarily described within the general context of exemplary embodiment of the present invention of a method and apparatus for defining an iris biometrics system for operation in minimally constrained settings.
  • The present invention defines systems having fewer constraints on subjects than traditional methods by extending standoff distance and capture volume. The standoff distance is the distance between the image acquisition system and the subject. In some cases, there may be two standoff distances, the camera-subject distance and illumination-subject distance. The capture volume is a volume in four dimensions (i.e., space and time) within which an iris image can be captured with high probability that it will generate an acceptable iris template.
  • An input to the system comprises a definition of the environment in which iris biometrics are to be captured. The invention processes these environmental constraints to derive the parameters of an iris biometric system that enables iris imaging using large standoff distances and optimal, safe iris illumination.
  • Exemplary embodiments of systems designed using the present invention may be used in many different scenarios, including the following exemplary scenarios: (1) a high volume security checkpoint, (2) an office, (3) an aircraft boarding bridge, (4) a wide corridor, and (5) an automobile. In a high volume security checkpoint, subjects must be imaged without stopping or actively cooperating in the identification method. In an office, multiple subjects must be imaged while moving through a large volume, changing direction orientation. In an aircraft boarding bridge, which is loading or unloading, subjects must be imaged with only the constraints on position and speed of motion imposed by the width of the space. In a wide corridor, subjects may move in different directions or change direction and the capture volume must be large enough that all persons passing through the corridor are imaged successfully. In an automobile, subjects must be imaged at a range of distances from an outside camera and illuminator.
  • These scenarios differ in the degree of constraint. For example, the corridor is less constrained than the other boarding bridge and, hence, likely to have a lower acquisition rate. Also, the automobile has the difficulty of a glass partition in front of the subject. Consequently, exemplary embodiments of the present invention include a method for designing systems to meet various combinations and degrees of constraint, such as a system for a particular scenario.
  • FIG. 1 depicts an exemplary system used to implement the present invention. The system 100 comprises a general purpose computer 102 that, when executing certain software, becomes a specific purpose computer that performs the present invention. The computer 102 comprises at least one central processing unit (CPU) 104, support circuits 106, and memory 108. The CPU 104 may be any one of the many microprocessors that are commercially available. The support circuits 106 comprise circuits that facilitate operation of the CPU 104 including clock circuits, cache, power supplies, input/output circuits and the like. The memory 108 may comprise one or more of read only memory, random access memory, disk drives, optical memory, removable memory and the like. The memory 108 stores an operating system 110 and iris-based biometric system design software 112. When the software 112 is executed, the computer processes input information 114 to define an optimal biometric system 116 that fulfills the given design constraints. As such, the system 100 processes input design constraints 114 to produce output system design parameters 116.
  • FIG. 2 depicts a flow diagram of the method 200 performed by the computer when executing the software 112. At step 202, the method 200 receives input constraints that define the environment in which the iris biometrics are to be captured. These constraints are generally entered into a table or spread sheet that defines the scenario. At step 204, these constraints are processed as described below to generate, at step 206, a set of parameters that define an iris-based biometric system that will operate in the given environment.
  • FIG. 3 depicts a block diagram of an iris biometrics system 300 that is designed using the present invention. The system 300 comprises an ambient illuminator 304, a controlled illuminator 306, a camera 308, and an iris image processor 312. The present invention generates the system parameter 310 including, but not limited to, an illumination standoff (the distance from the controlled illuminator to a subject 302), an illumination level, the camera standoff (a distance from the camera to the subject 302), and the camera settings (position, focal length, and the like). The scenario represents a multidimensional problem that includes numerous variable parameters.
  • The subject is illuminated by both ambient illumination and controlled illumination. Both illuminators 304 and 306 may be controlled, or only the controlled illuminator 306 may be controlled. The illumination levels received at the subject 302 from all sources of illumination must meet specific safety levels to avoid damage to the retina of the subject, yet provide sufficient illumination for the iris image to be processed. Thus, a safety assessment is performed that accounts for scenario constraints (e.g., camera and illuminator position), standoff distances, ambient illumination, maximum level of illumination from the controlled illuminator and so on.
  • Once illuminated, the camera 308 captures one or more images of the subject's iris, the processor 312 segments and normalizes the images. Then, an iris template is created, which is matched to a database of templates to identify the subject.
  • In an office scenario, there are many possibilities for designing a biometric system. The subjects may be seated in chairs and all looking in the same direction, towards a desk of the occupant of the office, or they may be in various places around the room. Subjects may be entering or leaving the room and may be looking in any direction. For seated, non-moving subjects, the camera (or cameras) can be positioned on the desk or behind the person sitting at the desk. For moving subjects, the cameras may need to be positioned around the room, because, in effect, the entire room is the capture volume. There are different challenges for moving and non-moving subjects. In the case of non-moving subjects seated in front of the desk, one challenge is simply to find the subjects' eyes and irises and to associate the identification with a particular person. In most cases, the subjects are not concealing their irises, although the person behind a desk could ask them, for example, to remove sunglasses. With no movement or occlusion, exemplary embodiments of the present invention are able to capture essentially all the irises. For moving subjects, the challenge is harder. Subjects may be facing in any direction, may be moving, and may be entering and leaving the capture volume at any time. Additionally, their irises may be found at varying heights, depending on whether they are standing, seated, or in the act of sitting down or getting up. On the other hand, it can be anticipated that subjects will not be moving particularly fast—differentiating the office scenario from, for example, the airport scenario where subjects are in a hurry.
  • Table 1 shows design constraints for an exemplary office scenario. These constraints include specific values for the capture volume, the standoff distance, a number of persons in the capture volume that a biometric system is able to handle at one time, the acquisition rate, the speed at which the persons are moving, the cooperation of the subjects, and whether the exemplary embodiment of the present invention includes features for an attendant. In one exemplary embodiment of the present invention, these specific values and others are input to the method 200 for designing iris biometric systems in minimally constrained settings and the configuration for a particular iris biometric system is provided that operates within the input constraints for the office scenario.
    TABLE 1
    Exemplary Constraints for an Office Scenario
    Capture Volume 3 m × 3 m × 2.5 m
    Standoff Distance ≦3 m
    Number of subjects in capture volume ≦6
    Acquisition rate Seated 100%/Moving 75%
    Speed of subjects ≦1 m/s
    Cooperation Little or none
    Attended/Unattended Optional
  • In a boarding bridge scenario, passengers may be boarding or exiting an airplane through a boarding bridge. Typically, the space through which passengers pass is approximately 1.5 m wide and 2.1 m high. The depth of the capture volume is approximately 1 m. Most people in the capture volume are moving and facing in the same direction, but that direction changes depending on whether the people are boarding or exiting. People may be moving side-by-side and occluding persons behind them. One challenge is to find the eyes in the volume, capture the irises, and associate identification with a particular person. Persons on the bridge are rarely more than two abreast and are frequently single file. However, occlusion, which occurs when one person is directly between the camera and another person, is expected to occur occasionally. It is also expected that individuals may deliberately or inadvertently block their irises from the view of the camera, e.g., with near infrared (NIR) opaque glasses or contact lenses. Upon identification, a person for whom a good iris image has not been acquired may need to be taken aside for further screening. It may be desirable to screen for aberrant behavior, e.g., a person moving opposite to the general traffic flow. Upon detection, a person, who is, for example, attempting to board when passengers were exiting, may be checked against a list of authorized personnel.
  • Table 2 shows specific values of constraints for an exemplary boarding bridge scenario. In one exemplary embodiment of the present invention, these specific values and others are input to the method 200 for designing iris biometric systems in minimally constrained settings and the configuration for a particular iris biometric system is provided that operates within the input constraints for the boarding bridge scenario.
    TABLE 2
    Exemplary constraints for a Boarding Bridge Scenario
    Capture volume 1.5 m × 1 m × 1 m
    Standoff Distance 3 m-10 m
    Number of subjects in capture volume ≦3
    Acquisition rate 80%
    Speed of subjects ≦2 m/sec.
    Cooperation When asked
    Attended/Unattended Attended
  • While similar to the boarding bridge scenario, the capture volume of a corridor scenario may be larger and people may be moving in both directions at all times. For full coverage, therefore, it is desirable to have cameras pointing in both directions and across the corridor. It is expected that at times there may be many persons in the capture volume and that there may be multiple occlusions. One challenge is to capture as many iris images as possible, given the uncontrolled nature of the environment. Usually it is difficult to ask persons to undergo secondary screening, because this scenario is usually an uncontrolled passageway without specific security measures associated with it. It may be desirable to identify as many persons as possible in a database (possibly a watch list), given that some percentage of irises will not be captured.
  • Table 3 shows specific values of constraints for an exemplary corridor scenario. In one exemplary embodiment of the present invention, these specific values and others are input to the method 200 for designing iris biometric systems in minimally constrained settings and the configuration for a particular iris biometric system is provided that operates within the input constraints for the corridor scenario.
    TABLE 3
    Exemplary Constraints for a Corridor Scenario
    Capture Volume 3 m × 1 m × 1 m
    Standoff Distance ≦3 m
    Number of subjects in capture volume ≦10
    Acquisition rate 75%
    Speed of subjects ≦2 m/sec.
    Cooperation None
    Attended/Unattended Unattended
  • In an automobile scenario, the irises of the driver of a moving automobile are to be captured. Unlike the other scenarios where multiple people are in the capture volume, in the automobile scenarios, the subject is the single person seated in the front seat on the driver's side. In the automobile scenarios, the automobile is not moving very rapidly, e.g., it has slowed down for a barrier or a tollbooth or a drive-by window, such as an automated teller machine (ATM) or a food service window. One challenge is to find the iris and capture it through a glass partition, which may be either the front windshield or the side window. The window may have a large degree of unpredictable specularity and may be tinted so that image brightness is attenuated. Getting sufficient light on the subject under these circumstances might prove difficult. The transparency of different varieties of autoglass in the near infrared under various conditions may be determined. Sometimes it is desirable to immediately identify the driver of a car when the subject is in the database. This is useful for installations such as drive-through ATMs and tollbooths, security barriers, and border crossings. In the automobile scenario, failure to capture results in additional screening, much like a portal (constrained capture) scenario. For example, at a tollbooth, if capture through the windshield fails, the driver is required to stop, open the driver-side window, and peer at a device until recognized.
  • Table 4 shows specific values of constraints for an exemplary automobile scenario. In one exemplary embodiment of the present invention, these specific values and others are input to the method 200 for designing iris biometric systems in minimally constrained settings and the configuration for a particular iris biometric system is provided that operates within the input constraints for the automobile scenario. Of course, other constraints and other specific values for these and other scenarios are within the scope of the present invention.
    TABLE 4
    Exemplary Parameters for an Automobile Scenario
    Capture Volume 0.5 m × 0.5 m × 0.5 m
    Standoff Distance 3 m-10 m
    Number of subjects in capture volume 1
    Acquisition rate 75%
    Speed of subjects ≦1 m/sec.
    Cooperation None
    Attended/Unattended Optional
  • There are various outcomes for attempting to perform iris recognition. For example, the system may fail to recognize that a subject is within the capture volume, fail to acquire an iris template from a subject known to be within the capture volume, fail to recognize an acquired iris template, match an iris template against a watchlist, match an iris template against some other database (e.g., authorized users, passenger manifest, employee database), recognize some feature of an acquired template or iris image that triggers an alarm (e.g., an attempt to spoof the system, or a subject moving counter to expected flow for that type of subject, or a false match against a database. Exemplary embodiments of systems for performing iris recognition can be designed to respond differently to the outcome depending on the particular scenario and the needs and desires of the client. Some exemplary responses to failure to acquire an iris template from someone known to be in the capture volume include sounding an alarm, signaling a person, and collecting the failure as a statistic to measure system performance.
  • When designing an iris biometrics system, the method 200 must account for the possible retinal damage that could occur in any environment. FIGS. 4 and 5 are charts of exemplary hazard functions 400 and 500. These charts illustrate the relative hazard of thermal damage to the retina from visible or infra-red light as a function of the wavelength of that light. A hazard function according to these charts is used in an exemplary embodiment of the present invention of an iris acquisition model (see FIGS. 9A-E) to determine eye-safe levels of absorption.
  • The American Conference of Industrial Hygienists (ACGIH) 2004 handbook for threshold limit values (TLVs) and biological exposure indices (BEls) recognizes three regimes for retinal injury from visible and near IR radiation: retinal photochemical injury from chronic blue light exposure, retinal thermal injury from visible or near IR exposure, and retinal photochemical injury from chronic blue light exposure in workers with a lens removed and cataract surgery in which the lens is not replaced with a UV absorbing lens (called aphakic hazard). The handbook provides three retinal hazard spectral weighting functions corresponding to these three hazard regimes: A(λ), B(λ) and R(λ) for the aphakic, blue and thermal hazards respectively. These functions are shown in FIG. 5. They are not analytic; they are presented in the handbook as tables. They are based on analysis of empirical data. These functions are used to weigh other functions that depend on exposure time and angular subtense of the source. The other functions are not analytic and cover multiple cases. FIG. 7 provides an assessment of how the three hazards vary with wavelength. The values of the three weighting functions cannot be compared between hazards. The TLV value is set by the product of the weighting function with a hazard-specific function of exposure and subtense.
  • For the near IR, the TLV for a single wavelength is expressed in the form
    radianceTLV=f(t)/(R(λ)α)
    Where t is the exposure time, R is the retinal thermal hazard function of FIG. 7, α is the angular subtense of the source as seen by the eye and f(t) is defined as the radiance TLV coefficient, which is a function of exposure time that is presented in FIGS. 6 and 7.
  • FIGS. 6 and 7 are graphs of exemplary radiance threshold limit value (TLV) coefficient functions 600 and 700. The TLV describes the level of exposure to some hazard (in this case infrared illumination) that is deemed to be the maximum acceptable to avoid risk of injury. This TLV coefficient gives the relative effectiveness of an infra-red illumination dose as a function of the length of time over which that dose is absorbed. The potential for damage is greater when very intense radiation is absorbed over a short period of time than when the same total dose is applied as low intensity radiation over a longer time.
  • FIG. 8 shows an exemplary embodiment of the present invention of a method 800 for providing iris biometrics system design in minimally constrained settings. The method 800 receives design constraints and provides derived design parameters that are useful in designing a system having a specific set of performance requirements. In this example, the method 800 receives particular values for the following design parameters: a lens focal length 802, an illuminator power 804, a camera standoff 806, an illuminator standoff 808, a camera resolution 810, a camera sensitivity 812, and an illuminator wavelength 814. In response, the method 800 provides an amount of radiation absorbed by the iris 816, a captured image contrast 818, and a depth of field 820. With this provided information, an exemplary embodiment of the present invention of a system can be designed meeting particular performance requirements to perform in a minimally constrained setting. The method 800 may be implemented using the information in FIGS. 9A-E to generate a set of inequality constraints on the system design parameters to be satisfied. One exemplary embodiment of the present invention is a system designed in accordance with method 800 using the information in FIGS. 9A-E. Of course, other exemplary embodiments of the present invention include methods that are not limited by any particular values or selection of system components, parameters, units or formulas, such as those in FIGS. 9A-E.
  • FIG. 9, which includes FIGS. 9A-9E, is a table defining an exemplary iris acquisition model. This model is implemented as a spreadsheet with macros (see Table 6). In this table, for each system component, there are associated parameters which have a sample value, units, symbol, and formula. Values and units are also provided in international system of units (SI). In this exemplary model, the following system components are defined: illumination, illumination standoff, ambient illumination, subject, camera standoff, camera lens, camera sensor, iris image, and constants. Of course, the present invention contemplates various similar models and is not limited to any particular values shown in FIGS. 9A-E.
  • This exemplary model is implementable as a system design and analysis tool using software, such as a Microsoft® Excel® spreadsheet program. Design parameters (such as lens focal length, illuminator power, camera standoff, illuminator standoff, camera resolution, camera sensitivity, illuminator wavelength, and so forth) are entered and derived quantities are output. The derived quantities may comprise the amount of radiation absorbed by the iris, the captured image contrast, and the depth of field for the particular constraints. This model is capable of generating many different iris biometrics systems based on different combinations of parameters, such as capture volume, camera placement, speed of motion, and other factors.
  • In designing a system for a specific set of performance parameters (e.g., camera standoff greater than three meters and capture volume greater than x by y by z), the exemplary embodiment of the present invention of the iris acquisition model shown in FIGS. 9A-E uses design principles embodied in that model to generate a set of inequality constraints on the system design parameters to be satisfied. For example, in order to obtain a specified depth of capture volume, the Far Focus Limit—Near Focus Limit must be greater than this specified value. The Near Focus Limit is given by the model as a numerical expression relating Camera Standoff Distance, Hyperfocal Distance, and Focal Length. Hyperfocal Distance in turn is computed in terms of Focal Length, F number, and Circle of Confusion. Each of these quantities is either specified directly or computed from design constraints. Thus, the capture volume is translated into a constraint on system design parameters, allowing engineering tradeoffs to achieve application requirements to be evaluated.
  • The exemplary iris acquisition model shown in FIGS. 9A-E includes the following model components: controlled illumination, illumination standoff, ambient illumination, subject, camera standoff, camera lens, camera sensor, and iris image. The controlled illumination is the illumination supplied by the iris acquisition system. The illumination standoff is the distance between the controlled illumination and the subject including the properties of the intervening medium. The ambient illumination is the illumination not controlled by the iris acquisition system. The subject is the iris being measured. The camera standoff is the distance between the front of the camera lens and the subject. The camera lens is the optical elements of the camera. The camera sensor is the device that converts photons into electronic signals. The iris image is the output of the sensor.
  • Table 5 lists exemplary parameters associated with controlled illumination. Wavelength is the wavelength of the illumination source. Radiance is the brightness of the illumination source, the areal density of radian intensity, and the watts/s per unit area of source. Radiance TLV is the threshold limit value of radiance for eye sage operation. It depends on wavelength, exposure time, and angular subtense of the source as seen from the subject's iris and the pulse width and repetition rate if the illumination source is not continuous. The TLV is defined by the ACGIH. In one exemplary embodiment of the present invention, macros in a spreadsheet compute the TLVs. (Table 14 shows exemplary code for these macros.) The area of a single emitter is the area in meters squared of one of an array of emitters. The number of emitters is a number of emitters in an array of emitters. The duty cycle is a function of the time the source is on. The pulse width is the width of pulses for a pulsed source. The repetition rate is the frequency of pulses for a pulsed source. The duration is the duration of exposure, which encompasses multiple pulses for a pulsed source.
    TABLE 5
    Exemplary Parameters for Controlled Illumination
    Parameter Input Symbol SI Units
    Wavelength X λsource m
    Radiance X Lsource W/m2-sr
    Radiance TLV ITLV W/m2-sr
    Area of single emitter X Asource m2
    Number of Emitters X
    Radiant Intensity Isource W/sr
    Duty cycle X %
    Pulse width sec
    Repetition rate X Hz
    Duration X sec

    Measurement of Radiance
  • The radiance of an LED is seldom specified by the manufacturer. Hence, for the exemplary iris acquisition model of FIGS. 9A-E, radiance is measured or computed from other parameters. Given a calibrated irradiance meter, the irradiance is measured as a function of distance from the source. By plotting the irradiance vs. the inverse square distance, the slope of the line is the radiant intensity of the source. Radiance is determined using the area of the source. The area of the source is determined by taking a picture of the source with the intensity turned down low enough that it will not bloom the camera. The dimensions of the active area of the LED are scaled off, the active area, and the radiance of the source are computed. In one example,
    L=0.028/1.2E−6=2.3E4(w/m 2 −sr)
    At 20 mA.
  • Table 6 lists exemplary parameters associated with controlled illumination standoff. Distance is the distance from the source to the iris. Distance/size ratio is the ratio of the source distance to the largest transverse dimension of the source. In one exemplary embodiment of the present invention, values larger than about 10 for the small angle approximations are considered valid in a model, such as the model in FIGS. 9A-E. Angular subtense is the largest subtended angle of the source as seen from the iris. The optical properties of the intervening medium are absorption, refraction, dispersion, and scattering.
    TABLE 6
    Exemplary Parameters for Controlled Illumination Standoff
    Parameter Input Symbol SI Units
    Distance X dsource m
    Distance/size ratio
    Angular subtense sr
    Optical properties X
  • Table 7 lists exemplary parameters for ambient illumination. The wavelength is the wavelength of the illumination source. The subject irradiance is the irradiance measured at the subject due to ambient illumination.
    TABLE 7
    Exemplary Parameters for Ambient Illumination
    Parameter Input Symbol SI Units
    Wavelength X m
    Subject irradiance X W/m2
  • Table 8 lists exemplary parameters for the subject. The subject irradiance is the irradiance at the subject due to the controlled illumination source. The irradiance TLV, non laser is the TLV for incoherent, non-laser light, while the irradiance TLV, laser is the TLV for laser light. The iris albedo is the measured average albedo of a prototype iris. The iris albedo root mean square (RMS) is the measured RMS variation of albedo about the average. The skin albedo is the measured average skin albedo. The sclera albedo is the measured average sclera albedo. The pupil albedo is the measured average pupil albedo. The iris diameter is the measured iris diameter of a prototype iris. The iris average excitance is the light reflected from the iris. The iris RMS excitance is the variation in light reflected from the iris.
    TABLE 8
    Exemplary Parameters for the Subject
    Parameter Input Symbol SI Units
    Subject irradiance Esubject W/m2
    Irradiance TLV - non-laser W/m2
    Irradiance TLV - laser W/m2
    Iris albedo X
    Iris albedo RMS variation X %
    Skin albedo X
    Sclera albedo X
    Pupil albedo X
    Iris diameter X m
    Iris average excitance W/m2
    Iris RMS excitance W/m2

    Measurement of Albedo
  • Iris albedo and variation of iris albedo data in the near IR does not seem to be readily available in the open literature. Hence, iris albedo is measured for the exemplary model. This is done using a reflectance standard, such as the type provided by Labsphere, North Sutton, N.H. or Ocean Optics, Dunedin, Fla. by performing the following method. Arrange a light source to uniformly (i.e., constant irradiance) illuminate the subject iris and a reflectance standard at the desired wavelength. Image the subject and standard with a camera with gamma correction and automatic gain control (AGC) turned off. Adjust the illumination level for pixel values of the standard at approximately 90% of full scale. Capture image(s). Compare pixel values of the iris with that of the standard. If the standard has an albedo of about 1.00, the iris albedo is the ratio of the pixel value on the iris to that on the standard. If the iris albedo is low, capture one image as described above and then increase the illumination level by a known factor and recapture. Rescale the pixel values of the iris in the original image by the illumination level ratio of the two images and proceed to compare those values with the iris pixel values in the new image. A series of images may be taken with increasing intensity to plot the pixel values for the various structures, (e.g., reference, skin, iris, sclera) of interest as a function of the intensity. The data usually fits straight lines. The ratios of the slopes of the iris, sclera, and skin to that of the reference then give the albedos for those structures. The variation of albedo is estimated within a structure by extracting a region of pixels from the structure and computing the average and standard deviation of the pixels.
  • Table 9 lists the exemplary parameters associated with the camera standoff. The distance is the distance from the front of the camera to the subject iris. The optical properties of the intervening medium are absorption, refraction, dispersion, and scattering.
    TABLE 9
    Exemplary Parameters for the Camera Standoff
    Parameter Input Symbol SI Units
    Distance X dcamera m
    Optical properties of intervening medium X
  • Table 10 lists the exemplary parameters associated with the camera lens. The magnification is the iris radius at the sensor divided by the iris radius at the subject. The focal length is the distance along the optical axis from the lens to the focus (or focal point). The lens F# is the ratio of the lens focal length to the lens diameter. The transmission is the fraction of light transmitted by the lens. The effective lens diameter is the areas of the lens for light capture. The lens capture efficiency is the fraction of light from the subject that is captured by the lens. The total lens efficiency is the fraction of light from the subject that arrives at the sensor. The circle of confusion (at the sensor) is the maximum diameter of a point source as imaged on the sensor that is below the resolution limit of the sensor. The hyper focal distance is the focus point at which the depth of field extends from half hyper focal distance to infinity. The near focal distance is the near edge of the depth of field. The far focus limit is the far edge of the depth of field. The depth of field is the region of space within which a point is imaged to a circle smaller than the circle of confusion.
    TABLE 10
    Exemplary Parameters for the Camera Lens
    Parameter Input Symbol SI Units
    Magnification
    Focal length f m
    F# X F#
    Transmission X TlenS %
    Effective Lens Diameter m
    Lens Capture Efficiency
    Total Lens Efficiency
    Circle of confusion (at sensor) m
    Hyper focal distance m
    Near focus limit m
    Far focus limit m
    Depth of field m
  • Table 11 lists exemplary parameters for the camera sensor. The quantum efficiency is for interacting/incident photons. The camera gain J for interacting photons and the camera gain K for electrons are from the well known photon transfer curve. The well depth is the depth of the sensor well in electrons. The read noise is the noise in the absence of photons. The shot noise is the square root of the sensor average signal. The pixel width is set by the pitch and fill factor. The pixel height is the same as width for square pixels. The pixel pitch horizontal is the pixel spacing, which is not the same as the pixel width. The pixel pitch vertical is the same as width, for square pixels. The pixel area is the pixel width times the pixel height. The fill factor is the size of the light sensitive photodiode relative to the surface of the pixel. The sensor width and height are measured in pixels and meters. The shutter time is the exposure time. Photon energy=Planck's constant*the speed of light/the source wavelength. Sensor average irradiance=iris average excitance*total lens efficiency. Sensor RMS radiance=iris RMS excitance*total lens efficiency. Sensor signal photons=sensor average irradiance*pixel area*camera shutter/photon energy. Sensor average signal=quantum efficiency*sensor signal photons. Sensor RMS signal=sensor average signal*iris albedo variation. Total sensor noise=read noise+shot noise. Dynamic range fraction=sensor average signal/well depth. Signal/noise=20 log(sensor average signal/total sensor noise). Contrast signal noise=20 log(sensor RMS signal/total sensor noise).
    TABLE 11
    Exemplary Parameters for the Camera Sensor
    Parameter Input Symbol SI Units
    Quantum Efficiency X QE
    (interacting/incident) photons
    Camera Gain J (interacting photons) X photon/DN
    Camera Gain K (electrons) X e−/DN
    Well depth X e−
    Read Noise e−
    Shot Noise e−
    Pixel width X m
    Pixel height m
    Pixel pitch horizontal m
    Pixel pitch vertical m
    Pixel Area m2
    Fill factor X %
    Sensor Width, Pixels X
    Sensor Height, Pixels X
    Sensor Width m
    Sensor Height m
    Shutter time X sec
    Photon Energy joule
    Sensor average irradiance W/m2
    Sensor RMS irradiance W/m2
    Sensor Signal Photons photons
    Sensor Average Signal e−
    Sensor RMS Signal e−
    Total Sensor Noise e−
    Dynamic Range Fraction
    Signal/Noise dB
    Contrast Signal/Noise dB
  • Table 12 lists the exemplary parameters for iris image. The pixels across iris is the desired pixel resolution at the subject. The resolution is the resolution at the subject. The contrast is the contrast at the subject. The excess resolution factor is used to determine the optical resolution needed for recognition. The minimum required resolution at the subject is the minimum needed for recognition.
    TABLE 12
    Exemplary Parameters for the Camera Sensor
    Parameter Input Symbol SI Units
    Pixels across iris X
    Resolution(at subject) pixels/m
    Contrast
    Excess Resolution Factor
    Minimum Required Resolution (at subject) X pixels/m

    Radiometric Physics
  • The parameters in Table 13 are general optics terms. Using subscripts s for source and t for target, R as the source target distance and As and At as the target and source respectively, the radiant intensity is related to the radiance and the source area by
    I=L s A s
    for a uniform extended source. A point source is modeled as Ls˜1/As as As goes to zero. The radiant flux incident on a target is the product of the solid angle subtended by the target and the radiant intensity
    φt =I*(4πA t/4πR 2)=I*(A t /R 2).
    The target irradiance is the ratio of the radian flux to the target area
    E tt /A t =I/R 2 =L s A s /R 2.
    The albedo of a surface is the ratio of its radiant exitance to its irradiance. For an extended source of radiance Ls, which has a size large enough to fill the viewing angle, θ, of a detector, the measured irradiance is given by
    E=πL sin2(θ/2)
  • independent of distance. Conceptually, as the distance increases, the detector sees more of the source and the increased source visibility balances the 1/r2 falloff from each area element, dA.
    TABLE 13
    Some general optics terms
    Parameter Symbol Units Use to measure
    radiant Q Joule
    energy
    radiant φ Joule/sec
    power Watt
    radiance L W/m2-sr source brightness
    radiant I W/sr light flux propagating in space
    intensity
    irradiance E W/m2 light flux density impinging on a
    surface
    radiant M W/m2 light flux density emitted from a
    exitance surface
    albedo reflectance of a surface

    Relationship between Retinal Irradiance, Corneal Irradiance, Source Radiance, and Retinal Hazards
  • For corneal and lens damage to the eye, the deposition of energy to the cornea depends on the corneal irradiance Ec. For retinal damage, the deposition of energy in the retina depends on the retinal irradiance, Er. In many cases, the corneal irradiance tells little about the retinal irradiance, because the corneal irradiance is focused onto the retina and the details of the focused image depend on the source, rather than the corneal irradiance. The retinal irradiance is given by
    E r =L s τ/f e 2
  • where fe is the focal length of the eye lens and r is the transmission of the lens and the intraocular fluids. Because τ and fe are reasonably constant across the population, the eye safety limits are expressed in terms of the source radiance. There are complicating factors. The size of the image on the retina has an impact on the rate at which energy is diffused from the image and this interacts with the exposure time so that the maximum permissible source radiance for safety depends on the angular subtense of the source, the source wavelength, and the duration of the exposure. For sources that are effectively very far away so that the light impinging on the eye is essentially plane waves (e.g., laser light) or for true point sources, the size of the image on the retina is determined by diffraction and aberration in the eye, rather than by the size of the source. In these cases, the corneal irradiance is a better indicator of retinal irradiance than the source radiance. These relationships have been used to develop eye safety standards for corneal/lens and retinal damage.
    TABLE 14
    Exemplary TLV Macros
    ‘ TLV Macros
    Private Const CM2perM2 = 10000 ‘ cm{circumflex over ( )}2 per m{circumflex over ( )}2 conversion
    factor
    Option Explicit
    Public Function OccularRadianceTLV(ByVal exposureTimeSec As
    Single,
    ByVal wavelengthNM As Single,
    ByVal angularSubtense As Single) As Single
    ‘ returns watts/sr-m{circumflex over ( )}2
    ‘ from ACGIH 2004 TLVs and BEIs, page 154. Get a copy at
    www.acgih.org
    Dim myTLV As Single
    myTLV = 0 ‘ default in case where we do not know it or
    have not calculated it
    If (exposureTimeSec < 0.00001) Then exposureTimeSec =
    0.00001
    If (wavelengthNM >= 770) And (wavelengthNM <= 1400) Then
    If (exposureTimeSec >= 10) Then
    myTLV = 0.6 / (angularSubtense *
    retinalThermalHazardMultiplier(wavelengthNM) ) ‘ page 154
    section 4b
    Else
    myTLV = 5 /
    (retinalThermalHazardMultiplier(wavelengthNM) *
    angularSubtense * exposureTimeSec {circumflex over ( )} 0.25)
    End If
    End If
    OccularRadianceTLV = CM2perM2 * myTLV
    End Function
    Public Function OccularIrradianceTLV (exposureTimeSec As
    Single, wavelengthNM As Single) As Single
    ‘ returns watts/m{circumflex over ( )}2
    ‘ from ACGIH 2004 TLVs and BEIs, page 154. Get a copy at
    www.acgih.org
    Dim myTLV As Single
    myTLV = 0 ‘ default in case where we do not know it or
    have not calculated it
    If (wavelengthNM >= 770) And (wavelengthNM <= 3000) Then
    If (exposureTimeSec >= 1000) Then
    myTLV = 0.01 * CM2perM2 ‘ page 154 section 4a
    Else
    myTLV = 1.8 * exposureTimeSec {circumflex over ( )} (−3 / 4) *
    CM2perM2 ‘page 154 section 4a
    End If
    Else
    End If
    OccularIrradianceTLV = myTLV
    End Function
    Public Function LaserOccularIrradianceTLV (exposureTimeSec As
    Single, wavelengthNM As Single) As Single
    ‘ returns watts/m{circumflex over ( )}2
    ‘ from ACGIH 2004 TLVs and BEIs, page 137. Get a copy at
    www.acgih.org
    Dim myTLV As Single
    myTLV = 0 ‘ default in case where we do not know it or
    have not calculated it
    If (wavelengthNM < 700) Or (wavelengthNM > 1000000#) Then
    ‘ not yet implemented
    End If
    If (wavelengthNM >= 700) And (wavelengthNM <= 1400) Then
    ‘ 700 nm to 1400 microns -- IRA
    If (exposureTimeSec > 30000#) Then ‘not yet
    implemented
    End If
    If (exposureTimeSec >= 1000) And (exposureTimeSec <=
    30000#) Then
    myTLV = 320 * Ca(wavelengthNM) * Cc(wavelengthNM)
    ‘ microwatts/cm{circumflex over ( )}2
    myTLV = 0.000001 * CM2perM2 * myTLV ‘ W/m{circumflex over ( )}2
    End If
    If (exposureTimeSec < 1000) Then
    If (wavelengthNM >= 700) And (wavelengthNM < 1050)
    And
    (exposureTimeSec >= 0.0000000000001) And
    (exposureTimeSec < 0.00000000001) Then
    myTLV = 1.5 * Ca(wavelengthNM) * 0.00000001 /
    exposureTimeSec
    myTLV = myTLV * CM2perM2 ‘ W/m{circumflex over ( )}2
    ‘ 1.5 Ca 1e−8 J/cm{circumflex over ( )}2
    End If
    If (wavelengthNM >= 700) And (wavelengthNM < 1050)
    And
    (exposureTimeSec >= 0.00000000001) And
    (exposureTimeSec < 0.000000001) Then
    myTLV = 2.7 * Ca(wavelengthNM) /
    exposureTimeSec {circumflex over ( )} 0.25
    myTLV = myTLV * CM2perM2 ‘ W/m{circumflex over ( )}2
    ‘ 2.7 Ca t/t{circumflex over ( )}0.25 J/cm{circumflex over ( )}2
    End If
    If (wavelengthNM >= 700) And (wavelengthNM < 1050)
    And
    (exposureTimeSec >= 0.000000001) And
    (exposureTimeSec < 0.000018) Then
    myTLV = 5 * Ca(wavelengthNM) * 0.0000001 /
    exposureTimeSec
    myTLV = myTLV * CM2perM2 ‘ W/m{circumflex over ( )}2
    ‘ 5 Ca 1e−7 J/cm{circumflex over ( )}2
    End If
    If (wavelengthNM >= 700) And (wavelengthNM < 1050)
    And
    (exposureTimeSec >= 0.000018) And
    (exposureTimeSec < 1000) Then
    myTLV = 1.8 * Ca(wavelengthNM) /
    exposureTimeSec {circumflex over ( )} 0.25
    myTLV = myTLV * CM2perM2 * 0.001 ‘ W/m{circumflex over ( )}2
    ‘ 1.8 Ca (t/t{circumflex over ( )}0.25) mJ/cm{circumflex over ( )}2
    End If
    If (wavelengthNM >= 1050) And (wavelengthNM <=
    1400) And
    (exposureTimeSec >= 0.0000000000001) And
    (exposureTimeSec < 0.00000000001) Then
    myTLV = 1.5 * Cc(wavelengthNM) * 0.0000001 /
    exposureTimeSec
    myTLV = myTLV * CM2perM2 ‘ W/m{circumflex over ( )}2
    ‘ 1.5 Cc 1e−7 J/cm{circumflex over ( )}2
    End If
    If (wavelengthNM >= 1050) And (wavelengthNM <=
    1400) And
    (exposureTimeSec >= 0.00000000001) And
    (exposureTimeSec < 0.000000001) Then
    myTLV = 2.7 * Cc(wavelengthNM) /
    exposureTimeSec {circumflex over ( )} 0.25
    myTLV = myTLV * CM2perM2 ‘ W/m{circumflex over ( )}2
    ‘ 2.7 Cc (t/t{circumflex over ( )}0.25) J/cm{circumflex over ( )}2
    End If
    If (wavelengthNM >= 1050) And (wavelengthNM <=
    1400) And
    (exposureTimeSec >= 0.000000001) And
    (exposureTimeSec < 0.00001) Then
    myTLV = 5 * Cc(wavelengthNM) * 0.000001 /
    exposureTimeSec
    myTLV = myTLV * CM2perM2 ‘ W/m{circumflex over ( )}2
    ‘ 5 Cc 1E−6 J/cm{circumflex over ( )}2
    End If
    If (wavelengthNM >= 1050) And (wavelengthNM <=
    1400) And
    (exposureTimeSec >= 0.00001) And
    (exposureTimeSec < 1000) Then
    myTLV = 9 * Cc(wavelengthNM) / exposureTimeSec
    {circumflex over ( )} 0.25
    myTLV = myTLV * CM2perM2 * 0.001 ‘ W/m{circumflex over ( )}2
    ‘ 9 Cc t/t{circumflex over ( )}0.25 mJ/cm{circumflex over ( )}2
    End If
    End If
    End If
    If (wavelengthNM > 1400) And (wavelengthNM <= 1000000#)
    Then
    ‘ 1400 nm to 1000 microns -- IRB & IRC
    If (exposureTimeSec > 30000#) Then ‘not yet
    implemented
    End If
    If (exposureTimeSec >= 10) And (exposureTimeSec <=
    30000#) Then
    myTLV = 0.1 * CM2perM2 ‘ W/m{circumflex over ( )}2 0.1 W/cm{circumflex over ( )}2 ˜ 100
    mW/cm{circumflex over ( )}2
    End If
    If (exposureTimeSec < 10) Then
    If (wavelengthNM > 1400) And (wavelengthNM <=
    1500) And
    (exposureTimeSec > 0.00000000000001) And
    (exposureTimeSec < 0.001) Then
    myTLV = (0.1 / exposureTimeSec) * CM2perM2 ‘
    W/m{circumflex over ( )}2 0.1 J/cm{circumflex over ( )}2
    End If
    If (wavelengthNM > 1400) And (wavelengthNM <=
    1500) And
    (exposureTimeSec >= 0.001) And
    (exposureTimeSec < 10) Then
    myTLV = (0.56 * exposureTimeSec {circumflex over ( )} 0.25 /
    exposureTimeSec) * CM2perM2 ‘ W/m{circumflex over ( )}2
    ‘ 0.56 * t{circumflex over ( )}0.25 J/cm{circumflex over ( )}2
    End If
    If (wavelengthNM > 1500) And (wavelengthNM <=
    1800) And
    (exposureTimeSec >= 0.00000000000001) And
    (exposureTimeSec < 10) Then
    myTLV = (1 / exposureTimeSec) * CM2perM2 ‘
    W/m{circumflex over ( )}2
    ‘ 1 J/cm{circumflex over ( )}2
    End If
    If (wavelengthNM > 1800) And (wavelengthNM <=
    2600) And
    (exposureTimeSec >= 0.00000000000001) And
    (exposureTimeSec < 0.001) Then
    myTLV = (0.1 / exposureTimeSec) * CM2perM2 ‘
    W/m{circumflex over ( )}2
    ‘ 0.1 J/cm{circumflex over ( )}2
    End If
    If (wavelengthNM > 1800) And (wavelengthNM <=
    2600) And
    (exposureTimeSec >= 0.001) And
    (exposureTimeSec < 10) Then
    myTLV = (0.56 * exposureTimeSec {circumflex over ( )} 0.25 /
    exposureTimeSec) * CM2perM2 ‘ W/m{circumflex over ( )}2
    ‘ 0.56 * t{circumflex over ( )}0.25 J/cm{circumflex over ( )}2
    End If
    If (wavelengthNM > 2600) And (wavelengthNM <=
    1000000#) And
    (exposureTimeSec >= 0.00000000000001) And
    (exposureTimeSec < 10000000#) Then
    myTLV = (0.01 / exposureTimeSec) * CM2perM2 ‘
    W/m{circumflex over ( )}2
    ‘ 0.01 J/cm{circumflex over ( )}2
    End If
    If (wavelengthNM > 2600) And (wavelengthNM <=
    1000000#) And
    (exposureTimeSec >= 0.0000001) And
    (exposureTimeSec < 10000000#) Then
    myTLV = (0.56 * exposureTimeSec {circumflex over ( )} 0.25 /
    exposureTimeSec) * CM2perM2 ‘ W/m{circumflex over ( )}2
    ‘ 0.56 * t{circumflex over ( )}0.25 J/cm{circumflex over ( )}2
    End If
    End If
    End If
    LaserOccularIrradianceTLV = myTLV
    End Function
    Public Function alphaMin(exposureTimeSec As Single) As Single
    ‘ returns radians
    ‘ from ACGIH 2004 TLVs and BEIs, page 133. Get a copy at
    www.acgih.org
    Dim a As Single
    a = 0
    If exposureTimeSec <= 0.7 Then a = 0.0015
    If (exposureTimeSec > 0.7) And (exposureTimeSec <= 10)
    Then
    a = 0.002 * exposureTimeSec {circumflex over ( )} 0.75
    End If
    If exposureTimeSec > 10 Then a = 0.011
    alphaMin = a
    End Function
    Private Function Ca(wavelengthNM As Single) As Single
    Dim myCa As Single
    myCa = 0
    If (wavelengthNM < 700) Then ‘ out of range
    End If
    If (wavelengthNM > 1400) Then ‘ out of range
    End If
    If (wavelengthNM >= 700) And (wavelengthNM < 1050) Then
    myCa = 10 {circumflex over ( )} (0.002 * (wavelengthNM − 700))
    End If
    If (wavelengthNM >= 1050) And (wavelengthNM <= 1400) Then
    myCa = 5
    End If
    Ca = myCa
    End Function
    Private Function Cc(wavelengthNM As Single) As Single
    Dim myCc As Single
    myCc = 0
    If (wavelengthNM < 700) Then ‘ out of range
    End If
    If (wavelengthNM >= 700) And (wavelengthNM <= 1150) Then
    myCc = 1
    If (wavelengthNM > 1150) And (wavelengthNM < 1200) Then
    myCc = 10 {circumflex over ( )} (0.0181 * (wavelengthNM − 1150))
    End If
    If (wavelengthNM >= 1200) And (wavelengthNM <= 1400) Then
    myCc = 8
    Cc = myCc
    End Function
    Public Function Ce(exposureTimeSec As Single, alphaRadians As
    Single, wavelengthNM As Single) As Single
    ‘ returns dimensionless
    ‘ from ACGIH 2004 TLVs and BEIs, page 138. Get a copy at
    www.acgih.org
    Dim aMin As Single
    Dim c As Single
    aMin = alphaMin (exposureTimeSec)
    c = 1
    If (wavelengthNM >= 400) And (wavelengthNM <= 1400) Then
    If alphaRadians <= aMin Then c = 1#
    If (aMin < alphaRadians) And (alphaRadians <= 0.1)
    Then c = alphaRadians / aMin
    If (0.1 < alphaRadians) Then c = alphaRadians {circumflex over ( )} 2 /
    (aMin * 0.1)
    ‘ This is not exactly what the book shows. I have
    used what I believe to be the
    ‘ correct result -- I believe the book has an error.
    JRM
    End If
    Ce = c
    End Function
    Public Function retinalThermalHazardMultiplier(wavelengthNM As
    Single) As Single
    ‘ returns dimensionless
    ‘ from ACGIH 2004 TLVs and BEIs, page 153. Get a copy at
    www.acgih.org
    Dim myValue As Single
    myValue = 10 ‘ set value at maximum to start
    If (wavelengthNM >= 500) And (wavelengthNM < 700) Then myValue
    = 1
    If (wavelengthNM >= 700) And (wavelengthNM < 1050) Then
    myValue = 10 {circumflex over ( )} ((700 − wavelengthNM) / 500)
    If (wavelengthNM >= 1050) And (wavelengthNM < 1400) Then
    myValue = 0.2
    If ((wavelengthNM >= 385) And (wavelengthNM < 390)) Then
    myValue = 0.125
    If ((wavelengthNM >= 390) And (wavelengthNM < 395)) Then
    myValue = 0.25
    If ((wavelengthNM >= 395) And (wavelengthNM < 400)) Then
    myValue = 0.5
    If ((wavelengthNM >= 400) And (wavelengthNM < 405)) Then
    myValue = 1
    If ((wavelengthNM >= 405) And (wavelengthNM < 410)) Then
    myValue = 2
    If ((wavelengthNM >= 410) And (wavelengthNM < 415)) Then
    myValue = 4
    If ((wavelengthNM >= 415) And (wavelengthNM < 420)) Then
    myValue = 8
    If ((wavelengthNM >= 420) And (wavelengthNM < 425)) Then
    myValue = 9
    If ((wavelengthNM >= 425) And (wavelengthNM < 430)) Then
    myValue = 9.5
    If ((wavelengthNM >= 430) And (wavelengthNM < 435)) Then
    myValue = 9.8
    If ((wavelengthNM >= 435) And (wavelengthNM < 440)) Then
    myValue = 10
    If ((wavelengthNM >= 440) And (wavelengthNM < 445)) Then
    myValue = 10
    If ((wavelengthNM >= 445) And (wavelengthNM < 450)) Then
    myValue = 9.7
    If ((wavelengthNM >= 450) And (wavelengthNM < 455)) Then
    myValue = 9.4
    If ((wavelengthNM >= 455) And (wavelengthNM < 460)) Then
    myValue = 9
    If ((wavelengthNM >= 460) And (wavelengthNM < 465)) Then
    myValue = 8
    If ((wavelengthNM >= 465) And (wavelengthNM < 470)) Then
    myValue = 7
    If ((wavelengthNM >= 470) And (wavelengthNM < 475)) Then
    myValue = 6.2
    If ((wavelengthNM >= 475) And (wavelengthNM < 480)) Then
    myValue = 5.5
    If ((wavelengthNM >= 480) And (wavelengthNM < 485)) Then
    myValue = 4.5
    If ((wavelengthNM >= 485) And (wavelengthNM < 490)) Then
    myValue = 4
    If ((wavelengthNM >= 490) And (wavelengthNM < 495)) Then
    myValue = 2.2
    If ((wavelengthNM >= 495) And (wavelengthNM < 500)) Then
    myValue = 1.6
    retinalThermalHazardMultiplier = myValue
    End Function
  • While the foregoing is directed to various embodiments of the present invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof. As such, the appropriate scope of the invention is to be determined according to the claims, which follow.

Claims (19)

1. A method for designing an iris biometrics system, comprising:
receiving a plurality of design constraints for a minimally constrained environment;
calculating at least one calculated parameter; and
providing a design for an iris biometrics system that operates within the plurality of design constraints, the design being based on the at least one calculated parameter.
2. The method of claim 1, wherein the plurality of design constraints comprise at least one of: a capture volume, a standoff distance, a number of subjects in the capture volume, an acquisition rate, a speed of subjects, a cooperation indication, or an attended indication.
3. The method of claim 1, wherein a further plurality of design constraints comprise at least one of: a lens focal length, an illuminator power, a camera standoff, an illuminator standoff, a camera resolution, a camera sensitivity, and an illuminator wavelength.
4. The method of claim 1, wherein the at least one calculated parameter comprises at least one of: an amount of radiation absorbed by the iris, a captured image contrast, eye-safe levels of absorption, and a depth of field.
5. The method of claim 1, wherein the minimally constrained environment comprises at least one of: a security checkpoint, an office, a boarding bridge, a corridor, and an automobile.
6. The method of claim 1, wherein the iris biometrics system uses synchronized stroboscopic illumination.
7. A system for providing iris biometrics, comprising:
a user interface for receiving a plurality of design constraints for a minimally constrained environment; and
a processor for calculating at least one calculated parameter and for providing a design for an iris biometrics system that operates within the plurality of design constraints and safety constraints, the design being based on the plurality of design constraints, the at least one calculated parameter, and the safety constraints.
8. The system of claim 7, wherein the plurality of design constraints comprises at least one of: a capture volume, a standoff distance, a number of subjects in the capture volume, an acquisition rate, a speed of subjects, a cooperation indication, and an attended indication.
9. The system of claim 7, wherein the processor calculates an occular radiance threshold level value (TLV) based on an exposure time, a wavelength, and an angular subtense.
10. The system of claim 7, wherein the processor calculates an occular irradiance TLV based on an exposure time and a wavelength.
11. The system of claim 7, wherein the processor calculates a laser occular irradiance TLV based on an exposure time and a wavelength.
12. The system of claim 7, wherein the processor calculates an eye-safe level of absorption.
13. The system of claim 7, wherein the iris biometrics system uses synchronized stroboscopic illumination.
14. A computer-readable medium having stored thereon a plurality of instructions, the plurality of instructions including instructions which, when executed by a processor, cause the processor to perform the steps of a method for providing iris biometrics, comprising:
receiving a plurality of design constraints for a minimally constrained environment;
calculating at least one calculated parameter; and
providing a design for an iris biometrics system that operates within the plurality of design constraints, the design being based on the at least one calculated parameter.
15. The computer-readable medium of claim 14, wherein the plurality of design constraints comprise at least one of: a capture volume, a standoff distance, a number of subjects in the capture volume, an acquisition rate, a speed of subjects, a cooperation indication, or an attended indication.
16. The computer-readable medium of claim 14, wherein a further plurality of design constraints comprise at least one of: a lens focal length, an illuminator power, a camera standoff, an illuminator standoff, a camera resolution, a camera sensitivity, and an illuminator wavelength.
17. The computer-readable medium of claim 14, wherein the at least one calculated parameter comprises at least one of: an amount of radiation absorbed by the iris, a captured image contrast, eye-safe levels of absorption, and a depth of field.
18. The computer-readable medium of claim 14, wherein the minimally constrained environment comprises at least one of: a security checkpoint, an office, a boarding bridge, a corridor, and an automobile.
19. The computer-readable medium of claim 14, wherein the iris biometrics system uses synchronized stroboscopic illumination.
US11/364,300 2005-06-03 2006-02-28 Method and apparatus for designing iris biometric systems for use in minimally constrained settings Abandoned US20060274918A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US11/364,300 US20060274918A1 (en) 2005-06-03 2006-02-28 Method and apparatus for designing iris biometric systems for use in minimally constrained settings
US12/576,644 US7925059B2 (en) 2005-06-03 2009-10-09 Method and apparatus for iris biometric systems for use in an entryway

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US68710605P 2005-06-03 2005-06-03
US11/364,300 US20060274918A1 (en) 2005-06-03 2006-02-28 Method and apparatus for designing iris biometric systems for use in minimally constrained settings

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/334,968 Continuation-In-Part US7542628B2 (en) 2005-04-11 2006-01-19 Method and apparatus for providing strobed image capture

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/849,969 Continuation-In-Part US7634114B2 (en) 2005-06-03 2007-09-04 Method and apparatus for iris biometric systems for use in an entryway

Publications (1)

Publication Number Publication Date
US20060274918A1 true US20060274918A1 (en) 2006-12-07

Family

ID=37498881

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/364,300 Abandoned US20060274918A1 (en) 2005-06-03 2006-02-28 Method and apparatus for designing iris biometric systems for use in minimally constrained settings
US11/377,042 Active 2026-06-17 US7627147B2 (en) 2005-06-03 2006-03-16 Method and apparatus for obtaining iris biometric information from a moving subject

Family Applications After (1)

Application Number Title Priority Date Filing Date
US11/377,042 Active 2026-06-17 US7627147B2 (en) 2005-06-03 2006-03-16 Method and apparatus for obtaining iris biometric information from a moving subject

Country Status (2)

Country Link
US (2) US20060274918A1 (en)
WO (2) WO2006132686A2 (en)

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080181467A1 (en) * 2006-09-01 2008-07-31 Zappia Thomas M Method and apparatus for iris biometric systems for use in an entryway
US20080199054A1 (en) * 2006-09-18 2008-08-21 Matey James R Iris recognition for a secure facility
WO2009023313A2 (en) * 2007-05-09 2009-02-19 Honeywell International Inc. Eye-safe near infra-red imaging illumination method and system
US20090274345A1 (en) * 2006-09-22 2009-11-05 Hanna Keith J Compact Biometric Acquisition System and Method
US20110007949A1 (en) * 2005-11-11 2011-01-13 Global Rainmakers, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
US20120293643A1 (en) * 2011-05-17 2012-11-22 Eyelock Inc. Systems and methods for illuminating an iris with visible light for biometric acquisition
US8442277B1 (en) * 2008-10-31 2013-05-14 Bank Of America Corporation Identity authentication system for controlling egress of an individual visiting a facility
US20130130227A1 (en) * 2011-11-22 2013-05-23 The Boeing Company Infectious Disease Detection System
US20130279767A1 (en) * 2012-04-24 2013-10-24 Chih-Hsung Huang Method of managing visiting guests by face recognition
US9232592B2 (en) 2012-04-20 2016-01-05 Trilumina Corp. Addressable illuminator with eye-safety circuitry
US20160132463A1 (en) * 2014-11-12 2016-05-12 Kabushiki Kaisha Toshiba Computing apparatus, computing method, and computer program product
US20160291155A1 (en) * 2015-04-01 2016-10-06 Vayavision, Ltd. System and method for optimizing active measurements in 3-dimensional map generation
US20170243075A1 (en) * 2007-04-19 2017-08-24 Eyelock Llc Method and system for biometric recognition
US9773169B1 (en) * 2012-11-06 2017-09-26 Cross Match Technologies, Inc. System for capturing a biometric image in high ambient light environments
US9875551B2 (en) 2014-09-24 2018-01-23 Samsung Electronics Co., Ltd Method for performing user authentication and electronic device thereof
US10038304B2 (en) 2009-02-17 2018-07-31 Trilumina Corp. Laser arrays for variable optical properties
US10244181B2 (en) 2009-02-17 2019-03-26 Trilumina Corp. Compact multi-zone infrared laser illuminator
US10366296B2 (en) 2016-03-31 2019-07-30 Princeton Identity, Inc. Biometric enrollment systems and methods
US10373008B2 (en) 2016-03-31 2019-08-06 Princeton Identity, Inc. Systems and methods of biometric analysis with adaptive trigger
US20190259188A1 (en) * 2014-09-17 2019-08-22 Circonus, Inc. System and method for generating histograms
US10425814B2 (en) 2014-09-24 2019-09-24 Princeton Identity, Inc. Control of wireless communication device capability in a mobile device with a biometric key
US10452936B2 (en) 2016-01-12 2019-10-22 Princeton Identity Systems and methods of biometric analysis with a spectral discriminator
US10484584B2 (en) 2014-12-03 2019-11-19 Princeton Identity, Inc. System and method for mobile device biometric add-on
US10607096B2 (en) 2017-04-04 2020-03-31 Princeton Identity, Inc. Z-dimension user feedback biometric system
US10615871B2 (en) 2009-02-17 2020-04-07 Trilumina Corp. High speed free-space optical communications
US10902104B2 (en) 2017-07-26 2021-01-26 Princeton Identity, Inc. Biometric security systems and methods
US11095365B2 (en) 2011-08-26 2021-08-17 Lumentum Operations Llc Wide-angle illuminator module
US11402510B2 (en) 2020-07-21 2022-08-02 Leddartech Inc. Systems and methods for wide-angle LiDAR using non-uniform magnification optics
US11422266B2 (en) 2020-07-21 2022-08-23 Leddartech Inc. Beam-steering devices and methods for LIDAR applications
US11567179B2 (en) 2020-07-21 2023-01-31 Leddartech Inc. Beam-steering device particularly for LIDAR systems

Families Citing this family (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8098901B2 (en) 2005-01-26 2012-01-17 Honeywell International Inc. Standoff iris recognition system
US7593550B2 (en) * 2005-01-26 2009-09-22 Honeywell International Inc. Distance iris recognition
US8090157B2 (en) 2005-01-26 2012-01-03 Honeywell International Inc. Approaches and apparatus for eye detection in a digital image
US8064647B2 (en) 2006-03-03 2011-11-22 Honeywell International Inc. System for iris detection tracking and recognition at a distance
US8442276B2 (en) 2006-03-03 2013-05-14 Honeywell International Inc. Invariant radial iris segmentation
US8705808B2 (en) 2003-09-05 2014-04-22 Honeywell International Inc. Combined face and iris recognition system
US7761453B2 (en) 2005-01-26 2010-07-20 Honeywell International Inc. Method and system for indexing and searching an iris image database
KR101299074B1 (en) 2006-03-03 2013-08-30 허니웰 인터내셔널 인코포레이티드 Iris encoding system
WO2008019168A2 (en) 2006-03-03 2008-02-14 Honeywell International, Inc. Modular biometrics collection system architecture
WO2007101275A1 (en) 2006-03-03 2007-09-07 Honeywell International, Inc. Camera with auto-focus capability
EP1991948B1 (en) 2006-03-03 2010-06-09 Honeywell International Inc. An iris recognition system having image quality metrics
WO2007101276A1 (en) 2006-03-03 2007-09-07 Honeywell International, Inc. Single lens splitter camera
US10037408B2 (en) 2013-03-15 2018-07-31 Jeffrey A. Matos Apparatus for preventing unauthorized access to computer files and for securing medical records
US8063889B2 (en) 2007-04-25 2011-11-22 Honeywell International Inc. Biometric data collection system
US8599306B2 (en) * 2008-08-20 2013-12-03 Matthew Rolston Photographer, Inc. Cosmetic package with operation for modifying visual perception
ITAV20070003U1 (en) * 2007-09-14 2007-12-14 Silvio Spiniello UTILITY MODEL FROM THE "SECUTITY" NAME FOR PREVENTION, REPRESSION, SAFETY, THE SEARCH FOR MISSING PEOPLE AND THE HISTORICAL REBUILDING OF ACTUALLY FACTED FACTS.
US8436907B2 (en) 2008-05-09 2013-05-07 Honeywell International Inc. Heterogeneous video capturing system
US8213782B2 (en) 2008-08-07 2012-07-03 Honeywell International Inc. Predictive autofocusing system
US8090246B2 (en) 2008-08-08 2012-01-03 Honeywell International Inc. Image acquisition system
US8280119B2 (en) 2008-12-05 2012-10-02 Honeywell International Inc. Iris recognition system using quality metrics
US7912252B2 (en) * 2009-02-06 2011-03-22 Robert Bosch Gmbh Time-of-flight sensor-assisted iris capture system and method
WO2010099475A1 (en) * 2009-02-26 2010-09-02 Kynen Llc User authentication system and method
US8472681B2 (en) 2009-06-15 2013-06-25 Honeywell International Inc. Iris and ocular recognition system using trace transforms
US8630464B2 (en) 2009-06-15 2014-01-14 Honeywell International Inc. Adaptive iris matching using database indexing
US8742887B2 (en) 2010-09-03 2014-06-03 Honeywell International Inc. Biometric visitor check system
CN102855471B (en) * 2012-08-01 2014-11-26 中国科学院自动化研究所 Remote iris intelligent imaging device and method
BR112015026910A2 (en) * 2013-04-23 2017-07-25 Procter & Gamble hair tool arrangement and method
US10042994B2 (en) 2013-10-08 2018-08-07 Princeton Identity, Inc. Validation of the right to access an object
US10038691B2 (en) 2013-10-08 2018-07-31 Princeton Identity, Inc. Authorization of a financial transaction
US10025982B2 (en) 2013-10-08 2018-07-17 Princeton Identity, Inc. Collecting and targeting marketing data and information based upon iris identification
EP3074924A4 (en) 2013-10-08 2017-11-22 Princeton Identity, Inc. Iris biometric recognition module and access control assembly
DE102013225283B4 (en) * 2013-12-09 2023-04-27 Rohde & Schwarz GmbH & Co. Kommanditgesellschaft Method and device for capturing an all-round view
WO2015108911A1 (en) * 2014-01-16 2015-07-23 Delta ID Inc. Method and apparatus for controlling intensity of illumination in eye based biometric systems
US20150324568A1 (en) * 2014-05-09 2015-11-12 Eyefluence, Inc. Systems and methods for using eye signals with secure mobile communications
BR112017004593A2 (en) * 2014-09-12 2017-12-05 Eyelock Llc methods and apparatus for directing a user's gaze on an iris recognition system
WO2018016837A1 (en) * 2016-07-18 2018-01-25 Samsung Electronics Co., Ltd. Method and apparatus for iris recognition
KR20180032947A (en) * 2016-09-23 2018-04-02 삼성전자주식회사 Method for controlling a sensor and electronic device thereof
US11288387B2 (en) 2018-12-21 2022-03-29 Verizon Patent And Licensing Inc. Method and system for self-sovereign information management
US10860874B2 (en) * 2018-12-21 2020-12-08 Oath Inc. Biometric based self-sovereign information management
US11062006B2 (en) 2018-12-21 2021-07-13 Verizon Media Inc. Biometric based self-sovereign information management
US11196740B2 (en) 2018-12-21 2021-12-07 Verizon Patent And Licensing Inc. Method and system for secure information validation
US11182608B2 (en) 2018-12-21 2021-11-23 Verizon Patent And Licensing Inc. Biometric based self-sovereign information management
US11281754B2 (en) 2018-12-21 2022-03-22 Verizon Patent And Licensing Inc. Biometric based self-sovereign information management
US11514177B2 (en) 2018-12-21 2022-11-29 Verizon Patent And Licensing Inc. Method and system for self-sovereign information management
US11288386B2 (en) 2018-12-21 2022-03-29 Verizon Patent And Licensing Inc. Method and system for self-sovereign information management
US20220270406A1 (en) * 2019-06-25 2022-08-25 Nec Corporation Iris recognition apparatus, iris recognition method, computer program and recording medium
US11601589B2 (en) 2020-09-15 2023-03-07 Micron Technology, Inc. Actuating an image sensor
US11219371B1 (en) 2020-11-09 2022-01-11 Micron Technology, Inc. Determining biometric data using an array of infrared illuminators

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5481622A (en) * 1994-03-01 1996-01-02 Rensselaer Polytechnic Institute Eye tracking apparatus and method employing grayscale threshold values
US6320610B1 (en) * 1998-12-31 2001-11-20 Sensar, Inc. Compact imaging device incorporating rotatably mounted cameras
US6424727B1 (en) * 1998-11-25 2002-07-23 Iridian Technologies, Inc. System and method of animal identification and animal transaction authorization using iris patterns
US6961599B2 (en) * 2001-01-09 2005-11-01 Childrens Hospital Los Angeles Identifying or measuring selected substances or toxins in a subject using resonant raman signals
US7095901B2 (en) * 2001-03-15 2006-08-22 Lg Electronics, Inc. Apparatus and method for adjusting focus position in iris recognition system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5291560A (en) 1991-07-15 1994-03-01 Iri Scan Incorporated Biometric personal identification system based on iris analysis
US20030169334A1 (en) * 2001-08-06 2003-09-11 Michael Braithwaite Iris capture device having expanded capture volume
US7118042B2 (en) * 2002-01-18 2006-10-10 Microscan Systems Incorporated Method and apparatus for rapid image capture in an image system
EP1671258A4 (en) 2003-09-04 2008-03-19 Sarnoff Corp Method and apparatus for performing iris recognition from an image
US7428320B2 (en) * 2004-12-07 2008-09-23 Aoptix Technologies, Inc. Iris imaging using reflection from the eye

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5481622A (en) * 1994-03-01 1996-01-02 Rensselaer Polytechnic Institute Eye tracking apparatus and method employing grayscale threshold values
US6424727B1 (en) * 1998-11-25 2002-07-23 Iridian Technologies, Inc. System and method of animal identification and animal transaction authorization using iris patterns
US6320610B1 (en) * 1998-12-31 2001-11-20 Sensar, Inc. Compact imaging device incorporating rotatably mounted cameras
US6961599B2 (en) * 2001-01-09 2005-11-01 Childrens Hospital Los Angeles Identifying or measuring selected substances or toxins in a subject using resonant raman signals
US7095901B2 (en) * 2001-03-15 2006-08-22 Lg Electronics, Inc. Apparatus and method for adjusting focus position in iris recognition system

Cited By (72)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100074478A1 (en) * 2005-06-03 2010-03-25 Hoyos Hector T Method and apparatus for iris biometric systems for use in an entryway
US7925059B2 (en) 2005-06-03 2011-04-12 Sri International Method and apparatus for iris biometric systems for use in an entryway
US20140072183A1 (en) * 2005-11-11 2014-03-13 Eyelock, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
US9792499B2 (en) 2005-11-11 2017-10-17 Eyelock Llc Methods for performing biometric recognition of a human eye and corroboration of same
US8798333B2 (en) * 2005-11-11 2014-08-05 Eyelock, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
US8798334B2 (en) 2005-11-11 2014-08-05 Eyelock, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
US8798331B2 (en) 2005-11-11 2014-08-05 Eyelock, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
US10102427B2 (en) 2005-11-11 2018-10-16 Eyelock Llc Methods for performing biometric recognition of a human eye and corroboration of same
US20110007949A1 (en) * 2005-11-11 2011-01-13 Global Rainmakers, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
US8818053B2 (en) 2005-11-11 2014-08-26 Eyelock, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
US8260008B2 (en) 2005-11-11 2012-09-04 Eyelock, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
US8798330B2 (en) 2005-11-11 2014-08-05 Eyelock, Inc. Methods for performing biometric recognition of a human eye and corroboration of same
US9613281B2 (en) 2005-11-11 2017-04-04 Eyelock Llc Methods for performing biometric recognition of a human eye and corroboration of same
US20080181467A1 (en) * 2006-09-01 2008-07-31 Zappia Thomas M Method and apparatus for iris biometric systems for use in an entryway
US7634114B2 (en) * 2006-09-01 2009-12-15 Sarnoff Corporation Method and apparatus for iris biometric systems for use in an entryway
US20080199054A1 (en) * 2006-09-18 2008-08-21 Matey James R Iris recognition for a secure facility
US7574021B2 (en) 2006-09-18 2009-08-11 Sarnoff Corporation Iris recognition for a secure facility
US9626562B2 (en) 2006-09-22 2017-04-18 Eyelock, Llc Compact biometric acquisition system and method
US8965063B2 (en) * 2006-09-22 2015-02-24 Eyelock, Inc. Compact biometric acquisition system and method
US20090274345A1 (en) * 2006-09-22 2009-11-05 Hanna Keith J Compact Biometric Acquisition System and Method
US9959478B2 (en) * 2007-04-19 2018-05-01 Eyelock Llc Method and system for biometric recognition
US10395097B2 (en) * 2007-04-19 2019-08-27 Eyelock Llc Method and system for biometric recognition
US20170243075A1 (en) * 2007-04-19 2017-08-24 Eyelock Llc Method and system for biometric recognition
WO2009023313A2 (en) * 2007-05-09 2009-02-19 Honeywell International Inc. Eye-safe near infra-red imaging illumination method and system
WO2009023313A3 (en) * 2007-05-09 2009-06-04 Honeywell Int Inc Eye-safe near infra-red imaging illumination method and system
US8442277B1 (en) * 2008-10-31 2013-05-14 Bank Of America Corporation Identity authentication system for controlling egress of an individual visiting a facility
US10244181B2 (en) 2009-02-17 2019-03-26 Trilumina Corp. Compact multi-zone infrared laser illuminator
US10038304B2 (en) 2009-02-17 2018-07-31 Trilumina Corp. Laser arrays for variable optical properties
US11075695B2 (en) 2009-02-17 2021-07-27 Lumentum Operations Llc Eye-safe optical laser system
US11405105B2 (en) 2009-02-17 2022-08-02 Lumentum Operations Llc System for optical free-space transmission of a string of binary data
US10938476B2 (en) 2009-02-17 2021-03-02 Lumentum Operations Llc System for optical free-space transmission of a string of binary data
US10615871B2 (en) 2009-02-17 2020-04-07 Trilumina Corp. High speed free-space optical communications
US11121770B2 (en) 2009-02-17 2021-09-14 Lumentum Operations Llc Optical laser device
US9124798B2 (en) * 2011-05-17 2015-09-01 Eyelock Inc. Systems and methods for illuminating an iris with visible light for biometric acquisition
US20120293643A1 (en) * 2011-05-17 2012-11-22 Eyelock Inc. Systems and methods for illuminating an iris with visible light for biometric acquisition
US11095365B2 (en) 2011-08-26 2021-08-17 Lumentum Operations Llc Wide-angle illuminator module
US11451013B2 (en) 2011-08-26 2022-09-20 Lumentum Operations Llc Wide-angle illuminator module
US20130130227A1 (en) * 2011-11-22 2013-05-23 The Boeing Company Infectious Disease Detection System
US9175356B2 (en) * 2011-11-22 2015-11-03 The Boeing Company Infectious disease detection system
US20150125937A1 (en) * 2011-11-22 2015-05-07 The Boeing Company Infectious disease detection system
US8936944B2 (en) * 2011-11-22 2015-01-20 The Boeing Company Infectious disease detection system
US9232592B2 (en) 2012-04-20 2016-01-05 Trilumina Corp. Addressable illuminator with eye-safety circuitry
US8750576B2 (en) * 2012-04-24 2014-06-10 Taiwan Colour And Imaging Technology Corporation Method of managing visiting guests by face recognition
US20130279767A1 (en) * 2012-04-24 2013-10-24 Chih-Hsung Huang Method of managing visiting guests by face recognition
US9773169B1 (en) * 2012-11-06 2017-09-26 Cross Match Technologies, Inc. System for capturing a biometric image in high ambient light environments
US20190259188A1 (en) * 2014-09-17 2019-08-22 Circonus, Inc. System and method for generating histograms
US10425814B2 (en) 2014-09-24 2019-09-24 Princeton Identity, Inc. Control of wireless communication device capability in a mobile device with a biometric key
US9875551B2 (en) 2014-09-24 2018-01-23 Samsung Electronics Co., Ltd Method for performing user authentication and electronic device thereof
US9779062B2 (en) * 2014-11-12 2017-10-03 Kabushiki Kaisha Toshiba Apparatus, method, and computer program product for computing occurrence probability of vector
US20160132463A1 (en) * 2014-11-12 2016-05-12 Kabushiki Kaisha Toshiba Computing apparatus, computing method, and computer program product
US10484584B2 (en) 2014-12-03 2019-11-19 Princeton Identity, Inc. System and method for mobile device biometric add-on
US10024965B2 (en) 2015-04-01 2018-07-17 Vayavision, Ltd. Generating 3-dimensional maps of a scene using passive and active measurements
US10444357B2 (en) * 2015-04-01 2019-10-15 Vayavision Ltd. System and method for optimizing active measurements in 3-dimensional map generation
US11226413B2 (en) 2015-04-01 2022-01-18 Vayavision Sensing Ltd. Apparatus for acquiring 3-dimensional maps of a scene
US11725956B2 (en) 2015-04-01 2023-08-15 Vayavision Sensing Ltd. Apparatus for acquiring 3-dimensional maps of a scene
US20160291155A1 (en) * 2015-04-01 2016-10-06 Vayavision, Ltd. System and method for optimizing active measurements in 3-dimensional map generation
US11604277B2 (en) 2015-04-01 2023-03-14 Vayavision Sensing Ltd. Apparatus for acquiring 3-dimensional maps of a scene
US10643088B2 (en) 2016-01-12 2020-05-05 Princeton Identity, Inc. Systems and methods of biometric analysis with a specularity characteristic
US10943138B2 (en) 2016-01-12 2021-03-09 Princeton Identity, Inc. Systems and methods of biometric analysis to determine lack of three-dimensionality
US10762367B2 (en) 2016-01-12 2020-09-01 Princeton Identity Systems and methods of biometric analysis to determine natural reflectivity
US10643087B2 (en) 2016-01-12 2020-05-05 Princeton Identity, Inc. Systems and methods of biometric analysis to determine a live subject
US10452936B2 (en) 2016-01-12 2019-10-22 Princeton Identity Systems and methods of biometric analysis with a spectral discriminator
US10373008B2 (en) 2016-03-31 2019-08-06 Princeton Identity, Inc. Systems and methods of biometric analysis with adaptive trigger
US10366296B2 (en) 2016-03-31 2019-07-30 Princeton Identity, Inc. Biometric enrollment systems and methods
US10607096B2 (en) 2017-04-04 2020-03-31 Princeton Identity, Inc. Z-dimension user feedback biometric system
US10902104B2 (en) 2017-07-26 2021-01-26 Princeton Identity, Inc. Biometric security systems and methods
US11402510B2 (en) 2020-07-21 2022-08-02 Leddartech Inc. Systems and methods for wide-angle LiDAR using non-uniform magnification optics
US11422266B2 (en) 2020-07-21 2022-08-23 Leddartech Inc. Beam-steering devices and methods for LIDAR applications
US11474253B2 (en) 2020-07-21 2022-10-18 Leddartech Inc. Beam-steering devices and methods for LIDAR applications
US11543533B2 (en) 2020-07-21 2023-01-03 Leddartech Inc. Systems and methods for wide-angle LiDAR using non-uniform magnification optics
US11567179B2 (en) 2020-07-21 2023-01-31 Leddartech Inc. Beam-steering device particularly for LIDAR systems
US11828853B2 (en) 2020-07-21 2023-11-28 Leddartech Inc. Beam-steering device particularly for LIDAR systems

Also Published As

Publication number Publication date
WO2006132689A3 (en) 2007-12-27
WO2006132689A2 (en) 2006-12-14
US7627147B2 (en) 2009-12-01
US20060274919A1 (en) 2006-12-07
WO2006132686A3 (en) 2009-04-16
WO2006132686A2 (en) 2006-12-14

Similar Documents

Publication Publication Date Title
US7925059B2 (en) Method and apparatus for iris biometric systems for use in an entryway
US20060274918A1 (en) Method and apparatus for designing iris biometric systems for use in minimally constrained settings
US10783388B2 (en) Spoof detection using multiple image acquisition devices
CA2744757C (en) Biometric authentication using the eye
US7174033B2 (en) Methods and systems for detecting and recognizing an object based on 3D image data
US10970574B2 (en) Spoof detection using dual-band near-infrared (NIR) imaging
US7257236B2 (en) Methods and systems for detecting and recognizing objects in a controlled wide area
US20100208207A1 (en) Automatic direct gaze detection based on pupil symmetry
JP2002514098A (en) Device for iris acquisition image
EP3673406B1 (en) Laser speckle analysis for biometric authentication
US9036872B2 (en) Biometric authentication using the eye
JPWO2019230306A1 (en) Identification device and identification method
WO2017062162A1 (en) Iris recognition
He et al. Key techniques and methods for imaging iris in focus
Matey et al. Iris recognition in less constrained environments
KR102274270B1 (en) System for acquisiting iris image for enlarging iris acquisition range
Matey et al. Iris recognition–beyond one meter
CN101221343A (en) Active illumination imaging device against surroundings light interference
Lee et al. Multifeature-based fake iris detection method
CN201159818Y (en) Anti-interference imaging device with 940nm infrared light emitting diode active illumination
CN201159819Y (en) Anti-interference imaging device with laser active illumination
Lee et al. Fake iris detection method using Purkinje images based on gaze position
Choi et al. Image Scanning Method for Vascular Pattern Recognition
AU2014280908B2 (en) Biometric Authentication using the Eye
WO2022226478A1 (en) Thermal based presentation attack detection for biometric systems

Legal Events

Date Code Title Description
AS Assignment

Owner name: SARNOFF CORPORATION, NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AMANTEA, ROBERT;LOIACONO, DOMINICK;NARODITSKY, OLEG;AND OTHERS;REEL/FRAME:017642/0325;SIGNING DATES FROM 20060216 TO 20060217

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION