US20110300830A1 - Fingerprint scanning with optical navigation - Google Patents
Fingerprint scanning with optical navigation Download PDFInfo
- Publication number
- US20110300830A1 US20110300830A1 US12/793,960 US79396010A US2011300830A1 US 20110300830 A1 US20110300830 A1 US 20110300830A1 US 79396010 A US79396010 A US 79396010A US 2011300830 A1 US2011300830 A1 US 2011300830A1
- Authority
- US
- United States
- Prior art keywords
- processor
- optical navigation
- biometric
- image
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/08—Network architectures or network communication protocols for network security for authentication of entities
- H04L63/0861—Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
Abstract
Description
- The present application relates generally to authentication for a computing device and, more specifically, to combining fingerprint scanning with optical navigation.
- As mobile telephones have received increasing amounts of computing power in successive generations, the mobile telephones have been termed “smart phones”. Along with increasing amounts of computing power, such smart phones have seen increases in storage capacity and, consequently, increased utility. Beyond telephone functions, smart phones may now send and receive digital messages, be they formatted to use e-mail standards, Short Messaging Service (SMS) standards, Instant Messaging standards and proprietary messaging systems. Smart phones may also store, read, edit and create documents, spreadsheets and presentations. Accordingly, there have been increasing demands for smart phones with enhanced authentication functions.
- Reference will now be made, by way of example, to the accompanying drawings which show example implementations; and in which:
-
FIG. 1 illustrates an anterior side of a mobile communication device; -
FIG. 2 illustrates a posterior side of the mobile communication device ofFIG. 1 ; -
FIG. 3 illustrates an example arrangement of internal components of the mobile communication device ofFIG. 1 , in accordance with an implementation of the present disclosure; -
FIG. 4 illustrates an example biometric and optical navigation subsystem for the mobile communication device ofFIG. 1 , in accordance with an implementation of the present disclosure; -
FIG. 5 illustrates an example image signal processor for the mobile communication device ofFIG. 1 , in accordance with an implementation of the present disclosure; -
FIG. 6 illustrates example steps in a method of switching between a navigation mode of operation and a fingerprint scanning mode of operation for the example biometric and optical navigation subsystem ofFIG. 4 , in accordance with an implementation of the present disclosure; -
FIG. 7A illustrates an example navigation mode window, in accordance with an implementation of the present disclosure; -
FIG. 7B illustrates an example fingerprint scanning mode window, in accordance with an implementation of the present disclosure; and -
FIG. 8 illustrates example steps in a method of operation for the image signal processor ofFIG. 5 , in accordance with an implementation of the present disclosure; and -
FIG. 9 illustrates example steps in a method of processing images received at the biometric and optical navigation subsystem processor ofFIG. 4 , in accordance with an implementation of the present disclosure. - An optical navigation subsystem may be used when obtaining a candidate fingerprint for a mobile communication device (e.g., for authentication purposes). To accommodate such use of the optical navigation subsystem, the optical navigation subsystem may be adapted to automatically adjust a processed image sensor window from a first set of window dimensions (e.g., suitable for optical navigation) to a second set of window dimensions (e.g., suitable for fingerprint capture). Alternatively, a single, static set of window dimension may be employed in conjunction with stitching algorithms, or other methods suitable for forming candidate fingerprint images by combining a plurality of images obtained using the static set of window dimension.
- According to an aspect of the present disclosure, there is provided method of handling images. The method comprises receiving a digital image of a finger from an image sensor, transmitting the digital image, processing the digital image in a context of a plurality of previously received digital images to determine a finger motion indication and transmitting the finger motion indication. In other aspects of the present application, a biometric and optical navigation subsystem is provided for carrying out this method and a computer readable medium is provided for adapting a processor in a biometric and optical navigation subsystem to carry out this method.
- Other aspects and features of the present disclosure will become apparent to those of ordinary skill in the art upon review of the following description of specific implementations of the disclosure in conjunction with the accompanying figures.
-
FIG. 1 illustrates an anterior side of amobile communication device 100. Many features of the anterior side of themobile communication device 100 are mounted within ahousing 101 and include adisplay 126, akeyboard 124 having a plurality of keys, aspeaker 111 and anavigation lens 106. - The
mobile communication device 100 includes an input device (e.g., the keyboard 124) and an output device (e.g., the display 126), which may comprise a full graphic, or full color, Liquid Crystal Display (LCD). In some implementations, thedisplay 126 may comprise a touchscreen display. In such touchscreen implementations, thekeyboard 124 may comprise a virtual keyboard provided on thedisplay 126. Other types of output devices may alternatively be utilized. - The
housing 101 may be elongated vertically, or may take on other sizes and shapes (including clamshell housing structures). In the case in which thekeyboard 124 includes keys that are associated with at least one alphabetic character and at least one numeric character, thekeyboard 124 may include a mode selection key, or other hardware or software, for switching between alphabetic entry and numeric entry. -
FIG. 2 illustrates a posterior side of themobile communication device 100. Included on the posterior side are aposterior lens 203 and a Light Emitting Diode (LED) 207 for use as a flash when using themobile communication device 100 to capture, through theposterior lens 203, a still photograph. TheLED 207 may also be used as a torch to provide light when themobile communication device 100 is used to capture, through theposterior lens 203, video in low ambient light. -
FIG. 3 illustrates an example arrangement of internal components of themobile communication device 100. A processing device (a microprocessor 328) is shown schematically inFIG. 3 as coupled between thekeyboard 124 and thedisplay 126. Themicroprocessor 328 controls the operation of thedisplay 126, as well as the overall operation of themobile communication device 100, in part, responsive to actuation of the keys on thekeyboard 124 by a user. - In addition to the
microprocessor 328, other parts of themobile communication device 100 are shown schematically inFIG. 3 . These may include acommunications subsystem 302, a short-range communications subsystem 304, thekeyboard 124 and thedisplay 126. Themobile communication device 100 may further include other input/output devices, such as a set of auxiliary I/O devices 306, aserial port 308, thespeaker 111 and amicrophone 312. Themobile communication device 100 may further include memory devices including aflash memory 316 and a Random Access Memory (RAM) 318 as well as various other device subsystems. Themobile communication device 100 may comprise a two-way, radio frequency (RF) communication device having voice and data communication capabilities. In addition, themobile communication device 100 may have the capability to communicate with other computer systems via the Internet. - Operating system software executed by the
microprocessor 328 may be stored in a computer readable medium, such as theflash memory 316, but may be stored in other types of memory devices, such as a read only memory (ROM) or similar storage element. In addition, system software, specific device applications, or parts thereof, may be temporarily loaded into a volatile store, such as theRAM 318. Communication signals received by the mobile device may also be stored to theRAM 318. - The
microprocessor 328, in addition to its operating system functions, enables execution of software applications on themobile communication device 100. A predetermined set of software applications that control basic device operations, such as avoice communications module 330A and adata communications module 330B, may be installed on themobile communication device 100 during manufacture. Anauthentication module 330C may also be installed on themobile communication device 100 during manufacture, to implement aspects of the present disclosure. As well, additional software modules, illustrated as another software module 330N, which may be, for instance, a PIM application, may be installed during manufacture. The PIM application may be capable of organizing and managing data items, such as e-mail messages, calendar events, voice mail messages, appointments and task items. The PIM application may also be capable of sending and receiving data items via awireless carrier network 370 represented by a radio tower. The data items managed by the PIM application may be seamlessly integrated, synchronized and updated via thewireless carrier network 370 with the device user's corresponding data items stored or associated with a host computer system. - Communication functions, including data and voice communications, are performed through the
communication subsystem 302 and, possibly, through the short-range communications subsystem 304. Thecommunication subsystem 302 includes areceiver 350, atransmitter 352 and one or more antennas, illustrated as areceive antenna 354 and atransmit antenna 356. In addition, thecommunication subsystem 302 also includes a processing module, such as a digital signal processor (DSP) 358, and local oscillators (LOs) 360. The specific design and implementation of thecommunication subsystem 302 is dependent upon the communication network in which themobile communication device 100 is intended to operate. For example, thecommunication subsystem 302 of themobile communication device 100 may be designed to operate with the Mobitex™, DataTAC™ or General Packet Radio Service (GPRS) mobile data communication networks and also designed to operate with any of a variety of voice communication networks, such as Advanced Mobile Phone Service (AMPS), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), Personal Communications Service (PCS), Global System for Mobile Communications (GSM), Enhanced Data rates for GSM Evolution (EDGE), Universal Mobile Telecommunications System (UMTS), Wideband Code Division Multiple Access (W-CDMA), High Speed Packet Access (HSPA), etc. Other types of data and voice networks, both separate and integrated, may also be utilized with themobile communication device 100. - Network access requirements vary depending upon the type of communication system. Typically, an identifier is associated with each mobile device that uniquely identifies the mobile device or subscriber to which the mobile device has been assigned. The identifier is unique within a specific network or network technology. For example, in Mobitex™ networks, mobile devices are registered on the network using a Mobitex Access Number (MAN) associated with each device and in DataTAC™ networks, mobile devices are registered on the network using a Logical Link Identifier (LLI) associated with each device. In GPRS networks, however, network access is associated with a subscriber or user of a device. A GPRS device therefore uses a subscriber identity module, commonly referred to as a Subscriber Identity Module (SIM) card, in order to operate on a GPRS network. Despite identifying a subscriber by SIM, mobile devices within GSM/GPRS networks are uniquely identified using an International Mobile Equipment Identity (IMEI) number.
- When required network registration or activation procedures have been completed, the
mobile communication device 100 may send and receive communication signals over thewireless carrier network 370. Signals received from thewireless carrier network 370 by the receiveantenna 354 are routed to thereceiver 350, which provides for signal amplification, frequency down conversion, filtering, channel selection, etc., and may also provide analog to digital conversion. Analog-to-digital conversion of the received signal allows theDSP 358 to perform more complex communication functions, such as demodulation and decoding. In a similar manner, signals to be transmitted to thewireless carrier network 370 are processed (e.g., modulated and encoded) by theDSP 358 and are then provided to thetransmitter 352 for digital to analog conversion, frequency up conversion, filtering, amplification and transmission to the wireless carrier network 370 (or networks) via the transmitantenna 356. - In addition to processing communication signals, the
DSP 358 provides for control of thereceiver 350 and thetransmitter 352. For example, gains applied to communication signals in thereceiver 350 and thetransmitter 352 may be adaptively controlled through automatic gain control algorithms implemented in theDSP 358. - In a data communication mode, a received signal, such as a text message or web page download, is processed by the
communication subsystem 302 and is input to themicroprocessor 328. The received signal is then further processed by themicroprocessor 328 for output to thedisplay 126, or alternatively to some auxiliary I/O devices 306. A device user may also compose data items, such as e-mail messages, using thekeyboard 124 and/or some other auxiliary I/O device 306, such as thenavigation lens 106, a touchpad, a rocker switch, a thumb-wheel, a trackball, a touchscreen, or some other type of input device. The composed data items may then be transmitted over thewireless carrier network 370 via thecommunication subsystem 302. - In a voice communication mode, overall operation of the device is substantially similar to the data communication mode, except that received signals are output to the
speaker 111, and signals for transmission are generated by amicrophone 312. Alternative voice or audio I/O subsystems, such as a voice message recording subsystem, may also be implemented on themobile communication device 100. In addition, thedisplay 126 may also be utilized in voice communication mode, for example, to display the identity of a calling party, the duration of a voice call, or other voice call related information. - The short-range communications subsystem 304 enables communication between the
mobile communication device 100 and other proximate systems or devices, which need not necessarily be similar devices. For example, the short-range communications subsystem may include an infrared device and associated circuits and components, or a Bluetooth™ communication module to provide for communication with similarly-enabled systems and devices. - In one implementation, a biometric and
optical navigation subsystem 322 and aphotography subsystem 320 connect to themicroprocessor 328 via an Image Signal Processor (ISP) 321. Indeed, the biometric andoptical navigation subsystem 322 and thephotography subsystem 320 each include a communication interface (not shown) for managing communication with theISP 321. Correspondingly, theISP 321 includes a first communication interface (not shown) for managing communication with the biometric andoptical navigation subsystem 322, a second communication interface (not shown) for managing communication with thephotography subsystem 320 and a third communication interface (not shown) for managing communication withmicroprocessor 328. In an alternative implementation (not shown), the biometric andoptical navigation subsystem 322 connects to themicroprocessor 328 over a communication channel that bypasses theISP 321. - An example arrangement of components of the biometric and
optical navigation subsystem 322 is schematically illustrated inFIG. 4 as including thenavigation lens 106, awindowing module 408, animage sensor 406, alight source 412, amemory 414 and a biometric and opticalnavigation subsystem processor 410. The biometric and opticalnavigation subsystem processor 410 exerts control, via thewindowing module 408, over the window size recorded at theimage sensor 406. Thelight source 412 is also under control of the biometric and opticalnavigation subsystem processor 410. A control connection between the biometric and opticalnavigation subsystem processor 410 and theISP 321 may be accomplished using an Inter-Integrated Circuit (I2C) bus. An I2C bus is a multi-master, serial, single-ended computer bus that is used to attach low-speed peripherals to a motherboard, an embedded system or a cellular telephone. - A set of operating instructions may be installed in the
memory 414 during manufacture, to allow the biometric and opticalnavigation subsystem processor 410 to implement aspects of the present disclosure. - The
photography subsystem 320, which may be configured for capturing digital photographs and/or video of subjects that are greater than 10 cm away from theposterior lens 203, may include theposterior lens 203 with corresponding posterior shutter (not shown), theLED 207, an image sensor (not shown) and a posterior photography subsystem processor (not shown). A control connection between thephotography subsystem 320 and theISP 321 may be accomplished using an I2C bus. -
FIG. 5 illustrates an example of the components of theISP 321 for themobile communication device 100 ofFIG. 1 . In particular, theISP 321 may include anISP processor 510 and anISP memory 514. TheISP memory 514 may be used to buffer received data and may also store computer readable instructions for use by theISP processor 510. Notably, theISP memory 514 is likely to be small and embedded within theISP processor 510. TheISP memory 514 is shown separately merely for purposes of illustration and should not preclude the option that theISP memory 514 is implemented in theISP processor 510. In a manner similar to the manner in which theISP 321 connects to the biometric and opticalnavigation subsystem processor 410, theISP 321 may connect to themicroprocessor 328 using an I2C bus. - In overview, the biometric and
optical navigation subsystem 322 may be used when obtaining a candidate fingerprint for the mobile communication device 100 (e.g., for authentication purposes). To accommodate such use of the biometric andoptical navigation subsystem 322, the biometric andoptical navigation subsystem 322 may be adapted, according to the present disclosure, to automatically adjust a processed image sensor window from a first set of window dimensions (e.g., 16 pixels by 16 pixels for optical navigation) to a second set of window dimensions (e.g., 512 pixels by 64 pixels for fingerprint capture). Alternatively, a single, static set of window dimension may be employed in conjunction with stitching algorithms, or other methods suitable for forming candidate fingerprint images by combining a plurality of images obtained using the static set of window dimension. - The
mobile communication device 100 may be arranged to have various security modes including, for example, a locked mode and an unlocked mode. - In locked mode, a user has limited access to the functions of the
mobile communication device 100. Themobile communication device 100 may be capable of receiving incoming telephone calls or may be capable of placing an emergency telephone call while in locked mode. Additionally, themobile communication device 100 may be capable of receiving messages (e.g., e-mail messages, Short Messaging Service messages, instant messenger messages, etc.), but the messages may not be viewed by the user while the device is in locked mode. Themobile communication device 100 can do little else but provide a dialog indicating that themobile communication device 100 is in locked mode and indicating the action that is to be taken by the user to change the mode of themobile communication device 100 to unlocked mode. Upon changing the mode of themobile communication device 100 to unlocked mode, the user may be provided much greater access to the functionality of themobile communication device 100. - When the
mobile communication device 100 is in locked mode, there may exist a requirement that a user provide, to themobile communication device 100, one or more forms of authentication input before themobile communication device 100 will change over to unlocked mode. Such authentication input may take the form of input provided on thekeyboard 124, where the input may be a password formed of alphanumeric characters and/or symbols. Alternatively or additionally, such authentication input may take the form of biometric input. Biometric input may include one or more fingerprints, retinal scans, face geometry scans, hand geometry scans, voice prints or speech prints, etc. - In recognition of the security provided by biometric authentication, manufacturers of mobile communication devices such as notebook computers, cellular telephones and so-called smart phones, which combine elements of notebook computers and cellular telephones, have been known to include a built-in fingerprint sensor in their products.
- One example fingerprint sensor has a bar shape. A silicon sensor constructs a fingerprint as a user swipes a finger across the bar. Another example fingerprint sensor has a pad shape. A sensor constructs a fingerprint as a user holds a finger on the pad, which is designed with a size to accommodate an entire fingerprint.
- It is proposed herein to employ the biometric and
optical navigation subsystem 322 in place of a dedicated fingerprint sensor. In an adjustable window mode implementation, employment of the biometric andoptical navigation subsystem 322 uses a control system that can adjust a processed image sensor window from a first set of window dimensions (e.g., 16×16 pixels for optical navigation) to a second set of window dimensions (e.g., 512×64 pixels for fingerprint capture). In a small-window stitching implementation, a single, static set of window dimension may be employed in conjunction with stitching algorithms, or other methods suitable for forming candidate fingerprint images by combining a plurality of images obtained using the static set of window dimension. - In operation of the adjustable window mode implementation, in view of
FIG. 6 , the biometric andoptical navigation subsystem 322 may be arranged to normally operate (step 602) in navigation mode with the first set of window dimensions (e.g., 16×16 pixels). Periodically, the biometric and opticalnavigation subsystem processor 410 may determine (step 604) whether fingerprint scanning mode is to be activated. - The
image sensor 406 may, for example, have a resolution 500 dots per inch (dpi). It follows then that for a scanning area of approximately one square inch, theimage sensor 406 may have dimensions defined as 512×384 pixels. The dimensions may be reduced through use of appropriate optics. - The
windowing module 408 allows for “windowing”, which is also called “selective area processing”. Accordingly, thewindowing module 408 allows theISP 321 to specify a window size and a window placement on theimage sensor 406 at which images are captured. More particularly, theISP 321 transmits a windowing instruction to the biometric and opticalnavigation subsystem processor 410, the windowing instruction specifying, for example, a window size and a window placement. Upon receipt of the instruction, the biometric and opticalnavigation subsystem processor 410 may instruct thewindowing module 408 accordingly. Thewindowing module 408 may then operate theimage sensor 406 to capture images inside a window with the properties specified by the windowing instructions. - The windowing instruction may also specify a frame rate. The
image sensor 406 may, for example, support a frame rate of 3000 frames/sec at a resolution of 16×16, which is well suited to navigation mode. These parameters can be changed while maintaining a specific performance. Performance of theimage sensor 406 may be measured in counts per inch (cpi). An example specific performance is 1200 cpi. - The
image sensor 406 may, for another example, support a frame rate of 60 frames/sec at a resolution of 512×64 pixels, which is well suited to fingerprint scanning mode. - In the navigation mode, the
windowing module 408 may, for example, instruct theimage sensor 406 to record images with anavigation mode window 702A that is 16 pixels wide and 16 pixels high, as illustrated inFIG. 7A . - In the navigation mode, the
image sensor 406 may transfer images captured through thenavigation mode window 702A to the biometric and opticalnavigation subsystem processor 410 for processing and storage. The biometric and opticalnavigation subsystem processor 410 may process the images and transmit an indication, to theISP 321 for passing to themicroprocessor 328, of a movement of the user's finger. The processing aims to resolve details on the surface of a finger to determine which direction a user's finger is moving as the user's finger travels over thenavigation lens 106. Such processing typically comprises processing the received image in the context of a plurality of previously received images to determine a change (also known as a “delta”, represented as “Δ”) in one linear direction (say, an “x” direction) and in another linear direction (say, a “y” direction), which is often perpendicular to the x direction. Accordingly, the indication that the biometric and opticalnavigation subsystem processor 410 transmits to theISP processor 510 may include a value for each of Δx and Δy as well as a rate of change of position. That is, the indication may include representation of a direction of motion of the finger along with a representation of the speed of the motion of the finger. TheISP processor 510 may, in turn, pass the indication to themicroprocessor 328. - In the fingerprint scanning mode, the
windowing module 408 may, for example, instruct theimage sensor 406 to record images with a fingerprintscanning mode window 702B having a width of 512 pixels and a height of 64 pixels, as illustrated inFIG. 7B . - In the fingerprint scanning mode, the
image sensor 406 may transfer images captured through the fingerprintscanning mode window 702B to the biometric and opticalnavigation subsystem processor 410. The biometric and opticalnavigation subsystem processor 410, in turn, transfers the images to theISP 321 for processing and storage. TheISP processor 510 may process the images and provide, to themicroprocessor 328, a candidate digital fingerprint, as will be described hereinafter. - In view of
FIG. 6 , if the biometric and opticalnavigation subsystem processor 410 determines (step 604) that fingerprint scanning mode is not to be activated, operation (step 602) in navigation mode continues. However, the biometric and opticalnavigation subsystem processor 410 may determine (step 604) that fingerprint scanning mode is to be activated, for example, due to receipt of an instruction from theISP processor 510 to activate fingerprint scanning mode. - The instruction from the
ISP processor 510 may be transmitted responsive to receipt, by theISP processor 510, of an instruction from themicroprocessor 328. In one implementation, themicroprocessor 328 determines that there is a cause for the biometric andoptical navigation subsystem 322 to operate in fingerprint scanning mode and indicates the fingerprint scanning mode in the instruction to theISP 321. - Accordingly, in one implementation, determining (step 604) whether fingerprint scanning mode is to be activated comprises reviewing an instruction received from the
ISP processor 510 for an indication that fingerprint scanning mode is to be activated. - Upon determining (step 604) that fingerprint scanning mode is to be activated, the biometric and optical
navigation subsystem processor 410 may initialize (step 606) operation of thewindowing module 408 in a fingerprint scanning mode. - The biometric and
optical navigation subsystem 322 may continue to operate in the fingerprint scanning mode until the biometric and opticalnavigation subsystem processor 410 determines (step 608) that fingerprint scanning mode is to be de-activated. - Upon determining (step 608) that fingerprint scanning mode is to be de-activated, the biometric and optical
navigation subsystem processor 410 may return to operation (step 602) in navigation mode. - As illustrated in
FIG. 3 , the biometric andoptical navigation subsystem 322 and thephotography subsystem 320 connect to thesame ISP 321.FIG. 8 illustrates example steps in a method of operation for theISP processor 510. - It is proposed herein to configure the
ISP processor 510 with image stitching algorithms such that theISP processor 510 may, upon receipt (step 802) of the multiple partial digital images of the finger from the biometric andoptical navigation subsystem 322, combine the multiple partial digital images to form (step 804) a candidate digital fingerprint. TheISP processor 510 may then determine (step 806) whether the formed candidate digital fingerprint is suitable for use by theauthentication module 330C. - A typical fingerprint identification algorithm seeks to detect, in a candidate digital fingerprint, a number of “features” suitable to enable an adequate level of discrimination relative to other fingerprints not from the enrolled user. A candidate digital fingerprint may be determined (step 806) to be suitable for use by the
authentication module 330C based on the detection, in the candidate digital fingerprint, of a number of features exceeding a predetermined threshold number of features. The predetermined threshold number of features is dependent on a level of discrimination desired (say, 1 in 1 000, 1 in 10 000, 1 in 1 000 000, etc.). - Upon determining (step 806) that the candidate digital fingerprint is not yet suitable for use by the
authentication module 330C, theISP processor 510 may transmit (step 808) a request to the biometric andoptical navigation subsystem 322 for further partial digital images. Upon determining (step 806) that the candidate digital fingerprint is suitable for use by theauthentication module 330C, theISP processor 510 may transmit (step 810), to the biometric andoptical navigation subsystem 322, a confirmation that the fingerprint has been successfully captured. TheISP processor 510 may then transmit (step 812) the candidate digital fingerprint to themicroprocessor 328. - Upon receiving the candidate digital fingerprint, the
microprocessor 328 may implement theauthentication module 330C to perform a conventional comparison of the received candidate digital fingerprint to a previously stored template digital fingerprint. The result of the comparison may be represented by a value representative of a degree of match between the candidate and the template. Based on the degree of match, themicroprocessor 328 may unlock themobile communication device 100 for use by the user. - As will be clear to a person of ordinary skill in the art, an authentication dialog presented on the
display 126 under control of themicroprocessor 328 may indicate a requirement for a combination of password and one or more fingerprints to successfully unlock themobile communication device 100. The use of a single fingerprint has been described above for simplicity of presentation. - In an alternative implementation, in fingerprint scanning mode, the actions of the
image sensor 406 when obtaining a candidate digital fingerprint may be described as obtaining a high-resolution raster scan as a user's finger passes over thenavigation lens 106. When the raster scan is described by “high-resolution” the term is used relative to the resolution used when the biometric andoptical navigation subsystem 322 is operating in navigation mode. The term resolution is used to refer to the amount of information collected by theimage sensor 406. - In an alternative implementation, the navigation mode may be termed a low-resolution full frame mode.
- In a small-window stitching implementation, a single, static set of window dimension (for example, 20 pixels×20 pixels) may be employed in conjunction with stitching algorithms suitable for forming a candidate fingerprint image, with dimension 200 pixels×200 pixels, for example, by combining a plurality of 20×20 images.
- In operation of the small-window stitching implementation, a finger is moved over the surface of the
navigation lens 106. The biometric and opticalnavigation subsystem processor 410 may control thewindowing module 408 to maintain a specifically sized window at theimage sensor 406. Theimage sensor 406 captures a plurality of images and transfers the images to the biometric and opticalnavigation subsystem processor 410. A feature of the operation of the small-window stitching implementation is a lack of distinction between a fingerprint scanning mode and a navigation mode. -
FIG. 9 presents example steps in a method of processing images received at the biometric and opticalnavigation subsystem processor 410 from theimage sensor 406. The biometric and opticalnavigation subsystem processor 410 receives (step 902) an image and transmits (step 902) the image to theISP 321. The biometric and opticalnavigation subsystem processor 410 also processes (step 906) the image to determine representations of finger motion and transmits (step 908) a finger motion indication to theISP 321. The finger motion indication may include representation of the direction of motion of the finger along with a representation of the speed of the motion of the finger. - The
ISP 321 may then carry out the method of image processing presented above in view ofFIG. 8 . Upon receipt (step 802) of the multiple partial digital images of the finger from the biometric andoptical navigation subsystem 322, theISP processor 510 may combine the multiple digital images to form (step 804) a candidate digital fingerprint. TheISP processor 510 may then determine (step 806) whether the formed candidate digital fingerprint is suitable for use by theauthentication module 330C. - Upon determining (step 806) that the candidate digital fingerprint is not yet suitable for use by the
authentication module 330C, theISP processor 510 may transmit (step 808) a request to the biometric andoptical navigation subsystem 322 for further partial digital images. Upon determining (step 806) that the candidate digital fingerprint is suitable for use by theauthentication module 330C, theISP processor 510 may transmit (step 810), to the biometric andoptical navigation subsystem 322, a confirmation that the fingerprint has been successfully captured. TheISP processor 510 may then transmit (step 812) the candidate digital fingerprint to themicroprocessor 328. - Successive images are stitched together until the
ISP processor 510 determines (step 806) that the candidate digital fingerprint is suitable. Accordingly, even though an actual frame rate of capture of small-window images may remain constant, an “effective” frame rate of acceptable larger-window images is expected to be variable. Such an “effective” frame rate may be determined based on several independent parameters, one such parameter being the exact path of finger motion. - Upon receiving the candidate digital fingerprint, the
microprocessor 328 may implement theauthentication module 330C to perform a conventional comparison of the received candidate digital fingerprint to a previously stored template digital fingerprint. The result of the comparison may be represented by a value representative of a degree of match between the candidate and the template. Based on the degree of match, themicroprocessor 328 may unlock themobile communication device 100 for use by the user. - In a further alternative implementation, rather than transfer (step 812) the candidate digital fingerprint to the
microprocessor 328, theISP processor 510 may perform (not shown) a matching function. Provided that theISP processor 510 has access to an appropriate fingerprint template, theISP processor 510 may attempt to determine whether the candidate digital fingerprint stitched together from multiple partial images is a match for the fingerprint template. That is, theISP processor 510 may perform a conventional comparison of the received candidate digital fingerprint to a previously stored template digital fingerprint. The result of the comparison may be represented by a value representative of a degree of match between the candidate and the template. TheISP processor 510 may then transmit an indication of the degree of match to themicroprocessor 328. Based on the degree of match received from theISP 321, themicroprocessor 328 may unlock themobile communication device 100 for use by the user. - In conjunction with handling images received from the biometric and optical
navigation subsystem processor 410, theISP 321 may also handle the deltas received from the biometric and opticalnavigation subsystem processor 410. In one case, theISP 321 may simply pass the deltas to themicroprocessor 328. Further alternatively, the biometric and opticalnavigation subsystem processor 410 may, as discussed above, maintain a direct connection to themicroprocessor 328. In such a case, the biometric and opticalnavigation subsystem processor 410 may transfer the deltas directly to themicroprocessor 328, bypassing theISP 321. - In use, when the
mobile communication device 100 is in locked mode, the fingerprint scanning function may be recognized as more helpful than the navigation function, even when both functions are available simultaneously, as may be the case in the small-window stitching implementation. Upon capturing a suitable candidate digital fingerprint and successfully authenticating a user, i.e., when themobile communication device 100 is in unlocked mode, the navigation function may be recognized as more helpful than the fingerprint scanning function. - In an implementation made possible by the methods and hardware discussed above, a user may be allowed to gain access to operation of the mobile communication device merely by providing a correctly authenticated password. Thereafter, i.e., once the
mobile communication device 100 has completed a transition into unlocked mode and the user's primary use of the biometric andoptical navigation subsystem 322 is for the navigation function, the biometric andoptical navigation subsystem 322 may continue to transmit images to theISP 321. At theISP 321, and unbeknownst to the user, the images may be processed to obtain a candidate digital fingerprint that can be used to authenticate the user. - Upon successful authentication, the user's interaction with the
mobile communication device 100 may be allowed to continue unabated. However, upon authentication failure, themobile communication device 100 may be returned to locked mode. - The above-described implementations of the present application are intended to be examples only. Alterations, modifications and variations may be effected to the particular implementations by those skilled in the art without departing from the scope of the application, which is defined by the claims appended hereto.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/793,960 US8792683B2 (en) | 2010-06-04 | 2010-06-04 | Fingerprint scanning with optical navigation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/793,960 US8792683B2 (en) | 2010-06-04 | 2010-06-04 | Fingerprint scanning with optical navigation |
Publications (2)
Publication Number | Publication Date |
---|---|
US20110300830A1 true US20110300830A1 (en) | 2011-12-08 |
US8792683B2 US8792683B2 (en) | 2014-07-29 |
Family
ID=45064826
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/793,960 Active 2031-12-19 US8792683B2 (en) | 2010-06-04 | 2010-06-04 | Fingerprint scanning with optical navigation |
Country Status (1)
Country | Link |
---|---|
US (1) | US8792683B2 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120105375A1 (en) * | 2010-10-27 | 2012-05-03 | Kyocera Corporation | Electronic device |
US20130027400A1 (en) * | 2011-07-27 | 2013-01-31 | Bo-Ram Kim | Display device and method of driving the same |
US20130160088A1 (en) * | 2011-12-16 | 2013-06-20 | Keith A. McFarland | Authentication Via Motion of Wireless Device Movement |
US20130298224A1 (en) * | 2012-05-03 | 2013-11-07 | Authentec, Inc. | Electronic device including a finger sensor having a valid authentication threshold time period and related methods |
US20150213245A1 (en) * | 2014-01-30 | 2015-07-30 | Qualcomm Incorporated | Dynamic keyboard and touchscreen biometrics |
US9301191B2 (en) | 2013-09-20 | 2016-03-29 | Telecommunication Systems, Inc. | Quality of service to over the top applications used with VPN |
US9338153B2 (en) | 2012-04-11 | 2016-05-10 | Telecommunication Systems, Inc. | Secure distribution of non-privileged authentication credentials |
US9479344B2 (en) | 2011-09-16 | 2016-10-25 | Telecommunication Systems, Inc. | Anonymous voice conversation |
US10380402B2 (en) * | 2012-09-28 | 2019-08-13 | Synaptics Incorporated | Low power navigation devices, systems and methods |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9760755B1 (en) * | 2014-10-03 | 2017-09-12 | Egis Technology Inc. | Fingerprint matching methods and device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020067408A1 (en) * | 1997-10-06 | 2002-06-06 | Adair Edwin L. | Hand-held computers incorporating reduced area imaging devices |
US20050156882A1 (en) * | 2003-04-11 | 2005-07-21 | Microsoft Corporation | Self-orienting display |
US20060152488A1 (en) * | 2005-01-12 | 2006-07-13 | Kenneth Salsman | Electronic equipment for handheld vision based absolute pointing system |
US20090160769A1 (en) * | 2007-12-19 | 2009-06-25 | Lowles Robert J | Input Mechanism for Handheld Electronic Communication Device |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0905646A1 (en) | 1997-09-30 | 1999-03-31 | Compaq Computer Corporation | Pointing and fingerprint identifier mechanism for a computer system |
KR100325381B1 (en) | 2000-02-11 | 2002-03-06 | 안준영 | A method of implementing touch pad using fingerprint reader and a touch pad apparatus for functioning as fingerprint scan |
JP2006107366A (en) | 2004-10-08 | 2006-04-20 | Fujitsu Ltd | Living body information input device, living body authentication device, living body information processing method, living body information processing program and computer readable recording medium with the program recorded thereon |
-
2010
- 2010-06-04 US US12/793,960 patent/US8792683B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020067408A1 (en) * | 1997-10-06 | 2002-06-06 | Adair Edwin L. | Hand-held computers incorporating reduced area imaging devices |
US20050156882A1 (en) * | 2003-04-11 | 2005-07-21 | Microsoft Corporation | Self-orienting display |
US20060152488A1 (en) * | 2005-01-12 | 2006-07-13 | Kenneth Salsman | Electronic equipment for handheld vision based absolute pointing system |
US7852317B2 (en) * | 2005-01-12 | 2010-12-14 | Thinkoptics, Inc. | Handheld device for handheld vision based absolute pointing system |
US20090160769A1 (en) * | 2007-12-19 | 2009-06-25 | Lowles Robert J | Input Mechanism for Handheld Electronic Communication Device |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120105375A1 (en) * | 2010-10-27 | 2012-05-03 | Kyocera Corporation | Electronic device |
US20130027400A1 (en) * | 2011-07-27 | 2013-01-31 | Bo-Ram Kim | Display device and method of driving the same |
US9479344B2 (en) | 2011-09-16 | 2016-10-25 | Telecommunication Systems, Inc. | Anonymous voice conversation |
US9326143B2 (en) | 2011-12-16 | 2016-04-26 | Telecommunication Systems, Inc. | Authentication via motion of wireless device movement |
US20130160088A1 (en) * | 2011-12-16 | 2013-06-20 | Keith A. McFarland | Authentication Via Motion of Wireless Device Movement |
US8984591B2 (en) * | 2011-12-16 | 2015-03-17 | Telecommunications Systems, Inc. | Authentication via motion of wireless device movement |
US9338153B2 (en) | 2012-04-11 | 2016-05-10 | Telecommunication Systems, Inc. | Secure distribution of non-privileged authentication credentials |
US20130298224A1 (en) * | 2012-05-03 | 2013-11-07 | Authentec, Inc. | Electronic device including a finger sensor having a valid authentication threshold time period and related methods |
US10380402B2 (en) * | 2012-09-28 | 2019-08-13 | Synaptics Incorporated | Low power navigation devices, systems and methods |
US9301191B2 (en) | 2013-09-20 | 2016-03-29 | Telecommunication Systems, Inc. | Quality of service to over the top applications used with VPN |
WO2015116403A1 (en) * | 2014-01-30 | 2015-08-06 | Qualcomm Incorporated | Dynamic keyboard and touchscreen biometrics |
US20150213245A1 (en) * | 2014-01-30 | 2015-07-30 | Qualcomm Incorporated | Dynamic keyboard and touchscreen biometrics |
CN106415570A (en) * | 2014-01-30 | 2017-02-15 | 高通股份有限公司 | Dynamic keyboard and touchscreen biometrics |
US9747428B2 (en) * | 2014-01-30 | 2017-08-29 | Qualcomm Incorporated | Dynamic keyboard and touchscreen biometrics |
Also Published As
Publication number | Publication date |
---|---|
US8792683B2 (en) | 2014-07-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8792683B2 (en) | Fingerprint scanning with optical navigation | |
US10452829B2 (en) | Key with integral biometric input device | |
US9977944B2 (en) | Determining fingerprint scanning mode from capacitive touch sensor proximate to lens | |
US7988037B2 (en) | Device and method for contact information exchange | |
US7865174B2 (en) | Establishing a collaborative domain among communication terminals responsive to authentication | |
US20110013813A1 (en) | Electronic device having authentication function and authentication method | |
CA2719862C (en) | Device and method for contact information exchange | |
US9013581B2 (en) | Associating a work with a biometric indication of the identity of an author | |
CN108075899B (en) | Identity authentication method, mobile terminal and computer readable storage medium | |
CA2740624C (en) | Fingerprint scanning with a camera | |
CN106570383A (en) | Unlocking method and apparatus for mobile terminal, and mobile terminal | |
CA2754314C (en) | Keyboard having key with integral biometric input device | |
CN109844764A (en) | The verification method and terminal of fingerprint sensor function | |
EP2634719B1 (en) | System and method of providing biometric quick launch | |
CN106682609A (en) | Detecting method and device for legality of additionally recorded fingerprints | |
CA2742027C (en) | Fingerprint scanning with optical navigation | |
US20150022635A1 (en) | Using multiple flashes when obtaining a biometric image | |
WO2022032680A1 (en) | Operation method, terminal, and computer storage medium | |
US20110216154A1 (en) | Apparatus and method for omnidirectional caller detection in video call system | |
JP2002261914A (en) | Portable communication terminal having user certification function | |
WO2022252812A1 (en) | Information protection method and electronic device | |
CN110619201A (en) | Terminal control method, terminal and computer readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: RESEARCH IN MOTION LIMITED, CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:RAMRATTAN, COLIN SHIVA;HOSSEINPOR, HASSAN;TOWNSEND, GRAHAM CHARLES;AND OTHERS;SIGNING DATES FROM 20100603 TO 20100625;REEL/FRAME:024727/0271 |
|
AS | Assignment |
Owner name: BLACKBERRY LIMITED, ONTARIO Free format text: CHANGE OF NAME;ASSIGNOR:RESEARCH IN MOTION LIMITED;REEL/FRAME:032836/0938 Effective date: 20130709 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551) Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
AS | Assignment |
Owner name: MALIKIE INNOVATIONS LIMITED, IRELAND Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BLACKBERRY LIMITED;REEL/FRAME:064104/0103 Effective date: 20230511 |
|
AS | Assignment |
Owner name: MALIKIE INNOVATIONS LIMITED, IRELAND Free format text: NUNC PRO TUNC ASSIGNMENT;ASSIGNOR:BLACKBERRY LIMITED;REEL/FRAME:064270/0001 Effective date: 20230511 |