WO2006053375A1 - Computer-based method and system for identifying a potential partner - Google Patents

Computer-based method and system for identifying a potential partner Download PDF

Info

Publication number
WO2006053375A1
WO2006053375A1 PCT/AU2005/001733 AU2005001733W WO2006053375A1 WO 2006053375 A1 WO2006053375 A1 WO 2006053375A1 AU 2005001733 W AU2005001733 W AU 2005001733W WO 2006053375 A1 WO2006053375 A1 WO 2006053375A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
biometric data
database
face
server
Prior art date
Application number
PCT/AU2005/001733
Other languages
French (fr)
Inventor
Sinisa Cupac
Original Assignee
Sinisa Cupac
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2004906566A external-priority patent/AU2004906566A0/en
Application filed by Sinisa Cupac filed Critical Sinisa Cupac
Priority to AU2005306571A priority Critical patent/AU2005306571A1/en
Publication of WO2006053375A1 publication Critical patent/WO2006053375A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management

Definitions

  • the present invention relates generally to the field of communications. More particularly the invention comprises a method and system for identifying a potential partner using a computer-based system.
  • BACKGROUND TO THE INVENTION Dating and introduction services have historically relied on approaches whereby a client completes a questionnaire to ascertain and classify their interests, hobbies, likes, and dislikes.
  • the dating service will then cross match the questionnaire responses of the new client with those of existing clients in an effort to identify a number of potentially compatible individuals for the new client.
  • this approach is convenient only if the database of the dating service is quite small, since many people share interests such as reading, going to the movies etc.
  • the database is large, there will be many potential matches for a new client especially where the new client provides fairly generic responses to the questionnaire (as most are want of doing). Accordingly, a problem is that a large number of potential partners will be identified on the basis that they have provided a similarly generic response
  • a further problem is that even where the new client is prepared to screen a very large number of potential partners identified on the basis of a questionnaire with a low discriminatory power, or is prepared to complete a more detailed questionnaire to provide a higher level of discrimination, the potential partners identified can often suffer.
  • FIG 1 shows a schematic diagram of a generic biometrics-based system.
  • FIG 2 shows a schematic diagram of the communication paths between the face recognition software (Biol D), the web server, the database server, the website
  • FIG 3 shows a schematic representation of the system process, including interaction with the Biol D components.
  • FIG 4 is a flow chart representing the system overview.
  • Figs 5 and 6 representing the database structure in the form of an entity relationship.
  • FIG 7 shows a flow chart representation of the user registration process.
  • FIG 8 shows a flow chart representation of the method by which a user creates a user profile.
  • FIG 9 shows an overview of the photo uploading and matching process.
  • FIG 10 shows a flow chart representation of the process by which a user logs into their account and uploads a photo via a website.
  • FIG 1 1 shows the process by which after upload of the user's photo, the image is passed to Biol D for enrolment.
  • FIG 12 shows a flow chart representation for uploading a user's image by mobile telephony device.
  • FIG 13 shows SMS notification, confirming upload of photograph as shown in FIG 12.
  • FIG 14 shows a flow chart representation of the process for matching a user with another user on the database having similar facial features.
  • FIG 15 shows delivery of matches to the user via web page.
  • FIG 16 shows delivery of matches to the user via mobile telephony device.
  • FIG 17 shows a schematic structure of an intermediate interface for the interaction between Biol D components, the database, and the client browser.
  • the present invention provides a method for identifying a potential partner for a user, the method including the steps of: providing biometric data characterising a physical feature of the user and/or a parent of the user, providing a database having biometric data characterising a physical feature of a plurality of individuals, comparing the biometric data of the user with at least a proportion of the biometric data on the database, and identifying at least one individual characterised by biometric data that is the same or similar to that of the user.
  • biometric data for screening a database of potential partners may dramatically cut down on the number of potential partners identified as potentially compatible with a user.
  • potential partners identified by the methods and systems described herein may be of greater compatibility than those obtainable by methods of the prior art where no biometric comparisons are made between individuals.
  • the invention is proposed to rely on the subconscious desire for people to form relationships with others having similar physical characteristics or a similar level of attractiveness as their own. It is a widespread belief that human partners look alike. Positive assortative mating, mating with partners more similar than expected by chance, may result in more stable partnerships and may have genetic benefits, although costs of inbreeding may limit the amount of self-similarity that should be tolerated. Research has shown positive assortment for many physical features, and partners' faces resemble each other in ways that allow them to be identified as partners at levels above chance.
  • the level of similarity is not so great that the user is presented with potential partners who have a similarity so high that the user could view them as a relative.
  • humans like many other animals
  • the avoidance of exact matches is a difference to methods in the prior art using biomethc data for validation of an individual's identity.
  • biomethc data for validation of an individual's identity.
  • the methods of the present invention are not directed to that result, since only similarities are required between the user and potential partner rather than strict congruence.
  • the present invention also provides the use of face recognition software for identifying a potential partner for a user.
  • the biometric data relates to the position or shape of anatomical features of the face and/or head such as the eyes, ears, nose and mouth.
  • the biometric data relates to the colouring of features such as the skin, eyes and hair.
  • the present invention provides a computer-based system capable of identifying a potential partner for a user.
  • the computer-based system includes software capable of executing a method as described herein.
  • the system may be implemented over the Internet, incorporated into a standard Internet match making website.
  • the system includes six major components.
  • the first component of an automated biomethc identification/verification system is a data acquisition component that acquires the biometric data in digital format by using a sensor.
  • the senor is typically a digital camera. This component is optional since the user may use their own digital camera.
  • the second and third components of the system are also optional. They are the data compression and decompression mechanisms, which are designed to meet the data transmission and storage requirements of the system.
  • the fourth component is the feature extraction algorithm.
  • the feature extraction algorithm may produce a feature vector, in which the components are numerical characterizations of the underlying biometrics.
  • the feature vectors are designed to characterize the underlying biometrics of the user for comparison with potential partners on the database.
  • the fifth component of the system is the "matcher," which compares feature vectors obtained from the feature extraction algorithm to produce a similarity score. This score indicates the degree of similarity between a pair of biometrics data under consideration.
  • the sixth component of the system is a decision-maker whereby a decision is made as to whether an individual on the database has sufficient biometric similarity to the user to warrant an introduction.
  • the present invention provides a method for identifying a potential partner for a user and/or a parent of the user, the method including the steps of: providing biometric data characterising a physical feature of the user, providing a database having biometric data characterising a physical feature of a plurality of individuals, comparing the biometric data of the user with at least a proportion of the biometric data on the database, and identifying at least one individual characterised by biometric data that is similar to that of the user.
  • biometric data for screening a database of potential partners may dramatically cut down on the number of potential partners identified as potentially compatible.
  • the potential partners identified by the methods and systems described herein may be of greater compatibility than that obtainable by methods of the prior art.
  • the invention is proposed to rely on the subconscious desire for people to form relationships with others having similar physical characteristics or a similar level of attractiveness as their own.
  • the level of similarity between the user and a potential partner is not less than a minimum value, such that the user has an attraction or affinity for the potential partner. It is proposed that when the present methods are implemented, the attraction or affinity between the user and potential partner is greater than that where there is no comparison of biometric data.
  • the level of similarity may not be greater than a maximum value, such that the user has little or no attraction or affinity for the potential partner.
  • the attraction or affinity may be on a physical, emotional or spiritual basis. Alternatively, the affinity or attraction may be only be at the level of friendship. However, it is preferred that the affinity or attraction is predominantly physical in nature at least at first instance.
  • the biometric data relates to the face.
  • Facial biometric-based systems are often used in a security setting such as airports, or for identity authentication applications. However, they have yet to fulfil their promise given that they rarely achieve even just 90% accuracy.
  • these facial biometric-based systems are suited to applications with the present invention since it is not necessary (or desired) for the software to find an exact match between two faces.
  • the similarity between two faces could be ascertained by "eigenface” technology. This methodology uses the whole face by slicing it into hundreds of gray-scale layers, each with distinctive features.
  • the invention relies on a computer-based comparison of two sets of biometrics data and making a decision about whether or not they relate to persons having similar facial features.
  • the computer may perform this function by providing a similarity measurement in the form of a numerical score which informs as to the similarity of the pair of underlying biometrics data.
  • the computer may generate a list of pair-wise biometrics data comparisons that are in an ascending order, commonly known as a candidate list.
  • the present invention is not limited to any particular facial feature mapping function, but can include any known or yet to be created algorithm, suitable for the purposes described herein, for recognizing facial features, whether it be two-dimensional or three-dimensional, that are then also to be used for ranging functions, as well. Further, according to the present invention, ranging algorithms are used in combination with the known face recognition software.
  • facial recognition software packages utilise the position of various anatomical "landmarks" in deciding whether two faces are the same.
  • Software packages often define these landmarks as nodal points. There are about 80 nodal points on a human face.
  • Useful anatomical landmarks may be selected from the group including right eye pupil, left eye pupil, right mouth corner, left mouth corner, outer end of right eye brow, inner, end of right eye brow, inner end of left eye brow outer end of left eye brow, right temple, outer corner of right eye, inner corner of right eye, inner corner of left eye, outer corner of left eye, left temple, tip of nose, right nostril, left nostril, centre point on outer edge of upper lip, centre point on outer edge of lower lip, and tip of chin.
  • nodal points may be measured to create a numerical code that represents the face in a database. For some software packages only 14 to 22 nodal points are needed to complete the face matching process. For the purposes of the present invention, it is not necessarily desirable for the algorithm to identify a very similar face (such that the two faces could be considered to be of the same person). All that is required is for the algorithm to identify faces that are similar to that of the user. The skilled person could trial the use of different numbers of nodal points to find an optimal number that gives the greatest attractiveness to the user.
  • anatomical structures Apart from the use of distances between nodal points, consideration could be given more so to the shape of various anatomical structures.
  • the shape of the jaw square, pointed or rounded
  • the shape of the eyes round or almond
  • the shape of the nose wide or narrow
  • Other anatomical structures including the eye socket, nostril, ear, chin, cheek, forehead, head, teeth, eyebrow and eyelash could be incorporated into the biometric comparison.
  • the present invention is not limited to the use of anatomical landmark, or anatomical structure information, but extends to the colour of the skin, hair or eyes.
  • the importance of these variables has been discovered during studies showing that humans select long-term partners who not only look like themselves, but look like their opposite sex parents. It has been discovered that men are attracted to women who look like their mothers and women prefer men who resemble their fathers. The same research has also shown that humans select partners who remind them of themselves, particularly in relation to traits such as hair and eye colour. Studies examining hair and eye colour have shown evidence of positive assortment which may reflect attraction to self-similar characteristics but is also consistent with attraction to parental traits (Little, A.C., Penton-Voak, I.S., Burt D. M.
  • the level of similarity between the user and the potential partner may be varied. In general, the higher the similarity the better. However at a certain point, further similarity does not improve attractiveness, or may decrease in attractiveness. Without wishing to be limited by theory, it is thought that too similar a match will trigger the user's instinct to avoid mating with family members. This trigger point may differ depending on race, sex, individual preference etc. However the skilled person could ascertain the point at which similarity is negatively correlated to attractiveness by simple trial and error.
  • the user may choose to exclude any of their own features from the similarity analysis. For example, if the user was particularly dissatisfied with the size or shape of their nose, this feature could be excluded from the similarity analysis such that potential partners having a similarly large or misshapen nose are excluded. Of course, this may lead to less than optimal matches being generated by the computer but the similarity with other features such as the eyes or mouth may still result in the user finding attractiveness in a less than optimal match.
  • Wiskott, L., et al "Face Recognition by Elastic Bunch Graph Matching", Internal Report, IR-INI 96-08, lnstitut fur Neuroinformatik, Ruhr-Universitat, Bochum, pp. 1 -21 , Apr. 1996; Wiskott, L, "Labeled Graphs and Dynamic Link Matching for Face Recognition and Scene Analysis", Verlag Harr Deutsch, Th u n- Frankfurt am Main. Erasmus Physik, Dec. 1995, pp. 1 -109; Wiskott, L., "Phanton Faces for Face Analysis”.
  • Wiskott, L. "Phanton Faces for Face Analysis”. Internal Report, IR-INI 96-06, lnstitut fur Neoroinformatik, Ruhr-Universitat, Bochum, Germany, Apr. 1996, 12 pp; Wiskott, L. "Phantom Faces for Face Analysis", Pattern Recognition, vol. 30, No. 6, pp. 837-846, 1997; Wiskott, L., et al, "Face Recognition by Elastic Bunch Graph Matching", IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(7), pp.
  • the present invention provides a computer-based system adapted to identify a potential partner for a user.
  • the system includes six major components depicted in FIG.1
  • the first component of an automated biometric identification/verification system is a data acquisition component that acquires the biometric data in digital format by using a sensor. For face images the sensor is typically a camera.
  • the second and third components of the system are optional. They are the data compression and decompression mechanisms, which are designed to meet the data transmission and storage requirements of the system.
  • the fourth component is the feature extraction algorithm.
  • the feature extraction algorithm may produce a feature vector, in which the components are numerical characterizations of the underlying biometrics.
  • the feature vectors are designed to characterize the underlying biometrics of the user for potential for comparison with potential partners on the database. In general, the larger the size of a feature vector (without much redundancy), the higher its discrimination power.
  • the discrimination power is the difference between a pair of feature vectors representing two different individuals.
  • the fifth component of the system is the "matcher,” which compares feature vectors obtained from the feature extraction algorithm to produce a similarity score. This score indicates the degree of similarity between a pair of biometrics data under consideration.
  • the sixth component of the system is a decision-maker.
  • biometric data that may find use in the present invention is that of facial asymmetry.
  • a person may be attracted to a person having a similar level of asymmetry in their face. It has been demonstrated that the asymmetry of specific facial regions captures individual differences that are robust to variation in facial expression. It has been further shown that facial asymmetry provides discriminating power orthogonal to conventional face identification methods. The degree of asymmetry can be quantitated by consideration of two dimensional or three dimensional measurements of the face and head.
  • Three dimensional (3D) face recognition has advantages over 2D face since it compares the 3D shape of the face which is invariant in different lighting conditions. As long as the illumination of the face is in a range which allows 3D reconstruction of a sufficiently large portion of the face, a detailed analysis of the face is possible.
  • the present invention also provides the use of face recognition software for identifying a potential partner for a user.
  • face recognition software for identifying a potential partner for a user.
  • SecurelDent Products by BioDentity Systems Corporation (Ottawa, Canada) encompasses everything from hardware to middleware to specialized application software.
  • the Preprocessor offers comprehensive facial capture and image recognition support regardless of lighting and other environmental factors.
  • the software has the capability of separating the facial image from the background for easy processing of information.
  • the SecurelDent Preprocessor offers onboard processing, and can provide front-end enhancement to any other facial-recognition system or application.
  • Other offerings in the family include the SecurlDent Face Recognition Controller, which is the primary interface between the Preprocessors and the rest of the application; the SecurlDent Photo Enrolment System, which automatically optimizes all images to create a high-quality database; and the SecurlDent Search Engines, which compare biographical or face biomethc details, or a combination of the two, against very large databases.
  • BiolD America lnc Single Sign-On by BiolD America lnc (NC) technology offers the ability to analyze face, voice, and lip movement simultaneously, and requires only a standard USB camera and microphone for implementation.
  • the BiolD dataset features 1 ,521 gray level images showing a frontal view of a face of one out of 23 different test persons. Images are stored in single files using portable graymap (pgm) data format.
  • the FaceVACS-Logon system offers automatic facial identification. It may be integrated with conventional access control or time and attendance systems, and a combination with card terminals is possible for high- security areas. Users faces are captured by video camera, and image- processing algorithms extract a feature set from the digitized image, which the software compares to the users reference set, stored on the computer. Features of the system include flexible operating modes, which enable it to be used as a stand-alone facial-recognition solution.
  • the package includes standard Webcam support, and support of Windows 98/2000/NT/XP/Me.
  • the FaceTrac facial-recognition system (Graphco Technologies, PA) is image capture, comparison against images in a database, and matching of images. This open system can incorporate facial-recognition engine components from vendors such as Viisage, Visionics, and AcSys Biometrics. FaceTrac can match the facial geometry of an individual against portraits in a database.
  • Face ID facial-recognition program uses biometrics in combination with parallel processing to match faces through a mathematical formula that uses the eyes as a reference point.
  • the formula generates a data record representing the face, which is used to compare against a digital database of enrolled images. Images from scanned photographs or video may be queried using the program, and millions of images may be searched to identify matches.
  • ID-2000 software (Imagis Technologies Inc. BC Canada) captures, compares, and quickly and efficiently displays an individuals face against a database. It enables an individual to be matched in seconds using only an image or photograph as the primary search criteria.
  • the IRID face-recognition technology can perform infrared facial recognition as well as continuous condition monitoring of individuals by using passive infrared imaging that is non-contact, non-invasive, and works under any lighting conditions or in total darkness.
  • the Tridentity 3 Dimensional Face system offers a three-dimensional approach to facial recognition, and can analyze subtle features of the face like bone structure, and enables images to be rotated to offer a better view of the subject.
  • the technology uses patterned light to create a three-dimensional image of the face, and once an image is captured, a 3D representation of the subjects face can be built from a single frame of video footage.
  • the solution can operate on single or multiple scans, and each scan can be processed in under one second on a 400 MHz Pentium system.
  • the search database size is limited by disk space and processor speed only, and the system may be expanded to scale up to multiple cameras and workstations.
  • the system is based on an open architecture, uses COTS components, and may be easily integrated as a component of a larger system.
  • the FaceOn technology uses neural network and artificial intelligence techniques to capture facephnts and determine or verify identity.
  • the FaceOn Logon AdminTool enables complete the facephnt enrollment, as well as add and delete faceprints from the database.
  • the enhanced visual Access Log enables administrators to keep tack of all users access settings.
  • the FaceOn Surveillance system may be integrated with various types of CCTV systems, and offers multiple, real-time image enrollment, retrieval, and recognition (using the Invariant Feature Analysis technology).
  • the Viisage face-recognition (FR) technology (Viisage Technology, MA) is based on an algorithm developed at MIT, and enables software to translate the characteristics of a face into a unique set of numbers called an eigenface. This is used by identification and verification systems for facial comparisons made in real time, and may be used with databases containing millions of faces.
  • the technology enables software to instantly calculate an individual's eigenface from live video or a still digital image and then search a database to find similar or matching images.
  • the family of products includes the FaceFINDER, FaceEXPLORER, FacePASS, FacePIN, and FaceTOOLS applications. They offer the ability to search large databases of images, and a software development kit for developing additional applications.
  • the Facelt facial-recognition software engine from Visionics Corporation NJ enables computers to rapidly and accurately detect and recognize faces, for everything from ID solutions to banking and e-commerce applications.
  • the software can detect one or multiple faces, and can also provide one-to-one or one-to-many matching. It also evaluates the quality of the image and prompts for an improved image if needed, and can crop faces from background imagery.
  • Facelt can also compress facial images to 84 bytes for easy storage and transfer. It uses the local feature analysis technique to represent facial images in terms of local, statistically derived building blocks.
  • the software is resistant to changes in skin tone, lighting, facial expression, eyeglasses, and hair, and allows up to 35 degrees of change in pose in all directions.
  • the UnMask system (Visionsphere Technologies Inc. ON Canada) offers face detection and location of key features, extraction of facial descriptors, and comparison of extracted information against a database. It locates the face and the eyes automatically through proprietary search algorithms, and then normalizes and crops the image to offer invariance to variations in head rotation, lighting, hairstyle, and facial expression. The system then uses VisionSpheres Holistic Feature Code (HFC) to provide discrimination for comparing faces at high confidence rates and fast processing speeds. Faces are then compared using a proprietary distance function, which stresses significant differences between faces.
  • HFC VisionSpheres Holistic Feature Code
  • VisionSphere also provides the UnMask Plus software artificial intelligence (Al) system, which provides identification and removal of duplicate or multiple images from large databases.
  • the software also includes automatic computer logon authentication system offers hardware and software components for verifying the identity of a network or workstation user.
  • the FaceCam biometric user- verification terminal offers integration with applications for physical access control, time and attendance, and registration systems.
  • the ZN-Face physical access control system (ZN Security, a division of ZN Vision Technologies Bochum, Germany) enables automation of identity checks for access to secure areas.
  • the system uses a neural face-recognition routine to verify individuals, and also offers a refined optical filter system and a LiveCheck analysis procedure to prevent attempts at spoofing through photos or masks.
  • the system may be administered via Windows NT/2000, and supports the ODBC database interface standard to enable acceptance of the master data from external databases, and also features the ZN-SmartEye technology. This enables evaluation of pictures from a video camera, and reports the similarity of a face compared with others on the database.
  • the system also works with ZN- Phantomas, a computerized image database that can automatically compare faces.
  • the present invention provides a method for identifying a potential partner for a user over a computer network, the method including the use of a system described herein.
  • the systems and methods described herein may be implemented over any type of computer network.
  • the systems and process may be implemented over the Internet.
  • any other network such as WAN or LAN could be utilised. It is contemplated that wired or wireless networking protocols could be used.
  • the network could be implemented by a user carrying their own biometric information on a portable data storage device, and connecting the device to a computer holding the database of individuals. Upon connection of the device to the computer, the user's biometric data is compared with the biometric information.
  • the portable data storage device may be a, flash disk, micro-hard drive, compact disc, magnetic medium such as floppy disk, punched card, or EPROM device.
  • the user's image may be forwarded to the server for biometric analysis by means of mobile telephone equipment.
  • Many consumer telephones have the ability to take a digital photograph and transmitting the photograph to a computer via a cellular network. It is envisaged that an image of the potential partner identified by the computer could be returned to the user's mobile telephone, along with the potential partner's contact details.
  • the methods and systems may incorporate other known methods useful for identifying a potential partner such as standard questionnaires and zodiac sign compatibility. Also incorporated may be other screening criteria such as hair colour, skin colour, ethnicity, height, weight, and the like. These further criteria could be selected for or against either before or after the computer selects a potential partner for the user.
  • the user and the potential partners may be a different sex or the same sex.
  • the invention may even be useful for identifying a potential animal companion based on similarities between the features of the potential owner and the potential pet.
  • Biol D SDK V3.1 is used to compare the level of similarity between user faces on the match making database.
  • Biol D is available from HumanScan AGGrundstrasse 1 , CH-6060 Sarnen, Switzerland
  • Create Database Create a database in SQL server that is to be used for this project.
  • Server Side Scripting language ASP.NET Database: SQL Server 2000 or above Operating System: Windows 2000 or above Web server: IIS 5.0 or above
  • HTML Client Side Scripting Java Script or VB Script
  • the potential partner match application will consist of several components, which will perform different tasks. Referring now to FIG 2 the components and communication paths of special interest are identified and described as follows.
  • the Web Server (Internet Information Server) will run as a service on Windows NT4, Windows 2000 or Windows XP.
  • IIS Internet Information Services
  • Web server service uses an account to logon to the SQL server depending on the SQL server that is run.
  • the database server is on same PC as Web Server Service; therefore local account can be used.
  • BiolD Server Service runs as a service on Windows XP and runs independently with an existing or a new account.
  • the BiolD Server Service can use an existing account to logon to the SQL Server. If new account is required, several factors need to be considered where the account will be created (domain global or locally on the PC) as mentioned below:
  • BiolD server service will not be installed on a domain controller: o database server is on same PC as BiolD Server Service: local account o database server is on different PC as BiolD Server Service: ⁇ PC is member of a domain: domain global account
  • ⁇ PC is standalone / not member of a domain: local account
  • Biol D server setup As an administrator. If necessary to create a domain global account in the installation admin access is required to install Biol D server.
  • the Biol D Server stores all data about the clients, users, etc. in a SQL database.
  • the Biol D Server Service requires login account information.
  • the required data differs as mentioned:
  • the Biol D Server Service needs the names of the database server and the database on that database server. It will logon to the database with the Biol D Server Service existing account. To work with the database this account needs to have the right to logon to that database and to read, modify, create and delete datasets in any of the Biol D database tables.
  • the Biol D Server Service needs to the names of the database server and database on that database server and a SQL server user account name and password. It will logon to the database with existing account. To work with the database this account needs to have the right to logon to that database and to read, modify, create and delete datasets in any of the Biol D database tables. On the BiolD server side the communication runs via OLE DB.
  • BiolD Server Setup can create the basic BiolD database when installing database on a MS SQL server.
  • BiolD Server Setup needs an account, which is member of the SQL servers System Administrators role (sysadmin) to perform all necessary actions: ⁇ Create the new database
  • the BiolD Server Setup can logon with the account the BiolD Server Setup is currently running as the administrators - account. It is necessary to be member of the sysadmin Server Role on the SQL server. ⁇ MS SQL server 7.0 or higher with SQL or mixed security
  • the BiolD Server Setup needs to know a SQL server user account name and password. It will logon to the SQL server with this given account info. This account has to be member of the sysadmin Server Role on the SQL server (like the standard "SA" account). If it is necessary to create the database de novo, then there is no need to allow the BiolD Server Setup the access to the SQL server. The BiolD Server Setup will install a SQL script, which can be used to create all necessary tables in the database. (E) Client Browser ⁇ -> BiolD and Web Server
  • Client Browser and Web Server will be done using HTTP. Then Web Server will in turn interact with BiolD Server.
  • HTTP stands for Hypertext Transfer Protocol. This is the network protocol used to deliver virtually all files and other data (collectively called resources) on the World Wide Web, such as HTML files, image files, query results. Typically, HTTP transfer takes place through TCP/IP sockets.
  • a browser is an HTTP client because it sends requests to an HTTP server (Web server), which then sends responses back to the client.
  • HTTP server Web server
  • the standard (and default) port for HTTP servers to listen on is port 80.
  • HTTP is used to transmit resources, not just files.
  • a resource is a package of information that can be identified by a URL.
  • the most common type of resource is a file, but a resource may also be a dynamically generated query result, the output of a CGI script, a document that is available in several languages.
  • Figure 3 is a systematic representation of the system process (Interaction with Bio SDK) together with a description to follow:
  • Client Application On the front end Client Application will run on a Web Browser. It will send a HTTP request to the web server through ASP.NET script.
  • Web Server i.e. Internet Information Server (IIS) will receive the HTTP request and initiate respective process.
  • IIS Internet Information Server
  • the processes can be: fetching User's profile from database.
  • Database resides on the back-end and will store all the data related to users and their photos.
  • HTTP Request and Responses There can be the following types of HTTP Request and Responses: a. User can send HTTP Request for registration. In this case Web Server will get user's data and pass it on to Database. User's data is saved on database and HTTP Response will be sent to Web Browser.
  • Web Server will authenticate user's username and password from Database and send the HTTP Response back to Web Browser.
  • Web Server will get photo data from client browser. Web Server will save the photo on server and update the Database. Web server will also call Biol D server and components to enroll and create template for the photo. Then web server will send the HTTP Response for photo upload confirmation back to Web Browser.
  • Web Server will fetch all other user's photos from Database. Web server will then call Biol D server and components to match the photo using processes like classification, verify and identify and then send the match % back to web server. Then web server will send the HTTP Response for match % back to Web Browser, only if match % > 60%.
  • FIG 4 is a flow chart representing the system overview:
  • the user can upload his/her picture in order to find a match.
  • Picture can be upload both through the website or send via email using their Internet enabled mobile phone. In order to upload picture via mobile phone, it is required that the picture meets the defined specification and is sent to provided email address from where the photos will be fetched.
  • Match is performed using Biol D SDK functions and processes that are classification, verify and identify. 9. User will receive a result of >60% match either on his web page or mobile device through SMS.
  • FIGS 5 and 6 show two detailed database structures in the form of entity relationships.
  • the user registers online in order to use match making services.
  • the flow chart shown at FIG 7 represents this process. Create profile process
  • the user chooses their mobile service provider to allow receipt of mobile messages from the site and also to upload their photograph.
  • a username and password is required.
  • This email will contain an activation link for that purpose.
  • user can login to the website by using their username and password.
  • user's session will be created to validate user on each requested page.
  • User is required to create their profile.
  • User can include various details such as introduction title, description, personal characteristics (e.g. smoker or drinker etc).
  • User may also include preferred personal characteristics of their desired partner.
  • the user uploads his/her digital photograph in order to find a match.
  • the photograph may be uploaded via the website or via email using their Internet enabled mobile phone.
  • In order to upload picture via mobile phone it is required that the picture meets the defined specification and is sent to provided email address from where the photos will be retrieved.
  • the photograph must be of an acceptable format (jpg, gif) as mentioned on the website and not exceeding maximum allowed size (this is Admin controlled).
  • Enrollment is the process of collecting biometric samples from an applicant and the subsequent preparation, and storage of biometric reference templates representing that person's identity.
  • User's picture is passed to BiolD SDK and is enrolled using following steps:
  • BiolD SDK uses a two-stage model-based algorithm to detect the location of a human face in an arbitrary image: a binary face model is being matched in a binahzed version of the current scene. The comparison is performed with the modified Hausdorff distance, which determines the optimal location, scaling and rotation of the model.
  • the estimated face position is refined by matching a more detailed eye region model, again using the Hausdorff distance for comparison.
  • the exact eye positions are determined by checking the output of an artificial neural network (ANN) specialized on detecting eye centers.
  • ANN artificial neural network
  • the eye positions allow for all further processing: using anthropomorphic knowledge, a normalized portion of the face and of the mouth region can be extracted.
  • the face is transformed to a uniform size. This procedure ensures that the appropriate biometric features of the face are analyzed, and not, for example, the size of the head, hair style, a tie, or piece of jewelry.
  • the template is made up of a separate part for each classification trait and can be understood as a compact representation of the collected feature data, where useless or redundant information is discarded. Independent of the number of training sequences, each part consists of a fixed amount of data representing each person's characteristics.
  • Each person enrolled in Biol D is assigned a unique class, and the classifier compares a new recording (i.e. the feature vectors that are extracted out of this recording) with all (formerly trained and stored) prototypes of each class. The prototype with the highest similarity determines the class ("winner takes all" principle).
  • the identify function is used to match the pictures of existing users in order to find a match (where match score is greater then 0.6 or 60%).
  • the output of a classifier therefore is the Class ID (the person with the best matching features) - the similarity value. Similarity values are typically in the range between 0.0 and 1.0, a match with 1.0 would mean a perfect match. Note that values in biometrics are never really 100%. The output of a classifier will almost never be 1.0, and values of 0.8 to 0.9 are typical.
  • Biol D (as almost all classification systems) uses thresholds to qualify a classification. Only if the similarity value exceeds a certain threshold, the user will be recognized. This prevents poor matches from being falsely identified. These score will be converted into percentages and a match greater then 60% for example, which is a score of 0.6, will be considered a good match.
  • the user can upload his/her picture in order to find a match.
  • user In order to upload the picture through the website, user must login to their respective account by entering their username and password.
  • a photograph is uploaded either through the website or sent via email using an Internet enabled mobile phone.
  • In order to upload the photograph via mobile phone it is required that the picture meets a defined specification, and is sent to the provided email address from where the photos will be retrieved.
  • User picture will be enrolled using Biol D SDK enroll function and a template will be created for matching purposes.
  • a photograph must be of proper format (jpg, gif) as mentioned on the website and not exceeding maximum allowed size (this is Admin controlled).
  • Photograph name will be then added to the database. If there are any pictures of same user present with that name, 1 ,2,3... will be added after to the picture name.
  • Once the picture is uploaded user will be sent a confirmation email using SMS (see FIG 13).
  • the matching process is shown generally in FIGS 14, 15 and 16. 1. User can start matching process through the website after uploading their pictures. Match is performed using Biol D functions and processes that are classification, verify and identify.
  • the output of a classifier therefore is the Class ID (the person with the best matching features) - the similarity value. Similarity values are typically in the range between 0.0 and 1.0, a match with 1.0 would mean a perfect match. Note that values in biometrics are never really 100%. The score will be converted to percentage and a match of greater than 60% will be considered as a potential match.
  • BiolD components are called using the API, like suppose BiolD component's object is named as BiolDAPI. 5. Identification functions that are: BiolDCtrlJdentificationReady and identify Click are called to identify similarity in photos.
  • BiolD SDK includes a set of components and database, so a script (ASP or VB script) is required to call BiolD components and then execute that script through browser (see FIG 17)

Abstract

A method for identifying a potential partner for a user, the method including the steps of: providing biometric data characterising a physical feature of the user and/or a parent of the user, providing a database having biometric data characterising a physical feature of a plurality of individuals, comparing the biometric data of the user with at least a proportion of the biometric data on the database, identifying at least one individual characterised by biometric data that is at least similar to that of the user and/or a parent of the user.

Description

COMPUTER-BASED METHOD AND SYSTEM FOR IDENTIFYING A
POTENTIAL PARTNER
FIELD OF THE INVENTION The present invention relates generally to the field of communications. More particularly the invention comprises a method and system for identifying a potential partner using a computer-based system.
BACKGROUND TO THE INVENTION Dating and introduction services have historically relied on approaches whereby a client completes a questionnaire to ascertain and classify their interests, hobbies, likes, and dislikes. The dating service will then cross match the questionnaire responses of the new client with those of existing clients in an effort to identify a number of potentially compatible individuals for the new client. Generally, this approach is convenient only if the database of the dating service is quite small, since many people share interests such as reading, going to the movies etc. However, where the database is large, there will be many potential matches for a new client especially where the new client provides fairly generic responses to the questionnaire (as most are want of doing). Accordingly, a problem is that a large number of potential partners will be identified on the basis that they have provided a similarly generic response
With the advent of Internet-based dating services, databases of individuals searching for a suitable partner have become particularly large for example the service known as "America's Online Dating" currently has 3.5 million users. Clearly, a database having this number of individuals will expose a new client to a very large number of potential partners having similar interests. It is tedious for the new client to search through the many potential partners identified by these services to find a potential partner. Of course, the search could be narrowed to exclude individuals outside the geographical area of the new client, but for a large city there still may remain an unmanageably large number of potential clients to screen. Another problem is that many dating services require a new client to fill out lengthy questionnaires. This approach is intended to overcome the problem of a very large number of potential partners being identified for a new client, as is often case where the questionnaire is simplistic and unable to discriminate. While lengthy questionnaires certainly provide a greater discriminatory power, they are tedious to prepare, and many clients find the questions overly personal and invasive.
A further problem is that even where the new client is prepared to screen a very large number of potential partners identified on the basis of a questionnaire with a low discriminatory power, or is prepared to complete a more detailed questionnaire to provide a higher level of discrimination, the potential partners identified can often disappoint.
It is an aspect of the present invention to at least alleviate a problem of the prior art by providing a system and method for identifying a potential partner over a computer network.
BRIEF DESCRIPTION OF THE DRAWINGS FIG 1 shows a schematic diagram of a generic biometrics-based system.
FIG 2 shows a schematic diagram of the communication paths between the face recognition software (Biol D), the web server, the database server, the website
(front end), and the Biol D server.
FIG 3 shows a schematic representation of the system process, including interaction with the Biol D components.
FIG 4 is a flow chart representing the system overview.
Figs 5 and 6 representing the database structure in the form of an entity relationship.
FIG 7 shows a flow chart representation of the user registration process. FIG 8 shows a flow chart representation of the method by which a user creates a user profile.
FIG 9 shows an overview of the photo uploading and matching process.
FIG 10 shows a flow chart representation of the process by which a user logs into their account and uploads a photo via a website. FIG 1 1 shows the process by which after upload of the user's photo, the image is passed to Biol D for enrolment.
FIG 12 shows a flow chart representation for uploading a user's image by mobile telephony device. FIG 13 shows SMS notification, confirming upload of photograph as shown in FIG 12.
FIG 14 shows a flow chart representation of the process for matching a user with another user on the database having similar facial features. FIG 15 shows delivery of matches to the user via web page. FIG 16 shows delivery of matches to the user via mobile telephony device.
FIG 17 shows a schematic structure of an intermediate interface for the interaction between Biol D components, the database, and the client browser.
SUMMARY OF THE INVENTION
In a first aspect, the present invention provides a method for identifying a potential partner for a user, the method including the steps of: providing biometric data characterising a physical feature of the user and/or a parent of the user, providing a database having biometric data characterising a physical feature of a plurality of individuals, comparing the biometric data of the user with at least a proportion of the biometric data on the database, and identifying at least one individual characterised by biometric data that is the same or similar to that of the user.
Applicants propose that in computer-based match making services, the use of biometric data for screening a database of potential partners may dramatically cut down on the number of potential partners identified as potentially compatible with a user. Furthermore, the potential partners identified by the methods and systems described herein may be of greater compatibility than those obtainable by methods of the prior art where no biometric comparisons are made between individuals.
Without wishing to be limited by theory, the invention is proposed to rely on the subconscious desire for people to form relationships with others having similar physical characteristics or a similar level of attractiveness as their own. It is a widespread belief that human partners look alike. Positive assortative mating, mating with partners more similar than expected by chance, may result in more stable partnerships and may have genetic benefits, although costs of inbreeding may limit the amount of self-similarity that should be tolerated. Research has shown positive assortment for many physical features, and partners' faces resemble each other in ways that allow them to be identified as partners at levels above chance.
Furthermore, it is proposed that there is a subconscious desire for people to form relationships with others having similar characteristics to their opposite sex parent, whether that parent is the biological or non-biological parent.
In a preferred form of the method, the level of similarity is not so great that the user is presented with potential partners who have a similarity so high that the user could view them as a relative. Instinctively, humans (like many other animals) avoid mating with relatives for the simple biological reason of avoiding congenital disorders in their offspring. The avoidance of exact matches is a difference to methods in the prior art using biomethc data for validation of an individual's identity. In the prior art methods, it is desirable for face recognition software to find an exact match to the individual under examination such that their identity can be validated. By contrast, the methods of the present invention are not directed to that result, since only similarities are required between the user and potential partner rather than strict congruence. Even in light of this, face recognition software packages of the prior art will be useful in the context of the present invention, as the algorithms used in the software generally have difficulty in identifying exact matches. Accordingly, the present invention also provides the use of face recognition software for identifying a potential partner for a user.
In a preferred form of the invention, the biometric data relates to the position or shape of anatomical features of the face and/or head such as the eyes, ears, nose and mouth. In another form of the invention the biometric data relates to the colouring of features such as the skin, eyes and hair. In a further aspect the present invention provides a computer-based system capable of identifying a potential partner for a user. The computer-based system includes software capable of executing a method as described herein. The system may be implemented over the Internet, incorporated into a standard Internet match making website. In one form of the invention the system includes six major components. The first component of an automated biomethc identification/verification system is a data acquisition component that acquires the biometric data in digital format by using a sensor. For face images the sensor is typically a digital camera. This component is optional since the user may use their own digital camera. The second and third components of the system are also optional. They are the data compression and decompression mechanisms, which are designed to meet the data transmission and storage requirements of the system. The fourth component is the feature extraction algorithm. The feature extraction algorithm may produce a feature vector, in which the components are numerical characterizations of the underlying biometrics. The feature vectors are designed to characterize the underlying biometrics of the user for comparison with potential partners on the database. The fifth component of the system is the "matcher," which compares feature vectors obtained from the feature extraction algorithm to produce a similarity score. This score indicates the degree of similarity between a pair of biometrics data under consideration. The sixth component of the system is a decision-maker whereby a decision is made as to whether an individual on the database has sufficient biometric similarity to the user to warrant an introduction.
DETAILED DESCRIPTION OF THE INVENTION
In a first aspect, the present invention provides a method for identifying a potential partner for a user and/or a parent of the user, the method including the steps of: providing biometric data characterising a physical feature of the user, providing a database having biometric data characterising a physical feature of a plurality of individuals, comparing the biometric data of the user with at least a proportion of the biometric data on the database, and identifying at least one individual characterised by biometric data that is similar to that of the user. Applicants propose that the use of biometric data for screening a database of potential partners may dramatically cut down on the number of potential partners identified as potentially compatible. Furthermore, the potential partners identified by the methods and systems described herein may be of greater compatibility than that obtainable by methods of the prior art.
Without wishing to be limited by theory, the invention is proposed to rely on the subconscious desire for people to form relationships with others having similar physical characteristics or a similar level of attractiveness as their own. Thus, the level of similarity between the user and a potential partner is not less than a minimum value, such that the user has an attraction or affinity for the potential partner. It is proposed that when the present methods are implemented, the attraction or affinity between the user and potential partner is greater than that where there is no comparison of biometric data. However, it will be appreciated that the level of similarity may not be greater than a maximum value, such that the user has little or no attraction or affinity for the potential partner.
The attraction or affinity may be on a physical, emotional or spiritual basis. Alternatively, the affinity or attraction may be only be at the level of friendship. However, it is preferred that the affinity or attraction is predominantly physical in nature at least at first instance.
In a preferred form of the invention, the biometric data relates to the face. There are many algorithms known in the art capable of converting a digital image to a set of numerical values for the purposes of characterising a person's facial features. Facial biometric-based systems are often used in a security setting such as airports, or for identity authentication applications. However, they have yet to fulfil their promise given that they rarely achieve even just 90% accuracy. By contrast these facial biometric-based systems are suited to applications with the present invention since it is not necessary (or desired) for the software to find an exact match between two faces.
Some of the methods by which facial recognition technology identifies a match between two photographs include consideration of the shape or size of the upper outlines of eye sockets, the geometry of the cheekbone area, the shape and size of sides of the mouth, the distance between eyes, and the length or shape of nose. In an alternative form of the invention the similarity between two faces could be ascertained by "eigenface" technology. This methodology uses the whole face by slicing it into hundreds of gray-scale layers, each with distinctive features.
The invention relies on a computer-based comparison of two sets of biometrics data and making a decision about whether or not they relate to persons having similar facial features. The computer may perform this function by providing a similarity measurement in the form of a numerical score which informs as to the similarity of the pair of underlying biometrics data. Alternatively, the computer may generate a list of pair-wise biometrics data comparisons that are in an ascending order, commonly known as a candidate list.
It is stressed that the present invention is not limited to any particular facial feature mapping function, but can include any known or yet to be created algorithm, suitable for the purposes described herein, for recognizing facial features, whether it be two-dimensional or three-dimensional, that are then also to be used for ranging functions, as well. Further, according to the present invention, ranging algorithms are used in combination with the known face recognition software.
The skilled person will understand that for the purposes of the present invention, the particular facial features used by the face comparison algorithm can be optimised by routine experimentation. For example, many facial recognition software packages utilise the position of various anatomical "landmarks" in deciding whether two faces are the same. Software packages often define these landmarks as nodal points. There are about 80 nodal points on a human face. Useful anatomical landmarks may be selected from the group including right eye pupil, left eye pupil, right mouth corner, left mouth corner, outer end of right eye brow, inner, end of right eye brow, inner end of left eye brow outer end of left eye brow, right temple, outer corner of right eye, inner corner of right eye, inner corner of left eye, outer corner of left eye, left temple, tip of nose, right nostril, left nostril, centre point on outer edge of upper lip, centre point on outer edge of lower lip, and tip of chin.
The various distances between these nodal points may be measured to create a numerical code that represents the face in a database. For some software packages only 14 to 22 nodal points are needed to complete the face matching process. For the purposes of the present invention, it is not necessarily desirable for the algorithm to identify a very similar face (such that the two faces could be considered to be of the same person). All that is required is for the algorithm to identify faces that are similar to that of the user. The skilled person could trial the use of different numbers of nodal points to find an optimal number that gives the greatest attractiveness to the user.
Apart from the use of distances between nodal points, consideration could be given more so to the shape of various anatomical structures. For example, the shape of the jaw (square, pointed or rounded), the shape of the eyes (round or almond), the shape of the nose (wide or narrow), the fullness of the lips could be considered. Other anatomical structures including the eye socket, nostril, ear, chin, cheek, forehead, head, teeth, eyebrow and eyelash could be incorporated into the biometric comparison.
The present invention is not limited to the use of anatomical landmark, or anatomical structure information, but extends to the colour of the skin, hair or eyes. The importance of these variables has been discovered during studies showing that humans select long-term partners who not only look like themselves, but look like their opposite sex parents. It has been discovered that men are attracted to women who look like their mothers and women prefer men who resemble their fathers. The same research has also shown that humans select partners who remind them of themselves, particularly in relation to traits such as hair and eye colour. Studies examining hair and eye colour have shown evidence of positive assortment which may reflect attraction to self-similar characteristics but is also consistent with attraction to parental traits (Little, A.C., Penton-Voak, I.S., Burt D. M. & Perrett, D.I. (2003) An imprinting-like phenomenon in humans: partners and opposite-sex parents have similar hair and eye colour. Evolution and Human Behavior, 24: 43-51 ). This paper set out to establish whether the colouring of parents influenced choice of partner and found significant correlations between parental characteristics and actual partner characteristics for both men and women, proving that parental colouring has an effect on human partner choice. In particular, it was found that colour traits in opposite-sex parents had more of an effect on partner choice than colour traits in self or the same-sex parent. In other words the subjects were more likely to choose partners who resembled their opposite-sex parent.
The group found that the eye colour of opposite-sex parents significantly affected the choice of partner eye colour in both male and females. They also found that males' choice of partner hair colour was significantly positively affected by maternal hair colour. Without wishing to be limited by theory it is proposed that humans (and some animals) are attracted to elements which are familiar or in some way we are imprinted' with certain familiar characteristics from birth which we are then comfortable with, or attracted to, in the future. Traits such as hair and eye colour are examples of parental characteristics that offspring may iearn' or be imprinted with.
The theory of imprinting' is also thought to be one reason why individuals have different ideas of what is 'attractive'. Despite a high degree of agreement over what is and what is not 'attractive' throughout the World and different cultures, this learning of parental characteristics may explain some individual differences in opinion about which characteristics are attractive in a partner.
The level of similarity between the user and the potential partner may be varied. In general, the higher the similarity the better. However at a certain point, further similarity does not improve attractiveness, or may decrease in attractiveness. Without wishing to be limited by theory, it is thought that too similar a match will trigger the user's instinct to avoid mating with family members. This trigger point may differ depending on race, sex, individual preference etc. However the skilled person could ascertain the point at which similarity is negatively correlated to attractiveness by simple trial and error.
The user may choose to exclude any of their own features from the similarity analysis. For example, if the user was particularly dissatisfied with the size or shape of their nose, this feature could be excluded from the similarity analysis such that potential partners having a similarly large or misshapen nose are excluded. Of course, this may lead to less than optimal matches being generated by the computer but the similarity with other features such as the eyes or mouth may still result in the user finding attractiveness in a less than optimal match.
Many algorithms for determining facial similarity are known in the art, and it will be within the capacity of the skilled person to choose or one more algorithms suitable for use with the present invention. Algorithms that may be used in the context of the present invention include those detailed by Wiskott et al, Phantom faces for face analysis, 1997, Institute for neuroinformatic, Germany pp. 308-31 1 ; Wiskott et al, Face recognition by elastic bunch graph matching, 1997, IEEE, pp. 775-779; Tomasi, C, et al., "Stereo Without Search", Proceedings of European Conference on Computer Vision, Cambridge, UK, 1996, 14 pp. (7 sheets).; Turk, M., et al, "Eigenfaces for Recognition", Journal of Cognitive Neuroscience, vol. 3, No 1 , pp. 71 -86, 1991 ; Wiskott, L., et al, "Face Recognition by Elastic Bunch Graph Matching", Internal Report, IR-INI 96-08, lnstitut fur Neuroinformatik, Ruhr-Universitat, Bochum, pp. 1 -21 , Apr. 1996; Wiskott, L, "Labeled Graphs and Dynamic Link Matching for Face Recognition and Scene Analysis", Verlag Harr Deutsch, Th u n- Frankfurt am Main. Reihe Physik, Dec. 1995, pp. 1 -109; Wiskott, L., "Phanton Faces for Face Analysis". Proceedings of 3rd Joint Symposium on Neural Computation, Pasadena, CA, vol. 6, pp. 46-52, Jun. 1996; Wiskott, L., "Phanton Faces for Face Analysis". Internal Report, IR-INI 96-06, lnstitut fur Neoroinformatik, Ruhr-Universitat, Bochum, Germany, Apr. 1996, 12 pp; Wiskott, L. "Phantom Faces for Face Analysis", Pattern Recognition, vol. 30, No. 6, pp. 837-846, 1997; Wiskott, L., et al, "Face Recognition by Elastic Bunch Graph Matching", IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(7), pp. 775-779, 1997; Wong, R., et al, "PC-Based Human Face Recognition System", IEEE, pp. 641 -644, 1992; Kruger, N., et al, "Object Recognition with a Sparse and Autonomously Learned Representation Based on Banana Wavelets", Internal Report 96-1 1 , lnstitut fur Neuroinformatik, Dec. 96, pp. 1 -24; Kruger, N., et al, "Object Recognition with Banana Wavelets", European Symposium on Artificial Neural Networks (ESANN97), 1997, 6 pp; Lades, M., et al, "Distortion Invarient Object Recognition in the Dynamic Link Architecture", IEEE Transactions on Computers, vol. 42, No. 3, 1993, 1 1 pp.; Manjunath, B. S., et al, "A Feature Based Approach to Face Recognition" , In Proceedings IEEE Conference on Computer Vision and Pattern Recognition, pp. 373-378, 3/92; Mauer, T., et al, "Single-View Based Recognition of Faces Rotated in Depth", In Proceedings of the International Workshop on Automatic Face and Gesture Recognition, pp. 248-253, Zurich, CH, Jun. 26, 1995; Mauer, T., et al, "Learning Feature Transformations to Recognize Faces Rotated in Depth", In Proceedings of the International Conference on Artificial Neural Networks, vol. 1 , pp. 353-358, Paris, France, Oct. 9-13, 1995; Mauer, T., et al, "Tracking and Learning Graphs and Pose on Image Sequences of Faces", Proceedings of 2nd International Conference on Automatic Face and Gesture Recognition, Oct. 14-16, 1996, pp. 176-181 ; Peters, G., et al, "Learning Object Representations by Clustering Banana Wavelet Responses", Tech. Report IR-INI 96-09, lnstitut fur Neuroinformatik, Ruhr Universitat, Bochum, 1996, 6 pp; Phillips, P. J., et al, "The Face Recognition Technology (FERET) Program", Proceedings of Office of National Drug Control Policy, CTAC International Technology Symposium, Aug. 18-22, 1997, 10 pages; Roy, S., et al, "A Maximum Flow Formulation of the N- Camera Stereo Correspondence Problem", IEEE, Proceedings of International Conference on Computer Vision, Bombay, India, Jan. 1998, pp. 1 -6; Sara, R. et al "3-D Data Acquision and Interpretation for Virtual Reality and Telepresence", Proceedings IEEE Workshop Computer Vision for Virtual Reality Based Human Communication, Bombay, Jan. 1998, 7 pp.
1998; Sara, R., et al, "On Occluding Contour Artifacts in Stereo Vision", IEEE, Proceedings of International Conference Computer Vision and Pattern Recognition, Puerto Rico, 1997, 6 pp; Steffens, J., et al, "PersonSpotter-Fast and Robust System for Human Detection, Tracking, and Recognition", Proceedings of International Conference on Automatic Face and Gesture Recognition, 6 pp., Japan-Apr. 1998; Hall, E. L., "Computer Image Processing And Recognition", Academic Press 1979, 99. 468-484; Hong, H.,et al., "Online Facial Recognition based on Personalized Gallery", Proceedings of Int'l Conference On Automatic Face And Gesture Recognition, pp. 1 -6, Japan Apr. 1997; Kolocsai, P., et al, Statistical Analysis of Gabor-Filter Representation, Proceedings of International Conference on Automatic Face and Gesture Recognition, 1997, 4 pp; Ayache, N. et al., "Rectification of Images for Binocular and Trinocular Stereovision", Proc. Of 9th Intl., Conference on Pattern Recognition, 1 , pp. 11 -16, Italy, 1988; Beymer, D. J., "Face Recognition Under Varying Pose", MIT A.I. Lab, Memo No. 1461 , pp. 1 -13, 12/93; Beymer, D.J., "Face Recognition Under Varying Pose", MIT A.I. Lab. Research Report, 1994, pp. 756-761 ; Buhmann, J. et al., "Distortion Invariant Object Recognition By Matching Hierarchically Labeled Graphs", In Proceedings IJCNN Int'l Conf. Of Neural Networks, Washington, D. C. Jun. 1989, pp. 155-159; DeCarlo, D., et al., "The integration of Optical Flow and Deformable Models with Applications to Human Face Shape and Motion Estimation", pp. 1 -15, In Proc. CVPR '96, pp. 231 -238 (published)[TM Sep. 18, 1996; Dhond, U., "Structure from Stereo: a Review", IEEE Transactions on Systems, Man, and Cybernetics, 19(6), pp. 1489- 1510, 1989.
Notification of Transmittal of the International Search Report or the Declaration, International Search Report for PCT/US02/23973, mailed Nov. 18, 2002; Yang, Tzong Jer, "Face Analysis and Synthesis", Jun. 1 , 1999, Retrieved from Internet, http://www.cmlab.csie.ntu.edu.tw/ on Oct. 25, 2002, 2 pg.
In a further aspect the present invention provides a computer-based system adapted to identify a potential partner for a user. In one preferred form of the invention the system includes six major components depicted in FIG.1 The first component of an automated biometric identification/verification system is a data acquisition component that acquires the biometric data in digital format by using a sensor. For face images the sensor is typically a camera. The second and third components of the system are optional. They are the data compression and decompression mechanisms, which are designed to meet the data transmission and storage requirements of the system. The fourth component is the feature extraction algorithm. The feature extraction algorithm may produce a feature vector, in which the components are numerical characterizations of the underlying biometrics. The feature vectors are designed to characterize the underlying biometrics of the user for potential for comparison with potential partners on the database. In general, the larger the size of a feature vector (without much redundancy), the higher its discrimination power. The discrimination power is the difference between a pair of feature vectors representing two different individuals. The fifth component of the system is the "matcher," which compares feature vectors obtained from the feature extraction algorithm to produce a similarity score. This score indicates the degree of similarity between a pair of biometrics data under consideration. The sixth component of the system is a decision-maker.
One class of biometric data that may find use in the present invention is that of facial asymmetry. A person may be attracted to a person having a similar level of asymmetry in their face. It has been demonstrated that the asymmetry of specific facial regions captures individual differences that are robust to variation in facial expression. It has been further shown that facial asymmetry provides discriminating power orthogonal to conventional face identification methods. The degree of asymmetry can be quantitated by consideration of two dimensional or three dimensional measurements of the face and head.
Three dimensional (3D) face recognition has advantages over 2D face since it compares the 3D shape of the face which is invariant in different lighting conditions. As long as the illumination of the face is in a range which allows 3D reconstruction of a sufficiently large portion of the face, a detailed analysis of the face is possible.
A number of commercially available software systems will have use in the present invention. Accordingly, the present invention also provides the use of face recognition software for identifying a potential partner for a user. For example, SecurelDent Products by BioDentity Systems Corporation (Ottawa, Canada) encompasses everything from hardware to middleware to specialized application software. The Preprocessor offers comprehensive facial capture and image recognition support regardless of lighting and other environmental factors. The software has the capability of separating the facial image from the background for easy processing of information.
The SecurelDent Preprocessor offers onboard processing, and can provide front-end enhancement to any other facial-recognition system or application. Other offerings in the family include the SecurlDent Face Recognition Controller, which is the primary interface between the Preprocessors and the rest of the application; the SecurlDent Photo Enrolment System, which automatically optimizes all images to create a high-quality database; and the SecurlDent Search Engines, which compare biographical or face biomethc details, or a combination of the two, against very large databases.
Single Sign-On by BiolD America lnc (NC) technology offers the ability to analyze face, voice, and lip movement simultaneously, and requires only a standard USB camera and microphone for implementation. The BiolD dataset features 1 ,521 gray level images showing a frontal view of a face of one out of 23 different test persons. Images are stored in single files using portable graymap (pgm) data format.
The FaceVACS-Logon system (Cognitec, OR) offers automatic facial identification. It may be integrated with conventional access control or time and attendance systems, and a combination with card terminals is possible for high- security areas. Users faces are captured by video camera, and image- processing algorithms extract a feature set from the digitized image, which the software compares to the users reference set, stored on the computer. Features of the system include flexible operating modes, which enable it to be used as a stand-alone facial-recognition solution. The package includes standard Webcam support, and support of Windows 98/2000/NT/XP/Me.
The FaceTrac facial-recognition system (Graphco Technologies, PA) is image capture, comparison against images in a database, and matching of images. This open system can incorporate facial-recognition engine components from vendors such as Viisage, Visionics, and AcSys Biometrics. FaceTrac can match the facial geometry of an individual against portraits in a database.
Face ID facial-recognition program (ImageWare Systems, CA) uses biometrics in combination with parallel processing to match faces through a mathematical formula that uses the eyes as a reference point. The formula generates a data record representing the face, which is used to compare against a digital database of enrolled images. Images from scanned photographs or video may be queried using the program, and millions of images may be searched to identify matches. Using more than 200 facial descriptors generated from an image analysis algorithm, the ID-2000 software (Imagis Technologies Inc. BC Canada) captures, compares, and quickly and efficiently displays an individuals face against a database. It enables an individual to be matched in seconds using only an image or photograph as the primary search criteria.
The IRID face-recognition technology (Infrared Identification Inc. VA) can perform infrared facial recognition as well as continuous condition monitoring of individuals by using passive infrared imaging that is non-contact, non-invasive, and works under any lighting conditions or in total darkness.
The Tridentity 3 Dimensional Face system (Recognition Neurodynamics Limited, Cambridge UK) offers a three-dimensional approach to facial recognition, and can analyze subtle features of the face like bone structure, and enables images to be rotated to offer a better view of the subject. The technology uses patterned light to create a three-dimensional image of the face, and once an image is captured, a 3D representation of the subjects face can be built from a single frame of video footage. The solution can operate on single or multiple scans, and each scan can be processed in under one second on a 400 MHz Pentium system. The search database size is limited by disk space and processor speed only, and the system may be expanded to scale up to multiple cameras and workstations. The system is based on an open architecture, uses COTS components, and may be easily integrated as a component of a larger system.
The FaceOn technology (Symtron Technology, CA) uses neural network and artificial intelligence techniques to capture facephnts and determine or verify identity.
The FaceOn Logon AdminTool enables complete the facephnt enrollment, as well as add and delete faceprints from the database. The enhanced visual Access Log enables administrators to keep tack of all users access settings. The FaceOn Surveillance system may be integrated with various types of CCTV systems, and offers multiple, real-time image enrollment, retrieval, and recognition (using the Invariant Feature Analysis technology). The Viisage face-recognition (FR) technology (Viisage Technology, MA) is based on an algorithm developed at MIT, and enables software to translate the characteristics of a face into a unique set of numbers called an eigenface. This is used by identification and verification systems for facial comparisons made in real time, and may be used with databases containing millions of faces. The technology enables software to instantly calculate an individual's eigenface from live video or a still digital image and then search a database to find similar or matching images.
The family of products includes the FaceFINDER, FaceEXPLORER, FacePASS, FacePIN, and FaceTOOLS applications. They offer the ability to search large databases of images, and a software development kit for developing additional applications.
The Facelt facial-recognition software engine from Visionics Corporation NJ enables computers to rapidly and accurately detect and recognize faces, for everything from ID solutions to banking and e-commerce applications. The software can detect one or multiple faces, and can also provide one-to-one or one-to-many matching. It also evaluates the quality of the image and prompts for an improved image if needed, and can crop faces from background imagery.
Other features include the ability to generate a faceprint, a digital code/template unique to an individual, as well as the ability to track faces over time. Facelt can also compress facial images to 84 bytes for easy storage and transfer. It uses the local feature analysis technique to represent facial images in terms of local, statistically derived building blocks. The software is resistant to changes in skin tone, lighting, facial expression, eyeglasses, and hair, and allows up to 35 degrees of change in pose in all directions.
The UnMask system (Visionsphere Technologies Inc. ON Canada) offers face detection and location of key features, extraction of facial descriptors, and comparison of extracted information against a database. It locates the face and the eyes automatically through proprietary search algorithms, and then normalizes and crops the image to offer invariance to variations in head rotation, lighting, hairstyle, and facial expression. The system then uses VisionSpheres Holistic Feature Code (HFC) to provide discrimination for comparing faces at high confidence rates and fast processing speeds. Faces are then compared using a proprietary distance function, which stresses significant differences between faces.
VisionSphere also provides the UnMask Plus software artificial intelligence (Al) system, which provides identification and removal of duplicate or multiple images from large databases. The software also includes automatic computer logon authentication system offers hardware and software components for verifying the identity of a network or workstation user. The FaceCam biometric user- verification terminal offers integration with applications for physical access control, time and attendance, and registration systems.
The ZN-Face physical access control system (ZN Security, a division of ZN Vision Technologies Bochum, Germany) enables automation of identity checks for access to secure areas. The system uses a neural face-recognition routine to verify individuals, and also offers a refined optical filter system and a LiveCheck analysis procedure to prevent attempts at spoofing through photos or masks.
The system may be administered via Windows NT/2000, and supports the ODBC database interface standard to enable acceptance of the master data from external databases, and also features the ZN-SmartEye technology. This enables evaluation of pictures from a video camera, and reports the similarity of a face compared with others on the database. The system also works with ZN- Phantomas, a computerized image database that can automatically compare faces.
In another aspect, the present invention provides a method for identifying a potential partner for a user over a computer network, the method including the use of a system described herein. The systems and methods described herein may be implemented over any type of computer network. In a highly preferred form of the invention the systems and process may be implemented over the Internet. However, any other network such as WAN or LAN could be utilised. It is contemplated that wired or wireless networking protocols could be used.
It is further contemplated that the network could be implemented by a user carrying their own biometric information on a portable data storage device, and connecting the device to a computer holding the database of individuals. Upon connection of the device to the computer, the user's biometric data is compared with the biometric information. The portable data storage device may be a, flash disk, micro-hard drive, compact disc, magnetic medium such as floppy disk, punched card, or EPROM device.
It is further contemplated that the user's image may be forwarded to the server for biometric analysis by means of mobile telephone equipment. Many consumer telephones have the ability to take a digital photograph and transmitting the photograph to a computer via a cellular network. It is envisaged that an image of the potential partner identified by the computer could be returned to the user's mobile telephone, along with the potential partner's contact details.
The methods and systems may incorporate other known methods useful for identifying a potential partner such as standard questionnaires and zodiac sign compatibility. Also incorporated may be other screening criteria such as hair colour, skin colour, ethnicity, height, weight, and the like. These further criteria could be selected for or against either before or after the computer selects a potential partner for the user.
The user and the potential partners may be a different sex or the same sex. The invention may even be useful for identifying a potential animal companion based on similarities between the features of the potential owner and the potential pet.
The present invention will now be more fully described by reference to the following example. It is emphasised that this example is not intended to be restrictive on the general disclosure supra. EXAMPLE 1 : HARDWARE AND SOFTWARE CONFIGURATION
Face analysis software
Biol D SDK V3.1 is used to compare the level of similarity between user faces on the match making database. Biol D is available from HumanScan AGGrundstrasse 1 , CH-6060 Sarnen, Switzerland
Below are the installation procedures for HumanScan Biol D SDK:
Assumption: Windows 2000 (or above), SQL Server 2000 (or above) and .NET Environment are already installed and we have Remote Desktop Connection to the server with Administrator privileges.
Create Database: Create a database in SQL server that is to be used for this project.
Upload Necessary Files: Firstly, all the necessary files related to HumanScan Biol D software need to be uploaded and they are listed as follows: BiolDAdmin.zip, BiolDServer.zip, BiolDCIient.zip, BiolDSDK31.zip
Install Admin: Unzip BiolDAdmin.zip and double click on setup.exe. Follow the instructions on the screen and complete the installation.
Install Server: Unzip BiolDAdmin.zip and double click on setup.exe. Choose
SQL Server and update existing database option. Then choose the database created in Step 2. Follow the instructions on screen and complete the installation.
Setup Client: After installing server part, go to Start Menu -> Program -> Humanscan->BiolDManagement. Click on "clients" on the left hand side and click on "+" sign to add client on the right hand side. It will show the computer name in the next window. Then put the security setting to 1. Keep on pressing "next" and then "finish".
Install Client: Unzip BiolDCIient.zip and double click on setup.exe. Follow the instructions on screen and complete the installation. Later on it will ask about the client, then select the client created in step 6 and move forward.
Install SDK: Unzip BiolDSDK31.zip and double click on setup.exe. Follow the instructions on screen and complete the installation. Proposed Technology and Hardware Technology
The following hardware/software configuration is used. Server Side Scripting language: ASP.NET Database: SQL Server 2000 or above Operating System: Windows 2000 or above Web server: IIS 5.0 or above
Browsers Compatibility: Microsoft Internet Explorer 5.x, Netscape Communicator 6.x, Mozilla FireFox 1.0 and above
Scripting Language/ User Interface: HTML Client Side Scripting: Java Script or VB Script
Hardware (Server Configuration) Dual or Quad Xeon 2.0 GHz
2 to 4 GB DDR RAM
SuperMicro S81 1 i Chassis
SuperMicro P4Sci Main Board
4x73GB SCSI RAID 5
500 to 1000 GB Bandwidth/month
Dell Hardware
Plesk, Ensim or Cpanel - Control Panel
At least 1 dedicated IP address Firewall Protection
Remote Desktop Connection
Remote reboot option MailEnable Pro Mail Server
Full Backup options
Description of Communication Paths The potential partner match application will consist of several components, which will perform different tasks. Referring now to FIG 2 the components and communication paths of special interest are identified and described as follows.
(A) Web Server
The Web Server (Internet Information Server) will run as a service on Windows NT4, Windows 2000 or Windows XP.
Internet Information Services (IIS) is a powerful Web server that provides a highly reliable, manageable, and scalable Web application infrastructure for all versions of Windows Server. IIS assists in increasing Web site and application availability while lowering system administration costs. The Web server service uses an account to logon to the SQL server depending on the SQL server that is run. The database server is on same PC as Web Server Service; therefore local account can be used.
(B) BiolD Server Service The BiolD Server Service runs as a service on Windows XP and runs independently with an existing or a new account.
The BiolD Server Service can use an existing account to logon to the SQL Server. If new account is required, several factors need to be considered where the account will be created (domain global or locally on the PC) as mentioned below:
BiolD server service will not be installed on a domain controller: o database server is on same PC as BiolD Server Service: local account o database server is on different PC as BiolD Server Service: PC is member of a domain: domain global account
PC is standalone / not member of a domain: local account
Biol D server service will be installed on a domain controller: domain global account
To allow the creation of the account, it's necessary to run the Biol D server setup as an administrator. If necessary to create a domain global account in the installation admin access is required to install Biol D server.
If using an existing account for the Biol D Server Service, it's important that this account has the right to "log on as a service". Otherwise the Biol D Server Service will not be able to start. To set this right, use the following:
NT4: User Manager/Policies/User Rights
W2K, XP: MMC Snapln for Security Settings, LocalPolicies/User Rights Assignment/Log on as a service
(C) Biol D and Web Server <-> SQL Server / Biol D Database
The Biol D Server stores all data about the clients, users, etc. in a SQL database. To connect to the SQL Server the Biol D Server Service requires login account information. Depending on type or version of SQL server used, the required data differs as mentioned:
MS SQL server 7.0 or higher with integrated or mixed security
The Biol D Server Service needs the names of the database server and the database on that database server. It will logon to the database with the Biol D Server Service existing account. To work with the database this account needs to have the right to logon to that database and to read, modify, create and delete datasets in any of the Biol D database tables.
MS SQL server 7.0 or higher with SQL or mixed security
The Biol D Server Service needs to the names of the database server and database on that database server and a SQL server user account name and password. It will logon to the database with existing account. To work with the database this account needs to have the right to logon to that database and to read, modify, create and delete datasets in any of the Biol D database tables. On the BiolD server side the communication runs via OLE DB.
(D) BiolD Server Setup <-> SQL Server
The BiolD Server Setup can create the basic BiolD database when installing database on a MS SQL server. BiolD Server Setup needs an account, which is member of the SQL servers System Administrators role (sysadmin) to perform all necessary actions: Create the new database
Create all required tables in the database
Create the new login for the BiolD Server Service
Give all rights for the new BiolD database to this new login
It depends on the specific MS SQL server installation which account can be used to perform these actions:
MS SQL server 7.0 or higher with integrated or mixed security
The BiolD Server Setup can logon with the account the BiolD Server Setup is currently running as the administrators - account. It is necessary to be member of the sysadmin Server Role on the SQL server. MS SQL server 7.0 or higher with SQL or mixed security
The BiolD Server Setup needs to know a SQL server user account name and password. It will logon to the SQL server with this given account info. This account has to be member of the sysadmin Server Role on the SQL server (like the standard "SA" account). If it is necessary to create the database de novo, then there is no need to allow the BiolD Server Setup the access to the SQL server. The BiolD Server Setup will install a SQL script, which can be used to create all necessary tables in the database. (E) Client Browser <-> BiolD and Web Server
The communication between Client Browser and Web Server will be done using HTTP. Then Web Server will in turn interact with BiolD Server.
HTTP stands for Hypertext Transfer Protocol. This is the network protocol used to deliver virtually all files and other data (collectively called resources) on the World Wide Web, such as HTML files, image files, query results. Typically, HTTP transfer takes place through TCP/IP sockets.
A browser is an HTTP client because it sends requests to an HTTP server (Web server), which then sends responses back to the client. The standard (and default) port for HTTP servers to listen on is port 80.
HTTP is used to transmit resources, not just files. A resource is a package of information that can be identified by a URL. The most common type of resource is a file, but a resource may also be a dynamically generated query result, the output of a CGI script, a document that is available in several languages.
System Process
Figure 3 is a systematic representation of the system process (Interaction with Bio SDK) together with a description to follow:
1. On the front end Client Application will run on a Web Browser. It will send a HTTP request to the web server through ASP.NET script.
2. Web Server i.e. Internet Information Server (IIS) will receive the HTTP request and initiate respective process.
3. The processes can be: fetching User's profile from database. Database resides on the back-end and will store all the data related to users and their photos.
4. There can be the following types of HTTP Request and Responses: a. User can send HTTP Request for registration. In this case Web Server will get user's data and pass it on to Database. User's data is saved on database and HTTP Response will be sent to Web Browser.
b. User can send HTTP Request for login. Here the Web Server will authenticate user's username and password from Database and send the HTTP Response back to Web Browser.
c. User can send HTTP Request to upload photo. The Web Server will get photo data from client browser. Web Server will save the photo on server and update the Database. Web server will also call Biol D server and components to enroll and create template for the photo. Then web server will send the HTTP Response for photo upload confirmation back to Web Browser.
User can send HTTP Request for matching the photo. In this case Web Server will fetch all other user's photos from Database. Web server will then call Biol D server and components to match the photo using processes like classification, verify and identify and then send the match % back to web server. Then web server will send the HTTP Response for match % back to Web Browser, only if match % > 60%.
FIG 4 is a flow chart representing the system overview:
1. User can register on site by entering their personal details, email and mobile number. The user needs to choose their mobile service provider to allow receipt of mobile messages from the site and send their pictures. In order to use the match making service, a username and password is required.
2. After registration user will receive a confirmation email to activate their account. This email will contain an activation link to activate their account. 3. Once the account is activated, user can login to the website by using their username and password. On successful login, user's session will be created to validate user on each requested page.
4. Once account is activated, user is required to create their profile. User can mention various information such as introduction title, description, personal characteristics, smoker/drinker etc, as well as details of the person he/she is interested in.
5. The user can upload his/her picture in order to find a match. Picture can be upload both through the website or send via email using their Internet enabled mobile phone. In order to upload picture via mobile phone, it is required that the picture meets the defined specification and is sent to provided email address from where the photos will be fetched.
6. User's image will then be passed to Biol D SDK and enrolled. After enrollment a template is created in Biol D for User's image. 7. User can then search for members and choose members to perform match with or otherwise update their match preference in order for the system to identify a match.
8. Match is performed using Biol D SDK functions and processes that are classification, verify and identify. 9. User will receive a result of >60% match either on his web page or mobile device through SMS.
Database Entity Relationship
FIGS 5 and 6 show two detailed database structures in the form of entity relationships.
Registration Process
The user registers online in order to use match making services. The flow chart shown at FIG 7 represents this process. Create profile process
Once a user is registered on the website, the user creates their profile. This profile will be visible to other users for match making purposes. A flow chart of this process is shown at FIG 8.
Photo Upload and Matching Process Overview
The overview of photo uploading and matching process is shown at FIG 9.
User registers on site by entering their personal details, email and mobile telephone number. The user chooses their mobile service provider to allow receipt of mobile messages from the site and also to upload their photograph. In order to use the match making services, a username and password is required.
After registration user will receive a confirmation email to activate their account. This email will contain an activation link for that purpose.
Once the account is activated, user can login to the website by using their username and password. On successful login, user's session will be created to validate user on each requested page.
Once account is activated, user is required to create their profile. User can include various details such as introduction title, description, personal characteristics (e.g. smoker or drinker etc). User may also include preferred personal characteristics of their desired partner.
The user uploads his/her digital photograph in order to find a match. The photograph may be uploaded via the website or via email using their Internet enabled mobile phone. In order to upload picture via mobile phone, it is required that the picture meets the defined specification and is sent to provided email address from where the photos will be retrieved.
The photograph must be of an acceptable format (jpg, gif) as mentioned on the website and not exceeding maximum allowed size (this is Admin controlled).
When user uploads the photograph, it is saved into a separate folder on server. Picture name will be then added to the database. If there are any pictures of same user present with that name, 1 ,2,3... will be added after to the picture name.
Enrollment is the process of collecting biometric samples from an applicant and the subsequent preparation, and storage of biometric reference templates representing that person's identity. User's picture is passed to BiolD SDK and is enrolled using following steps:
Locating Faces
BiolD SDK uses a two-stage model-based algorithm to detect the location of a human face in an arbitrary image: a binary face model is being matched in a binahzed version of the current scene. The comparison is performed with the modified Hausdorff distance, which determines the optimal location, scaling and rotation of the model.
The estimated face position is refined by matching a more detailed eye region model, again using the Hausdorff distance for comparison. The exact eye positions are determined by checking the output of an artificial neural network (ANN) specialized on detecting eye centers.
The eye positions allow for all further processing: using anthropomorphic knowledge, a normalized portion of the face and of the mouth region can be extracted.
Face Features
The face is transformed to a uniform size. This procedure ensures that the appropriate biometric features of the face are analyzed, and not, for example, the size of the head, hair style, a tie, or piece of jewelry.
Some further preprocessing steps reduce the impact of lighting conditions and color variance. Subsequent to the feature extraction, methods are applied to the normalized image, resulting in a face feature vector, which is then used by the classifier. Create Template
The template is made up of a separate part for each classification trait and can be understood as a compact representation of the collected feature data, where useless or redundant information is discarded. Independent of the number of training sequences, each part consists of a fixed amount of data representing each person's characteristics.
User's picture is now compared with other already stored pictures. This is the process of comparing biometric data with a previously stored reference template or templates.
Classification
Each person enrolled in Biol D is assigned a unique class, and the classifier compares a new recording (i.e. the feature vectors that are extracted out of this recording) with all (formerly trained and stored) prototypes of each class. The prototype with the highest similarity determines the class ("winner takes all" principle). The identify function is used to match the pictures of existing users in order to find a match (where match score is greater then 0.6 or 60%).
The output of a classifier therefore is the Class ID (the person with the best matching features) - the similarity value. Similarity values are typically in the range between 0.0 and 1.0, a match with 1.0 would mean a perfect match. Note that values in biometrics are never really 100%. The output of a classifier will almost never be 1.0, and values of 0.8 to 0.9 are typical.
Biol D (as almost all classification systems) uses thresholds to qualify a classification. Only if the similarity value exceeds a certain threshold, the user will be recognized. This prevents poor matches from being falsely identified. These score will be converted into percentages and a match greater then 60% for example, which is a score of 0.6, will be considered a good match.
Since Biol D uses individual classification mechanisms for each biometric channel, each channel has its own threshold. Thresholds can be altered in the Biol D User Interface within reasonable limits (the system doesn't allow to set very low threshold values such as 0.0 or 1.0). Upload picture through website process
User uploads their photograph to find a match by login into their account as shown schematically in FIG 10.
The user can upload his/her picture in order to find a match. In order to upload the picture through the website, user must login to their respective account by entering their username and password.
User's image will then be passed to Biol D SDK and enrolled. After enrollment a template is created in Biol D for User's image and a confirmation email is sent to user (see FIG 1 1 )
Uploading picture through mobile phone process
User can upload their photograph to find a match by using their mobile phones as shown schematically in FIG 12.
A photograph is uploaded either through the website or sent via email using an Internet enabled mobile phone. In order to upload the photograph via mobile phone, it is required that the picture meets a defined specification, and is sent to the provided email address from where the photos will be retrieved. User picture will be enrolled using Biol D SDK enroll function and a template will be created for matching purposes. A photograph must be of proper format (jpg, gif) as mentioned on the website and not exceeding maximum allowed size (this is Admin controlled). When users upload the photograph, it is saved into a separate folder on server. Photograph name will be then added to the database. If there are any pictures of same user present with that name, 1 ,2,3... will be added after to the picture name. Once the picture is uploaded user will be sent a confirmation email using SMS (see FIG 13).
Matching Process
The matching process is shown generally in FIGS 14, 15 and 16. 1. User can start matching process through the website after uploading their pictures. Match is performed using Biol D functions and processes that are classification, verify and identify.
2. Each person enrolled in BiolD is assigned a unique class, and the classifier compares a new picture with all (formerly trained and stored) prototypes of each class.
3. The output of a classifier therefore is the Class ID (the person with the best matching features) - the similarity value. Similarity values are typically in the range between 0.0 and 1.0, a match with 1.0 would mean a perfect match. Note that values in biometrics are never really 100%. The score will be converted to percentage and a match of greater than 60% will be considered as a potential match.
4. BiolD components are called using the API, like suppose BiolD component's object is named as BiolDAPI. 5. Identification functions that are: BiolDCtrlJdentificationReady and identify Click are called to identify similarity in photos.
6. The verify functions that are BiolDCtrl VerificationReady and verify Click are then called to verify the matches.
7. User will receive a result of >60% match either on his web page or mobile device through SMS in descending order of percentage.
Intermediate interface to interact with BiolD SDK
An intermediate script to transfer data between BiolD SDK and web browser is required. BiolD SDK includes a set of components and database, so a script (ASP or VB script) is required to call BiolD components and then execute that script through browser (see FIG 17)
Finally, it is to be understood that various other modifications and/or alterations may be made without departing from the spirit of the present invention as outlined herein.

Claims

CLAIMS:
1. A method for identifying a potential partner for a user, the method including the steps of: providing biometric data characterising a physical feature of the user and/or a parent of the user, providing a database having biometric data characterising a physical feature of a plurality of individuals, comparing the biometric data of the user with at least a proportion of the biometric data on the database, identifying at least one individual characterised by biometric data that is at least similar to that of the user and/or a parent of the user.
2. A method according to claim 1 wherein the level of similarity is not less than a minimum value, such that the user has an attraction or affinity for the potential partner.
3. A method according to claim 2 wherein the attraction or affinity is greater than that where there is no comparison of biometric data.
4. A method according to claim 2 or claim 3 wherein the level of similarity is not greater than a maximum value, such that the user has little or no attraction or affinity for the potential partner.
5. A method according to any one of claims 2 to 4 wherein the attraction or affinity is physical.
6. A method according to any one of claims 1 to 5 implemented at least in part over a computer network.
7. A method according to claim 6 wherein the computer network is the Internet.
8. A method according to claim 6 or claim 7 wherein the method is at least partially implemented over a mobile telephone network.
9. A method according to any one of claims 6 to 8 wherein the potential partner(s) are returned to the user as a candidate list in descending order of similarity.
10. A method according to any one of claims 1 to 9 wherein the biomethc data relates to the head and/or face of the user.
1 1. A method according to claim 10 wherein the biometric data relates to the position of anatomical landmarks on the head and/or face.
12. A method according to claim 1 1 wherein the anatomical landmark is selected from the group consisting of the right eye pupil, left eye pupil, right mouth corner, left mouth corner, outer end of right eye brow, inner end of right eye brow, inner end of left eye brow, outer end of left eye brow, right temple, outer corner of right eye, inner corner of right eye, inner corner of left eye, outer corner of left eye, left temple, tip of nose, right nostril, left nostril, centre point on outer edge of upper lip, centre point on outer edge of lower lip, and tip of chin.
13. A method according to any one of claims 10 to 12 wherein the biometric data relates to the shape of an anatomical structure of the face and/or head.
14. A method according to claim 13 wherein the anatomical structure is selected from the group consisting of the eye, eye socket, nose, nostril, ear, chin, jaw, cheek, forehead, head, mouth, lip, teeth, eyebrow, and eyelash.
15. A method according to any one of claims 1 to 14 wherein the step of comparing does not include a biometric characteristic of the user that the user wishes to exclude.
16. A method according to any one of claims 1 to 15 wherein the biometric data relates to colouring.
17. A method according to claim 16 wherein the colouring is of a feature selected form the group consisting of the skin, eyes, and hair.
18. A method according to any one of claims 1 to 17 wherein the biometric data is selected from the group consisting of an anatomical landmark, shape of an anatomical structure, and colouring.
19. Computer executable code capable of implementing a method according to any one of claims 1 to 18.
20. A computer system including a computer executable code according to claim 19.
21. A computer system according to claim 20 including a component selected from the group consisting of a data acquisition component, a data compression component, a data decompression component, a feature extraction component, a matcher component, and a decision maker component.
22. Use of face recognition software for identifying a potential partner for a user.
PCT/AU2005/001733 2004-11-16 2005-11-16 Computer-based method and system for identifying a potential partner WO2006053375A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2005306571A AU2005306571A1 (en) 2004-11-16 2005-11-16 Computer-based method and system for identifying a potential partner

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2004906566 2004-11-16
AU2004906566A AU2004906566A0 (en) 2004-11-16 Compter-based method and system for identifying a potential partner

Publications (1)

Publication Number Publication Date
WO2006053375A1 true WO2006053375A1 (en) 2006-05-26

Family

ID=36406760

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2005/001733 WO2006053375A1 (en) 2004-11-16 2005-11-16 Computer-based method and system for identifying a potential partner

Country Status (1)

Country Link
WO (1) WO2006053375A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008142481A2 (en) * 2006-10-31 2008-11-27 Parana Vision Adaptive voice-feature-enhanced matchmaking method and system
US8015128B2 (en) 2006-10-01 2011-09-06 International Business Machines Corporation Biometric security using neuroplastic fidelity
US8798321B2 (en) 2005-09-28 2014-08-05 Facedouble, Inc. Method and system for tagging an image of an individual in plurality of photos
US8897506B2 (en) 2005-09-28 2014-11-25 Facedouble, Inc. Image classification and information retrieval over wireless digital networks and the internet
US9465817B2 (en) 2005-09-28 2016-10-11 9051147 Canada Inc. Method and system for attaching a metatag to a digital image
US9824262B1 (en) * 2014-10-04 2017-11-21 Jon Charles Daniels System and method for laterality adjusted identification of human attraction compatibility
US10223578B2 (en) 2005-09-28 2019-03-05 Avigilon Patent Holding Corporation System and method for utilizing facial recognition technology for identifying an unknown individual from a digital image

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2440398A1 (en) * 2002-12-20 2004-06-20 Yaron Mayer System and method for searching, finding and contacting dates on the internet in instant messaging networks and/or in other methods that enable immediate finding and creating immediate contact
US20050043897A1 (en) * 2003-08-09 2005-02-24 Meyer Robert W. Biometric compatibility matching system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2440398A1 (en) * 2002-12-20 2004-06-20 Yaron Mayer System and method for searching, finding and contacting dates on the internet in instant messaging networks and/or in other methods that enable immediate finding and creating immediate contact
US20050043897A1 (en) * 2003-08-09 2005-02-24 Meyer Robert W. Biometric compatibility matching system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
THIESSEN D. AND GREGG B.: "Human Assortative Mating and Genetic Equilibrium: An Evolutionary perspective", DEPARTMENT OF PSYCHOLOGY, THE UNIVERSITY OF TEXAS, 1980, Retrieved from the Internet <URL:http://agserver.kku.ac.th/monchai/Teaching/PopGen/Paper/Human%20assortive%20and%20gene%20equilibrium.pdf> *

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9569659B2 (en) 2005-09-28 2017-02-14 Avigilon Patent Holding 1 Corporation Method and system for tagging an image of an individual in a plurality of photos
US8897506B2 (en) 2005-09-28 2014-11-25 Facedouble, Inc. Image classification and information retrieval over wireless digital networks and the internet
US10990811B2 (en) 2005-09-28 2021-04-27 Avigilon Patent Holding 1 Corporation Image classification and information retrieval over wireless digital networks and the internet
US8798321B2 (en) 2005-09-28 2014-08-05 Facedouble, Inc. Method and system for tagging an image of an individual in plurality of photos
US9798922B2 (en) 2005-09-28 2017-10-24 Avigilon Patent Holding 1 Corporation Image classification and information retrieval over wireless digital networks and the internet
US9224035B2 (en) 2005-09-28 2015-12-29 9051147 Canada Inc. Image classification and information retrieval over wireless digital networks and the internet
US9412009B2 (en) 2005-09-28 2016-08-09 9051147 Canada Inc. Image classification and information retrieval over wireless digital networks and the internet
US10853690B2 (en) 2005-09-28 2020-12-01 Avigilon Patent Holding 1 Corporation Method and system for attaching a metatag to a digital image
US10776611B2 (en) 2005-09-28 2020-09-15 Avigilon Patent Holding 1 Corporation Method and system for identifying an individual in a digital image using location meta-tags
US10223578B2 (en) 2005-09-28 2019-03-05 Avigilon Patent Holding Corporation System and method for utilizing facial recognition technology for identifying an unknown individual from a digital image
US9465817B2 (en) 2005-09-28 2016-10-11 9051147 Canada Inc. Method and system for attaching a metatag to a digital image
US9875395B2 (en) 2005-09-28 2018-01-23 Avigilon Patent Holding 1 Corporation Method and system for tagging an individual in a digital image
US10216980B2 (en) 2005-09-28 2019-02-26 Avigilon Patent Holding 1 Corporation Method and system for tagging an individual in a digital image
US8015128B2 (en) 2006-10-01 2011-09-06 International Business Machines Corporation Biometric security using neuroplastic fidelity
WO2008142481A3 (en) * 2006-10-31 2009-03-12 Parana Vision Adaptive voice-feature-enhanced matchmaking method and system
WO2008142481A2 (en) * 2006-10-31 2008-11-27 Parana Vision Adaptive voice-feature-enhanced matchmaking method and system
US9824262B1 (en) * 2014-10-04 2017-11-21 Jon Charles Daniels System and method for laterality adjusted identification of human attraction compatibility

Similar Documents

Publication Publication Date Title
US9798922B2 (en) Image classification and information retrieval over wireless digital networks and the internet
US10223578B2 (en) System and method for utilizing facial recognition technology for identifying an unknown individual from a digital image
US7599527B2 (en) Digital image search system and method
US7450740B2 (en) Image classification and information retrieval over wireless digital networks and the internet
US20090060289A1 (en) Digital Image Search System And Method
Manna et al. Face recognition from video using deep learning
WO2006053375A1 (en) Computer-based method and system for identifying a potential partner
Galbally et al. Study on face identification technology for its implementation in the Schengen information system
Saraswat et al. Anti-spoofing-enabled contactless attendance monitoring system in the COVID-19 pandemic
Nahar et al. Twins and Similar Faces Recognition Using Geometric and Photometric Features with Transfer Learning
Park et al. A study on the design and implementation of facial recognition application system
Arbab‐Zavar et al. On forensic use of biometrics
Shashikala et al. Attendance Monitoring System Using Face Recognition
De Marsico et al. Face recognition in adverse conditions: A look at achieved advancements
KR20220124446A (en) Method and system for providing animal face test service based on machine learning
AU2012201564A1 (en) Computer based method and system for identifying a potential partner
AU2005306571A1 (en) Computer-based method and system for identifying a potential partner
Espinosa Duró Face recognition by means of advanced contributions in machine learning
Vully Facial expression detection using principal component analysis
Galdámez et al. Ear biometrics: a small look at the process of ear recognition
Al-Kawaz Facial identification for digital forensic
Srivika et al. Biometric Verification using Periocular Features based on Convolutional Neural Network
Vinayagam et al. Experimental Evaluation of IoT based Human Gender Classification and Record Management using Intelligent Hybrid Learning Principles
Udayakumar 3D Image Based Face Recognition System Using LDA, PCA And HAAR
Karuppasamy et al. Real-Time Face Recognition in Attendance System using DCNN

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KN KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2005306571

Country of ref document: AU

ENP Entry into the national phase

Ref document number: 2005306571

Country of ref document: AU

Date of ref document: 20051116

Kind code of ref document: A

WWP Wipo information: published in national office

Ref document number: 2005306571

Country of ref document: AU

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC. EPO FORM 1205A DATED 16-11-2007

122 Ep: pct application non-entry in european phase

Ref document number: 05805563

Country of ref document: EP

Kind code of ref document: A1