US20160171292A1 - Information processing device, information processing method, and program for recognizing facial expression and permitting use of equipment based on the recognized facial emotion expression - Google Patents

Information processing device, information processing method, and program for recognizing facial expression and permitting use of equipment based on the recognized facial emotion expression Download PDF

Info

Publication number
US20160171292A1
US20160171292A1 US15/048,188 US201615048188A US2016171292A1 US 20160171292 A1 US20160171292 A1 US 20160171292A1 US 201615048188 A US201615048188 A US 201615048188A US 2016171292 A1 US2016171292 A1 US 2016171292A1
Authority
US
United States
Prior art keywords
information processing
user
processing device
statistic
facial expression
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/048,188
Inventor
Junichi Rekimoto
Hitomi Tsujita
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Priority to US15/048,188 priority Critical patent/US20160171292A1/en
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TSUJITA, HITOMI, REKIMOTO, JUNICHI
Publication of US20160171292A1 publication Critical patent/US20160171292A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • G06V40/176Dynamic expression
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • G06V40/175Static expression
    • G06K9/00315
    • G06K9/00261
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/167Detection; Localisation; Normalisation using comparisons between temporally consecutive images

Definitions

  • the present disclosure relates to an information processing device, an information processing method, and a program, and particularly, an information processing device, an information processing method, and a program which are designed to make a user actively have a specific facial expression.
  • an information processing device which includes a recognition unit which recognizes a facial expression of a user taken on a captured image, and a control unit which controls equipment to permit the user to use the equipment when the facial expression recognized by the recognition unit is a specific facial expression.
  • Equipment which it is permitted for the user to use includes not only electric appliances such as a television set, an audio player, a personal computer, a vacuum, a washing machine, a refrigerator, and the like, but also items such as a door, furniture, or the like, of which the operation is controlled.
  • the recognition unit can recognize a facial expression of the user taken on the image captured by the capturing unit.
  • information indicating the different facial expression from the specific expression it is possible to output an image different from an image output as the information indicating the specific facial expression and a sound different from a sound output as the information indicating the specific facial expression.
  • light different from light output as the information indicating the specific facial expression and vibration different from vibration output as the information indicating the specific facial expression it is possible to output light different from light output as the information indicating the specific facial expression and vibration different from vibration output as the information indicating the specific facial expression.
  • control unit may control the equipment to set non-permission for the user to use the equipment.
  • equipment when a facial expression of a user taken on a captured image is recognized and the recognized facial expression is a specific facial expression, equipment is controlled so as to allow the user to use the equipment.
  • FIG. 1 is a diagram showing an example of the appearance of an information processing device according to an embodiment of the present disclosure
  • FIG. 2 is a diagram showing an example of a smile icon
  • FIG. 3 is a diagram showing an example of a sad icon
  • FIG. 4 is a diagram showing a use example of the information processing device according to the embodiment of the disclosure.
  • FIG. 5 is a diagram showing a display example of an icon viewed from a user standing in front of a mirror
  • FIG. 6 is a block diagram showing an inner configuration example of the information processing device according to the embodiment of the disclosure.
  • FIG. 7 is a flowchart describing a process of the information processing device according to the embodiment of the disclosure.
  • FIG. 8 is a diagram showing a configuration example of a communication system
  • FIG. 9 is a diagram showing a configuration example of the information processing device of FIG. 8 according to another embodiment of the disclosure.
  • FIG. 10 is a block diagram showing a hardware configuration example of a web server
  • FIG. 11 is a block diagram showing a functional configuration example of the web server
  • FIG. 12 is a flowchart describing a process of the information processing device according to the embodiment of the disclosure.
  • FIG. 13 is a diagram showing an example of a screen of an SNS site
  • FIG. 14 is a flowchart describing a process of the information processing device according to the embodiment of the disclosure.
  • FIG. 15 is a diagram showing an example of a screen of a calendar
  • FIG. 16 is a diagram showing a control target device
  • FIG. 17 is a diagram showing the appearance of a refrigerator as a control target device.
  • FIG. 18 is a flowchart describing a process of the information processing device according to the embodiment of the disclosure.
  • a smiling face help function is a function of helping a user to perceive whether or not the expression on a user's own face is a smile, and to have a smiling face when the expression is not a smile.
  • FIG. 1 is a diagram showing an example of the appearance of an information processing device according to an embodiment of the present disclosure.
  • the information processing device of FIG. 1 has a rectangular parallelpiped housing in a size that a user can grasp the device in one hand.
  • a lens 11 on the front surface of the housing of the information processing device 1 and an LED matrix portion 12 constituted by a predetermined number of LEDs (Light Emitting Diodes).
  • the LED matrix portion 12 is formed by providing a total of 64 LEDs including 8 LEDs arranged in the vertical direction and 8 LEDs arranged in the horizontal direction.
  • a display such as an LCD (Liquid Crystal Display), or the like may be provided.
  • the information processing device 1 performs capturing by conducting photoelectric conversion for light brought in from the lens 11 , and, based on an image obtained from the capturing, recognizes the expression (expression on the face) of the user taken as a subject.
  • the information processing device 1 recognizes that the facial expression of the user is a smile
  • the information processing device 1 displays a smile icon that is information indicating that the facial expression of the user is a smile, by making predetermined LEDs constituting the LED matrix portion 12 emit light.
  • a smiling face recognition function is mounted in the information processing device 1 . Determination whether or not the facial expression of a user within the capturing range of the information processing device 1 is repetitively performed based on a captured image in real time.
  • the information processing device 1 displays a sad icon that is information indicating that the facial expression of the user is not a smile, by making predetermined LEDs constituting the LED matrix portion 12 emit light.
  • FIG. 2 is a diagram showing an example of the smile icon
  • FIG. 3 is a diagram showing an example of the sad icon.
  • Each LED constituting the LED matrix portion 12 is described by giving # and a number according to the location (row and column) such that, for example, the LED located in the first row and the first column is set to an LED # 11 .
  • the smile icon is displayed by LEDs # 23 , # 26 , # 33 , # 36 , # 61 , # 68 , # 72 , # 77 , and # 83 to # 86 emitting light.
  • the sad icon is displayed by LEDs # 23 , # 26 , # 32 , # 37 , # 63 to # 66 , 872 , # 77 , # 81 , and # 88 emitting light.
  • the user positioned in front of the information processing device 1 can recognize that he or she is smiling by, for example, ascertaining that the smile icon is displayed.
  • the user can recognize that he or she has a facial expression other than a smiling face, for example, an angry expression or a sad expression after finding out that the sad icon is displayed, and then can be conscious of having a smiling face.
  • the information processing device 1 can make the emotion of the user positive by making the user be aware of his or her own facial expression and consciously have a smiling face when the expression is not a smile. Since maintaining a positive emotion is deemed to be connected to the prevention and improvement of a mental disease such as depression, and the like, it is also possible to reinforce wellness of mentality of the user by making the emotion of the user positive.
  • FIG. 4 is a diagram showing a use example of the information processing device 1 .
  • a mirror 21 is hung on a wall surface W in a building such as a house, or the like.
  • the information processing device 1 is installed in the rear side of the mirror 21 so that the front surface of the housing is toward to the forward side of the wall surface W.
  • the front surface of the housing of the information processing device 1 may be installed to be projected from the mirror surface 21 A.
  • the mirror surface 21 A surrounded by a frame is constituted by a half-mirror.
  • Light incident from the front side of the wall surface W to the mirror 21 is reflected on the mirror surface 21 A, part of which transmits through the mirror surface 21 A, and brought in by the lens 11 of the information processing device 1 .
  • light from the LED matrix portion 12 facing the surface side (the forward side of the wall surface W) of the mirror 21 from the rear side of the mirror 21 transmits through the mirror surface 21 A, and the smile icon or the sad icon is displayed as if the icon emerges on the mirror surface 21 A.
  • the information processing device 1 recognizes the facial expression of the user in front of the mirror 21 , and displays the smile icon or the sad icon according to the recognized facial expression. The user sees the smile icon or the sad icon displayed as if emerging on the mirror surface 21 A, and then is aware of his or her own facial expression.
  • FIG. 5 is a diagram showing a display example of an icon viewed from the user side in front of the mirror 21 .
  • the upper body of the user who is the subject H, is reflected on the mirror surface 21 A, and the facial expression of the user is a smile.
  • a smile icon is displayed at the position of the mirror surface 21 A where the information processing device 1 is installed.
  • Icon I in FIG. 5 indicates the smile icon.
  • An installation position of the information processing device 1 is arbitrary by being provided in the rear side of a television set or the display of a PC (Personal Computer), by being buried in a door in a building.
  • PC Personal Computer
  • FIG. 6 is a block diagram showing an internal configuration example of the information processing device 1 . At least some of the configuration shown in FIG. 6 is realized to execute a predetermined program by a CPU (Central Processing Unit) provided in the information processing device 1 .
  • a CPU Central Processing Unit
  • the information processing device 1 includes a capturing unit 31 , a recognition unit 32 , and a control unit 33 .
  • the capturing unit 31 is constituted by imaging elements such as a CCD (Charge Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor), or the like.
  • the capturing unit 31 performs photoelectric conversion for light brought by the lens 11 , and carries out various processes such as A/D conversion, and the like, for a signal obtained from the photoelectric conversion.
  • the capturing unit 31 outputs image data obtained after implementing the various processes to the recognition unit 32 .
  • the capturing unit 31 repeats capturing, and supplies image data obtained after the repetitive capturing to the recognition unit 32 one after another.
  • the recognition unit 32 analyses images captured by the capturing unit 31 , and recognizes the facial expression of the user taken in the images as a subject.
  • the recognition unit 32 outputs information indicating the recognition result to the control unit 33 .
  • the recognition unit 32 outputs information appropriately indicating the degree of a smile such as a big laugh, a faint smile, or the like.
  • the recognition unit 32 is given, in advance, with information which has been obtained from the analysis of the images of the smiling face and indicates feature values of each part constituting the smiling face such as eyes, nose, and mouth, as data for recognition.
  • the recognition unit 32 detects the face of the user by performing contour detection, or the like from the images captured by the capturing unit 31 , and extracts feature values of each part of the face including eyes, nose, and mouth.
  • the recognition unit 32 compares the feature values extracted from the captured images to the feature values given as the data for recognition, and recognizes the facial expression of the user as a smiling face when the index of coincidence of both values is equal to or higher than a threshold value.
  • the recognition unit 32 recognizes that the facial expression is not a smiling face.
  • the smiling face recognized by the information processing device 1 is the facial expression with characteristics of each part of the face extracted from the captured images which coincide with, to the threshold value or greater, the characteristics of the face extracted from the face image selected as an image of a smiling face when the function of recognizing a smiling face was developed, or the like.
  • the control unit 33 causes the LED matrix portion 12 to display the smile icon when the facial expression of the user is determined to be a smile, based on the information supplied from the recognition unit 32 . It may be possible that a different type of smile icon is displayed depending on the degree of the smiling face indicated by the information supplied from the recognition unit 32 . In addition, the control unit 33 causes the LED matrix portion 12 to display the sad icon when the facial expression of the user is determined not to be a smile.
  • the control unit 33 causes a speaker not shown in the drawing to output a predetermined melody appropriately in accordance with the display of the icon.
  • a melody output in accordance with the display of the smile icon and a melody output in accordance with the display of the sad icon are set to different ones respectively. It may be possible to use favorite music that the user has registered in advance as the melody output in accordance with the display of the smile icon. Accordingly, the user can recognize his or her own facial expression through the music, and boost the mood listening to the favorite music when the user has a smiling face.
  • the information processing device 1 not to include the capturing unit 31 .
  • images captured by an external device are acquired by the recognition unit 32 of the information processing device 1 as images to recognize the facial expression of the user.
  • the recognition unit 32 recognizes the facial expression of the user based on the acquired images, and the control unit 33 performs a process to give a feedback according to the facial expression to the user.
  • Step S 1 the capturing unit 31 performs capturing.
  • Step S 2 the recognition unit 32 analyses images captured by the capturing unit 31 , and recognizes the facial expression of the user taken in the images.
  • Step S 3 the control unit 33 determines whether or not the facial expression of the user is a smile based on information supplied from the recognition unit 32 .
  • Step S 4 the control unit 33 causes the LED matrix portion 12 to display the smile icon.
  • Step S 5 the control unit 33 causes the LED matrix unit 12 to display the sad icon.
  • Step S 4 After the smile icon is displayed in Step S 4 , or after the sad icon is displayed in Step S 5 , the process returns to Step S 1 , and the above process is repeated.
  • the display of the LED matrix unit 12 is switched from the sad icon to the smile icon.
  • the information processing device 1 can make the user be aware of his or her own facial expression, and can make the user have a good mood by causing his or her smiling face.
  • FIG. 8 is a diagram showing a configuration example of a communication system using the information processing device 1 .
  • the information processing device 1 of FIG. 8 has a communication function.
  • the information processing device 1 is connected to a network 51 including the Internet, or the like.
  • the network 51 is also connected to a PC 52 , a web server 53 , and a photo frame 54 .
  • the PC 52 is, for example, a PC that the user of the information processing device 1 uses.
  • the web server 53 is a server to manage SNS (Social Network Service) sites such as blogs, message boards, or the like.
  • the photo frame 54 is a digital photo frame installed in, for example, a person's home other than the user of the information processing device 1 .
  • the photo frame 54 is provided with a display, and is possible to receive and display images transmitted through the network 51 .
  • functions are realized, which are contributing messages indicating the number of smiling faces to SNS sites, displaying the list of the number of smiling faces for every predetermined period, and transmitting images of smiling faces to the photo frame 54 . Details of the functions will be described later.
  • FIG. 9 is a block diagram showing a configuration example of the information processing device 1 of FIG. 8 .
  • the same reference numerals are given to the same configurations as those shown in FIG. 6 . Overlapping description will be appropriately omitted.
  • the information processing device 1 includes the capturing unit 31 , the recognition unit 32 , the control unit 33 , a counter 61 , a display data generation unit 62 , a transmission unit 63 , and an image management unit 64 .
  • the capturing unit 31 outputs image data obtained from capturing.
  • the image data output from the capturing unit 31 is supplied to the recognition unit 32 and the image management unit 64 .
  • the recognition unit 32 analyses images captured by the capturing unit 31 , and recognizes facial expressions of the user.
  • the recognition unit 32 outputs information indicating the recognition result and information indicating the degree of a smiling face to the control unit 33 , the counter 61 , and the image management unit 64 .
  • the counter 61 When it is determined that the facial expression of the user is a smile based on the information supplied from the recognition unit 32 , the counter 61 increases the number of smiling faces by one, and manage information indicating the number of smiling faces by storing in a memory. When a message indicating the number of smiling faces is contributed to an SNS site, the counter 61 outputs the information indicating the number of smiling faces to the display data generation unit 62 at a time point when the number of smiling faces for a predetermined period of time such as one day reaches a threshold value or higher.
  • the counter 61 outputs the information indicating the number of smiling faces to the transmission unit 63 .
  • a period for which the number of smiling faces is counted is one day will be described.
  • the display data generation unit 62 generates a message indicating the number of smiling faces counted by the counter 61 and outputs the data of the generated message to the transmission unit 63 .
  • the transmission unit 63 When the message indicating the number of smiling faces is contributed to an SNS site, the transmission unit 63 has access to the web server 53 through the network 51 , and transmits the message generated by the display data generation unit 62 .
  • the transmission unit 63 transmits the information indicating the number of smiling faces supplied from the counter 61 and an image selected by the image management unit 64 to the web server 53 .
  • the image management unit 64 specifies an image in which a smiling face of the user is taken from images captured by the capturing unit 31 based on the recognition result by the recognition unit 32 , and manages the specified smiling face image by storing in the memory.
  • the recognition unit 32 also supplies information indicating the degree of a smiling face, such as a big laugh, a faint smile, or the like.
  • the image management unit 64 manages the degrees of smiling faces taken in each image corresponding to the images of the smiling faces.
  • the image management unit 64 selects an image with the highest degree of a smiling face from images stored in the memory and outputs the image to the transmission unit 63 so as to transmit the image to the web server 53 .
  • FIG. 10 is a block diagram showing a hardware configuration example of the web server 53 .
  • a CPU (Central Processing Unit) 71 , a ROM (Read Only Memory) 72 , and a RAM (Random Access Memory) 73 are connected to one another via a bus 74 .
  • an input and output interface 75 is connected to the bus 74 .
  • the input and output interface 75 is connected to an input unit 76 including a keyboard, a mouse, and the like, and an output unit 77 including a display, a speaker, and the like.
  • the input and output interface 75 is connected to a storage unit 78 including a hard disk, a non-volatile memory, and the like, a communication unit 79 including a network interface, and the like, and a drive 80 for driving a removable medium 81 .
  • FIG. 11 is a block diagram showing a functional configuration example of the web server 53 . At least some of the function units shown in FIG. 11 are realized by executing predetermined programs by the CPU 71 of FIG. 10 .
  • an SNS site management unit 91 In the web server 53 , an SNS site management unit 91 , a number management unit 92 , and a display control unit 93 are realized.
  • the SNS site management unit 91 manages SNS sites, and allows devices that have had access through the network 51 to review the sites.
  • the SNS site management unit 91 acquires messages transmitted from the devices connected to the network 51 controlling the communication unit 79 , and adds the messages to the SNS sites.
  • the number management unit 92 acquires the information controlling the communication unit 79 .
  • the number management unit 92 causes the storage unit 78 to store the acquired information, for example, corresponding to identification information of the information processing device 1 .
  • the display control unit 93 acquires the information managed by the number management unit 92 , and causes the list of the number of smiling faces counted for each day to be displayed using, for example, the image of a calendar. In addition, when one day is selected on the image of the calendar, the display control unit 93 causes an image, which is captured on the day and transmitted from the information processing device 1 as the image with the highest degree of a smiling face, to be transmitted to the devices that requested display of the list of the number of smiling faces to be displayed.
  • Step S 11 the capturing unit 31 performs capturing.
  • Step S 12 the recognition unit 32 analyzes the image captured by the capturing unit 31 , and recognizes the facial expression of the user.
  • Step S 13 the control unit 33 determines whether or not the facial expression of the user is a smile based on the information supplied from the recognition unit 32 .
  • Step S 14 the control unit 33 gives feedback by displaying a smile icon to the LED matrix unit 12 , or the like.
  • Step S 13 when the facial expression of the user is determined not to be a smile in Step S 13 , the process returns to Step S 11 , and the above process is repeated.
  • the control unit 33 may give feedback to the user by displaying a sad icon to the LED matrix unit 12 , or the like.
  • Step S 15 the counter 61 counts the number of smiling faces.
  • Step S 16 the counter 61 determined whether or not the number of smiling faces for a predetermined period is equal to or higher than a predetermined number that is a threshold value.
  • Step S 17 the display data generation unit 62 generates a message indicating the number of smiling faces counted by the counter 61 .
  • Step S 18 the transmission unit 63 transmits the message generated by the display data generation unit 62 to the web server 53 to contribute the message to the SNS site. After the message is contributed thereto, or when the number of smiling faces is determined not to be equal to or higher than the threshold value in Step S 16 , the process returns to Step S 11 , and the above process is repeated.
  • the information transmitted from the transmission unit 63 is acquired by the SNS site management unit 91 of the web server 53 , and posted on the SNS site.
  • FIG. 13 is a diagram showing an example of the screen of the SNS site.
  • the screen of FIG. 13 is displayed when the PC 52 or other devices such as a mobile telephone, or the like has access to the web server 53 using a browsing function.
  • Contribution # 3 is the message contributed from the information processing device 1 , and shows that the number of smiling faces counted for one day is six.
  • the number of smiling faces is expressed by the number of images indicating smiling faces. “Could make smiles OOOOOO times today” indicated as Contribution 13 (O is an image showing a smiling face) is the message generated by the display data generation unit 62 .
  • Contributions # 1 and # 2 of FIG. 13 are messages contributed by the friends, and the like of the user of the information processing device 1 , praising or supporting the number of smiling faces.
  • the user of the information processing device 1 can share the number of smiling faces with the friends, and the like.
  • Step S 31 Processes of Steps S 31 to S 35 of FIG. 14 are basically the same processes as those of Steps S 11 to S 15 of FIG. 12 .
  • the capturing unit 31 performs capturing. Images captured are supplied to the recognition unit 32 and to the image management unit 64 .
  • Step S 32 the recognition unit 32 analyses the images captured by the capturing unit 31 and recognizes the facial expression of the user.
  • Step S 33 the control unit 33 determines whether or not the facial expression of the user is a smile based on the information supplied from the recognition unit 32 .
  • Step S 34 the control unit 33 gives feedback causing the LED matrix unit 12 to display a smile icon, or the like.
  • the facial expression of the user is determined not to be a smile in Step S 33 , processes after Step S 31 are repeated.
  • Step S 35 the counter 61 counts the number of smiling faces.
  • Step S 36 the image management unit 64 causes a memory to store an image that is recognized that a smiling face is taken therein and information indicating the degree of a smiling face recognized by the recognition unit 32 for management.
  • Step S 37 the counter 61 determines whether or not one day that is a period for which the number of smiling faces has to be counted has passed.
  • Step S 38 the image management unit 64 selects an image with the highest degree of smiling face.
  • Step S 39 the transmission unit 63 transmits information indicating the number of smiling faces counted by the counter 61 and the image of a smiling face selected by the image management unit 64 to the web server 53 .
  • Step S 37 After the information indicating the number of smiling faces and the image of the smiling face are transmitted, or when the period for which the number of smiling faces has to be counted is determined to have not passed in Step S 37 , the process returns to Step S 31 , and the above processes are repeated.
  • the information transmitted from the transmission unit 63 is acquired by the number management unit 92 of the web server 53 , and managed.
  • the user of the information processing device 1 makes access to the web server 53 operating, for example, the PC 52 , and requests for displaying the list of the number of smiling faces
  • an image of a calendar for displaying the list of the number of smiling faces is displayed on a display of the PC 52 based on information transmitted from the display control unit 93 .
  • FIG. 15 is a diagram showing an example of a screen of the calendar displayed on the display of the PC 52 based on the information transmitted from the web server 53 .
  • Calendar image P of FIG. 15 is an image of a calendar for a certain month, and icons indicating the number of smiling faces counted for each day are displayed in the cells for the first to fifteenth days out of cells corresponding to first to thirty-first days. Calendar image P is an image displayed when the fifteenth day has passed.
  • icons with different degrees of smiling faces are displayed in accordance with the range of the number of smiling faces.
  • the icon with a dark facial expression displayed in the cell for the seventh day indicates that the number of smiling faces counted on that day is, for example, zero to five
  • the icon with a smiling face displayed in the cell for the fifth day indicates that the number of smiling faces counted on that day is, for example, six to ten
  • the icon with the big laugh displayed in the cell for the first day indicates that the number of smiling faces counted on that day is, for example, eleven.
  • the display control unit 93 of the web server 53 transmits to the PC 52 an image selected by the information processing device 1 among images (photos) captured on the day for which the selected icon is displayed as an image with the highest degree of a smiling face, and have the image to be displayed on the display.
  • the user of the information processing device 1 can recognize the number of smiling faces for each day from the types of the icons displayed on calendar image P, and can awake himself or herself to make a smiling face more actively when the number is low.
  • friends, and the like of the user of the information processing device 1 make access to the web server 53 through an operation of their own devices to see calendar image P, and can check whether the user of the information processing device 1 makes smiling faces, in other words, the user is in a bright mood.
  • the functions of managing the number of smiling faces and displaying the list thereof may be installed in the PC 52 .
  • the PC 52 is provided with the number management unit 92 and the display control unit 93 of FIG. 11 , and performs a process of displaying the list of the number of smiling faces based on information transmitted from the information processing device 1 .
  • the information processing device 1 may be provided with the number management unit 92 and the display control unit 93 so as to recognize the number of smiling faces through display of the LED matrix unit 12 and an LCD provided in the information processing device 1 .
  • the information processing device 1 of FIG. 8 recognizes a facial expression of the user taken in a captured image, and gives feedback to the user by displaying icons, or the like.
  • the information processing device 1 transmits an image in which the recognized smiling face is taken to the photo frame 54 .
  • the photo frame 54 that receives the image transmitted from the information processing device 1 causes the display to display the received image. Accordingly, for example, when the photo frame 54 is installed in somebody's home other than the user of the information processing device 1 , the user of the photo frame 54 can see the image of a smiling face of the user of the information processing device 1 . This function is considered to be particularly useful when the user of the information processing device 1 is an old person living alone and the user of the photo frame 54 is his or her offspring. The offspring who is the user of the photo frame 54 can recognize a smiling face of his or her parent who is the user of the information processing device 1 .
  • a smiling face gateway function that is a function of requesting a smiling face when the user takes an action will be described.
  • the information processing device 1 is electrically connected to control target equipment 101 that is equipment to be controlled in a wired or wireless manner. A case where the control target equipment 101 is a refrigerator will be described.
  • FIG. 17 is a diagram showing the appearance of a refrigerator to which the information processing device 1 is fixed.
  • the refrigerator 111 of FIG. 17 is constituted by a door 111 A and the main body 111 B.
  • the door 111 A is fixed to the main body 111 B so as to be open and closed via a hinge (not shown in the drawing) provided on the right side face of the door 111 A.
  • a grip 121 is provided in the left side of the surface of the door 111 A. An action of opening the door 111 A is grasping the grip 121 with, for example, the right hand and pulling the grip to the front as shown by the white arrow.
  • the information processing device 1 is installed at the position that is on the surface of the door 111 A and above the grip 121 so that the front face of the housing thereof faces the front direction of the refrigerator 111 .
  • a member 122 A including an electric lock 122 is provided, and on the left side face of the main body 111 B, a member 122 B constituting the electric lock 122 making a pair with the member 122 A.
  • the members 122 A and 122 B are provided at the position where the members adjoin each other when the door 111 A is closed.
  • the electric lock 122 is connected to the information processing device 1 as shown by the dotted line, and switches the door 111 A to a locked state or an unlocked state according to a control signal supplied from the information processing device 1 .
  • the electric lock 122 is in the locked state, the user is not able to open the door 111 A, but able to open the door 111 A when the lock is in the unlocked state.
  • the information processing device 1 installed in the refrigerator 111 in such a state gives feedback to the user in accordance with the facial expressions by capturing images, recognizing the facial expressions of the user, and displaying icons, in the same manner as the information processing device 1 of FIG. 1 .
  • the information processing device 1 when a facial expression of the user is recognized to be a smile, the information processing device 1 outputs a control signal instructing the electric lock 122 to be in the unlocked state so as to open the door 111 A, whereby the user is allowed to use the refrigerator ill. In addition, when the user closes the door 111 A finishing the use of the refrigerator 111 in the unlocked state of the electric lock 122 , the information processing device 1 outputs a control signal instructing the electric lock 122 to be in the locked state so as to prohibit from opening the door 111 A.
  • the information processing device 1 can make the user be aware of making smiling faces at ordinary times by requesting making smiling faces in ordinary actions such as using the refrigerator 11 , in other words, by compelling smiling faces.
  • the information processing device 1 of FIG. 17 has the same configuration as that in FIG. 6 or FIG. 10 .
  • Step S 51 the capturing unit 31 performs capturing.
  • Step S 52 the recognition unit 32 analyzes images captured by the capturing unit 31 , and recognizes a facial expression of the user.
  • Step S 53 the control unit 33 determines whether or not the facial expression of the user is a smile based on information supplied from the recognition unit 32 .
  • Step S 54 the control unit 33 gives feedback by causing the LED matrix unit 12 to display a smile icon.
  • Step S 55 the control unit 33 outputs a control signal to the electric lock 122 to put the door 111 A in an unlocked state. Accordingly, the user can open the door 111 A and takes out food from the refrigerator 111 , or the like.
  • Step S 56 the control unit 33 determines whether or not a fixed time has passed after, for example, it is detected that the user is in front of the refrigerator 111 . Whether or not the user is in front of the refrigerator 111 is recognized by the recognition unit 32 based on, for example, a captured image.
  • the control unit 33 When it is determined that a fixed time has passed in Step S 56 , the control unit 33 outputs a control signal to the electric lock 122 to put the door 111 A in the unlocked state in Step S 55 in the same manner.
  • the door 111 A is designed to be opened without making a smiling face when a fixed time has passed after the user is detected to be in front of the refrigerator 111 , but the user may not be allowed to open the door unless the user makes a smiling face.
  • Step S 57 the control unit 33 determines whether or not the user has closed the door 111 A and departed from the refrigerator Ill, and stands by until the user is determined to have departed from the refrigerator 111 . For example, when the door 111 A is detected to be closed, it is determined that the user has departed from the refrigerator 111 .
  • Step S 58 the control unit 33 outputs a control signal to the electric lock 122 to put the door 111 A in a locked state. After the door 111 A is in the locked state, or when it is determined that a fixed time has passed in Step S 56 , the process returns to Step S 51 , and the above processes are repeated.
  • the facial expression of the user is determined not to be a smile in Step S 53 , feedback may be given to the user by causing the LED matrix unit 12 to display a sad icon, or the like.
  • the information processing device 1 can so-called “compel” the user to make a smiling face, and can make the user be aware of making a smiling face.
  • equipment to be controlled is set to the refrigerator 111 , but it is possible to control access (permit or non-permit) to various items that the user uses in his or her daily life.
  • a series of processes described above can be executed by hardware, and also by software.
  • a program constituting the software is installed in a computer incorporated in dedicated hardware, or a general-purpose personal computer, or the like.
  • a program to be installed is provided by being recorded on an optical disc (CD-ROM (Compact Disc-Read Only Memory), a DVD (Digital Versatile Disc), or the like), or a removable medium 81 of FIG. 10 including a semiconductor memory, or the like.
  • an optical disc CD-ROM (Compact Disc-Read Only Memory), a DVD (Digital Versatile Disc), or the like
  • a removable medium 81 of FIG. 10 including a semiconductor memory, or the like.
  • such a program may be provided through a wired or wireless transmission medium including a local area network, the Internet, or a digital broadcasting.
  • a program can be installed in advance in the ROM 72 or the storage unit 78 .
  • a program executed by a computer may be a program of which the process is performed in time series following the order described in the present specification, or may be a program of which the process is performed in parallel or at a necessary time point when it is called up, or the like.
  • An embodiment of the disclosure is not limited to the embodiments described above, and can be variously modified in the scope not departing from the gist of the disclosure.
  • An information processing device including a recognition unit which recognizes a facial expression of a user taken on a captured image, and a control unit which controls equipment to permit the user to use the equipment when the facial expression recognized by the recognition unit is a specific facial expression.
  • the information processing device described in (1) above further including a capturing unit which captures an image of the user, in which the recognition unit recognizes a facial expression of the user taken on the image captured by the capturing unit.
  • the information processing device described in (1), (2), or (3) above further including an output unit which outputs information indicating the specific facial expression when the specific facial expression is recognized by the recognition unit and outputs information indicating a different facial expression from the specific expression when the facial expression that is not the specific expression is recognized by the recognition unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Processing (AREA)

Abstract

An information processing device includes a recognition unit which recognizes a facial expression of a user taken on a captured image, and a control unit which controls equipment to permit the user to use the equipment when the facial expression recognized by the recognition unit is a specific facial expression.

Description

    CROSS-REFERENCE TO PRIOR APPLICATION
  • This application is a continuation of U.S. patent application Ser. No. 13/363,799 (filed on Feb. 1, 2012), which claims priority to Japanese Patent Application No. 2011-027680 (filed on Feb. 10, 2011), which are all hereby incorporated by reference in their entirety.
  • BACKGROUND
  • The present disclosure relates to an information processing device, an information processing method, and a program, and particularly, an information processing device, an information processing method, and a program which are designed to make a user actively have a specific facial expression.
  • As the quoted statement (The Principles of Psychology (Vol. 2), 1950, New York, Dover Publications (the original work was published in 1890)) of William James, there is the belief that “We do not laugh because we are happy—we are happy because we laugh”. Recent studies have also acquired an abundance of ideas that support the belief.
  • For example, the experiment of Kleinke, et al. disclosed in “Effects of Self-Generated Facial Expressions on Mood” (by Chris L. Kieinke, Thomas R. Peterson, and Thomas R. Rutledge, Journal of Personality and Social Psychology 1998, Vol. 74, No. 1,272,279) verified whether or not there is a difference in facial expressions between a subject group in which photos of various facial expressions (laughing and sad) are shown to subjects and the subjects are instructed to make “the same facial expressions as in the photos” and the other subject group in which the photos are just shown to subjects. From the experiment, it was ascertained that the action of seeing a photo of a smiling face and then actually making such a smiling face contributes to making the emotion of a subject bright.
  • Japanese Unexamined Patent Application Publication Nos. 2010-34686 and 2009-290819 are example of related art.
  • SUMMARY
  • The above-described experiment disclosed in “Effects of Self-Generated Facial Expressions on Mood” indicates that there is a possibility that an emotional state of a person can be positive by making the person actively have a smiling face.
  • It is desirable for the disclosure to make a user actively have a specific facial expression.
  • According to an embodiment of the present disclosure, there is provided an information processing device which includes a recognition unit which recognizes a facial expression of a user taken on a captured image, and a control unit which controls equipment to permit the user to use the equipment when the facial expression recognized by the recognition unit is a specific facial expression.
  • It may be possible to set the specific facial expression to a smile. It is also possible to set the specific facial expression to a facial expression other than a smile such as an angry face, a weeping face, or the like. Equipment which it is permitted for the user to use includes not only electric appliances such as a television set, an audio player, a personal computer, a vacuum, a washing machine, a refrigerator, and the like, but also items such as a door, furniture, or the like, of which the operation is controlled.
  • It may be possible to further include a capturing unit which captures an image of the user. In this case, the recognition unit can recognize a facial expression of the user taken on the image captured by the capturing unit.
  • It may be possible to further include an output unit which outputs information indicating the specific facial expression when the specific facial expression is recognized by the recognition unit and outputs information indicating a different facial expression from the specific expression when the different facial expression that is not the specific expression is recognized by the recognition unit.
  • It is possible to output, for example, an image, a sound, light, or vibration as information indicating the specific facial expression. As information indicating the different facial expression from the specific expression, it is possible to output an image different from an image output as the information indicating the specific facial expression and a sound different from a sound output as the information indicating the specific facial expression. In addition, it is possible to output light different from light output as the information indicating the specific facial expression and vibration different from vibration output as the information indicating the specific facial expression.
  • When the user finishes using the equipment, the control unit may control the equipment to set non-permission for the user to use the equipment.
  • In the disclosure, when a facial expression of a user taken on a captured image is recognized and the recognized facial expression is a specific facial expression, equipment is controlled so as to allow the user to use the equipment.
  • According to the present disclosure, it is possible to make a user actively have a specific facial expression.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a diagram showing an example of the appearance of an information processing device according to an embodiment of the present disclosure;
  • FIG. 2 is a diagram showing an example of a smile icon;
  • FIG. 3 is a diagram showing an example of a sad icon;
  • FIG. 4 is a diagram showing a use example of the information processing device according to the embodiment of the disclosure;
  • FIG. 5 is a diagram showing a display example of an icon viewed from a user standing in front of a mirror;
  • FIG. 6 is a block diagram showing an inner configuration example of the information processing device according to the embodiment of the disclosure;
  • FIG. 7 is a flowchart describing a process of the information processing device according to the embodiment of the disclosure;
  • FIG. 8 is a diagram showing a configuration example of a communication system;
  • FIG. 9 is a diagram showing a configuration example of the information processing device of FIG. 8 according to another embodiment of the disclosure;
  • FIG. 10 is a block diagram showing a hardware configuration example of a web server;
  • FIG. 11 is a block diagram showing a functional configuration example of the web server;
  • FIG. 12 is a flowchart describing a process of the information processing device according to the embodiment of the disclosure;
  • FIG. 13 is a diagram showing an example of a screen of an SNS site;
  • FIG. 14 is a flowchart describing a process of the information processing device according to the embodiment of the disclosure;
  • FIG. 15 is a diagram showing an example of a screen of a calendar;
  • FIG. 16 is a diagram showing a control target device;
  • FIG. 17 is a diagram showing the appearance of a refrigerator as a control target device; and
  • FIG. 18 is a flowchart describing a process of the information processing device according to the embodiment of the disclosure.
  • DETAILED DESCRIPTION OF EMBODIMENTS
  • Hereinafter, embodiments for implementing the disclosure will be described. Description thereof will be provided in the following order.
  • 1. First Embodiment (Smiling Face Help Function)
  • 2. Second Embodiment (Smiling Face Counting Function)
  • 3. Third Embodiment (Smiling Face Gateway Function)
  • First Embodiment
  • A smiling face help function will be described, which is a function of helping a user to perceive whether or not the expression on a user's own face is a smile, and to have a smiling face when the expression is not a smile.
  • [Appearance of Information Processing Device]
  • FIG. 1 is a diagram showing an example of the appearance of an information processing device according to an embodiment of the present disclosure.
  • The information processing device of FIG. 1 has a rectangular parallelpiped housing in a size that a user can grasp the device in one hand. There are provided a lens 11 on the front surface of the housing of the information processing device 1 and an LED matrix portion 12 constituted by a predetermined number of LEDs (Light Emitting Diodes). In the example of FIG. 1, the LED matrix portion 12 is formed by providing a total of 64 LEDs including 8 LEDs arranged in the vertical direction and 8 LEDs arranged in the horizontal direction. Instead of the LED matrix portion 12, a display such as an LCD (Liquid Crystal Display), or the like may be provided.
  • The information processing device 1 performs capturing by conducting photoelectric conversion for light brought in from the lens 11, and, based on an image obtained from the capturing, recognizes the expression (expression on the face) of the user taken as a subject. When the information processing device 1 recognizes that the facial expression of the user is a smile, the information processing device 1 displays a smile icon that is information indicating that the facial expression of the user is a smile, by making predetermined LEDs constituting the LED matrix portion 12 emit light.
  • In other words, a smiling face recognition function is mounted in the information processing device 1. Determination whether or not the facial expression of a user within the capturing range of the information processing device 1 is repetitively performed based on a captured image in real time.
  • On the other hand, when it is recognized that the facial expression of the user is not a smile, the information processing device 1 displays a sad icon that is information indicating that the facial expression of the user is not a smile, by making predetermined LEDs constituting the LED matrix portion 12 emit light.
  • FIG. 2 is a diagram showing an example of the smile icon, and FIG. 3 is a diagram showing an example of the sad icon. Each LED constituting the LED matrix portion 12 is described by giving # and a number according to the location (row and column) such that, for example, the LED located in the first row and the first column is set to an LED # 11.
  • As shown in FIG. 2, the smile icon is displayed by LEDs # 23, #26, #33, #36, #61, #68, #72, #77, and #83 to #86 emitting light. In addition, as shown in FIG. 3, the sad icon is displayed by LEDs # 23, #26, #32, #37, #63 to #66, 872, #77, #81, and #88 emitting light.
  • The user positioned in front of the information processing device 1 can recognize that he or she is smiling by, for example, ascertaining that the smile icon is displayed. In addition, the user can recognize that he or she has a facial expression other than a smiling face, for example, an angry expression or a sad expression after finding out that the sad icon is displayed, and then can be conscious of having a smiling face.
  • Based on the belief “We do not laugh because we are happy—we are happy because we laugh”, the information processing device 1 can make the emotion of the user positive by making the user be aware of his or her own facial expression and consciously have a smiling face when the expression is not a smile. Since maintaining a positive emotion is deemed to be connected to the prevention and improvement of a mental disease such as depression, and the like, it is also possible to reinforce wellness of mentality of the user by making the emotion of the user positive.
  • [Use Example of Information Processing Device]
  • FIG. 4 is a diagram showing a use example of the information processing device 1.
  • In the example of FIG. 4, a mirror 21 is hung on a wall surface W in a building such as a house, or the like. As indicated by the dotted line, the information processing device 1 is installed in the rear side of the mirror 21 so that the front surface of the housing is toward to the forward side of the wall surface W. The front surface of the housing of the information processing device 1 may be installed to be projected from the mirror surface 21A.
  • The mirror surface 21A surrounded by a frame is constituted by a half-mirror. Light incident from the front side of the wall surface W to the mirror 21 is reflected on the mirror surface 21A, part of which transmits through the mirror surface 21A, and brought in by the lens 11 of the information processing device 1. In addition, light from the LED matrix portion 12 facing the surface side (the forward side of the wall surface W) of the mirror 21 from the rear side of the mirror 21 transmits through the mirror surface 21A, and the smile icon or the sad icon is displayed as if the icon emerges on the mirror surface 21A.
  • When the user stands in front of the mirror 21 in order to tidy himself or herself up, or the like, the information processing device 1 recognizes the facial expression of the user in front of the mirror 21, and displays the smile icon or the sad icon according to the recognized facial expression. The user sees the smile icon or the sad icon displayed as if emerging on the mirror surface 21A, and then is aware of his or her own facial expression.
  • FIG. 5 is a diagram showing a display example of an icon viewed from the user side in front of the mirror 21. In the example of FIG. 5, the upper body of the user, who is the subject H, is reflected on the mirror surface 21A, and the facial expression of the user is a smile. A smile icon is displayed at the position of the mirror surface 21A where the information processing device 1 is installed. Icon I in FIG. 5 indicates the smile icon.
  • As such, by providing the information processing device 1 at a position where the face looks in daily lives, the user perceives his or her own the facial expression in ordinary times, whereby it is possible to habitually have a smiling face. An installation position of the information processing device 1 is arbitrary by being provided in the rear side of a television set or the display of a PC (Personal Computer), by being buried in a door in a building.
  • [Inner Configuration of Information Processing Device]
  • FIG. 6 is a block diagram showing an internal configuration example of the information processing device 1. At least some of the configuration shown in FIG. 6 is realized to execute a predetermined program by a CPU (Central Processing Unit) provided in the information processing device 1.
  • The information processing device 1 includes a capturing unit 31, a recognition unit 32, and a control unit 33.
  • The capturing unit 31 is constituted by imaging elements such as a CCD (Charge Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor), or the like. The capturing unit 31 performs photoelectric conversion for light brought by the lens 11, and carries out various processes such as A/D conversion, and the like, for a signal obtained from the photoelectric conversion. The capturing unit 31 outputs image data obtained after implementing the various processes to the recognition unit 32. The capturing unit 31 repeats capturing, and supplies image data obtained after the repetitive capturing to the recognition unit 32 one after another.
  • The recognition unit 32 analyses images captured by the capturing unit 31, and recognizes the facial expression of the user taken in the images as a subject. The recognition unit 32 outputs information indicating the recognition result to the control unit 33. In addition, the recognition unit 32 outputs information appropriately indicating the degree of a smile such as a big laugh, a faint smile, or the like.
  • The recognition unit 32 is given, in advance, with information which has been obtained from the analysis of the images of the smiling face and indicates feature values of each part constituting the smiling face such as eyes, nose, and mouth, as data for recognition. The recognition unit 32 detects the face of the user by performing contour detection, or the like from the images captured by the capturing unit 31, and extracts feature values of each part of the face including eyes, nose, and mouth.
  • In addition, the recognition unit 32 compares the feature values extracted from the captured images to the feature values given as the data for recognition, and recognizes the facial expression of the user as a smiling face when the index of coincidence of both values is equal to or higher than a threshold value. On the other hand, when the index of coincidence between the feature values extracted from the captured images and the feature values given as the data for recognition is less than the threshold value, the recognition unit 32 recognizes that the facial expression is not a smiling face. In other words, the smiling face recognized by the information processing device 1 is the facial expression with characteristics of each part of the face extracted from the captured images which coincide with, to the threshold value or greater, the characteristics of the face extracted from the face image selected as an image of a smiling face when the function of recognizing a smiling face was developed, or the like.
  • The control unit 33 causes the LED matrix portion 12 to display the smile icon when the facial expression of the user is determined to be a smile, based on the information supplied from the recognition unit 32. It may be possible that a different type of smile icon is displayed depending on the degree of the smiling face indicated by the information supplied from the recognition unit 32. In addition, the control unit 33 causes the LED matrix portion 12 to display the sad icon when the facial expression of the user is determined not to be a smile.
  • The control unit 33 causes a speaker not shown in the drawing to output a predetermined melody appropriately in accordance with the display of the icon. A melody output in accordance with the display of the smile icon and a melody output in accordance with the display of the sad icon are set to different ones respectively. It may be possible to use favorite music that the user has registered in advance as the melody output in accordance with the display of the smile icon. Accordingly, the user can recognize his or her own facial expression through the music, and boost the mood listening to the favorite music when the user has a smiling face.
  • As such, as a feedback denoting a face recognition result, it is possible to adopt various ways such as an output of a melody, generation of vibration, or the like. In addition, it is also possible for the information processing device 1 not to include the capturing unit 31. In this case, for example, images captured by an external device are acquired by the recognition unit 32 of the information processing device 1 as images to recognize the facial expression of the user. The recognition unit 32 recognizes the facial expression of the user based on the acquired images, and the control unit 33 performs a process to give a feedback according to the facial expression to the user.
  • [Operation of Information Processing Device]
  • Herein, with reference to the flowchart of FIG. 7, a process of the information processing device 1 will be described, which makes the user be aware of his or her own facial expression using the display of an icon and helps the user to have a smiling face.
  • In Step S1, the capturing unit 31 performs capturing.
  • In Step S2, the recognition unit 32 analyses images captured by the capturing unit 31, and recognizes the facial expression of the user taken in the images.
  • In Step S3, the control unit 33 determines whether or not the facial expression of the user is a smile based on information supplied from the recognition unit 32.
  • When the facial expression of the user is determined to be a smile in Step S3, in Step S4, the control unit 33 causes the LED matrix portion 12 to display the smile icon.
  • On the other hand, when the facial expression of the user is determined not to be a smile in Step S3, in Step S5, the control unit 33 causes the LED matrix unit 12 to display the sad icon.
  • After the smile icon is displayed in Step S4, or after the sad icon is displayed in Step S5, the process returns to Step S1, and the above process is repeated. When the user, who has recognized that he or she did not have a smiling face from the display of the sad icon, intentionally makes a smiling face, the display of the LED matrix unit 12 is switched from the sad icon to the smile icon.
  • With the above process, the information processing device 1 can make the user be aware of his or her own facial expression, and can make the user have a good mood by causing his or her smiling face.
  • Second Embodiment
  • Description will be provided on a smiling face counting function that is a function of counting and presenting the number of smiling faces.
  • [Configuration of Communication System]
  • FIG. 8 is a diagram showing a configuration example of a communication system using the information processing device 1.
  • The information processing device 1 of FIG. 8 has a communication function. The information processing device 1 is connected to a network 51 including the Internet, or the like. The network 51 is also connected to a PC 52, a web server 53, and a photo frame 54.
  • The PC 52 is, for example, a PC that the user of the information processing device 1 uses. The web server 53 is a server to manage SNS (Social Network Service) sites such as blogs, message boards, or the like. The photo frame 54 is a digital photo frame installed in, for example, a person's home other than the user of the information processing device 1. The photo frame 54 is provided with a display, and is possible to receive and display images transmitted through the network 51.
  • In the communication system of FIG. 8, functions are realized, which are contributing messages indicating the number of smiling faces to SNS sites, displaying the list of the number of smiling faces for every predetermined period, and transmitting images of smiling faces to the photo frame 54. Details of the functions will be described later.
  • FIG. 9 is a block diagram showing a configuration example of the information processing device 1 of FIG. 8. In the configuration shown in FIG. 9, the same reference numerals are given to the same configurations as those shown in FIG. 6. Overlapping description will be appropriately omitted.
  • The information processing device 1 includes the capturing unit 31, the recognition unit 32, the control unit 33, a counter 61, a display data generation unit 62, a transmission unit 63, and an image management unit 64.
  • The capturing unit 31 outputs image data obtained from capturing. The image data output from the capturing unit 31 is supplied to the recognition unit 32 and the image management unit 64.
  • The recognition unit 32 analyses images captured by the capturing unit 31, and recognizes facial expressions of the user. The recognition unit 32 outputs information indicating the recognition result and information indicating the degree of a smiling face to the control unit 33, the counter 61, and the image management unit 64.
  • When it is determined that the facial expression of the user is a smile based on the information supplied from the recognition unit 32, the counter 61 increases the number of smiling faces by one, and manage information indicating the number of smiling faces by storing in a memory. When a message indicating the number of smiling faces is contributed to an SNS site, the counter 61 outputs the information indicating the number of smiling faces to the display data generation unit 62 at a time point when the number of smiling faces for a predetermined period of time such as one day reaches a threshold value or higher.
  • In addition, when the list of the number of smiling faces is displayed and a predetermined period such as one day, one week, one month, or the like have passed, the counter 61 outputs the information indicating the number of smiling faces to the transmission unit 63. Hereinafter, a case where a period for which the number of smiling faces is counted is one day will be described.
  • The display data generation unit 62 generates a message indicating the number of smiling faces counted by the counter 61 and outputs the data of the generated message to the transmission unit 63.
  • When the message indicating the number of smiling faces is contributed to an SNS site, the transmission unit 63 has access to the web server 53 through the network 51, and transmits the message generated by the display data generation unit 62.
  • In addition, when the list of the number of smiling faces is displayed, the transmission unit 63 transmits the information indicating the number of smiling faces supplied from the counter 61 and an image selected by the image management unit 64 to the web server 53.
  • When the list of the number of smiling faces is displayed, the image management unit 64 specifies an image in which a smiling face of the user is taken from images captured by the capturing unit 31 based on the recognition result by the recognition unit 32, and manages the specified smiling face image by storing in the memory. As described above, the recognition unit 32 also supplies information indicating the degree of a smiling face, such as a big laugh, a faint smile, or the like. The image management unit 64 manages the degrees of smiling faces taken in each image corresponding to the images of the smiling faces.
  • The image management unit 64 selects an image with the highest degree of a smiling face from images stored in the memory and outputs the image to the transmission unit 63 so as to transmit the image to the web server 53.
  • FIG. 10 is a block diagram showing a hardware configuration example of the web server 53.
  • A CPU (Central Processing Unit) 71, a ROM (Read Only Memory) 72, and a RAM (Random Access Memory) 73 are connected to one another via a bus 74.
  • Furthermore, an input and output interface 75 is connected to the bus 74. The input and output interface 75 is connected to an input unit 76 including a keyboard, a mouse, and the like, and an output unit 77 including a display, a speaker, and the like. In addition, the input and output interface 75 is connected to a storage unit 78 including a hard disk, a non-volatile memory, and the like, a communication unit 79 including a network interface, and the like, and a drive 80 for driving a removable medium 81.
  • FIG. 11 is a block diagram showing a functional configuration example of the web server 53. At least some of the function units shown in FIG. 11 are realized by executing predetermined programs by the CPU 71 of FIG. 10.
  • In the web server 53, an SNS site management unit 91, a number management unit 92, and a display control unit 93 are realized.
  • The SNS site management unit 91 manages SNS sites, and allows devices that have had access through the network 51 to review the sites. The SNS site management unit 91 acquires messages transmitted from the devices connected to the network 51 controlling the communication unit 79, and adds the messages to the SNS sites.
  • When the information indicating the number of smiling faces counted for one day in the information processing device 1 and the image selected in the information processing device 1 as an image with the highest degree of a smiling face are transmitted from the information processing device 1, the number management unit 92 acquires the information controlling the communication unit 79.
  • The number management unit 92 causes the storage unit 78 to store the acquired information, for example, corresponding to identification information of the information processing device 1.
  • When the devices connected to the network 51 request for displaying the list of the number of smiling faces, the display control unit 93 acquires the information managed by the number management unit 92, and causes the list of the number of smiling faces counted for each day to be displayed using, for example, the image of a calendar. In addition, when one day is selected on the image of the calendar, the display control unit 93 causes an image, which is captured on the day and transmitted from the information processing device 1 as the image with the highest degree of a smiling face, to be transmitted to the devices that requested display of the list of the number of smiling faces to be displayed.
  • [Contribution to SNS Site]
  • Herein, with reference of the flowchart of FIG. 12, a process of the information processing device 1 contributing the message indicating the number of smiling faces to an SNS site will be described.
  • In Step S11, the capturing unit 31 performs capturing.
  • In Step S12, the recognition unit 32 analyzes the image captured by the capturing unit 31, and recognizes the facial expression of the user.
  • In Step S13, the control unit 33 determines whether or not the facial expression of the user is a smile based on the information supplied from the recognition unit 32.
  • When the facial expression of the user is determined to be a smile in Step S13, in Step S14, the control unit 33 gives feedback by displaying a smile icon to the LED matrix unit 12, or the like.
  • On the other hand, when the facial expression of the user is determined not to be a smile in Step S13, the process returns to Step S11, and the above process is repeated. When the facial expression of the user is determined not to be a smile, the control unit 33 may give feedback to the user by displaying a sad icon to the LED matrix unit 12, or the like.
  • After the smile icon is displayed in Step S14, in Step S15, the counter 61 counts the number of smiling faces.
  • In Step S16, the counter 61 determined whether or not the number of smiling faces for a predetermined period is equal to or higher than a predetermined number that is a threshold value.
  • When the number of smiling faces is determined to be equal to or higher than the threshold value in Step S16, in Step S17, the display data generation unit 62 generates a message indicating the number of smiling faces counted by the counter 61.
  • In Step S18, the transmission unit 63 transmits the message generated by the display data generation unit 62 to the web server 53 to contribute the message to the SNS site. After the message is contributed thereto, or when the number of smiling faces is determined not to be equal to or higher than the threshold value in Step S16, the process returns to Step S11, and the above process is repeated.
  • The information transmitted from the transmission unit 63 is acquired by the SNS site management unit 91 of the web server 53, and posted on the SNS site.
  • FIG. 13 is a diagram showing an example of the screen of the SNS site. The screen of FIG. 13 is displayed when the PC 52 or other devices such as a mobile telephone, or the like has access to the web server 53 using a browsing function.
  • On the screen of the SNS site of FIG. 13, messages contributed from each device are displayed in time series order, and Contribution # 1 that is the latest contribution, Contribution # 2 that has been contributed before Contribution # 1, and Contribution # 3 that has been contributed before Contribution # 2 are displayed.
  • Contribution # 3 is the message contributed from the information processing device 1, and shows that the number of smiling faces counted for one day is six. In the example of FIG. 13, the number of smiling faces is expressed by the number of images indicating smiling faces. “Could make smiles OOOOOO times today” indicated as Contribution 13 (O is an image showing a smiling face) is the message generated by the display data generation unit 62.
  • Friends, and the like of the user of the information processing device 1 who read Contribution # 3 contribute messages praising or supporting the number of smiling faces through their own devices, and the user of the information processing device 1 who checks the contributions of his or her friends, and the like can awake himself or herself to make smiling faces more actively. Contributions # 1 and #2 of FIG. 13 are messages contributed by the friends, and the like of the user of the information processing device 1, praising or supporting the number of smiling faces. In addition, the user of the information processing device 1 can share the number of smiling faces with the friends, and the like.
  • [Display of List of the Number of Smiling Faces]
  • Next, with reference to the flowchart of FIG. 14, a process of the information processing device 1 for displaying the list of the number of smiling faces for each predetermined period will be described.
  • Processes of Steps S31 to S35 of FIG. 14 are basically the same processes as those of Steps S11 to S15 of FIG. 12. In Step S31, the capturing unit 31 performs capturing. Images captured are supplied to the recognition unit 32 and to the image management unit 64.
  • In Step S32, the recognition unit 32 analyses the images captured by the capturing unit 31 and recognizes the facial expression of the user.
  • In Step S33, the control unit 33 determines whether or not the facial expression of the user is a smile based on the information supplied from the recognition unit 32.
  • When the facial expression of the user is determined to be a smile in Step S33, in Step S34, the control unit 33 gives feedback causing the LED matrix unit 12 to display a smile icon, or the like. On the other hand, when the facial expression of the user is determined not to be a smile in Step S33, processes after Step S31 are repeated.
  • In Step S35, the counter 61 counts the number of smiling faces.
  • In Step S36, the image management unit 64 causes a memory to store an image that is recognized that a smiling face is taken therein and information indicating the degree of a smiling face recognized by the recognition unit 32 for management.
  • In Step S37, the counter 61 determines whether or not one day that is a period for which the number of smiling faces has to be counted has passed.
  • When the period for which the number of smiling faces has to be counted is determined to have passed in Step S37, in Step S38, the image management unit 64 selects an image with the highest degree of smiling face.
  • In Step S39, the transmission unit 63 transmits information indicating the number of smiling faces counted by the counter 61 and the image of a smiling face selected by the image management unit 64 to the web server 53.
  • After the information indicating the number of smiling faces and the image of the smiling face are transmitted, or when the period for which the number of smiling faces has to be counted is determined to have not passed in Step S37, the process returns to Step S31, and the above processes are repeated.
  • The information transmitted from the transmission unit 63 is acquired by the number management unit 92 of the web server 53, and managed. When the user of the information processing device 1 makes access to the web server 53 operating, for example, the PC 52, and requests for displaying the list of the number of smiling faces, an image of a calendar for displaying the list of the number of smiling faces is displayed on a display of the PC 52 based on information transmitted from the display control unit 93.
  • FIG. 15 is a diagram showing an example of a screen of the calendar displayed on the display of the PC 52 based on the information transmitted from the web server 53.
  • Calendar image P of FIG. 15 is an image of a calendar for a certain month, and icons indicating the number of smiling faces counted for each day are displayed in the cells for the first to fifteenth days out of cells corresponding to first to thirty-first days. Calendar image P is an image displayed when the fifteenth day has passed.
  • In the example of FIG. 15, icons with different degrees of smiling faces are displayed in accordance with the range of the number of smiling faces. For example, the icon with a dark facial expression displayed in the cell for the seventh day indicates that the number of smiling faces counted on that day is, for example, zero to five, and the icon with a smiling face displayed in the cell for the fifth day indicates that the number of smiling faces counted on that day is, for example, six to ten. The icon with the big laugh displayed in the cell for the first day indicates that the number of smiling faces counted on that day is, for example, eleven.
  • When an icon indicating the number of smiling faces displayed on calendar image P is selected through a mouse operation, or the like, the display control unit 93 of the web server 53 transmits to the PC 52 an image selected by the information processing device 1 among images (photos) captured on the day for which the selected icon is displayed as an image with the highest degree of a smiling face, and have the image to be displayed on the display.
  • The user of the information processing device 1 can recognize the number of smiling faces for each day from the types of the icons displayed on calendar image P, and can awake himself or herself to make a smiling face more actively when the number is low. In addition, friends, and the like of the user of the information processing device 1 make access to the web server 53 through an operation of their own devices to see calendar image P, and can check whether the user of the information processing device 1 makes smiling faces, in other words, the user is in a bright mood.
  • Furthermore, as above, the functions of managing the number of smiling faces and displaying the list thereof may be installed in the PC 52. In this case, the PC 52 is provided with the number management unit 92 and the display control unit 93 of FIG. 11, and performs a process of displaying the list of the number of smiling faces based on information transmitted from the information processing device 1. The information processing device 1 may be provided with the number management unit 92 and the display control unit 93 so as to recognize the number of smiling faces through display of the LED matrix unit 12 and an LCD provided in the information processing device 1.
  • [Transmission to Photo Frame]
  • A function of transmitting images to the photo frame 54 will be described. In this case, the information processing device 1 of FIG. 8 recognizes a facial expression of the user taken in a captured image, and gives feedback to the user by displaying icons, or the like. In addition, when a smiling face is recognized, the information processing device 1 transmits an image in which the recognized smiling face is taken to the photo frame 54.
  • The photo frame 54 that receives the image transmitted from the information processing device 1 causes the display to display the received image. Accordingly, for example, when the photo frame 54 is installed in somebody's home other than the user of the information processing device 1, the user of the photo frame 54 can see the image of a smiling face of the user of the information processing device 1. This function is considered to be particularly useful when the user of the information processing device 1 is an old person living alone and the user of the photo frame 54 is his or her offspring. The offspring who is the user of the photo frame 54 can recognize a smiling face of his or her parent who is the user of the information processing device 1.
  • Third Embodiment
  • A smiling face gateway function that is a function of requesting a smiling face when the user takes an action will be described.
  • As shown in FIG. 16, the information processing device 1 is electrically connected to control target equipment 101 that is equipment to be controlled in a wired or wireless manner. A case where the control target equipment 101 is a refrigerator will be described.
  • FIG. 17 is a diagram showing the appearance of a refrigerator to which the information processing device 1 is fixed.
  • The refrigerator 111 of FIG. 17 is constituted by a door 111A and the main body 111B. The door 111A is fixed to the main body 111B so as to be open and closed via a hinge (not shown in the drawing) provided on the right side face of the door 111A. A grip 121 is provided in the left side of the surface of the door 111A. An action of opening the door 111A is grasping the grip 121 with, for example, the right hand and pulling the grip to the front as shown by the white arrow.
  • The information processing device 1 is installed at the position that is on the surface of the door 111A and above the grip 121 so that the front face of the housing thereof faces the front direction of the refrigerator 111. On the left side face of the door 111A, a member 122A including an electric lock 122 is provided, and on the left side face of the main body 111B, a member 122B constituting the electric lock 122 making a pair with the member 122A. The members 122A and 122B are provided at the position where the members adjoin each other when the door 111A is closed.
  • The electric lock 122 is connected to the information processing device 1 as shown by the dotted line, and switches the door 111A to a locked state or an unlocked state according to a control signal supplied from the information processing device 1. When the electric lock 122 is in the locked state, the user is not able to open the door 111A, but able to open the door 111A when the lock is in the unlocked state.
  • The information processing device 1 installed in the refrigerator 111 in such a state gives feedback to the user in accordance with the facial expressions by capturing images, recognizing the facial expressions of the user, and displaying icons, in the same manner as the information processing device 1 of FIG. 1.
  • In addition, when a facial expression of the user is recognized to be a smile, the information processing device 1 outputs a control signal instructing the electric lock 122 to be in the unlocked state so as to open the door 111A, whereby the user is allowed to use the refrigerator ill. In addition, when the user closes the door 111A finishing the use of the refrigerator 111 in the unlocked state of the electric lock 122, the information processing device 1 outputs a control signal instructing the electric lock 122 to be in the locked state so as to prohibit from opening the door 111A.
  • In other words, it is necessary for the user to make a smiling face in order to open the door 111A of the refrigerator 111, that is, to use the refrigerator 111. As such, the information processing device 1 can make the user be aware of making smiling faces at ordinary times by requesting making smiling faces in ordinary actions such as using the refrigerator 11, in other words, by compelling smiling faces.
  • Herein, with reference of the flowchart of FIG. 18, a process of the information processing device 1 for controlling permission and non-permission to use the refrigerator 111 will be described. Furthermore, the information processing device 1 of FIG. 17 has the same configuration as that in FIG. 6 or FIG. 10.
  • In Step S51, the capturing unit 31 performs capturing.
  • In Step S52, the recognition unit 32 analyzes images captured by the capturing unit 31, and recognizes a facial expression of the user.
  • In Step S53, the control unit 33 determines whether or not the facial expression of the user is a smile based on information supplied from the recognition unit 32.
  • When the facial expression of the user is determined to be a smile in Step S53, in Step S54, the control unit 33 gives feedback by causing the LED matrix unit 12 to display a smile icon.
  • In Step S55, the control unit 33 outputs a control signal to the electric lock 122 to put the door 111A in an unlocked state. Accordingly, the user can open the door 111A and takes out food from the refrigerator 111, or the like.
  • On the other hand, when the facial expression of the user is determined not to be a smile in Step S53, in Step S56, the control unit 33 determines whether or not a fixed time has passed after, for example, it is detected that the user is in front of the refrigerator 111. Whether or not the user is in front of the refrigerator 111 is recognized by the recognition unit 32 based on, for example, a captured image.
  • When it is determined that a fixed time has passed in Step S56, the control unit 33 outputs a control signal to the electric lock 122 to put the door 111A in the unlocked state in Step S55 in the same manner. In this example, the door 111A is designed to be opened without making a smiling face when a fixed time has passed after the user is detected to be in front of the refrigerator 111, but the user may not be allowed to open the door unless the user makes a smiling face.
  • After the door 111A is put in the unlocked state in Step S55, in Step S57, the control unit 33 determines whether or not the user has closed the door 111A and departed from the refrigerator Ill, and stands by until the user is determined to have departed from the refrigerator 111. For example, when the door 111A is detected to be closed, it is determined that the user has departed from the refrigerator 111.
  • When the user is determined to have departed from the refrigerator 111 in Step S57, in Step S58, the control unit 33 outputs a control signal to the electric lock 122 to put the door 111A in a locked state. After the door 111A is in the locked state, or when it is determined that a fixed time has passed in Step S56, the process returns to Step S51, and the above processes are repeated. When the facial expression of the user is determined not to be a smile in Step S53, feedback may be given to the user by causing the LED matrix unit 12 to display a sad icon, or the like.
  • With the process above, the information processing device 1 can so-called “compel” the user to make a smiling face, and can make the user be aware of making a smiling face.
  • In the above description, equipment to be controlled is set to the refrigerator 111, but it is possible to control access (permit or non-permit) to various items that the user uses in his or her daily life.
  • It is possible to make the user be aware of having a smiling face giving a carefree feeling that the user is not able to enter a room without having a smiling face, for example, by setting equipment to be controlled to an electric lock installed in a door of a building and controlling permission or non-permission to open the door. In addition, it is possible to make the user be aware of having a smiling face giving a carefree feeling that the user is not allowed to boil water without having a smiling face by setting equipment to be controlled to an IH cooker and controlling permission or non-permission to use the IH cooker.
  • Furthermore, it is possible to make the user be aware of having a smiling face giving a carefree feeling that the user is not allowed to watch television programs without having a smiling face by setting equipment to be controlled to a television set and controlling permission or non-permission to use the television set. It is possible to realize various functions of equipment by setting a condition for access thereto to a smiling face such that equipment to be controlled is set to the engine of an automobile which is not allowed to start without a smiling face of the user, and equipment to be controlled is set to a PC which is not allowed to start without a smiling face of the user.
  • Modified Example and Others
  • A series of processes described above can be executed by hardware, and also by software. When the series of processes is executed by software, a program constituting the software is installed in a computer incorporated in dedicated hardware, or a general-purpose personal computer, or the like.
  • A program to be installed is provided by being recorded on an optical disc (CD-ROM (Compact Disc-Read Only Memory), a DVD (Digital Versatile Disc), or the like), or a removable medium 81 of FIG. 10 including a semiconductor memory, or the like. In addition, such a program may be provided through a wired or wireless transmission medium including a local area network, the Internet, or a digital broadcasting. A program can be installed in advance in the ROM 72 or the storage unit 78.
  • Furthermore, a program executed by a computer may be a program of which the process is performed in time series following the order described in the present specification, or may be a program of which the process is performed in parallel or at a necessary time point when it is called up, or the like.
  • An embodiment of the disclosure is not limited to the embodiments described above, and can be variously modified in the scope not departing from the gist of the disclosure.
  • Furthermore, the present disclosure can also be configured as below.
  • (1) An information processing device including a recognition unit which recognizes a facial expression of a user taken on a captured image, and a control unit which controls equipment to permit the user to use the equipment when the facial expression recognized by the recognition unit is a specific facial expression.
  • (2) The information processing device described in (1) above further including a capturing unit which captures an image of the user, in which the recognition unit recognizes a facial expression of the user taken on the image captured by the capturing unit.
  • (3) The information processing device described in (1) or (2) above in which the specific facial expression is a smile.
  • (4) The information processing device described in (1), (2), or (3) above further including an output unit which outputs information indicating the specific facial expression when the specific facial expression is recognized by the recognition unit and outputs information indicating a different facial expression from the specific expression when the facial expression that is not the specific expression is recognized by the recognition unit.
  • (5) The information processing device described in any one of (1) to (4) above, in which, when the user finishes using the equipment, the control unit controls the equipment to set non-permission to the user to use the equipment.
  • It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.

Claims (15)

What is claimed is:
1. An information processing apparatus comprising:
circuitry configured to obtain a statistic based on a recognition result of a facial emotion expression of a user from captured images, and
initiate output of a statistic information based on the obtained statistic.
2. The information processing apparatus according to claim 1, wherein the obtained statistic corresponds to a number of a predetermined emotion expression that is counted during a predetermined period of time from the captured images.
3. The information processing apparatus according to claim 2, wherein the predetermined emotion expression is a smile.
4. The information processing apparatus according to claim 2, wherein the statistic information comprises the counted number of the predetermined emotion expression.
5. The information processing apparatus according to claim 4, wherein the statistic information is output by displaying the counted number of the predetermined emotion expression.
6. The information processing apparatus according to claim 2, wherein the counted number corresponds to a number of captured images within which the predetermined emotion expression of the user is recognized.
7. The information processing apparatus according to claim 1, wherein the circuitry obtains the statistic by counting a number of times a predetermined emotion expression is recognized.
8. The information processing apparatus according to claim 7, wherein the predetermined emotion expression is a smile.
9. The information processing apparatus according to claim 1, wherein the statistic information comprises the counted number of the predetermined emotion expression.
10. The information processing apparatus according to claim 1, wherein the statistic information is output to be shared with another user.
11. The information processing apparatus according to claim 1, wherein the statistic information is output in relation to a calendar.
12. The information processing apparatus according to claim 11, wherein the statistic information comprises the obtained statistic correlated with days indicated by the calendar.
13. The information processing apparatus according to claim 1, wherein the statistic information is output as displayed icons correlated with dates the captured images were captured.
14. An information processing method implemented via an information processing apparatus, the method comprising:
obtaining a statistic based on a recognition result of a facial emotion expression of a user from captured images; and
outputting a statistic information based on the obtained statistic.
15. A non-transitory computer-readable medium having embodied thereon a program, which when executed by a computer causes the computer to execute a method, the method comprising:
obtaining a statistic based on a recognition result of a facial emotion expression of a user from captured images; and
outputting a statistic information based on the obtained statistic.
US15/048,188 2011-02-10 2016-02-19 Information processing device, information processing method, and program for recognizing facial expression and permitting use of equipment based on the recognized facial emotion expression Abandoned US20160171292A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/048,188 US20160171292A1 (en) 2011-02-10 2016-02-19 Information processing device, information processing method, and program for recognizing facial expression and permitting use of equipment based on the recognized facial emotion expression

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2011-027680 2011-02-10
JP2011027680A JP2012169777A (en) 2011-02-10 2011-02-10 Information processor, information processing method, and program
US13/363,799 US9298977B2 (en) 2011-02-10 2012-02-01 Information processing device, information processing method, and program for recognizing facial expression and permitting use of equipment based on the recognized facial emotion expression
US15/048,188 US20160171292A1 (en) 2011-02-10 2016-02-19 Information processing device, information processing method, and program for recognizing facial expression and permitting use of equipment based on the recognized facial emotion expression

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/363,799 Continuation US9298977B2 (en) 2011-02-10 2012-02-01 Information processing device, information processing method, and program for recognizing facial expression and permitting use of equipment based on the recognized facial emotion expression

Publications (1)

Publication Number Publication Date
US20160171292A1 true US20160171292A1 (en) 2016-06-16

Family

ID=46636626

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/363,799 Active 2033-10-03 US9298977B2 (en) 2011-02-10 2012-02-01 Information processing device, information processing method, and program for recognizing facial expression and permitting use of equipment based on the recognized facial emotion expression
US15/048,188 Abandoned US20160171292A1 (en) 2011-02-10 2016-02-19 Information processing device, information processing method, and program for recognizing facial expression and permitting use of equipment based on the recognized facial emotion expression

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/363,799 Active 2033-10-03 US9298977B2 (en) 2011-02-10 2012-02-01 Information processing device, information processing method, and program for recognizing facial expression and permitting use of equipment based on the recognized facial emotion expression

Country Status (3)

Country Link
US (2) US9298977B2 (en)
JP (1) JP2012169777A (en)
CN (1) CN102693002A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107220591A (en) * 2017-04-28 2017-09-29 哈尔滨工业大学深圳研究生院 Multi-modal intelligent mood sensing system
CN107239195A (en) * 2017-06-12 2017-10-10 河南职业技术学院 Computer based desktop icon management method and desktop icons managing device
CN108664932A (en) * 2017-05-12 2018-10-16 华中师范大学 A kind of Latent abilities state identification method based on Multi-source Information Fusion
CN111262637A (en) * 2020-01-15 2020-06-09 湖南工商大学 Human body behavior identification method based on Wi-Fi channel state information CSI

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9013264B2 (en) 2011-03-12 2015-04-21 Perceptive Devices, Llc Multipurpose controller for electronic devices, facial expressions management and drowsiness detection
US9355366B1 (en) 2011-12-19 2016-05-31 Hello-Hello, Inc. Automated systems for improving communication at the human-machine interface
CN103677226B (en) * 2012-09-04 2016-08-03 北方工业大学 expression recognition input method
MX2015006623A (en) * 2012-11-29 2015-08-05 Vorwerk Co Interholding Food processor.
AU2013351227B2 (en) 2012-11-29 2019-01-03 Vorwerk & Co. Interholding Gmbh Food processor
CN102999164B (en) * 2012-11-30 2016-08-03 广东欧珀移动通信有限公司 A kind of e-book flipping-over control method and intelligent terminal
CN103873917A (en) * 2012-12-13 2014-06-18 联想(北京)有限公司 Multimedia playing processing method and system
US10070192B2 (en) * 2013-03-15 2018-09-04 Disney Enterprises, Inc. Application for determining and responding to user sentiments during viewed media content
FR3004831B1 (en) * 2013-04-19 2022-05-06 La Gorce Baptiste De DIGITAL CONTROL OF THE SOUND EFFECTS OF A MUSICAL INSTRUMENT.
US9451122B2 (en) * 2013-04-22 2016-09-20 Socialmatic LLC System and method for sharing photographic content
US10268983B2 (en) 2013-06-26 2019-04-23 Amazon Technologies, Inc. Detecting item interaction and movement
US10176513B1 (en) * 2013-06-26 2019-01-08 Amazon Technologies, Inc. Using gestures and expressions to assist users
CN103473040A (en) * 2013-07-08 2013-12-25 北京百纳威尔科技有限公司 Multimedia playing method and device
JP6435595B2 (en) * 2013-08-05 2018-12-12 カシオ計算機株式会社 Training support system, server, terminal, camera, method and program
US9570019B2 (en) * 2014-03-20 2017-02-14 Dell Products, Lp System and method for coordinating image capture in a camera hidden behind a display device
CN104143139A (en) * 2014-08-07 2014-11-12 辽宁蓝卡医疗投资管理有限公司 Payment method and system based on facial expressions
US10074009B2 (en) * 2014-12-22 2018-09-11 International Business Machines Corporation Object popularity detection
JP2017014806A (en) * 2015-07-01 2017-01-19 アイフォーコム株式会社 Smiling-face interlocking control system
WO2018023634A1 (en) * 2016-08-04 2018-02-08 汤隆初 Unlocking method, and door lock
WO2018023606A1 (en) * 2016-08-04 2018-02-08 汤隆初 Technical information notification method for expression-based unlocking, and door lock
US10282530B2 (en) 2016-10-03 2019-05-07 Microsoft Technology Licensing, Llc Verifying identity based on facial dynamics
WO2018066190A1 (en) 2016-10-07 2018-04-12 ソニー株式会社 Information processing device, information processing method, and program
US10043406B1 (en) 2017-03-10 2018-08-07 Intel Corporation Augmented emotion display for austistic persons
CN107181852A (en) * 2017-07-19 2017-09-19 维沃移动通信有限公司 A kind of method for sending information, method for information display and mobile terminal
US10448762B2 (en) * 2017-09-15 2019-10-22 Kohler Co. Mirror
US10636419B2 (en) 2017-12-06 2020-04-28 Sony Interactive Entertainment Inc. Automatic dialogue design
WO2019146199A1 (en) * 2018-01-23 2019-08-01 ソニー株式会社 Information processing device and information processing method
CN114245021B (en) * 2022-02-14 2023-08-08 北京火山引擎科技有限公司 Interactive shooting method, electronic equipment, storage medium and computer program product

Citations (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020133552A1 (en) * 2001-03-15 2002-09-19 Sony Corporation Information processing apparatus, information processing method, information exchanging method, recording medium, and program
US20050261032A1 (en) * 2004-04-23 2005-11-24 Jeong-Wook Seo Device and method for displaying a status of a portable terminal by using a character image
US20060023575A1 (en) * 2004-07-27 2006-02-02 Hiroyuki Hayashi Disk tag reading device
US20060281064A1 (en) * 2005-05-25 2006-12-14 Oki Electric Industry Co., Ltd. Image communication system for compositing an image according to emotion input
US7155037B2 (en) * 2000-03-02 2006-12-26 Honda Giken Kogyo Kabushiki Kaisha Face recognition apparatus
US20070009139A1 (en) * 2005-07-11 2007-01-11 Agere Systems Inc. Facial recognition device for a handheld electronic device and a method of using the same
US20070025722A1 (en) * 2005-07-26 2007-02-01 Canon Kabushiki Kaisha Image capturing apparatus and image capturing method
US20070086626A1 (en) * 2003-10-08 2007-04-19 Xid Technologies Pte Ltd Individual identity authentication systems
US20070122036A1 (en) * 2005-09-26 2007-05-31 Yuji Kaneda Information processing apparatus and control method therefor
US7258272B2 (en) * 2004-04-09 2007-08-21 Oki Electric Industry Co., Ltd. Identification system using face authentication and consumer transaction facility
US20080028570A1 (en) * 2006-08-04 2008-02-07 Cascio Gregory R Floor cleaner
US20080037836A1 (en) * 2006-08-09 2008-02-14 Arcsoft, Inc. Method for driving virtual facial expressions by automatically detecting facial expressions of a face image
US7401357B2 (en) * 2001-11-22 2008-07-15 Ntt Docomo, Inc. Authentication system, mobile terminal, and authentication method
US20080275830A1 (en) * 2007-05-03 2008-11-06 Darryl Greig Annotating audio-visual data
US20090082005A1 (en) * 2007-09-24 2009-03-26 Motorola, Inc. Methods and devices for coordinating a single telephone number
US20090092294A1 (en) * 2006-03-01 2009-04-09 Kaoru Uchida Face authenticating apparatus, face authenticating method, and program
US20090109180A1 (en) * 2007-10-25 2009-04-30 International Business Machines Corporation Arrangements for identifying users in a multi-touch surface environment
US20090110248A1 (en) * 2006-03-23 2009-04-30 Oki Electric Industry Co., Ltd Face Recognition System
US20090190803A1 (en) * 2008-01-29 2009-07-30 Fotonation Ireland Limited Detecting facial expressions in digital images
US20090251560A1 (en) * 2005-06-16 2009-10-08 Cyrus Azar Video light system and method for improving facial recognition using a video camera
US7639282B2 (en) * 2005-04-01 2009-12-29 Canon Kabushiki Kaisha Image sensing device that acquires a movie of a person or an object and senses a still image of the person or the object, and control method thereof
US7643671B2 (en) * 2003-03-24 2010-01-05 Animetrics Inc. Facial recognition system and method
US20100005393A1 (en) * 2007-01-22 2010-01-07 Sony Corporation Information processing apparatus, information processing method, and program
US20100033590A1 (en) * 2008-08-07 2010-02-11 Canon Kabushiki Kaisha Image sensing apparatus, image capturing method, and program
US20100123724A1 (en) * 2008-11-19 2010-05-20 Bradford Allen Moore Portable Touch Screen Device, Method, and Graphical User Interface for Using Emoji Characters
US20100124363A1 (en) * 2008-11-20 2010-05-20 Sony Ericsson Mobile Communications Ab Display privacy system
US20100211397A1 (en) * 2009-02-18 2010-08-19 Park Chi-Youn Facial expression representation apparatus
US20100237991A1 (en) * 2009-03-17 2010-09-23 Prabhu Krishnanand Biometric scanning arrangement and methods thereof
US20110058713A1 (en) * 2009-09-04 2011-03-10 Casio Computer Co., Ltd. Digital photo frame, control method and recording medium with control program
US20120075452A1 (en) * 2009-06-16 2012-03-29 Bran Ferren Controlled access to functionality of a wireless device
US8154384B2 (en) * 2008-06-16 2012-04-10 Canon Kabushiki Kaisha Personal authentication apparatus and personal authentication method
US8165399B2 (en) * 2007-01-30 2012-04-24 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and program
US8218833B2 (en) * 2008-09-09 2012-07-10 Casio Computer Co., Ltd. Image capturing apparatus, method of determining presence or absence of image area, and recording medium
US20130114865A1 (en) * 2005-06-16 2013-05-09 Sensible Vision, Inc. System and Method for Providing Secure Access to an Electronic Device Using Facial Biometrics
US20130273968A1 (en) * 2008-08-19 2013-10-17 Digimarc Corporation Methods and systems for content processing
US20130300645A1 (en) * 2012-05-12 2013-11-14 Mikhail Fedorov Human-Computer Interface System
US20140193036A1 (en) * 2013-01-05 2014-07-10 Hon Hai Precision Industry Co., Ltd. Display device and method for adjusting observation distances thereof

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04258477A (en) * 1991-02-08 1992-09-14 Nec Software Kansai Ltd Door opening/closing apparatus by fingerprint collation
JP2003233816A (en) * 2002-02-13 2003-08-22 Nippon Signal Co Ltd:The Access control system
JP4877762B2 (en) * 2006-07-19 2012-02-15 株式会社ソニー・コンピュータエンタテインメント Facial expression guidance device, facial expression guidance method, and facial expression guidance system
JP2008276345A (en) * 2007-04-26 2008-11-13 Kyocera Corp Electronic device, authentication method, and program
JP5120777B2 (en) * 2008-04-11 2013-01-16 カシオ計算機株式会社 Electronic data editing apparatus, electronic data editing method and program
JP5181841B2 (en) 2008-06-02 2013-04-10 カシオ計算機株式会社 Imaging apparatus, imaging control program, image reproducing apparatus, and image reproducing program
JP2010034686A (en) 2008-07-25 2010-02-12 Nikon Corp Digital camera
JP5071404B2 (en) * 2009-02-13 2012-11-14 オムロン株式会社 Image processing method, image processing apparatus, and image processing program

Patent Citations (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7155037B2 (en) * 2000-03-02 2006-12-26 Honda Giken Kogyo Kabushiki Kaisha Face recognition apparatus
US20020133552A1 (en) * 2001-03-15 2002-09-19 Sony Corporation Information processing apparatus, information processing method, information exchanging method, recording medium, and program
US7401357B2 (en) * 2001-11-22 2008-07-15 Ntt Docomo, Inc. Authentication system, mobile terminal, and authentication method
US7643671B2 (en) * 2003-03-24 2010-01-05 Animetrics Inc. Facial recognition system and method
US20070086626A1 (en) * 2003-10-08 2007-04-19 Xid Technologies Pte Ltd Individual identity authentication systems
US7258272B2 (en) * 2004-04-09 2007-08-21 Oki Electric Industry Co., Ltd. Identification system using face authentication and consumer transaction facility
US20050261032A1 (en) * 2004-04-23 2005-11-24 Jeong-Wook Seo Device and method for displaying a status of a portable terminal by using a character image
US20060023575A1 (en) * 2004-07-27 2006-02-02 Hiroyuki Hayashi Disk tag reading device
US7639282B2 (en) * 2005-04-01 2009-12-29 Canon Kabushiki Kaisha Image sensing device that acquires a movie of a person or an object and senses a still image of the person or the object, and control method thereof
US20060281064A1 (en) * 2005-05-25 2006-12-14 Oki Electric Industry Co., Ltd. Image communication system for compositing an image according to emotion input
US20130114865A1 (en) * 2005-06-16 2013-05-09 Sensible Vision, Inc. System and Method for Providing Secure Access to an Electronic Device Using Facial Biometrics
US20090251560A1 (en) * 2005-06-16 2009-10-08 Cyrus Azar Video light system and method for improving facial recognition using a video camera
US20070009139A1 (en) * 2005-07-11 2007-01-11 Agere Systems Inc. Facial recognition device for a handheld electronic device and a method of using the same
US20070025722A1 (en) * 2005-07-26 2007-02-01 Canon Kabushiki Kaisha Image capturing apparatus and image capturing method
US20070122036A1 (en) * 2005-09-26 2007-05-31 Yuji Kaneda Information processing apparatus and control method therefor
US20090092294A1 (en) * 2006-03-01 2009-04-09 Kaoru Uchida Face authenticating apparatus, face authenticating method, and program
US20090110248A1 (en) * 2006-03-23 2009-04-30 Oki Electric Industry Co., Ltd Face Recognition System
US20080028570A1 (en) * 2006-08-04 2008-02-07 Cascio Gregory R Floor cleaner
US20080037836A1 (en) * 2006-08-09 2008-02-14 Arcsoft, Inc. Method for driving virtual facial expressions by automatically detecting facial expressions of a face image
US20100005393A1 (en) * 2007-01-22 2010-01-07 Sony Corporation Information processing apparatus, information processing method, and program
US8165399B2 (en) * 2007-01-30 2012-04-24 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and program
US20080275830A1 (en) * 2007-05-03 2008-11-06 Darryl Greig Annotating audio-visual data
US20090082005A1 (en) * 2007-09-24 2009-03-26 Motorola, Inc. Methods and devices for coordinating a single telephone number
US20090109180A1 (en) * 2007-10-25 2009-04-30 International Business Machines Corporation Arrangements for identifying users in a multi-touch surface environment
US20090190803A1 (en) * 2008-01-29 2009-07-30 Fotonation Ireland Limited Detecting facial expressions in digital images
US8154384B2 (en) * 2008-06-16 2012-04-10 Canon Kabushiki Kaisha Personal authentication apparatus and personal authentication method
US20100033590A1 (en) * 2008-08-07 2010-02-11 Canon Kabushiki Kaisha Image sensing apparatus, image capturing method, and program
US20130273968A1 (en) * 2008-08-19 2013-10-17 Digimarc Corporation Methods and systems for content processing
US8218833B2 (en) * 2008-09-09 2012-07-10 Casio Computer Co., Ltd. Image capturing apparatus, method of determining presence or absence of image area, and recording medium
US20100123724A1 (en) * 2008-11-19 2010-05-20 Bradford Allen Moore Portable Touch Screen Device, Method, and Graphical User Interface for Using Emoji Characters
US20100124363A1 (en) * 2008-11-20 2010-05-20 Sony Ericsson Mobile Communications Ab Display privacy system
US20100211397A1 (en) * 2009-02-18 2010-08-19 Park Chi-Youn Facial expression representation apparatus
US20100237991A1 (en) * 2009-03-17 2010-09-23 Prabhu Krishnanand Biometric scanning arrangement and methods thereof
US20120075452A1 (en) * 2009-06-16 2012-03-29 Bran Ferren Controlled access to functionality of a wireless device
US20110058713A1 (en) * 2009-09-04 2011-03-10 Casio Computer Co., Ltd. Digital photo frame, control method and recording medium with control program
US20130300645A1 (en) * 2012-05-12 2013-11-14 Mikhail Fedorov Human-Computer Interface System
US20140193036A1 (en) * 2013-01-05 2014-07-10 Hon Hai Precision Industry Co., Ltd. Display device and method for adjusting observation distances thereof

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107220591A (en) * 2017-04-28 2017-09-29 哈尔滨工业大学深圳研究生院 Multi-modal intelligent mood sensing system
CN108664932A (en) * 2017-05-12 2018-10-16 华中师范大学 A kind of Latent abilities state identification method based on Multi-source Information Fusion
CN107239195A (en) * 2017-06-12 2017-10-10 河南职业技术学院 Computer based desktop icon management method and desktop icons managing device
CN111262637A (en) * 2020-01-15 2020-06-09 湖南工商大学 Human body behavior identification method based on Wi-Fi channel state information CSI

Also Published As

Publication number Publication date
US20120206603A1 (en) 2012-08-16
JP2012169777A (en) 2012-09-06
US9298977B2 (en) 2016-03-29
CN102693002A (en) 2012-09-26

Similar Documents

Publication Publication Date Title
US20160171292A1 (en) Information processing device, information processing method, and program for recognizing facial expression and permitting use of equipment based on the recognized facial emotion expression
US10565763B2 (en) Method and camera device for processing image
US9674485B1 (en) System and method for image processing
KR101988279B1 (en) Operating Method of User Function based on a Face Recognition and Electronic Device supporting the same
WO2019120029A1 (en) Intelligent screen brightness adjustment method and apparatus, and storage medium and mobile terminal
US20140101611A1 (en) Mobile Device And Method For Using The Mobile Device
US9021395B2 (en) Display control device, integrated circuit, and display control method
CN107194817B (en) User social information display method and device and computer equipment
JP2009521186A (en) Method and apparatus for providing user profiling based on facial recognition
US11809479B2 (en) Content push method and apparatus, and device
WO2008150427A1 (en) Multi-camera residential communication system
US10856043B2 (en) Simultaneous motion of users to trigger establishment of network communications channel
EP2149258A2 (en) A residential video communication system
US10325144B2 (en) Wearable apparatus and information processing method and device thereof
CN108363939B (en) Characteristic image acquisition method and device and user authentication method
CN111988493B (en) Interaction processing method, device, equipment and storage medium
KR20140052263A (en) Contents service system, method and apparatus for service contents in the system
EP2402839A2 (en) System and method for indexing content viewed on an electronic device
CN111984347A (en) Interaction processing method, device, equipment and storage medium
CN104318209B (en) Iris image acquiring method and equipment
CA3050456C (en) Facial modelling and matching systems and methods
CN113190748B (en) Account recommendation method, account recommendation device, electronic equipment and computer readable storage medium
CN115525188A (en) Shooting method and electronic equipment
CN113749614B (en) Skin detection method and apparatus
EP4329320A1 (en) Method and apparatus for video playback

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:REKIMOTO, JUNICHI;TSUJITA, HITOMI;SIGNING DATES FROM 20160208 TO 20160210;REEL/FRAME:037777/0382

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION