WO2010044073A1 - System and method for aiding a disabled person - Google Patents

System and method for aiding a disabled person Download PDF

Info

Publication number
WO2010044073A1
WO2010044073A1 PCT/IB2009/054546 IB2009054546W WO2010044073A1 WO 2010044073 A1 WO2010044073 A1 WO 2010044073A1 IB 2009054546 W IB2009054546 W IB 2009054546W WO 2010044073 A1 WO2010044073 A1 WO 2010044073A1
Authority
WO
WIPO (PCT)
Prior art keywords
disabled person
robotic arm
mark
camera
person
Prior art date
Application number
PCT/IB2009/054546
Other languages
French (fr)
Inventor
Aaron Shafir
Original Assignee
Aaron Shafir
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aaron Shafir filed Critical Aaron Shafir
Publication of WO2010044073A1 publication Critical patent/WO2010044073A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F4/00Methods or devices enabling patients or disabled persons to operate an apparatus or a device not forming part of the body 
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Definitions

  • the present invention relates to a system for aiding a disabled person, particularly to aid disabled person with limited use of the upper limbs.
  • patent application JP09224965 discloses a meal support robot dedicated to handling food.
  • a person having handicap in an upper limb can independently operate the robot by providing a light projecting part for projecting a directional light, a light receiving part for receiving the light, and a holding part which is elastically deformed by contact with an operator and irradiating the light in a desired position on the light receiving part.
  • U.S. patent 6,961,623 discloses a remote control method for use by disabled person, e.g. to actuate a muscle stimulator cuff, which involves triggering a signal by using mechanical vibrations detected to control operation of the device or process.
  • a system for aiding a disabled person comprising: at least one robotic arm; at least one camera for producing at least one image of the face of the disabled person; an image processing module for processing the image; and a control sub-system for controlling the position of the at least one robotic arm, based on repositioning of one or more facial elements by the disabled person , wherein the one or more facial elements is an artificial mark being a transparent mark detectable by the at least one camera operably connected to the robotic arm and/or a mark projected onto the person by a projection mechanism.
  • the camera is adapted to sense movement of the artificial marks including gestures
  • the mark relates to a sticker applied to the face with an 'x' written thereon or a mark (e.g. an 'x') projected onto the face by a projection mechanism or device such as a projector, laser, etc., which capability according to some embodiments is incorporated into the camera(s).
  • the present invention relates to a method for aiding a disabled person comprising: applying at least one artificial mark on the person; detecting the at least one artificial mark by at least one camera; and moving an appliance attachable to a controllable robotic arm corresponding to the detected artificial mark(s), wherein the artificial mark is a transparent mark detectable by the at least one camera operably connected to the robotic arm and/or a mark projected onto the person by the camera(s).
  • the present system and method can be used for a variety of activities including eating/feeding, drawing, writing, teeth brushing and the like.
  • FIG. 1 is an isometric view of an exemplary system in accordance with the present invention as operated by a person in a wheel chair;
  • Fig. 2A is an isometric front view of the disable person face
  • Fig. 2B is an isometric profile view of the disable person face
  • Fig. 3 is an isometric side view of a robotic arm
  • Fig. 4A is a front view of the disable person face including arrows designating the person face movement
  • Fig. 4B is a profile view of the disable person face including arrows designating the person face movement.
  • FIG. 1 illustrates an embodiment of a system for aiding a disabled person 30 in accordance with the present invention, operated by the disabled person, for example, in a chair or wheel chair 32 .
  • a digital camera 34 is attached to a chair support 36 of chair 32 .
  • Robotic arm 42 has freedom to move in one or more axes and preferably about those axes. Examples of such robotic arms are the VP 5-axis series and VP 6-axis series arms (DENSO Robotics, 3900 Via Oro Avenue, Long Beach, CA 90810, USA), which can perform in five and six freedoms of movement, respectively.
  • an additional camera 48 is attached to table 38 .
  • Camera 48 images a front view of face 58 of disabled person 30 and camera 34 is used for imaging the profile of the disabled person's face.
  • the digital images are stored in a digital memory means, not shown, for further processing.
  • the system further includes a voice recognition sub-system, not shown, for recognizing voice commands of disabled person 30 .
  • voice commands are, 'change tool', which commands robot arm 42 to substitute the appliance currently attached to catch 45 with another appliance that is stored in storage means 44 .
  • the command 'open catch' for example is another voice command that commands catch 45 of robotic arm 42 to open.
  • Figs. 2A and 2B respectively, to which references are now made.
  • marks 60, 62, 64 and 66 e.g. via stickers or projected thereon.
  • These marks can be natural marks appearing on face 58, facial gestures or marks artificially placed on the disabled person face 58.
  • Marks 60, 62 , 64 and 66 are detected by cameras 34 and 48 as known in the art, implemented by an image processing module, not shown. Examples of useful natural marks are wrinkles and moles.
  • a transparent sticker which is invisible to the human eye but is detectable by cameras 34 and 48 can be applied on the disabled person's face 58.
  • Mark 60 is positioned on the forehead of disabled person 30.
  • mark 60 The disabled person moving his forehead is accompanied by a corresponding movement of mark 60.
  • Mark 64 is disposed on the chin of the disabled person. When the disabled person moves his chin, mark 60 moves as well.
  • Marks 62 and 66 are positioned on the cheeks of the disabled person. In some embodiments of the present invention marks 62 and 66 are moved by moving the tongue of the disabled person towards the person's right and left cheeks, respectively.
  • FIG. 3 A side view of robotic arm 42 is shown in Fig . 3 to which reference is now made.
  • Robotic arm 42 can be described as being controllable with reference to a Cartesian grid.
  • Double headed arrow 70 designates the movement direction of robotic arm 42 in the x-axis.
  • Double headed arrow 72 designates the movement direction of robotic arm 42 in the z-axis.
  • Double headed arrow 74 designates the movement direction of robotic arm 42 in the y-axis. Rotational movements around axes 70 , 72 and 74 are designated by arrows 76 , 78 and 80 respectively.
  • FIG. 4A and 4B A front and profile view of the face 58 of disabled person 30, including arrows describing the facial movement, is shown in Figs. 4A and 4B to which reference is now made.
  • the movements of the disabled person's head is analyzed by determining the distance between any of marks 60, 62, 64 and 66 before and after a facial movement(s) or facial gestures respectively are performed.
  • Nodding of the disabled person's head up and down, in the direction designated by double headed arrow 90 is accompanied by a corresponding movement of marks 60, 64.
  • Sideways turning of the disabled person's head in the direction designated by double headed arrow 92 is accompanied by a corresponding movement of marks 60 and 64 to the left or to the right.
  • a control sub-system controls the movement of robotic arm 42.
  • Robotic arm 42 is controlled/driven by analyzing the aforementioned movement(s) of the head of disabled person 30 and issuing commands to driving mechanisms, not shown, for controlling the robotic arm (Fig. 3).
  • Head movement in the directions designated by double headed arrow 90 will actuate robotic arm 42 in the directions designated by double headed arrow 70.
  • a movement in direction 92 will actuate robotic arm 42 in the direction designated by double headed arrow 72.
  • a movement in direction 96 will actuate robotic arm 42 in the direction designated by double headed arrow 74.
  • a movement in direction 98 will actuate robotic arm 42 in the direction designated by arrow 76.
  • a movement in direction 94 will actuate robotic arm 42 in the direction designated by arrow 78.
  • a movement in direction 100 or direction 102 will actuate robotic arm 42 in the direction designated by arrow 80.
  • marks 60, 62, 64 and 66 can be projected by one of the cameras 34 and 48 or an additional other component onto the person's face 58 ; and robotic arm 42 is controlled by the relative movement of the face and the mark.

Abstract

A system for aiding a disabled person comprising: at least one robotic arm; at least one camera for producing at least one image of the disabled person's face; an image processing module for processing the image; and a control sub-system for controlling the position of the robotic arm(s), based on repositioning of one or more facial elements by the disabled person. One or more facial elements is an artificial mark being a transparent mark detectable by the camera(s) operably connected to the robotic arm and/or a mark projected onto the person by a projection mechanism.

Description

Description
Title of Invention: SYSTEM AND METHOD FOR AIDING A
DISABLED PERSON
[ 1 ] FIELD OF THE INVENTION
[2] The present invention relates to a system for aiding a disabled person, particularly to aid disabled person with limited use of the upper limbs.
[3] BACKGROUND OF THE INVENTION
[4] Disabled people have difficulties in carrying out simple every day operations such as eating and drinking.
[5] To help such disabled persons, patent application JP09224965 discloses a meal support robot dedicated to handling food. A person having handicap in an upper limb can independently operate the robot by providing a light projecting part for projecting a directional light, a light receiving part for receiving the light, and a holding part which is elastically deformed by contact with an operator and irradiating the light in a desired position on the light receiving part.
[6] U.S. patent 6,961,623 discloses a remote control method for use by disabled person, e.g. to actuate a muscle stimulator cuff, which involves triggering a signal by using mechanical vibrations detected to control operation of the device or process.
[7] SUMMARY OF THE INVENTION
[8] According to one aspect of the present invention there is provided a system for aiding a disabled person comprising: at least one robotic arm; at least one camera for producing at least one image of the face of the disabled person; an image processing module for processing the image; and a control sub-system for controlling the position of the at least one robotic arm, based on repositioning of one or more facial elements by the disabled person , wherein the one or more facial elements is an artificial mark being a transparent mark detectable by the at least one camera operably connected to the robotic arm and/or a mark projected onto the person by a projection mechanism.
[9] The camera is adapted to sense movement of the artificial marks including gestures
(e.g. movement of an eye, ear, mouth and/or nose; smiling, tongue movement, raising an eyebrow, and the like); and head movements (e.g. nodding yes or no or tilting, etc). In some embodiments, the mark relates to a sticker applied to the face with an 'x' written thereon or a mark (e.g. an 'x') projected onto the face by a projection mechanism or device such as a projector, laser, etc., which capability according to some embodiments is incorporated into the camera(s).
[10] According to another aspect , the present invention relates to a method for aiding a disabled person comprising: applying at least one artificial mark on the person; detecting the at least one artificial mark by at least one camera; and moving an appliance attachable to a controllable robotic arm corresponding to the detected artificial mark(s), wherein the artificial mark is a transparent mark detectable by the at least one camera operably connected to the robotic arm and/or a mark projected onto the person by the camera(s).
[11] The present system and method can be used for a variety of activities including eating/feeding, drawing, writing, teeth brushing and the like.
[12] BRIEF DESCRIPTION OF THE DRAWINGS
[13] The invention will be understood more clearly and other features and advantages shall become apparent from the following detailed description of exemplary embodiments, with reference to the following figures, wherein:
[14] Fig. 1 is an isometric view of an exemplary system in accordance with the present invention as operated by a person in a wheel chair;
[15] Fig. 2A is an isometric front view of the disable person face;
[16] Fig. 2B is an isometric profile view of the disable person face;
[17] Fig. 3 is an isometric side view of a robotic arm;
[18] Fig. 4A is a front view of the disable person face including arrows designating the person face movement; and
[19] Fig. 4B is a profile view of the disable person face including arrows designating the person face movement.
[20] DESCRIPTION OF SOME EMBODIMENTS OF THE PRESENT INVENTION
[21] Fig. 1 illustrates an embodiment of a system for aiding a disabled person 30 in accordance with the present invention, operated by the disabled person, for example, in a chair or wheel chair 32 . A digital camera 34 is attached to a chair support 36 of chair 32 . R esting upon a table 38 , a food plate 40 , a robotic arm 42 , and a storage means 44 which stores appliances that can be attached to a robotic catch 45 , disposed on an end 46 of the robotic arm. Examples of such appliances include a fork, a cup and so forth. Robotic arm 42 has freedom to move in one or more axes and preferably about those axes. Examples of such robotic arms are the VP 5-axis series and VP 6-axis series arms (DENSO Robotics, 3900 Via Oro Avenue, Long Beach, CA 90810, USA), which can perform in five and six freedoms of movement, respectively.
[22] Typically, an additional camera 48 is attached to table 38 . Camera 48 images a front view of face 58 of disabled person 30 and camera 34 is used for imaging the profile of the disabled person's face. The digital images are stored in a digital memory means, not shown, for further processing.
[23] According to some embodiments, the system further includes a voice recognition sub-system, not shown, for recognizing voice commands of disabled person 30 . An example of such voice commands are, 'change tool', which commands robot arm 42 to substitute the appliance currently attached to catch 45 with another appliance that is stored in storage means 44 . The command 'open catch' for example is another voice command that commands catch 45 of robotic arm 42 to open.
[24] An isometric front view and isometric profile view of disabled person 30 is shown in
Figs. 2A and 2B respectively, to which references are now made. In this example, on the face 58 of the disabled person 30 are placed four marks 60, 62, 64 and 66 (e.g. via stickers or projected thereon). These marks can be natural marks appearing on face 58, facial gestures or marks artificially placed on the disabled person face 58. Marks 60, 62 , 64 and 66 are detected by cameras 34 and 48 as known in the art, implemented by an image processing module, not shown. Examples of useful natural marks are wrinkles and moles. In some embodiments of the present invention, a transparent sticker which is invisible to the human eye but is detectable by cameras 34 and 48 can be applied on the disabled person's face 58. Mark 60 is positioned on the forehead of disabled person 30. The disabled person moving his forehead is accompanied by a corresponding movement of mark 60. Mark 64 is disposed on the chin of the disabled person. When the disabled person moves his chin, mark 60 moves as well. Marks 62 and 66 are positioned on the cheeks of the disabled person. In some embodiments of the present invention marks 62 and 66 are moved by moving the tongue of the disabled person towards the person's right and left cheeks, respectively.
[25] A side view of robotic arm 42 is shown in Fig . 3 to which reference is now made.
Robotic arm 42 can be described as being controllable with reference to a Cartesian grid. Double headed arrow 70 designates the movement direction of robotic arm 42 in the x-axis. Double headed arrow 72 designates the movement direction of robotic arm 42 in the z-axis. Double headed arrow 74 designates the movement direction of robotic arm 42 in the y-axis. Rotational movements around axes 70 , 72 and 74 are designated by arrows 76 , 78 and 80 respectively.
[26] A front and profile view of the face 58 of disabled person 30, including arrows describing the facial movement, is shown in Figs. 4A and 4B to which reference is now made. The movements of the disabled person's head is analyzed by determining the distance between any of marks 60, 62, 64 and 66 before and after a facial movement(s) or facial gestures respectively are performed. Nodding of the disabled person's head up and down, in the direction designated by double headed arrow 90, is accompanied by a corresponding movement of marks 60, 64. Sideways turning of the disabled person's head in the direction designated by double headed arrow 92, is accompanied by a corresponding movement of marks 60 and 64 to the left or to the right. Sideways tiliting of the disabled person head in the direction designated by double headed arrow 94, is accompanied by a corresponding movement of mark 60 to the right, and mark 64 to the left, or, mark 60 moves to the left and mark 64 moves to the right. The movements of the disabled person's chin in the direction designated by double headed arrow 96 moves mark 64 up or down in respect to mark 60. Movement of the disabled person's chin in the direction designated by arrows 98 is accompanied by a corresponding movement of mark 64 left or right with respect to mark 60. A movement of the disabled person's tongue in the direction designated by arrow 100 can cause mark 66 to move to the left with respect to marks 60 and 64. The movement of the disabled person's tongue in the direction designated by arrow 102 moves mark 62 to the right with respect to marks 60 and 64.
[27] A control sub-system, not shown, controls the movement of robotic arm 42. Robotic arm 42 is controlled/driven by analyzing the aforementioned movement(s) of the head of disabled person 30 and issuing commands to driving mechanisms, not shown, for controlling the robotic arm (Fig. 3). Head movement in the directions designated by double headed arrow 90 will actuate robotic arm 42 in the directions designated by double headed arrow 70. A movement in direction 92 will actuate robotic arm 42 in the direction designated by double headed arrow 72. A movement in direction 96 will actuate robotic arm 42 in the direction designated by double headed arrow 74. A movement in direction 98 will actuate robotic arm 42 in the direction designated by arrow 76. A movement in direction 94 will actuate robotic arm 42 in the direction designated by arrow 78. A movement in direction 100 or direction 102 will actuate robotic arm 42 in the direction designated by arrow 80.
[28] In some embodiments of the present invention marks 60, 62, 64 and 66 can be projected by one of the cameras 34 and 48 or an additional other component onto the person's face 58 ; and robotic arm 42 is controlled by the relative movement of the face and the mark.

Claims

Claims
[Claim 1] A system for aiding a disabled person comprising: at least one robotic arm; at least one camera for producing at least one image of the face of said disabled person; an image processing module for processing said image; and a control sub-system for controlling the position of said at least one robotic arm, based on repositioning of one or more facial elements by said disabled person, wherein said one or more facial elements is an artificial mark being a transparent mark detectable by the at least one camera operably connected to said robotic arm and/or a mark projected onto the person by a projection mechanism.
[Claim 2] A system as in claim 1, comprising two cameras, wherein one camera images a front view of the face of said disabled person and a second camera images a profile of the face of said disabled person.
[Claim 3] A system as in claim 1, wherein said system further comprises storage means which stores appliances that can be attached to said at least one robotic arm.
[Claim 4] A system as in claim 1, wherein said robotic arm is adapted to move in a plurality of axes. [Claim 5] A system as in claim 1, further comprising a voice recognition subsystem for recognizing voice commands of said disabled person. [Claim 6] A system as in claim 1, w herein the projection mechanism is produced by a laser. [Claim 7] A system as in claim 1, w herein the projection mechanism is produced by a projector. [Claim 8] A system as in claim 1, w herein the camera comprises the projection mechanism. [Claim 9] A method for aiding a disabled person comprising: applying at least one artificial mark on said person; detecting said at least one artificial mark by at least one camera; and moving a robotic arm corresponding to movement of the detected artificial mark(s), wherein said artificial mark is a transparent mark detectable by the at least one camera operably connected to said robotic arm and/or a mark projected onto the person by the camera(s).
PCT/IB2009/054546 2008-10-16 2009-10-15 System and method for aiding a disabled person WO2010044073A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0818942A GB2464486A (en) 2008-10-16 2008-10-16 Control of a robotic arm by the recognition and analysis of facial gestures.
GB0818942.5 2008-10-16

Publications (1)

Publication Number Publication Date
WO2010044073A1 true WO2010044073A1 (en) 2010-04-22

Family

ID=40084106

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2009/054546 WO2010044073A1 (en) 2008-10-16 2009-10-15 System and method for aiding a disabled person

Country Status (2)

Country Link
GB (1) GB2464486A (en)
WO (1) WO2010044073A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102720249A (en) * 2011-03-29 2012-10-10 梁剑文 Washbowl with mechanic hand

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5323470A (en) * 1992-05-08 1994-06-21 Atsushi Kara Method and apparatus for automatically tracking an object
US5532824A (en) * 1994-01-25 1996-07-02 Mts Systems Corporation Optical motion sensor
US20020126090A1 (en) * 2001-01-18 2002-09-12 International Business Machines Corporation Navigating and selecting a portion of a screen by utilizing a state of an object as viewed by a camera
US20080132383A1 (en) * 2004-12-07 2008-06-05 Tylerton International Inc. Device And Method For Training, Rehabilitation And/Or Support
US20080188959A1 (en) * 2005-05-31 2008-08-07 Koninklijke Philips Electronics, N.V. Method for Control of a Device

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4207959A (en) * 1978-06-02 1980-06-17 New York University Wheelchair mounted control apparatus
US5812978A (en) * 1996-12-09 1998-09-22 Tracer Round Associaties, Ltd. Wheelchair voice control apparatus
CA2227361A1 (en) * 1998-01-19 1999-07-19 Taarna Studios Inc. Method and apparatus for providing real-time animation utilizing a database of expressions
US6215471B1 (en) * 1998-04-28 2001-04-10 Deluca Michael Joseph Vision pointer method and apparatus
US6072496A (en) * 1998-06-08 2000-06-06 Microsoft Corporation Method and system for capturing and representing 3D geometry, color and shading of facial expressions and other animated objects
IT1315644B1 (en) * 2000-07-06 2003-03-14 Uni Di Modena E Reggio Emilia SYSTEM FOR INTERACTION BETWEEN THE EYE MOVEMENT OF A SUBJECT AND UNPERSONAL COMPUTER
US7306337B2 (en) * 2003-03-06 2007-12-11 Rensselaer Polytechnic Institute Calibration-free gaze tracking under natural head movement
US7218320B2 (en) * 2003-03-13 2007-05-15 Sony Corporation System and method for capturing facial and body motion
EP1667049A3 (en) * 2004-12-03 2007-03-28 Invacare International Sàrl Facial feature analysis system
US20070217891A1 (en) * 2006-03-15 2007-09-20 Charles Folcik Robotic feeding system for physically challenged persons
JP2007310914A (en) * 2007-08-31 2007-11-29 Nippon Telegr & Teleph Corp <Ntt> Mouse alternating method, mouse alternating program and recording medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5323470A (en) * 1992-05-08 1994-06-21 Atsushi Kara Method and apparatus for automatically tracking an object
US5532824A (en) * 1994-01-25 1996-07-02 Mts Systems Corporation Optical motion sensor
US20020126090A1 (en) * 2001-01-18 2002-09-12 International Business Machines Corporation Navigating and selecting a portion of a screen by utilizing a state of an object as viewed by a camera
US20080132383A1 (en) * 2004-12-07 2008-06-05 Tylerton International Inc. Device And Method For Training, Rehabilitation And/Or Support
US20080188959A1 (en) * 2005-05-31 2008-08-07 Koninklijke Philips Electronics, N.V. Method for Control of a Device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102720249A (en) * 2011-03-29 2012-10-10 梁剑文 Washbowl with mechanic hand

Also Published As

Publication number Publication date
GB0818942D0 (en) 2008-11-19
GB2464486A (en) 2010-04-21

Similar Documents

Publication Publication Date Title
US9517559B2 (en) Robot control system, robot control method and output control method
JP2019535928A (en) Eyelash evaluation method and apparatus
JP5186723B2 (en) Communication robot system and communication robot gaze control method
CN107891425B (en) Control method of intelligent double-arm safety cooperation man-machine co-fusion robot system
CN109571513B (en) Immersive mobile grabbing service robot system
JP2006289508A (en) Robot device and its facial expression control method
Triesch et al. A gesture interface for human-robot-interaction
US20070265495A1 (en) Method and apparatus for field of view tracking
Pateromichelakis et al. Head-eyes system and gaze analysis of the humanoid robot Romeo
EP3975909B1 (en) Operating mode control systems and methods for a computer-assisted surgical system
Palinko et al. Eye gaze tracking for a humanoid robot
WO2019087495A1 (en) Information processing device, information processing method, and program
JP2004329490A (en) Tool and system for finger movement rehabilitation support
Breazeal et al. Social constraints on animate vision
CN109571490A (en) A kind of chess playing robot system and its visual identity control method
US20170282359A1 (en) Robot and control method thereof
CN106214163B (en) Recovered artifical psychological counseling device of low limbs deformity correction postoperative
Li et al. An egocentric computer vision based co-robot wheelchair
Buckley et al. Sensor suites for assistive arm prosthetics
Song et al. Visual servoing for a user's mouth with effective intention reading in a wheelchair-based robotic arm
WO2010044073A1 (en) System and method for aiding a disabled person
JP7363809B2 (en) Information processing device, information processing method, and program
Chau et al. A five degree-of-freedom body-machine interface for children with severe motor impairments
JP6158665B2 (en) Robot, robot control method, and robot control program
Schulz et al. An affordable, 3D-printable camera eye with two active degrees of freedom for an anthropomorphic robot

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09820337

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09820337

Country of ref document: EP

Kind code of ref document: A1