WO2010044073A1 - System and method for aiding a disabled person - Google Patents
System and method for aiding a disabled person Download PDFInfo
- Publication number
- WO2010044073A1 WO2010044073A1 PCT/IB2009/054546 IB2009054546W WO2010044073A1 WO 2010044073 A1 WO2010044073 A1 WO 2010044073A1 IB 2009054546 W IB2009054546 W IB 2009054546W WO 2010044073 A1 WO2010044073 A1 WO 2010044073A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- disabled person
- robotic arm
- mark
- camera
- person
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61F—FILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
- A61F4/00—Methods or devices enabling patients or disabled persons to operate an apparatus or a device not forming part of the body
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
Definitions
- the present invention relates to a system for aiding a disabled person, particularly to aid disabled person with limited use of the upper limbs.
- patent application JP09224965 discloses a meal support robot dedicated to handling food.
- a person having handicap in an upper limb can independently operate the robot by providing a light projecting part for projecting a directional light, a light receiving part for receiving the light, and a holding part which is elastically deformed by contact with an operator and irradiating the light in a desired position on the light receiving part.
- U.S. patent 6,961,623 discloses a remote control method for use by disabled person, e.g. to actuate a muscle stimulator cuff, which involves triggering a signal by using mechanical vibrations detected to control operation of the device or process.
- a system for aiding a disabled person comprising: at least one robotic arm; at least one camera for producing at least one image of the face of the disabled person; an image processing module for processing the image; and a control sub-system for controlling the position of the at least one robotic arm, based on repositioning of one or more facial elements by the disabled person , wherein the one or more facial elements is an artificial mark being a transparent mark detectable by the at least one camera operably connected to the robotic arm and/or a mark projected onto the person by a projection mechanism.
- the camera is adapted to sense movement of the artificial marks including gestures
- the mark relates to a sticker applied to the face with an 'x' written thereon or a mark (e.g. an 'x') projected onto the face by a projection mechanism or device such as a projector, laser, etc., which capability according to some embodiments is incorporated into the camera(s).
- the present invention relates to a method for aiding a disabled person comprising: applying at least one artificial mark on the person; detecting the at least one artificial mark by at least one camera; and moving an appliance attachable to a controllable robotic arm corresponding to the detected artificial mark(s), wherein the artificial mark is a transparent mark detectable by the at least one camera operably connected to the robotic arm and/or a mark projected onto the person by the camera(s).
- the present system and method can be used for a variety of activities including eating/feeding, drawing, writing, teeth brushing and the like.
- FIG. 1 is an isometric view of an exemplary system in accordance with the present invention as operated by a person in a wheel chair;
- Fig. 2A is an isometric front view of the disable person face
- Fig. 2B is an isometric profile view of the disable person face
- Fig. 3 is an isometric side view of a robotic arm
- Fig. 4A is a front view of the disable person face including arrows designating the person face movement
- Fig. 4B is a profile view of the disable person face including arrows designating the person face movement.
- FIG. 1 illustrates an embodiment of a system for aiding a disabled person 30 in accordance with the present invention, operated by the disabled person, for example, in a chair or wheel chair 32 .
- a digital camera 34 is attached to a chair support 36 of chair 32 .
- Robotic arm 42 has freedom to move in one or more axes and preferably about those axes. Examples of such robotic arms are the VP 5-axis series and VP 6-axis series arms (DENSO Robotics, 3900 Via Oro Avenue, Long Beach, CA 90810, USA), which can perform in five and six freedoms of movement, respectively.
- an additional camera 48 is attached to table 38 .
- Camera 48 images a front view of face 58 of disabled person 30 and camera 34 is used for imaging the profile of the disabled person's face.
- the digital images are stored in a digital memory means, not shown, for further processing.
- the system further includes a voice recognition sub-system, not shown, for recognizing voice commands of disabled person 30 .
- voice commands are, 'change tool', which commands robot arm 42 to substitute the appliance currently attached to catch 45 with another appliance that is stored in storage means 44 .
- the command 'open catch' for example is another voice command that commands catch 45 of robotic arm 42 to open.
- Figs. 2A and 2B respectively, to which references are now made.
- marks 60, 62, 64 and 66 e.g. via stickers or projected thereon.
- These marks can be natural marks appearing on face 58, facial gestures or marks artificially placed on the disabled person face 58.
- Marks 60, 62 , 64 and 66 are detected by cameras 34 and 48 as known in the art, implemented by an image processing module, not shown. Examples of useful natural marks are wrinkles and moles.
- a transparent sticker which is invisible to the human eye but is detectable by cameras 34 and 48 can be applied on the disabled person's face 58.
- Mark 60 is positioned on the forehead of disabled person 30.
- mark 60 The disabled person moving his forehead is accompanied by a corresponding movement of mark 60.
- Mark 64 is disposed on the chin of the disabled person. When the disabled person moves his chin, mark 60 moves as well.
- Marks 62 and 66 are positioned on the cheeks of the disabled person. In some embodiments of the present invention marks 62 and 66 are moved by moving the tongue of the disabled person towards the person's right and left cheeks, respectively.
- FIG. 3 A side view of robotic arm 42 is shown in Fig . 3 to which reference is now made.
- Robotic arm 42 can be described as being controllable with reference to a Cartesian grid.
- Double headed arrow 70 designates the movement direction of robotic arm 42 in the x-axis.
- Double headed arrow 72 designates the movement direction of robotic arm 42 in the z-axis.
- Double headed arrow 74 designates the movement direction of robotic arm 42 in the y-axis. Rotational movements around axes 70 , 72 and 74 are designated by arrows 76 , 78 and 80 respectively.
- FIG. 4A and 4B A front and profile view of the face 58 of disabled person 30, including arrows describing the facial movement, is shown in Figs. 4A and 4B to which reference is now made.
- the movements of the disabled person's head is analyzed by determining the distance between any of marks 60, 62, 64 and 66 before and after a facial movement(s) or facial gestures respectively are performed.
- Nodding of the disabled person's head up and down, in the direction designated by double headed arrow 90 is accompanied by a corresponding movement of marks 60, 64.
- Sideways turning of the disabled person's head in the direction designated by double headed arrow 92 is accompanied by a corresponding movement of marks 60 and 64 to the left or to the right.
- a control sub-system controls the movement of robotic arm 42.
- Robotic arm 42 is controlled/driven by analyzing the aforementioned movement(s) of the head of disabled person 30 and issuing commands to driving mechanisms, not shown, for controlling the robotic arm (Fig. 3).
- Head movement in the directions designated by double headed arrow 90 will actuate robotic arm 42 in the directions designated by double headed arrow 70.
- a movement in direction 92 will actuate robotic arm 42 in the direction designated by double headed arrow 72.
- a movement in direction 96 will actuate robotic arm 42 in the direction designated by double headed arrow 74.
- a movement in direction 98 will actuate robotic arm 42 in the direction designated by arrow 76.
- a movement in direction 94 will actuate robotic arm 42 in the direction designated by arrow 78.
- a movement in direction 100 or direction 102 will actuate robotic arm 42 in the direction designated by arrow 80.
- marks 60, 62, 64 and 66 can be projected by one of the cameras 34 and 48 or an additional other component onto the person's face 58 ; and robotic arm 42 is controlled by the relative movement of the face and the mark.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Vascular Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Manipulator (AREA)
Abstract
A system for aiding a disabled person comprising: at least one robotic arm; at least one camera for producing at least one image of the disabled person's face; an image processing module for processing the image; and a control sub-system for controlling the position of the robotic arm(s), based on repositioning of one or more facial elements by the disabled person. One or more facial elements is an artificial mark being a transparent mark detectable by the camera(s) operably connected to the robotic arm and/or a mark projected onto the person by a projection mechanism.
Description
Description
Title of Invention: SYSTEM AND METHOD FOR AIDING A
DISABLED PERSON
[ 1 ] FIELD OF THE INVENTION
[2] The present invention relates to a system for aiding a disabled person, particularly to aid disabled person with limited use of the upper limbs.
[3] BACKGROUND OF THE INVENTION
[4] Disabled people have difficulties in carrying out simple every day operations such as eating and drinking.
[5] To help such disabled persons, patent application JP09224965 discloses a meal support robot dedicated to handling food. A person having handicap in an upper limb can independently operate the robot by providing a light projecting part for projecting a directional light, a light receiving part for receiving the light, and a holding part which is elastically deformed by contact with an operator and irradiating the light in a desired position on the light receiving part.
[6] U.S. patent 6,961,623 discloses a remote control method for use by disabled person, e.g. to actuate a muscle stimulator cuff, which involves triggering a signal by using mechanical vibrations detected to control operation of the device or process.
[7] SUMMARY OF THE INVENTION
[8] According to one aspect of the present invention there is provided a system for aiding a disabled person comprising: at least one robotic arm; at least one camera for producing at least one image of the face of the disabled person; an image processing module for processing the image; and a control sub-system for controlling the position of the at least one robotic arm, based on repositioning of one or more facial elements by the disabled person , wherein the one or more facial elements is an artificial mark being a transparent mark detectable by the at least one camera operably connected to the robotic arm and/or a mark projected onto the person by a projection mechanism.
[9] The camera is adapted to sense movement of the artificial marks including gestures
(e.g. movement of an eye, ear, mouth and/or nose; smiling, tongue movement, raising an eyebrow, and the like); and head movements (e.g. nodding yes or no or tilting, etc). In some embodiments, the mark relates to a sticker applied to the face with an 'x' written thereon or a mark (e.g. an 'x') projected onto the face by a projection mechanism or device such as a projector, laser, etc., which capability according to some embodiments is incorporated into the camera(s).
[10] According to another aspect , the present invention relates to a method for aiding a disabled person comprising: applying at least one artificial mark on the person;
detecting the at least one artificial mark by at least one camera; and moving an appliance attachable to a controllable robotic arm corresponding to the detected artificial mark(s), wherein the artificial mark is a transparent mark detectable by the at least one camera operably connected to the robotic arm and/or a mark projected onto the person by the camera(s).
[11] The present system and method can be used for a variety of activities including eating/feeding, drawing, writing, teeth brushing and the like.
[12] BRIEF DESCRIPTION OF THE DRAWINGS
[13] The invention will be understood more clearly and other features and advantages shall become apparent from the following detailed description of exemplary embodiments, with reference to the following figures, wherein:
[14] Fig. 1 is an isometric view of an exemplary system in accordance with the present invention as operated by a person in a wheel chair;
[15] Fig. 2A is an isometric front view of the disable person face;
[16] Fig. 2B is an isometric profile view of the disable person face;
[17] Fig. 3 is an isometric side view of a robotic arm;
[18] Fig. 4A is a front view of the disable person face including arrows designating the person face movement; and
[19] Fig. 4B is a profile view of the disable person face including arrows designating the person face movement.
[20] DESCRIPTION OF SOME EMBODIMENTS OF THE PRESENT INVENTION
[21] Fig. 1 illustrates an embodiment of a system for aiding a disabled person 30 in accordance with the present invention, operated by the disabled person, for example, in a chair or wheel chair 32 . A digital camera 34 is attached to a chair support 36 of chair 32 . R esting upon a table 38 , a food plate 40 , a robotic arm 42 , and a storage means 44 which stores appliances that can be attached to a robotic catch 45 , disposed on an end 46 of the robotic arm. Examples of such appliances include a fork, a cup and so forth. Robotic arm 42 has freedom to move in one or more axes and preferably about those axes. Examples of such robotic arms are the VP 5-axis series and VP 6-axis series arms (DENSO Robotics, 3900 Via Oro Avenue, Long Beach, CA 90810, USA), which can perform in five and six freedoms of movement, respectively.
[22] Typically, an additional camera 48 is attached to table 38 . Camera 48 images a front view of face 58 of disabled person 30 and camera 34 is used for imaging the profile of the disabled person's face. The digital images are stored in a digital memory means, not shown, for further processing.
[23] According to some embodiments, the system further includes a voice recognition sub-system, not shown, for recognizing voice commands of disabled person 30 . An example of such voice commands are, 'change tool', which commands robot arm 42 to
substitute the appliance currently attached to catch 45 with another appliance that is stored in storage means 44 . The command 'open catch' for example is another voice command that commands catch 45 of robotic arm 42 to open.
[24] An isometric front view and isometric profile view of disabled person 30 is shown in
Figs. 2A and 2B respectively, to which references are now made. In this example, on the face 58 of the disabled person 30 are placed four marks 60, 62, 64 and 66 (e.g. via stickers or projected thereon). These marks can be natural marks appearing on face 58, facial gestures or marks artificially placed on the disabled person face 58. Marks 60, 62 , 64 and 66 are detected by cameras 34 and 48 as known in the art, implemented by an image processing module, not shown. Examples of useful natural marks are wrinkles and moles. In some embodiments of the present invention, a transparent sticker which is invisible to the human eye but is detectable by cameras 34 and 48 can be applied on the disabled person's face 58. Mark 60 is positioned on the forehead of disabled person 30. The disabled person moving his forehead is accompanied by a corresponding movement of mark 60. Mark 64 is disposed on the chin of the disabled person. When the disabled person moves his chin, mark 60 moves as well. Marks 62 and 66 are positioned on the cheeks of the disabled person. In some embodiments of the present invention marks 62 and 66 are moved by moving the tongue of the disabled person towards the person's right and left cheeks, respectively.
[25] A side view of robotic arm 42 is shown in Fig . 3 to which reference is now made.
Robotic arm 42 can be described as being controllable with reference to a Cartesian grid. Double headed arrow 70 designates the movement direction of robotic arm 42 in the x-axis. Double headed arrow 72 designates the movement direction of robotic arm 42 in the z-axis. Double headed arrow 74 designates the movement direction of robotic arm 42 in the y-axis. Rotational movements around axes 70 , 72 and 74 are designated by arrows 76 , 78 and 80 respectively.
[26] A front and profile view of the face 58 of disabled person 30, including arrows describing the facial movement, is shown in Figs. 4A and 4B to which reference is now made. The movements of the disabled person's head is analyzed by determining the distance between any of marks 60, 62, 64 and 66 before and after a facial movement(s) or facial gestures respectively are performed. Nodding of the disabled person's head up and down, in the direction designated by double headed arrow 90, is accompanied by a corresponding movement of marks 60, 64. Sideways turning of the disabled person's head in the direction designated by double headed arrow 92, is accompanied by a corresponding movement of marks 60 and 64 to the left or to the right. Sideways tiliting of the disabled person head in the direction designated by double headed arrow 94, is accompanied by a corresponding movement of mark 60 to the right, and mark 64 to the left, or, mark 60 moves to the left and mark 64 moves to the right. The movements of
the disabled person's chin in the direction designated by double headed arrow 96 moves mark 64 up or down in respect to mark 60. Movement of the disabled person's chin in the direction designated by arrows 98 is accompanied by a corresponding movement of mark 64 left or right with respect to mark 60. A movement of the disabled person's tongue in the direction designated by arrow 100 can cause mark 66 to move to the left with respect to marks 60 and 64. The movement of the disabled person's tongue in the direction designated by arrow 102 moves mark 62 to the right with respect to marks 60 and 64.
[27] A control sub-system, not shown, controls the movement of robotic arm 42. Robotic arm 42 is controlled/driven by analyzing the aforementioned movement(s) of the head of disabled person 30 and issuing commands to driving mechanisms, not shown, for controlling the robotic arm (Fig. 3). Head movement in the directions designated by double headed arrow 90 will actuate robotic arm 42 in the directions designated by double headed arrow 70. A movement in direction 92 will actuate robotic arm 42 in the direction designated by double headed arrow 72. A movement in direction 96 will actuate robotic arm 42 in the direction designated by double headed arrow 74. A movement in direction 98 will actuate robotic arm 42 in the direction designated by arrow 76. A movement in direction 94 will actuate robotic arm 42 in the direction designated by arrow 78. A movement in direction 100 or direction 102 will actuate robotic arm 42 in the direction designated by arrow 80.
[28] In some embodiments of the present invention marks 60, 62, 64 and 66 can be projected by one of the cameras 34 and 48 or an additional other component onto the person's face 58 ; and robotic arm 42 is controlled by the relative movement of the face and the mark.
Claims
[Claim 1] A system for aiding a disabled person comprising: at least one robotic arm; at least one camera for producing at least one image of the face of said disabled person; an image processing module for processing said image; and a control sub-system for controlling the position of said at least one robotic arm, based on repositioning of one or more facial elements by said disabled person, wherein said one or more facial elements is an artificial mark being a transparent mark detectable by the at least one camera operably connected to said robotic arm and/or a mark projected onto the person by a projection mechanism.
[Claim 2] A system as in claim 1, comprising two cameras, wherein one camera images a front view of the face of said disabled person and a second camera images a profile of the face of said disabled person.
[Claim 3] A system as in claim 1, wherein said system further comprises storage means which stores appliances that can be attached to said at least one robotic arm.
[Claim 4] A system as in claim 1, wherein said robotic arm is adapted to move in a plurality of axes. [Claim 5] A system as in claim 1, further comprising a voice recognition subsystem for recognizing voice commands of said disabled person. [Claim 6] A system as in claim 1, w herein the projection mechanism is produced by a laser. [Claim 7] A system as in claim 1, w herein the projection mechanism is produced by a projector. [Claim 8] A system as in claim 1, w herein the camera comprises the projection mechanism. [Claim 9] A method for aiding a disabled person comprising: applying at least one artificial mark on said person; detecting said at least one artificial mark by at least one camera; and moving a robotic arm corresponding to movement of the detected artificial mark(s), wherein said artificial mark is a transparent mark detectable by the at least one camera operably connected to said robotic arm and/or a mark projected onto the person by the camera(s).
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB0818942.5 | 2008-10-16 | ||
| GB0818942A GB2464486A (en) | 2008-10-16 | 2008-10-16 | Control of a robotic arm by the recognition and analysis of facial gestures. |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2010044073A1 true WO2010044073A1 (en) | 2010-04-22 |
Family
ID=40084106
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2009/054546 Ceased WO2010044073A1 (en) | 2008-10-16 | 2009-10-15 | System and method for aiding a disabled person |
Country Status (2)
| Country | Link |
|---|---|
| GB (1) | GB2464486A (en) |
| WO (1) | WO2010044073A1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102720249A (en) * | 2011-03-29 | 2012-10-10 | 梁剑文 | Washbowl with mechanic hand |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5323470A (en) * | 1992-05-08 | 1994-06-21 | Atsushi Kara | Method and apparatus for automatically tracking an object |
| US5532824A (en) * | 1994-01-25 | 1996-07-02 | Mts Systems Corporation | Optical motion sensor |
| US20020126090A1 (en) * | 2001-01-18 | 2002-09-12 | International Business Machines Corporation | Navigating and selecting a portion of a screen by utilizing a state of an object as viewed by a camera |
| US20080132383A1 (en) * | 2004-12-07 | 2008-06-05 | Tylerton International Inc. | Device And Method For Training, Rehabilitation And/Or Support |
| US20080188959A1 (en) * | 2005-05-31 | 2008-08-07 | Koninklijke Philips Electronics, N.V. | Method for Control of a Device |
Family Cites Families (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US4207959A (en) * | 1978-06-02 | 1980-06-17 | New York University | Wheelchair mounted control apparatus |
| US5812978A (en) * | 1996-12-09 | 1998-09-22 | Tracer Round Associaties, Ltd. | Wheelchair voice control apparatus |
| CA2227361A1 (en) * | 1998-01-19 | 1999-07-19 | Taarna Studios Inc. | Method and apparatus for providing real-time animation utilizing a database of expressions |
| US6215471B1 (en) * | 1998-04-28 | 2001-04-10 | Deluca Michael Joseph | Vision pointer method and apparatus |
| US6072496A (en) * | 1998-06-08 | 2000-06-06 | Microsoft Corporation | Method and system for capturing and representing 3D geometry, color and shading of facial expressions and other animated objects |
| IT1315644B1 (en) * | 2000-07-06 | 2003-03-14 | Uni Di Modena E Reggio Emilia | SYSTEM FOR INTERACTION BETWEEN THE EYE MOVEMENT OF A SUBJECT AND UNPERSONAL COMPUTER |
| US7306337B2 (en) * | 2003-03-06 | 2007-12-11 | Rensselaer Polytechnic Institute | Calibration-free gaze tracking under natural head movement |
| US7218320B2 (en) * | 2003-03-13 | 2007-05-15 | Sony Corporation | System and method for capturing facial and body motion |
| EP1667049A3 (en) * | 2004-12-03 | 2007-03-28 | Invacare International Sàrl | Facial feature analysis system |
| US20070217891A1 (en) * | 2006-03-15 | 2007-09-20 | Charles Folcik | Robotic feeding system for physically challenged persons |
| JP2007310914A (en) * | 2007-08-31 | 2007-11-29 | Nippon Telegr & Teleph Corp <Ntt> | Mouse replacement method, mouse replacement program, and recording medium |
-
2008
- 2008-10-16 GB GB0818942A patent/GB2464486A/en not_active Withdrawn
-
2009
- 2009-10-15 WO PCT/IB2009/054546 patent/WO2010044073A1/en not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5323470A (en) * | 1992-05-08 | 1994-06-21 | Atsushi Kara | Method and apparatus for automatically tracking an object |
| US5532824A (en) * | 1994-01-25 | 1996-07-02 | Mts Systems Corporation | Optical motion sensor |
| US20020126090A1 (en) * | 2001-01-18 | 2002-09-12 | International Business Machines Corporation | Navigating and selecting a portion of a screen by utilizing a state of an object as viewed by a camera |
| US20080132383A1 (en) * | 2004-12-07 | 2008-06-05 | Tylerton International Inc. | Device And Method For Training, Rehabilitation And/Or Support |
| US20080188959A1 (en) * | 2005-05-31 | 2008-08-07 | Koninklijke Philips Electronics, N.V. | Method for Control of a Device |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102720249A (en) * | 2011-03-29 | 2012-10-10 | 梁剑文 | Washbowl with mechanic hand |
Also Published As
| Publication number | Publication date |
|---|---|
| GB2464486A (en) | 2010-04-21 |
| GB0818942D0 (en) | 2008-11-19 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Markovic et al. | Stereovision and augmented reality for closed-loop control of grasping in hand prostheses | |
| JP7747032B2 (en) | Information processing device and information processing method | |
| Diftler et al. | Evolution of the NASA/DARPA robonaut control system | |
| US11260530B2 (en) | Upper limb motion support apparatus and upper limb motion support system | |
| US20150094851A1 (en) | Robot control system, robot control method and output control method | |
| US20070265495A1 (en) | Method and apparatus for field of view tracking | |
| CN109571513B (en) | An immersive mobile grabbing service robot system | |
| Maimon-Mor et al. | Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking | |
| CN103271784A (en) | Man-machine interactive manipulator control system and method based on binocular vision | |
| WO2004041078A3 (en) | Housing device for head-worn image recording and method for control of the housing device | |
| JP2006289508A (en) | Robot apparatus and expression control method thereof | |
| JP5186723B2 (en) | Communication robot system and communication robot gaze control method | |
| JP2004329490A (en) | Finger motor function recovery support tool and finger motor function recovery support system | |
| US20170282359A1 (en) | Robot and control method thereof | |
| Chu et al. | The helping hand: An assistive manipulation framework using augmented reality and tongue-drive interfaces | |
| Li et al. | An egocentric computer vision based co-robot wheelchair | |
| Matsumoto et al. | The essential components of human-friendly robot systems | |
| WO2010044073A1 (en) | System and method for aiding a disabled person | |
| JP6158665B2 (en) | Robot, robot control method, and robot control program | |
| JP7360158B2 (en) | Control system and control program | |
| Chu et al. | Hands-free assistive manipulator using augmented reality and tongue drive system | |
| Yang et al. | Head-free, human gaze-driven assistive robotic system for reaching and grasping | |
| JP2003266353A (en) | Robot apparatus and control method thereof | |
| JP7133733B1 (en) | ROBOT SYSTEM, ROBOT OPERATING METHOD, AND ROBOT OPERATING PROGRAM | |
| Buckley et al. | Sensor suites for assistive arm prosthetics |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09820337 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 09820337 Country of ref document: EP Kind code of ref document: A1 |