[go: up one dir, main page]

WO2010044073A1 - Système et méthode d’assistance à personne handicapée - Google Patents

Système et méthode d’assistance à personne handicapée Download PDF

Info

Publication number
WO2010044073A1
WO2010044073A1 PCT/IB2009/054546 IB2009054546W WO2010044073A1 WO 2010044073 A1 WO2010044073 A1 WO 2010044073A1 IB 2009054546 W IB2009054546 W IB 2009054546W WO 2010044073 A1 WO2010044073 A1 WO 2010044073A1
Authority
WO
WIPO (PCT)
Prior art keywords
disabled person
robotic arm
mark
camera
person
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IB2009/054546
Other languages
English (en)
Inventor
Aaron Shafir
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of WO2010044073A1 publication Critical patent/WO2010044073A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F4/00Methods or devices enabling patients or disabled persons to operate an apparatus or a device not forming part of the body
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/002Specific input/output arrangements not covered by G06F3/01 - G06F3/16
    • G06F3/005Input arrangements through a video camera
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Definitions

  • the present invention relates to a system for aiding a disabled person, particularly to aid disabled person with limited use of the upper limbs.
  • patent application JP09224965 discloses a meal support robot dedicated to handling food.
  • a person having handicap in an upper limb can independently operate the robot by providing a light projecting part for projecting a directional light, a light receiving part for receiving the light, and a holding part which is elastically deformed by contact with an operator and irradiating the light in a desired position on the light receiving part.
  • U.S. patent 6,961,623 discloses a remote control method for use by disabled person, e.g. to actuate a muscle stimulator cuff, which involves triggering a signal by using mechanical vibrations detected to control operation of the device or process.
  • a system for aiding a disabled person comprising: at least one robotic arm; at least one camera for producing at least one image of the face of the disabled person; an image processing module for processing the image; and a control sub-system for controlling the position of the at least one robotic arm, based on repositioning of one or more facial elements by the disabled person , wherein the one or more facial elements is an artificial mark being a transparent mark detectable by the at least one camera operably connected to the robotic arm and/or a mark projected onto the person by a projection mechanism.
  • the camera is adapted to sense movement of the artificial marks including gestures
  • the mark relates to a sticker applied to the face with an 'x' written thereon or a mark (e.g. an 'x') projected onto the face by a projection mechanism or device such as a projector, laser, etc., which capability according to some embodiments is incorporated into the camera(s).
  • the present invention relates to a method for aiding a disabled person comprising: applying at least one artificial mark on the person; detecting the at least one artificial mark by at least one camera; and moving an appliance attachable to a controllable robotic arm corresponding to the detected artificial mark(s), wherein the artificial mark is a transparent mark detectable by the at least one camera operably connected to the robotic arm and/or a mark projected onto the person by the camera(s).
  • the present system and method can be used for a variety of activities including eating/feeding, drawing, writing, teeth brushing and the like.
  • FIG. 1 is an isometric view of an exemplary system in accordance with the present invention as operated by a person in a wheel chair;
  • Fig. 2A is an isometric front view of the disable person face
  • Fig. 2B is an isometric profile view of the disable person face
  • Fig. 3 is an isometric side view of a robotic arm
  • Fig. 4A is a front view of the disable person face including arrows designating the person face movement
  • Fig. 4B is a profile view of the disable person face including arrows designating the person face movement.
  • FIG. 1 illustrates an embodiment of a system for aiding a disabled person 30 in accordance with the present invention, operated by the disabled person, for example, in a chair or wheel chair 32 .
  • a digital camera 34 is attached to a chair support 36 of chair 32 .
  • Robotic arm 42 has freedom to move in one or more axes and preferably about those axes. Examples of such robotic arms are the VP 5-axis series and VP 6-axis series arms (DENSO Robotics, 3900 Via Oro Avenue, Long Beach, CA 90810, USA), which can perform in five and six freedoms of movement, respectively.
  • an additional camera 48 is attached to table 38 .
  • Camera 48 images a front view of face 58 of disabled person 30 and camera 34 is used for imaging the profile of the disabled person's face.
  • the digital images are stored in a digital memory means, not shown, for further processing.
  • the system further includes a voice recognition sub-system, not shown, for recognizing voice commands of disabled person 30 .
  • voice commands are, 'change tool', which commands robot arm 42 to substitute the appliance currently attached to catch 45 with another appliance that is stored in storage means 44 .
  • the command 'open catch' for example is another voice command that commands catch 45 of robotic arm 42 to open.
  • Figs. 2A and 2B respectively, to which references are now made.
  • marks 60, 62, 64 and 66 e.g. via stickers or projected thereon.
  • These marks can be natural marks appearing on face 58, facial gestures or marks artificially placed on the disabled person face 58.
  • Marks 60, 62 , 64 and 66 are detected by cameras 34 and 48 as known in the art, implemented by an image processing module, not shown. Examples of useful natural marks are wrinkles and moles.
  • a transparent sticker which is invisible to the human eye but is detectable by cameras 34 and 48 can be applied on the disabled person's face 58.
  • Mark 60 is positioned on the forehead of disabled person 30.
  • mark 60 The disabled person moving his forehead is accompanied by a corresponding movement of mark 60.
  • Mark 64 is disposed on the chin of the disabled person. When the disabled person moves his chin, mark 60 moves as well.
  • Marks 62 and 66 are positioned on the cheeks of the disabled person. In some embodiments of the present invention marks 62 and 66 are moved by moving the tongue of the disabled person towards the person's right and left cheeks, respectively.
  • FIG. 3 A side view of robotic arm 42 is shown in Fig . 3 to which reference is now made.
  • Robotic arm 42 can be described as being controllable with reference to a Cartesian grid.
  • Double headed arrow 70 designates the movement direction of robotic arm 42 in the x-axis.
  • Double headed arrow 72 designates the movement direction of robotic arm 42 in the z-axis.
  • Double headed arrow 74 designates the movement direction of robotic arm 42 in the y-axis. Rotational movements around axes 70 , 72 and 74 are designated by arrows 76 , 78 and 80 respectively.
  • FIG. 4A and 4B A front and profile view of the face 58 of disabled person 30, including arrows describing the facial movement, is shown in Figs. 4A and 4B to which reference is now made.
  • the movements of the disabled person's head is analyzed by determining the distance between any of marks 60, 62, 64 and 66 before and after a facial movement(s) or facial gestures respectively are performed.
  • Nodding of the disabled person's head up and down, in the direction designated by double headed arrow 90 is accompanied by a corresponding movement of marks 60, 64.
  • Sideways turning of the disabled person's head in the direction designated by double headed arrow 92 is accompanied by a corresponding movement of marks 60 and 64 to the left or to the right.
  • a control sub-system controls the movement of robotic arm 42.
  • Robotic arm 42 is controlled/driven by analyzing the aforementioned movement(s) of the head of disabled person 30 and issuing commands to driving mechanisms, not shown, for controlling the robotic arm (Fig. 3).
  • Head movement in the directions designated by double headed arrow 90 will actuate robotic arm 42 in the directions designated by double headed arrow 70.
  • a movement in direction 92 will actuate robotic arm 42 in the direction designated by double headed arrow 72.
  • a movement in direction 96 will actuate robotic arm 42 in the direction designated by double headed arrow 74.
  • a movement in direction 98 will actuate robotic arm 42 in the direction designated by arrow 76.
  • a movement in direction 94 will actuate robotic arm 42 in the direction designated by arrow 78.
  • a movement in direction 100 or direction 102 will actuate robotic arm 42 in the direction designated by arrow 80.
  • marks 60, 62, 64 and 66 can be projected by one of the cameras 34 and 48 or an additional other component onto the person's face 58 ; and robotic arm 42 is controlled by the relative movement of the face and the mark.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Vascular Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un système comprenant: au moins un bras robotisé; au moins une caméra produisant au moins une image du visage de la personne handicapée; un module de traitement de l’image; et un sous-système de commande de la position du bras robotisé basé sur le repositionnement d’un ou plusieurs éléments faciaux de la personne handicapée. Un ou plusieurs des éléments faciaux forment une marque artificielle transparente détectable par la ou les caméras, laquelle est reliée fonctionnelle au bras robotisé et/ou à une marque projetée sur la personne par un mécanisme de projection.
PCT/IB2009/054546 2008-10-16 2009-10-15 Système et méthode d’assistance à personne handicapée Ceased WO2010044073A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0818942.5 2008-10-16
GB0818942A GB2464486A (en) 2008-10-16 2008-10-16 Control of a robotic arm by the recognition and analysis of facial gestures.

Publications (1)

Publication Number Publication Date
WO2010044073A1 true WO2010044073A1 (fr) 2010-04-22

Family

ID=40084106

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2009/054546 Ceased WO2010044073A1 (fr) 2008-10-16 2009-10-15 Système et méthode d’assistance à personne handicapée

Country Status (2)

Country Link
GB (1) GB2464486A (fr)
WO (1) WO2010044073A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102720249A (zh) * 2011-03-29 2012-10-10 梁剑文 带机械手的脸盆

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5323470A (en) * 1992-05-08 1994-06-21 Atsushi Kara Method and apparatus for automatically tracking an object
US5532824A (en) * 1994-01-25 1996-07-02 Mts Systems Corporation Optical motion sensor
US20020126090A1 (en) * 2001-01-18 2002-09-12 International Business Machines Corporation Navigating and selecting a portion of a screen by utilizing a state of an object as viewed by a camera
US20080132383A1 (en) * 2004-12-07 2008-06-05 Tylerton International Inc. Device And Method For Training, Rehabilitation And/Or Support
US20080188959A1 (en) * 2005-05-31 2008-08-07 Koninklijke Philips Electronics, N.V. Method for Control of a Device

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4207959A (en) * 1978-06-02 1980-06-17 New York University Wheelchair mounted control apparatus
US5812978A (en) * 1996-12-09 1998-09-22 Tracer Round Associaties, Ltd. Wheelchair voice control apparatus
CA2227361A1 (fr) * 1998-01-19 1999-07-19 Taarna Studios Inc. Methode et appareil pour fournir de l'animation en temps reel en utilisant une base de donnes d'expressions
US6215471B1 (en) * 1998-04-28 2001-04-10 Deluca Michael Joseph Vision pointer method and apparatus
US6072496A (en) * 1998-06-08 2000-06-06 Microsoft Corporation Method and system for capturing and representing 3D geometry, color and shading of facial expressions and other animated objects
IT1315644B1 (it) * 2000-07-06 2003-03-14 Uni Di Modena E Reggio Emilia Sistema per l'interazione tra il movimento oculare di un soggetto e unpersonal computer
US7306337B2 (en) * 2003-03-06 2007-12-11 Rensselaer Polytechnic Institute Calibration-free gaze tracking under natural head movement
US7218320B2 (en) * 2003-03-13 2007-05-15 Sony Corporation System and method for capturing facial and body motion
EP1667049A3 (fr) * 2004-12-03 2007-03-28 Invacare International Sàrl Système d'analyse de traits du visage
US20070217891A1 (en) * 2006-03-15 2007-09-20 Charles Folcik Robotic feeding system for physically challenged persons
JP2007310914A (ja) * 2007-08-31 2007-11-29 Nippon Telegr & Teleph Corp <Ntt> マウス代替方法、マウス代替プログラム、および記録媒体

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5323470A (en) * 1992-05-08 1994-06-21 Atsushi Kara Method and apparatus for automatically tracking an object
US5532824A (en) * 1994-01-25 1996-07-02 Mts Systems Corporation Optical motion sensor
US20020126090A1 (en) * 2001-01-18 2002-09-12 International Business Machines Corporation Navigating and selecting a portion of a screen by utilizing a state of an object as viewed by a camera
US20080132383A1 (en) * 2004-12-07 2008-06-05 Tylerton International Inc. Device And Method For Training, Rehabilitation And/Or Support
US20080188959A1 (en) * 2005-05-31 2008-08-07 Koninklijke Philips Electronics, N.V. Method for Control of a Device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102720249A (zh) * 2011-03-29 2012-10-10 梁剑文 带机械手的脸盆

Also Published As

Publication number Publication date
GB2464486A (en) 2010-04-21
GB0818942D0 (en) 2008-11-19

Similar Documents

Publication Publication Date Title
Markovic et al. Stereovision and augmented reality for closed-loop control of grasping in hand prostheses
CN103271784B (zh) 基于双目视觉的人机交互式机械手控制系统和控制方法
JP7747032B2 (ja) 情報処理装置及び情報処理方法
US20070265495A1 (en) Method and apparatus for field of view tracking
Maimon-Mor et al. Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking
CN109571513B (zh) 一种沉浸式移动抓取服务机器人系统
WO2004041078A3 (fr) Dispositif de prise de vues pour la capture d&#39;images assistee par des mouvements de la tete, et procede de commande de ce dispositif de prise de vues
CN113056315B (zh) 信息处理装置、信息处理方法和程序
JP5186723B2 (ja) コミュニケーションロボットシステムおよびコミュニケーションロボットの視線制御方法
Baldi et al. Design of a wearable interface for lightweight robotic arm for people with mobility impairments
JP2004329490A (ja) 指運動機能回復支援具および指運動機能回復支援システム
US12239590B2 (en) Nursing bed system and nursing bed posture changing device
Staub et al. Human-computer interfaces for interaction with surgical tools in robotic surgery
Li et al. An egocentric computer vision based co-robot wheelchair
Matsumoto et al. The essential components of human-friendly robot systems
WO2010044073A1 (fr) Système et méthode d’assistance à personne handicapée
JP6158665B2 (ja) ロボット、ロボット制御方法、およびロボット制御プログラム
JP7360158B2 (ja) 制御システム及び制御プログラム
Chu et al. Hands-free assistive manipulator using augmented reality and tongue drive system
Yang et al. Head-free, human gaze-driven assistive robotic system for reaching and grasping
JP2003266353A (ja) ロボット装置及びその制御方法
JP7133733B1 (ja) ロボットシステム、ロボット操作方法、及びロボット操作プログラム
Tooyama et al. Development of an assistive system for position control of a human hand with high speed and high accuracy
Kim et al. A human-robot interface using eye-gaze tracking system for people with motor disabilities
RU2227930C2 (ru) Способ бесконтактного ввода информации в компьютер и система для его осуществления

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09820337

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09820337

Country of ref document: EP

Kind code of ref document: A1