[go: up one dir, main page]

WO2004008427A1 - Appareil en boucle fermee pour realite augmentee - Google Patents

Appareil en boucle fermee pour realite augmentee Download PDF

Info

Publication number
WO2004008427A1
WO2004008427A1 PCT/IL2002/000586 IL0200586W WO2004008427A1 WO 2004008427 A1 WO2004008427 A1 WO 2004008427A1 IL 0200586 W IL0200586 W IL 0200586W WO 2004008427 A1 WO2004008427 A1 WO 2004008427A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
signals
filtering
movement
filtering unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IL2002/000586
Other languages
English (en)
Inventor
Yoram Baram
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to AU2002368068A priority Critical patent/AU2002368068A1/en
Priority to PCT/IL2002/000586 priority patent/WO2004008427A1/fr
Publication of WO2004008427A1 publication Critical patent/WO2004008427A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/112Gait analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4082Diagnosing or monitoring movement diseases, e.g. Parkinson, Huntington or Tourette
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Biofeedback
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient; User input means
    • A61B5/742Details of notification to user or communication with user or patient; User input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • A61B5/7207Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal of noise induced by motion artifacts
    • A61B5/721Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal of noise induced by motion artifacts using a separate sensor to detect motion or using motion information derived from signals other than the physiological signal to be measured

Definitions

  • the present invention relates to a closed-loop augmented reality system for assisting people with motion disorders.
  • Certain neurological disorders such as those associated with Parkinson's Disease (PD) are known to cause both motor and visual impairments. These impairments may include tremor, motor fluctuations, and involuntary arm, leg and head movements. In addition, patients with these disorders may have trouble initiating and sustaining movement. While people with such disorders have distorted visual feedback effects, they are even more dependent on such feedback than healthy people.
  • US Patent Number 5,597,309 describes a method for stimulating and sustaining ambulation in Parkinson's patients by creating virtual visual cues. The method, however, is based only on open-loop visual cue presentation, wherein initiating and sustaining cues are given at predetermined speeds using an image-generating device.
  • an apparatus for adaptive image generation includes at least one non-radiating sensor, mountable on a body, for detecting body movements and producing signals related to the body movements, and a processor configured to receive the signals and generate an image, wherein the generated image is adapted according to the detected body movements.
  • the processor may include a filtering unit for filtering noise from the received signals, the unit having an adaptive filtering element, and an image generator for providing the generated and adapted images from the filtered and received signals.
  • the filtering unit may include linear elements and non-linear elements, and may be a neural network.
  • the non-radiating sensor is an accelerometer. There may be two sensors for producing signals related to movement of a head and body.
  • the generated image may include a geometric pattern, such as a tiled floor or parallel stripes, or it may include a view from real life.
  • an apparatus for augmented reality which includes at least one sensor mountable on at least one part of a body for producing signals from movements of a body part, and a processor for adapting an augmented image based only on the produced signals.
  • a system for adaptive augmented or virtual reality which includes at least one non-radiating sensor, mountable on at least one part of a body, for detecting body movements and producing signals related to the body movements, a processor configured to receive the signals and generate an image which is adapted according to the detected body movements, and a display for displaying the generated and adapted images.
  • the system provides closed-loop biofeedback for adaptation of body movements.
  • an apparatus for treating a movement disorder includes at least one sensor, mountable on a body, for detecting body movements and producing signals related to the body movements, and a processor configured to receive the signals and generate an image, wherein the generated image is adapted according to the detected body movements.
  • a system and method for reducing involuntary movement artifacts from a signal including a voluntary movement processor for filtering a voluntary movement signal representative of a voluntary movement having involuntary movements therein, an adaptive involuntary movement processor for adaptively filtering a vertical motion signal, and a subtractor for subtracting the involuntary movements from the voluntary movement signal to produce a reduced artifact signal.
  • the adaptive involuntary movement processor adapts its processing using the reduced artifact signal.
  • Involuntary movement may include tremor or other unwanted movements.
  • Voluntary movement may include walking or other full body movements such as turning, running, etc.
  • a method for interaction of an image with body movement including the steps of providing an image to a person, receiving signals related to movements of the person, adapting the image according to the received signals, and providing the adapted image to the person, wherein the adapted image enables the person to adjust body movements.
  • the steps may be performed repeatedly so as to provide continuous assistance of body movement.
  • the image may be virtual or augmented.
  • Interaction may include therapy, recreational activities (sports, sex, etc.) or physical assistance.
  • a method for treating a movement disorder including the steps of providing an image to a person, receiving at least one signal from the person, filtering unwanted noise from the signal, adapting the image based on the received and filtered signal, and providing the adapted image to the person, wherein the adapted image enables the person to adjust body movements.
  • the step of filtering may be accomplished using a filtering unit having an adaptive filtering element.
  • the method may also include the step of measuring a walking parameter.
  • FIGS. 1A and 1 B are schematic illustrations of a user wearing one embodiment of the present invention
  • Figs. 2A and 2B are illustrations of images viewed by the user of Figs. 1A and 1B;
  • FIG. 3 is a block diagram illustration of a processor
  • Fig. 4 is a block diagram illustration of one component of the processor of Fig. 3 in greater detail;
  • Fig. 5 is a block diagram illustration of another component of the processor of Fig. 3 in greater detail;
  • Fig. 6 is a block diagram illustration of open-loop and closed-loop control
  • Fig. 7 is a table showing results from tests performed using one embodiment of the present invention.
  • the proposed invention creates an adaptive augmented reality of motion over a virtual image, such as a tiled floor.
  • the system is portable, and can be used for a variety of therapeutic, healing, assistive, or recreational
  • PD Parkinson's Disease
  • FIG. 1 A shows an overview of the
  • Fig. 1 B shows a detailed view of a portion of the system.
  • the adaptive augmented reality apparatus, generally referenced 50 is portable
  • Head-mounted assembly 52 comprising a sensor 60A and a display 64, is attached to a pair of glasses 40. Glasses 40
  • Sensors 60A and 60B are non-radiating sensors, such as
  • accelerometers Other types of non-radiating sensors may be used as well.
  • Display 64 overlays a portion of one lens of glasses 40, protruding out
  • Display 64 is a small (for example, 1 cm x 1 cm) piece, situated directly in front of one eye 41. In this way, display 64 is close enough to eye 41 to allow the user to see a full view image on display 64 without obscuring any view of the surroundings.
  • Display 64 may be, for example, a liquid crystal display (LCD). Alternatively, integrated eyeglasses may be used, where display 64 is already incorporated within glasses 40. Such integrated glasses are available from, for example, i-glasses LC Model # 500881 , i-O Display Systems, LLC, Menlo Park, CA, USA; or The MicroOptical Corporation, Westwood, MA, USA. Display 64, whether located internally or externally to glasses 40, is equipped with VGA or video connectors (not shown).
  • Sensor 60A is, for example, a tilt sensor such as Model # CXTILT02E or Model # CXTA02, available from Crossbow Technology, Inc., San Jose, CA,
  • sensor 60A may be a sensor that can detect other movements as well as head tilt, such as a 3-axis accelerometer.
  • Head-mounted assembly 52 is connected to body-mounted assembly
  • Body-mounted assembly 54 comprises a processor 62 and a 3-axis accelerometer 60B, for example, translational accelerometer Model #
  • Body-mounted assembly 54 may be configured in a box of a reasonable size for a person to wear, for example, but not limited to, one having dimensions 7 x 12 x 3 cm. Body-mounted assembly 54 is preferably attached to a belt, but may be connected to the body in any number of ways such as by a chest strap, adhesive, or other connecting device.
  • FIG. 2A and 2B show examples of images viewed by the user while wearing system 50.
  • Fig. 2A is adapted during movement and shown in Fig. 2B.
  • Fig. 2A shows a virtual tiled floor image as displayed to the user during normal walk. The floor moves as the user walks, in an opposite direction as depicted by arrow 43, to simulate a real floor as it appears to someone walking. If the user stumbles or falls forward, an image such as the one shown in Fig. 2B is displayed to the user, to simulate the actual view of a real tiled floor during stumble or fall.
  • the image is continuously adapted to the motions of the user to create more realistic dynamics of the virtual world viewed by the patient. Consequently, the virtual floor viewed by the user moves only during actual motion, at a rate equal to this motion, as in real life.
  • the image is not restricted to tiled floors, and may include other geometric patterns, such as parallel stripes.
  • other images may be generated, such as views from real life (i.e. outdoors in a park or the like).
  • the image may be a virtual image, in which the outside world is blocked out, or it may be an augmented image, in which the image is superimposed onto the person's view of the real world.
  • Fig. 3 shows details of processor 62 located within body-mounted assembly 54.
  • Processor 62 may be a wearable computer or a microprocessor.
  • Input data to processor 62 is obtained from sensors 60A and 60B at input ports 74A and 74B, respectively, and output data from processor 62 is sent to display 64 through output port 72.
  • Signals may be, but are not limited to, proportional direct current (DC), and indicate some motion parameter.
  • signals may contain acceleration data that is later converted to velocity data.
  • signals may relate to an angle of head tilt, or other body movements.
  • Signals from processor 62 to display 64 may be analog video signals, for example, PAL or NTSC, or they may be digital (e.g. VGA) signals. Conversion from analog to digital (A/D) or from digital to analog (D/A) may either be performed within processor 62, or external to processor 62 using a converter.
  • Processor 62 includes at least two components: a filtering unit 48, and an image generator 40.
  • Filtering unit 48 filters signals received at input port 74B from sensor 60B. Signals from sensor 60A relating to movements other than head tilt may be filtered as well, as shown by dashed lines. Filtering eliminates unwanted components from the sensor signals, such as tremor, motor fluctuations and involuntary arm, leg and head movements, as described in further detail below.
  • Image generator 40 then incorporates filtered data, as well as signals received directly from sensor 60A at input port 74A, and translates the received and filtered proportional signals into rates and degrees of motion of the displayed virtual floor. Image generator 40 then adapts the base image (such as the one shown in Fig. 2A) according to the generated rate and degree of motion information. Adapted images are sent through output port 72 to display 64 to be viewed by the user.
  • FIG. 4 is a block diagram illustration of a filtering component 45 of filtering unit 48, used for filtering tremor, and other unwanted motions.
  • Each filtering component 45 in filtering unit 48 is used for filtering signals related to motion in one particular axis or direction.
  • filtering unit 48 may have one or several filtering components 45, depending on the number of axes of movement being measured.
  • noisy sensor data are generally cleaned by filtering. Signals relating to vertical movement (up/down), representing tremor and other involuntary movements, are then subtracted from signals relating to translational movement (forward/back or side/side) or other voluntary movements. In this way, both noise from signals and unwanted motions and tremor are filtered out.
  • Filtering unit 48 has an upper path 47 and a lower path 49.
  • Lower path 49 is used for eliminating tremor and involuntary movement, based on receipt of vertical (up/down) movements. Vertical movements may also be obtained from a 3-axis accelerometer, or by other measuring means.
  • a linear filtering element 76 is used to clean signals in one axis, for example, forward acceleration, from voluntary movement, or another voluntary movement in one axis, for example, forward acceleration.
  • an adaptive linear filtering element 77 is used in lower path 49.
  • Adaptive linear filtering element 77 is, for example, 5-dimensional, and is similar to one proposed by Widrow B. and Winter R for a linear adapter noise canceller in "Neural nets for adaptive filtering and adaptive pattern recognition", Computer 21(3): p. 25, 1988, incorporated herein by reference in its entirety. Similar to linear filtering element 76, output is related to input by
  • the b k are variable weights. K was taken to be 5, but can be any number.
  • Linear filter 76 and adaptive linear filtering element 77 both feed into sigmoidal elements 78.
  • output y 2 (i) from adaptive linear filtering element 77 is subtracted from output y ⁇ (i) from linear filtering element 76 to obtain a final output r(i). Weights b k in adaptive linear filter 77 are then adjusted so as to minimize the squared final output r 2 ⁇ ).
  • Filtering unit 48 "learns" the user's motions.
  • Filtering unit 48 may be considered a neural network.
  • Each axis of movement uses its own filtering component 45.
  • the cleaned signal is sent from filtering unit 48 to image generator 40.
  • image generator 40 may simultaneously obtain multiple filtered signals from filtering unit 48, as well as signals directly from sensor 60A, such as a tilt sensor.
  • FIG. 5 is a block diagram illustration of image generator 40, used for creating images and adapting the images based on received filtered data.
  • an initial image 80 of a tiled floor, or other image is created using an imaging software package (OpenGLTM, Silicon Graphics, Inc., Mountain View, CA, USA).
  • Data from sensors, which may be filtered or unfiltered, are fed into image generator 40, and are used to make corresponding proportional changes in floor angle and speed of movement of image 80, resulting in an updated image 80', also provided by the imaging software.
  • the filtered acceleration signals are converted into rate of motion data within image generator 40, typically using an integrator.
  • the tilt angle received from sensor 60A is translated into an inclination angle of the virtual tiled floor so as to create a realistic view of the floor. Tripping or falling motions result in larger angles, and are translated into a proportional outward expansion of image 80, as in real-life vision.
  • Sensors 60A and 60B may also detect turning motions, which are translated into counter-turning motions of the virtual floor.
  • the rates of motion of the virtual tiled floor are the same as the rates of body motion of the user, occurring in opposite directions so as to create the sensation of a floor fixed in space.
  • the tilt of the virtual floor is the same as that of the user's head, as measured by head-mounted sensor 60A. Parameters such as tile size, color and intensity of the virtual floor are adjustable.
  • filtering unit 48 Because of filtering unit 48, a forward motion of the tiled floor will not be triggered by leg tremor, and expansion of tile images, indicating a stumble or a fall, will not be caused by head tremor. Learning and filtering are performed on-line, as the patient's dynamic characteristics keep changing in time.
  • the present invention may potentially be used for anything that other virtual reality devices are used for, such as entertainment, industry, science and medicine.
  • the use of accelerometers allows for free movement and is not restricted by location or space.
  • it allows for adaptation of the image to full body motions.
  • one embodiment of the invention may include a device which would enable a sport or any other recreational activity (i.e. sexual activity) to be performed with a virtual background scene, outside of an entertainment room allowing for more body movements.
  • the device could be connected to the Internet, allowing for direct interaction between patients and doctors or between users. Movement disorders may include stroke, trauma, PD, or other central nervous system disorders and degenerative diseases. Also, it may include birth defects and results of aging.
  • Fig. 6 illustrates the concept of open-loop versus closed-loop control.
  • an image generator 40 produces a display 64 for a user 44 to see. User 44 may then react to display 64, and voluntarily begin to move. This, however, has no effect on image generator 40.
  • the motion of user 44 is sensed by motion sensors 60, which send signals related to this motion through a filtering unit 48 and back to image generator 40.
  • the closed-loop system incorporates signals from motion sensors 60 into display 64.
  • Fig. 7 is a table showing details about the subjects who participated in the study, and the results obtained with the display off, with open-loop display, and with closed-loop display.
  • open-loop display no sensors were activated on the subject for measuring movements, resulting in an image displayed at a predetermined speed towards the observer.
  • Speed and stride length are listed for each test per subject, and the final two columns list a percentage change for the tested parameters.
  • Each test consisted of a subject walking a stretch of 10 meters 4 times. Only results from the last two out of four tests in each category were used, to eliminate the effect of training. At the start of each test, the subject was verbally instructed to start walking. The length of time and the number of steps to completion of the 10-meter stretch were recorded for each test. Speed in meters/second (m/s) and stride length in meters (m) were calculated. In the first test (the reference test) the display was turned off. In the second, the open-loop system was turned on, displaying a virtual tiled floor in perpetual motion towards the observer at the maximal speed level comfortable for the subject. The third test employed the adaptive closed-loop system.
  • Fig. 7 list the percentage changes in the performance parameters obtained for the closed-loop system with respect to the reference test. It can be seen that, in all cases but one, performance was improved significantly with respect to the reference test when the closed-loop system was turned on (higher speed, longer strides).
  • closed-loop adaptive system and indicated a clear preference for it over the open-loop system.
  • augmented reality can significantly improve the walking abilities of most PD patients without causing the discomfort and the freezing phenomena associated with the open-loop system.
  • LV "Effects of bilateral posteroventral pallidotomy on gait in subjects with Parkinson's disease", Arch. Neurol. ,57, 198, 2000. However, medication causes involuntary movement which disturbs gait further.
  • the proposed approach may make it possible to reduce medication and postpone surgical intervention.
  • the proposed invention may be useful as treatment, as therapy, or as an assistive device. It will be appreciated by persons skilled in the art that the present invention is not limited by what has been particularly shown and described hereinabove. Rather the scope of the invention is defined by the claims which follow:

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Physiology (AREA)
  • Neurology (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Neurosurgery (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Developmental Disabilities (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

La présente invention concerne un appareil et un système (50) destinée à la réalité augmentée adaptative. L'invention comporte au moins une sonde non rayonnante (60) à porter sur corps pour en détecter les mouvements. Cette sonde produit des signaux se rapportant aux mouvements du corps. L'invention comporte également un processeur (62) configuré pour recevoir ces signaux et produire une image, laquelle image est adaptée en fonction des mouvements du corps détectés. Cette invention, qui assure une rétroaction biologique en boucle fermée destinée à l'adaptation des mouvements du corps, convient au traitement de troubles moteurs, notamment dans le cas de la maladie de Parkinson.
PCT/IL2002/000586 2002-07-17 2002-07-17 Appareil en boucle fermee pour realite augmentee Ceased WO2004008427A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2002368068A AU2002368068A1 (en) 2002-07-17 2002-07-17 Closed-loop augmented reality apparatus
PCT/IL2002/000586 WO2004008427A1 (fr) 2002-07-17 2002-07-17 Appareil en boucle fermee pour realite augmentee

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IL2002/000586 WO2004008427A1 (fr) 2002-07-17 2002-07-17 Appareil en boucle fermee pour realite augmentee

Publications (1)

Publication Number Publication Date
WO2004008427A1 true WO2004008427A1 (fr) 2004-01-22

Family

ID=30011845

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2002/000586 Ceased WO2004008427A1 (fr) 2002-07-17 2002-07-17 Appareil en boucle fermee pour realite augmentee

Country Status (2)

Country Link
AU (1) AU2002368068A1 (fr)
WO (1) WO2004008427A1 (fr)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005011494A1 (fr) * 2003-07-25 2005-02-10 Consejo Superior De Investigaciones Científicas Procede et dispositif biomecaniques d'elimination de tremblements pathologiques
EP1591064A1 (fr) * 2004-04-30 2005-11-02 Rupp + Hubrach Optik GmbH appareil de mesure
US7383728B2 (en) 2005-07-13 2008-06-10 Ultimate Balance, Inc. Orientation and motion sensing in athletic training systems, physical rehabilitation and evaluation systems, and hand-held devices
US7634379B2 (en) 2007-05-18 2009-12-15 Ultimate Balance, Inc. Newtonian physical activity monitor
US20110270135A1 (en) * 2009-11-30 2011-11-03 Christopher John Dooley Augmented reality for testing and training of human performance
EP2499965A1 (fr) * 2011-03-15 2012-09-19 Universite Paris-Sud (Paris 11) Procédé de fourniture à une personne d'informations d'orientation spatiale
WO2012152976A1 (fr) * 2011-05-10 2012-11-15 Universidade Da Coruña Système de réalité virtuelle pour l'évaluation et le traitement des troubles moteurs associés aux maladies neurodégénératives et à l'âge
CN104434129A (zh) * 2014-12-25 2015-03-25 中国科学院合肥物质科学研究院 一种帕金森及相关锥体外系疾病运动障碍症状量化评测装置及方法
US9078598B2 (en) 2012-04-19 2015-07-14 Barry J. French Cognitive function evaluation and rehabilitation methods and systems
EP2921100A1 (fr) * 2014-03-21 2015-09-23 Siemens Aktiengesellschaft Procédé pour adapter un système médical au mouvement du patient se produisant lors d'un examen médical et système associé
US9530299B2 (en) 2015-03-03 2016-12-27 Toyota Motor Engineering & Manufacturing North America, Inc. Methods and apparatuses for assisting a visually-impaired user
US9826921B2 (en) 2008-06-12 2017-11-28 Global Kinetics Corporation Limited Detection of hypokinetic and hyperkinetic states
CN108836347A (zh) * 2018-05-10 2018-11-20 中国科学院宁波材料技术与工程研究所 帕金森患者康复训练方法和系统
CN109068985A (zh) * 2016-03-31 2018-12-21 皇家飞利浦有限公司 用于检测对象的肌肉发作的设备和系统
US10292635B2 (en) 2013-03-01 2019-05-21 Global Kinetics Pty Ltd System and method for assessing impulse control disorder
CN110402129A (zh) * 2017-03-15 2019-11-01 本田技研工业株式会社 步行支援系统、步行支援方法、以及程序
US10736577B2 (en) 2014-03-03 2020-08-11 Global Kinetics Pty Ltd Method and system for assessing motion symptoms

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5728159B2 (ja) 2010-02-02 2015-06-03 ソニー株式会社 画像処理装置、画像処理方法及びプログラム

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5722420A (en) * 1995-11-28 1998-03-03 National Science Council EMG biofeedback traction modality for rehabilitation
US6176837B1 (en) * 1998-04-17 2001-01-23 Massachusetts Institute Of Technology Motion tracking system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5722420A (en) * 1995-11-28 1998-03-03 National Science Council EMG biofeedback traction modality for rehabilitation
US6176837B1 (en) * 1998-04-17 2001-01-23 Massachusetts Institute Of Technology Motion tracking system

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005011494A1 (fr) * 2003-07-25 2005-02-10 Consejo Superior De Investigaciones Científicas Procede et dispositif biomecaniques d'elimination de tremblements pathologiques
EP1591064A1 (fr) * 2004-04-30 2005-11-02 Rupp + Hubrach Optik GmbH appareil de mesure
US7383728B2 (en) 2005-07-13 2008-06-10 Ultimate Balance, Inc. Orientation and motion sensing in athletic training systems, physical rehabilitation and evaluation systems, and hand-held devices
US7634379B2 (en) 2007-05-18 2009-12-15 Ultimate Balance, Inc. Newtonian physical activity monitor
US9826921B2 (en) 2008-06-12 2017-11-28 Global Kinetics Corporation Limited Detection of hypokinetic and hyperkinetic states
US11596327B2 (en) 2008-06-12 2023-03-07 Global Kinetics Pty Ltd Detection of hypokinetic and hyperkinetic states
US20110270135A1 (en) * 2009-11-30 2011-11-03 Christopher John Dooley Augmented reality for testing and training of human performance
EP2499965A1 (fr) * 2011-03-15 2012-09-19 Universite Paris-Sud (Paris 11) Procédé de fourniture à une personne d'informations d'orientation spatiale
WO2012123524A1 (fr) * 2011-03-15 2012-09-20 Universite Paris Sud (Paris 11) Procédé pour fournir à une personne des informations d'orientation spatiale
WO2012152976A1 (fr) * 2011-05-10 2012-11-15 Universidade Da Coruña Système de réalité virtuelle pour l'évaluation et le traitement des troubles moteurs associés aux maladies neurodégénératives et à l'âge
ES2397031A1 (es) * 2011-05-10 2013-03-04 Universidade Da Coruña Sistema de realidad virtual para la evaluación y el tratamiento de los trastornos motores asociados a las enfermedades neurodegenerativas y a la edad.
US9078598B2 (en) 2012-04-19 2015-07-14 Barry J. French Cognitive function evaluation and rehabilitation methods and systems
US10292635B2 (en) 2013-03-01 2019-05-21 Global Kinetics Pty Ltd System and method for assessing impulse control disorder
US10736577B2 (en) 2014-03-03 2020-08-11 Global Kinetics Pty Ltd Method and system for assessing motion symptoms
EP2921100A1 (fr) * 2014-03-21 2015-09-23 Siemens Aktiengesellschaft Procédé pour adapter un système médical au mouvement du patient se produisant lors d'un examen médical et système associé
US11259752B2 (en) 2014-03-21 2022-03-01 Siemens Aktiengesellschaft Method for adapting a medical system to patient motion during medical examination, and system therefor
CN104434129A (zh) * 2014-12-25 2015-03-25 中国科学院合肥物质科学研究院 一种帕金森及相关锥体外系疾病运动障碍症状量化评测装置及方法
US9530299B2 (en) 2015-03-03 2016-12-27 Toyota Motor Engineering & Manufacturing North America, Inc. Methods and apparatuses for assisting a visually-impaired user
CN109068985A (zh) * 2016-03-31 2018-12-21 皇家飞利浦有限公司 用于检测对象的肌肉发作的设备和系统
CN110402129A (zh) * 2017-03-15 2019-11-01 本田技研工业株式会社 步行支援系统、步行支援方法、以及程序
CN110402129B (zh) * 2017-03-15 2021-09-14 本田技研工业株式会社 步行支援系统、步行支援方法、以及记录介质
CN108836347A (zh) * 2018-05-10 2018-11-20 中国科学院宁波材料技术与工程研究所 帕金森患者康复训练方法和系统
CN108836347B (zh) * 2018-05-10 2021-10-22 中国科学院宁波材料技术与工程研究所 帕金森患者康复训练方法和系统

Also Published As

Publication number Publication date
AU2002368068A1 (en) 2004-02-02

Similar Documents

Publication Publication Date Title
US6734834B1 (en) Closed-loop augmented reality apparatus
US11273344B2 (en) Multimodal sensory feedback system and method for treatment and assessment of disequilibrium, balance and motion disorders
WO2004008427A1 (fr) Appareil en boucle fermee pour realite augmentee
US10258259B1 (en) Multimodal sensory feedback system and method for treatment and assessment of disequilibrium, balance and motion disorders
US12233216B2 (en) Immersive multisensory simulation system
US10716469B2 (en) Ocular-performance-based head impact measurement applied to rotationally-centered impact mitigation systems and methods
US12042294B2 (en) Systems and methods to measure ocular parameters and determine neurologic health status
US9994228B2 (en) Systems and methods for controlling a vehicle or device in response to a measured human response to a provocative environment
US9788714B2 (en) Systems and methods using virtual reality or augmented reality environments for the measurement and/or improvement of human vestibulo-ocular performance
US9370302B2 (en) System and method for the measurement of vestibulo-ocular reflex to improve human performance in an occupational environment
EP3515284B1 (fr) Appareil de dépistage
Wade et al. The effect of ocular torsional position on perception of the roll-tilt of visual stimuli
JP2007518469A5 (fr)
EP2329761A1 (fr) Appareil et procédé pour établir et/ou améliorer une vision binoculaire améliorée
US20100305411A1 (en) Control of operating characteristics of devices relevant to distance of visual fixation using input from respiratory system and/or from eyelid function
Baram et al. Walking on virtual tiles
KR102220837B1 (ko) 신경계 및 근골격계 환자의 운동재활을 위한 증강현실 기반 거울운동 시스템
WO2020174636A1 (fr) Dispositif de changement d'informations visuelles, lunettes prismatiques et procédé de sélection de lentille dans des lunettes prismatiques
KR101730699B1 (ko) 가상현실을 이용한 외형적으로 대칭적이지 않은 질환을 위한 통증 치료 장치
Zhang et al. Sensory interactions for head and trunk control in space in young and older adults during normal and narrow-base walking
JP2023091678A (ja) 支援装置
Hao et al. External visual perturbation impacts muscle activation while walking on incline treadmill
Takami et al. Immediate effect of video viewing with an illusion of walking at a faster speed using virtual reality on actual walking of stroke patients
CN116997288A (zh) 用于确定视觉表现的方法和设备
Bugnariu et al. Virtual environments and sensory integration: Effects of aging and stroke

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LU MC NL PT SE SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP