[go: up one dir, main page]

WO2019171216A1 - Augmented reality device and/or system and/or method for using same for assisting in walking or movement disorders - Google Patents

Augmented reality device and/or system and/or method for using same for assisting in walking or movement disorders Download PDF

Info

Publication number
WO2019171216A1
WO2019171216A1 PCT/IB2019/051588 IB2019051588W WO2019171216A1 WO 2019171216 A1 WO2019171216 A1 WO 2019171216A1 IB 2019051588 W IB2019051588 W IB 2019051588W WO 2019171216 A1 WO2019171216 A1 WO 2019171216A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
wearable device
view
walking
augmented
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IB2019/051588
Other languages
French (fr)
Other versions
WO2019171216A4 (en
Inventor
Elon Littwitz
Ron Lefeber
Gabi Horowitz
Hanan Rom
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of WO2019171216A1 publication Critical patent/WO2019171216A1/en
Publication of WO2019171216A4 publication Critical patent/WO2019171216A4/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Definitions

  • Embodiments of the invention relate to an augmented reality device and/or system and method for using same, in particular for therapeutic purposes.
  • Augmented reality relates to provision of a composed view of a physical environment with computer-generated elements that have been inserted into the view to e.g. enhance the viewed reality.
  • Augmented visual cues may be used as patterns composed into a view of a physical environment for assisting or guiding movement of people within the environment. For example, such cues may be projected as visual indications on a walking plane for affecting parameters of walking such as step size or the like.
  • AR displays can be rendered on devices resembling eyeglasses.
  • Examples of such devices may include eyewear that can utilize cameras that can intercept the real-world view of a user and display in this view augmented effects.
  • Augmented effects may also be projected on eyewear (e.g. eyeglass devices) that may not necessarily have a camera or may make use inertial sensors.
  • Such display of effects may make use of AR imagery that is projected through or reflected off the lenses of the eyewear.
  • Some examples of technologies for wearable AR displays may include see-through video wearable displays which may be summari ed e.g. as“curved mirror” based and“waveguide” based. Using a waveguide based technique may for example permit a fully unobstructed view of the physical world by moving physical display(s) and electronics to the side near the user’s temples.
  • Waveguide techniques may include: diffractive techniques using e.g. deep slanted diffraction gratings, holographic techniques that use e.g. a holographic element for diffracting light, polarized waveguide techniques that use e.g. multilayer coatings and embedded polarized reflectors, reflective technologies that make use of e.g. semi reflective mirrors for reflecting light, and“clear-vu” approach that uses e.g. a surface structure made up of several reflecting structures.
  • diffractive techniques using e.g. deep slanted diffraction gratings
  • holographic techniques that use e.g. a holographic element for diffracting light
  • polarized waveguide techniques that use e.g. multilayer coatings and embedded polarized reflectors
  • reflective technologies that make use of e.g. semi reflective mirrors for reflecting light
  • “clear-vu” approach that uses e.g. a surface structure made up of several reflecting structures.
  • US6734834 is an example of a system for adaptive augmented reality that makes use of a micro display in front of the eyeglass for blocking part of the field of view.
  • This system includes non-radiating sensors, mountable on a body, for detecting body movements and producing signals related to the body movements.
  • the system further includes a processor for receiving the signals and for generating an image with moving cues according to the detected body movements.
  • the system provides closed-loop biofeedback for adaptation of body movements, and may be used for treating a movement disorder, such as Parkinson's Disease.
  • US2017206691 is a further example that describes a display system that includes a wearable display device for displaying augmented reality content.
  • the display device has a display area with light redirecting features that direct light to a user.
  • the display area is at least partially transparent and provides a view of an ambient environment through the display area.
  • An aspect of at least certain embodiments of the present invention relates to a wearable device (WD) suitable for assisting users wearing the device with mobility related difficulties.
  • WD wearable device
  • Such mobility related difficulties may be due to health disorders, such as disorders in the central nervous system of a user that mainly affects the motor system - e.g. Parkinson's disease.
  • embodiments of the wearable device may be arranged to be and/or include a super-imposer (SM) member that is configured to superimpose information onto a real-world field of view of a user wearing of the WD.
  • SM super-imposer
  • Such superimposing of information may be onto a plane/surface within a line of sight of a user, such as onto a surface of a lens placed in-front of a user’s eye.
  • SM members of at least certain wearable device (WD) embodiments may be arranged to form (e.g. project) such superimposed ‘information’ (e.g. superimposed cues) on a surface of the WD (e.g. surface of a lens) in-front of a user’s eye - resulting in the superimposed‘information’ being arranged to appear substantially in focus to a person using the WD by appearing in front of the WD substantially on a focal plane of the WD.
  • Such wearable device (WD) may be or may include: eyeglass device and/or eyeglass with micro display device in front of the lenses (etc.).
  • Wearable devices (WD) embodiments with SM members including and/or in the form of an array of pixels for forming the superimposed‘information’, may be arranged to light each pixel separately to the other pixels.
  • the wearable device may be an eyewear of the user that includes accessories worn on or over the user’s eyes.
  • Such eyewear possibly in the form of spectacles or eyeglasses, may include at least one of first and second optic members each being placed in-front of a respective eye of the user.
  • one of the members may possibly be embodied as an optical lens, a frame for a lens, a multifocal lens (or the like); and the other member, e.g. the second member, may optionally include a portion generally similar to the first member and an additional portion that is or includes a super-imposer (SM) member.
  • SM super-imposer
  • the super imposer (SM) in various forms may be configured to superimpose information on the real-world field of view using computer generated information possibly projected/formed to intercept the real-world view of a user and display in this view augmented effects. Such interception may be upon a plane/surface within a line of sight of a user, such as a surface of a lens placed in- front of a user’s eye.
  • the super imposer (SM) in various embodiments may be a smart-glass or wearable computer glasses configured for adding information alongside to what the wearer sees.
  • Such SM embodiments may be arranged to change optical properties of members through which eye sight of a user passes (possibly at runtime) in order to form such augmented information.
  • a super imposer may be programmed to change tint by electronic means.
  • superimposing information onto a field of view may be achieved through transparent heads-up displays.
  • the super imposer (SM) may be an augmented reality overlay able of reflecting/projecting digital images as well as allowing the user to see through it.
  • a wearable device (WD) and/or super imposer (SM) may be“curved mirror” based and/or“waveguide” based.“Waveguide” based technologies, may include diffractive techniques using e.g. deep slanted diffraction gratings, holographic techniques that use e.g. a holographic element for diffracting light, polarized waveguide techniques that use e.g. multilayer coatings and embedded polarized reflectors, reflective technologies that make use of e.g. semi reflective mirrors for reflecting light, and/or“clear-vu” approach that uses e.g. a surface structure made up of several reflecting structures.
  • diffractive techniques e.g. deep slanted diffraction gratings
  • holographic techniques that use e.g. a holographic element for diffracting light
  • polarized waveguide techniques that use e.g. multilayer coatings and embedded polarized reflectors
  • reflective technologies that make use of e
  • both the first and second members may be arranged to include a super imposer (SM) possibly in the form of an augmented reality member or lens.
  • SM super imposer
  • the first and second members may be integrally formed as a unitary one piece member.
  • the super imposer may include one or more optic members, possibly optic lenses, prisms (or the like).
  • the SM may be mounted on an eyewear, possibly with a field of view of the SM directed generally downwards to reflect a region generally located at the tips of the user’s feet or footwear.
  • a super imposer (SM) included in a wearable device (WD) fitted to a user may be configured to have a focal point that generally lies upon a surface/plane where tips of the user’s feet or footwear are generally located.
  • SM super imposer
  • Such configuration being preferably observed while the user is in a general upright walking state with his line of sight aimed generally forward and while viewing the tips of his feet or footwear at least through marginal lower regions of the SM’s optic.
  • a wearable device may include the first member’s lens in front of one eye of the user and at least a portion of the second member’s SM in front of the other eye of the user.
  • the second member’s SM may be positioned at an angle relative to the user’s line of sight so that the user while walking in a generally upright posture, may view through the SM generally simultaneously both regions adjacent tips of his feet or footwear and regions slightly more distant ahead in his route of advance and/or may switch his/her line of sight between the two fields of view while e.g. walking upright.
  • the first member’s optic may take various forms, such as: merely a frame for an optic, a contact lens located on the user’s eye, an implanted lens (or the like). In certain cases, the first member’s optic may also be absent.
  • a system for assisting a user with mobility related difficulties may employ in addition to a wearable device (WD) according to various embodiments of the invention, also one or more sensors such as: accelerometer(s), GPS sensors (or the like) - in order to monitor the user’s walking condition(s) or advancement condition(s).
  • sensors such as: accelerometer(s), GPS sensors (or the like) - in order to monitor the user’s walking condition(s) or advancement condition(s).
  • a system for assisting a user with mobility related difficulties may employ in addition to a wearable device (WD) according to various embodiments of the invention, also one or more cameras including each a field of view capable of imaging both the user’s feet, preferably tips of the feet; the user’s knees, the user’s hands and/or the user’s arms.
  • WD wearable device
  • cameras including each a field of view capable of imaging both the user’s feet, preferably tips of the feet; the user’s knees, the user’s hands and/or the user’s arms.
  • a system employing any of the above noted features may include a controller for controlling e.g. the cameras, accelerometer sensors, and/or the SM members included in wearable devices (WD) of the system.
  • a controller for controlling e.g. the cameras, accelerometer sensors, and/or the SM members included in wearable devices (WD) of the system.
  • Such system may also include wire or wireless communication to a distinct (possibly relative remote) device and/or processor (such as of a mobile device, tablet, or the like) where processes affecting and/or controlling embodiments of the SM, cameras (etc.) may be configured to run and/or operate.
  • processes may also be configured to run in a cloud based environment (or the like).
  • a super imposer SM and cameras included in a system and/or wearable device may be configured to have fields of view that are generally aligned. Such alignment may be part of a pre-performed alignment process where alignment between pixels in the cameras and pixels/comparable- zones in the field of view of the SM are computed/calculated.
  • a wearable device may include hearing means such as an earphone capable of audio communication in audible frequencies that are audible to the average human or in frequencies that are beyond those that are audible to the average human.
  • a wearable device may include odor substances possibly controllable via a processor located on the device for controlling odor dispersion possibly from receptacle’s housing same.
  • a system including a controller for controlling operation of a wearable device (WD) of at least certain embodiments of the invention, where said processor being configured to receive image feeds from cameras of or in association with such device for possible processing (e.g. image recognition/analysis) of the image feeds.
  • the image feeds may be communicated for handling to a processor possibly running on a computing means (such as a mobile phone, tablet, a remote processor such as in the cloud or therapist computer, or the like) where said processing (e.g. image recognition/analysis) of the image feeds may be performed.
  • a processor possibly running on a computing means (such as a mobile phone, tablet, a remote processor such as in the cloud or therapist computer, or the like) where said processing (e.g. image recognition/analysis) of the image feeds may be performed.
  • processing e.g. image recognition/analysis
  • handling may be conducted during or after a walking action of a user using a WD.
  • handling of image feed by a therapist may be during or after a walking action.
  • the controller may be configured to project an augmented cue (such as a generally horizontal line) on a SM member of the device at a location upon the SM’s optic where the projected cue is configured to be viewed by a user of the device as being generally located adjacent the user’s foot or shoe.
  • an augmented cue such as a generally horizontal line
  • the projected line may be configured to be located at a given pre-defined distance ahead of a front terminal area of a foot or footwear of a user of the WD (such as toe, a shoe’s front tip or the like); where the front direction being defined as being the direction of advancement of the user in a walking condition.
  • Such line being configured to represent an indication/target that the user should preferably try to step over during his advancement.
  • the controller may be configured to project more than one line, possibly two or more lines further possibly being generally parallel one to the other. All the lines may possibly be in front of a forward tip of the user’s foot while he is walking. Possibly at least some of the lines, preferably one line, may be located at least momentarily behind one or more feet (possibly behind tips of the feet) of the user while he is walking in the direction of the lines.
  • the controller may be configured to communicate the image feeds to a computing means (e.g. a personal computer) where according to inputs such as: walking quality, walking rhythm, stride-length, user’s posture and/or variance in stride-length (or the like) - the controller may be configured to determine parameters for augmented cue appearances.
  • Such parameters may include: rate of appearance of cue(s) (so that the rate generally correlates to user’s walking speed/rate); distance of the first cue (line) most proximal to the user’s foot from the foot’s tip; distance between adjacent lines in a direction ahead of the user where such distance possibly correlates to a general stride-length of the user; colors of lines (e.g. all lines same general color or of different color); line width possibly measured in pixel values of cameras and/or SM member pixel; (and the like).
  • audio signals may be configured to played to the user - where said audio being possibly synchronized with the user’s rhythm of walk. Said audio possibly being configured to mimic the above discussed line appearances, such as distance between lines being correlative e.g. to time span between audio signals (etc.).
  • Possible, audio and visible signals may be configured to be used/played in conjunction to enhance feedback to a user.
  • information(s) processed by the controller may be used for providing stimuli feedbacks (such as visual and/or audial feedbacks) to the user indicative of a‘quality’ measure of his walk.
  • stimuli feedbacks such as visual and/or audial feedbacks
  • visual feedbacks may be provided to the user on the super imposing (SM) member, which are indicative of a‘quality’ measure of his walk.
  • the indications may be provided for walking quality of both feet together or each foot independently.
  • Indications to ‘quality’ of a walk of a user may be computed by comparison of filmed footage of a user’s walk and a reference walk. Possibly, portions of the knee and leg of the user being observed may be compared to refence movements of similar body parts of a reference walk to provide a‘quality’ rank of the user’s walk. Such‘quality’ rank in certain cases may be binary-like such as “good”“bad” (or the like).
  • audio and/or visual feedback may include instructions to the user such as to move his feet/hands or certain feet/hands when required.
  • Image feeds captured by cameras on wearable devices fitted to the user may be compared in order to asses (possibly via the controller) that the user abides to the instructions.
  • a system may be configured, via e.g. the controller, in response to change in walking rhythm of a user to increase or decrease distance between lines placed in the direction of advance of the user via the SM member.
  • Possible instructions may be provided to the user to alter (e.g. increase) his stride-length in order to possibly avoid falling.
  • such avoidance of falling of a user during a walking process may be detected from signals received form accelerometer(s) or the like mounted to the user.
  • various“awakening” signals/effects may be applied to the user.
  • Such signals may include video clips shown to the user possibly projected onto or by the SM member in his wearable device (WD); audio signals in varying rhythms; triggering of possible trembling means fixed to the user e.g. to sleeves of the user’s shirt, dispersion of odor (or the like).
  • Such“awakening” signals/effects may be halted once movement of the user may be identified as commencing, possibly from camera footage and/or from accelerometer(s) fixed to the user. Such “awakening” signals/effects may be manually triggered of triggered by sound/voice recognition (or the like).
  • “frozen” states of a user which are about to occur may be identified from camera feeds filming ahead of the user. For example, a narrow entry through which the user is about to pass may be indicative of incidents where the user in the past experienced entry into a“frozen” state.
  • a system including a wearable device (WD) of some embodiments of the invention may be configured to detect obstacles in the route of advancement of the user. Such obstacles may be characterized as including varying heights in a surface ahead of the user (such as tiles having different heights), a step, (etc.).
  • a controller of at least some embodiments employed in such system may sound a visual, audio and/or other type alarm in anticipation of the obstacle - such an alarm indicating to the user to lift his legs in order to surpass the obstacle (etc.).
  • augmented indications to the user’s feet may also be projected on or by the SM e.g. in cases where the user is unable to view his feet in his field of view while walking. Together with such augmented indications to the user’s feet lines placed in advance of the projected feet may also be indicated.
  • FIG. 1 schematically shows a user fitted with an embodiment of a wearable device in accordance with the present invention providing one example of augmented cues
  • FIG. 1A and IB schematically show additional examples of augmented cues possibly formed by embodiments of a wearable devices;
  • Fig. 2 schematically shows a close-up view of Fig. 1 focused on a head area of the user;
  • FIG. 3 schematically shows an embodiment of a wearable device of the invention
  • FIG. 4 schematically shows front views of possible optic members of an embodiment of a wearable device of the invention
  • FIGs. 5A, 5B, 6A and 6B schematically show side views of possible optic members of an embodiment of a wearable device of the invention.
  • FIGs. 7A and 7B schematically show a user in different, respective, postures fitted with an embodiment of a wearable device of the present invention.
  • FIGs. 1 and 2 schematically illustrating full and partial views of a user 10 fitted with an embodiment of a wearable device (WD) 12 of the invention, here in the possible form of an eyewear or spectacle.
  • Wearable device (WD) 12 may be configured to assist users with mobility related difficulties due to health disorders, such as disorders in the central nervous system affecting the motor system - e.g. Parkinson's disease.
  • Wearable device (WD) 12 in at least certain embodiments may permit a user to maintain a relative frontal directed field of view (FOV) 14 (here marked by the‘dashed’ lines) that has a central axis 141 and which is aimed at a relative frontal direction D towards which the user is walking or attempting to walk.
  • Wearable device (WD) 12 in addition may permit a user to maintain a relative downward and lower directed field of view (FOV) 16 (here marked by the‘dotted-dashed’ lines) that has a central axis 161 and may be aimed at a relative downward direction towards an area generally covering the feet and/or footwear 18 of the user, for example a frontal tip 17 of the feet.
  • Each central axis 141, 161 may be defined as generally extending towards and/or along a center of its respective FOV 14, 16.
  • Various wearable device (WD) embodiments may thus be defined as having different fields of view 14, 16 that are angled/tilted one in relation to the other and/or that possibly do not substantially overlap.
  • the fields of view 14, 16 may in addition or alternatively be defined as being formed with a tilt angle a between their respective central axes 141 and 161.
  • each optic defining a given FOV may be independently adjustable so that the axes 141, 161 of each FOV may be aimed at a desired location resulting in a resulting angle a that is located therebetween.
  • Adjustment of tilt between the fields of view may be according to a user’s condition.
  • Figs. 7A and 7B the angular orientation of the fields of views 14, 16 will be discussed with respect to a user in two posture conditions and/or two different users in two different postures.
  • a user having a relative bent or hunched posture when standing up or walking is exhibited where FOV 14 is directed to a relative close distance dl.
  • This relative close distance dl may be a result of the user being bent forward while the user’s vision and/or optics defining FOV 14 being fixed to the user’s anatomy (here head) and thus being bent downwards together with this anatomy.
  • FOV 16 may be adjusted generally towards the user’s feet resulting in an angle al being defined between axes 141 and 161 in this illustration.
  • Fig. 7B another user or the same user (possibly after practice/treatment/training with a WD embodiment of the invention); is illustrated in a more upright and less bent posture.
  • FOV 14 is seen being aimed at a distance d2 that is greater than dl due to the user’s vision and/or optics defining FOV 14 being tilted slightly upwards as a result of the user’s more upright posture.
  • FOV 16 may be adjusted generally towards the user’s feet resulting in an angle a2 defined between axes 141 and 161 in this illustration being larger than al.
  • WD 12 in this example includes first and second optic members 20, 22. Each one of the optic members may be placed in-front of a respective one of the eyes of the user. Each optic member may include at least a portion in a form of an optical lens, a frame for a lens, a multifocal lens (or the like).
  • Wearable device (WD) 12 includes in addition a super imposer (SM) 24, for superimposing information on the real-world field of view using e.g. computer generated information.
  • SM super imposer
  • the shown embodiment of super imposer (SM) 24 includes a possible module 25 for projecting/creating augmented information that intercepts the real-world view of a user in order to display in this view augmented effects 26. Such interception may be upon a plane/surface within a line of sight of a user.
  • Module 25 may include or be e.g.: a display optics, projector, transmitter and/or electronics - used for creating or assisting in creating augmented effects 26.
  • Super imposer (SM) 24 in various embodiments may be a smart-glass or wearable computer glasses configured for adding information alongside to what the wearer sees. Such (SM) may be arranged to change optical properties of members through which eye sight of a user passes (possibly at runtime) in order to form such augmented information. In a non-binding example, such SM or wearable device (WD) including such SM - may utilize waveguide based techniques for creating augmented effects.
  • SM or wearable device (WD) including such SM - may utilize waveguide based techniques for creating augmented effects.
  • such plane/surface upon which the effects 26 appear may be an optic member such as optic member 22 that may be considered as being part of the SM or may be distinct to the SM. The effects 26 appearing on optic member 22 are configured to appear at least in a part of this optic, which may be arranged to have the relative lower field of view 16 aimed at the user’s feet or footwear.
  • a user fitted with the various discussed embodiments of wearable device (WD) 12 may be enriched at least in the WD’s relative lower field of view with effects 26 that seemingly appear adjacent his feet while in fact they are not present in reality at this location near his feet.
  • the visual effects 26 may take form of one or more lines extending generally transverse (possibly generally orthogonal) to an axis extending along the direction of advancement D of a user fitted with an embodiment of wearable device (WD) 12. These lines as seen may preferably be located in-front of a front terminal area 17 (such as toe, a shoe’s front tip or the like) of a foot or footwear of the user. With attention drawn to Figs. 1A and 1B it is apparent that the visual effects 26 may take various forms, such as a stair like formation (seen in Fig. 1 A), tile like formation (seen in Fig. 1B) and the like.
  • the visual effects 26 may be generally static relative to the user or may be controlled to dynamically move relative to the user.
  • the visual effects 26 may be controlled to move in a given pace relative to the user e.g. in the frontal direction D towards which the user is walking or attempting to walk or in an opposing direction to the frontal direction D.
  • FIG. 4 schematically illustrating front views of optic members generally similar to optic members 20 and 22.
  • each optic member 20, 22 may be seen including two possible optic segment 1, 2.
  • these segments may be formed into an optic member as a unitary one piece.
  • Segment 1 may be configured to provide a wearable device (WD) including such segments with the frontal and downward fields of views 14, 16.
  • segment 2 is seen illustrated being angled downwards to provide the downward directed field of view 16.
  • the optic members 20, 22 are illustrated including only one segment for providing the downward directed field of view 16. In Fig. 6A this segment is seen generally upright and in Fig. 6B it is illustrated being angled downwards.
  • wearable devices (WD) of various embodiments of the invention may not necessarily include two optic members 20, 22 as illustrated e.g. in Fig. 3.
  • only one optic member 20, 22 or segment 1, 2 may be provided for providing the downward directed field of view 16 - while the frontal field of view 14 may be provided without assistance of an optic or with optics either fitted directly to the eye such as contact or implanted lens (or the like).
  • Wearable devices in accordance with at least certain embodiments of the invention may be configured to include eyewear (e.g. eyeglasses) with one optic member 20, 22 being arranged to provide a frontal directed field of view 14, while the other optic member 20, 22 being arranged to provide the downward lower directed field of view 16.
  • eyewear e.g. eyeglasses
  • one optic member 20, 22 being arranged to provide a frontal directed field of view 14
  • the other optic member 20, 22 being arranged to provide the downward lower directed field of view 16.
  • eyewear e.g. eyeglasses
  • a system employing or in cooperation with wearable device embodiments of the invention may be arranged to include a controller 29 for controlling e.g. cameras, accelerometer sensors, and/or SM members included in wearable devices (WD) of the system.
  • a controller 29 for controlling e.g. cameras, accelerometer sensors, and/or SM members included in wearable devices (WD) of the system.
  • Such system may also include wire or wireless communication to a distinct (possibly relative remote) device and/or processor (such as of a mobile device, tablet, or the like) where processes affecting and/or controlling embodiments of the SM, cameras (etc.) may be configured to run and/or operate.
  • Such processes may also be configured to run in a cloud based environment (or the like).
  • the controller may be configured to communicate image feeds to a computing means (e.g. a personal computer) where according to inputs such as: walking quality, walking rhythm, stride-length, user’s posture and/or variance in stride-length (or the like) - the controller may be configured to determine parameters for augmented cue appearances.
  • a computing means e.g. a personal computer
  • a derived at‘quality measure’ of a user’s walking and/or advancement may be broadcasted/communicated as feedback to the user by e.g. a so-called ‘processor engine’ running on a processor (e.g. remote processor) and/or by a human therapist of the user.
  • Such feedbacks may be communicated to the user by various stimuli measures such as audio, visual (or the like), where e.g., a human therapist may communicate such feedback via audio means.
  • the derived at derived at‘quality measure’ and its resulting feedbacks can be used in some cases in conjunction with customized medicine treatment to achieve optimized movement of the user.
  • online monitor of the user’s movement may be used as a means to analyze the amount of medicine the user requires.
  • gathered information of the user’s movement can be used as a data base for research on possible diseases affecting the user’s condition such as Parkinson disease. Such gathered information may in addition or alternatively be used by a therapist treating the user, for example for off line monitoring of the user.
  • each of the verbs, “comprise”“include” and“have”, and conjugates thereof, are used to indicate that the object or objects of the verb are not necessarily a complete listing of members, components, elements or parts of the subject or subjects of the verb.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Footwear And Its Accessory, Manufacturing Method And Apparatuses (AREA)

Abstract

A wearable device (WD) for being used by a user has a super imposer (SM) for superimposing augmented information on a real-world field of view of the user. The wearable device (WD) provides the user with a relative downward directed field of view aimed at his foot or footwear, and the superimposed information is designed to appear at least within the relative downward directed field of view as being present close to the foot or footwear.

Description

AUGMENTED REALITY DEVICE AND/OR SYSTEM AND/OR METHOD FOR USING SAME FOR ASSISTING IN WALKING OR MOVEMENT
DISORDERS
TECHNICAL FIELD
[001] Embodiments of the invention relate to an augmented reality device and/or system and method for using same, in particular for therapeutic purposes. BACKGROUND
[002] Augmented reality (AR) relates to provision of a composed view of a physical environment with computer-generated elements that have been inserted into the view to e.g. enhance the viewed reality.
[003] Augmented visual cues may be used as patterns composed into a view of a physical environment for assisting or guiding movement of people within the environment. For example, such cues may be projected as visual indications on a walking plane for affecting parameters of walking such as step size or the like.
[004] AR displays can be rendered on devices resembling eyeglasses. Examples of such devices may include eyewear that can utilize cameras that can intercept the real-world view of a user and display in this view augmented effects. Augmented effects may also be projected on eyewear (e.g. eyeglass devices) that may not necessarily have a camera or may make use inertial sensors. Such display of effects may make use of AR imagery that is projected through or reflected off the lenses of the eyewear. [005] Some examples of technologies for wearable AR displays may include see-through video wearable displays which may be summari ed e.g. as“curved mirror” based and“waveguide” based. Using a waveguide based technique may for example permit a fully unobstructed view of the physical world by moving physical display(s) and electronics to the side near the user’s temples.
[006] Waveguide techniques may include: diffractive techniques using e.g. deep slanted diffraction gratings, holographic techniques that use e.g. a holographic element for diffracting light, polarized waveguide techniques that use e.g. multilayer coatings and embedded polarized reflectors, reflective technologies that make use of e.g. semi reflective mirrors for reflecting light, and“clear-vu” approach that uses e.g. a surface structure made up of several reflecting structures.
[007] US6734834 is an example of a system for adaptive augmented reality that makes use of a micro display in front of the eyeglass for blocking part of the field of view. This system includes non-radiating sensors, mountable on a body, for detecting body movements and producing signals related to the body movements. The system further includes a processor for receiving the signals and for generating an image with moving cues according to the detected body movements. The system provides closed-loop biofeedback for adaptation of body movements, and may be used for treating a movement disorder, such as Parkinson's Disease.
[008] US2017206691 is a further example that describes a display system that includes a wearable display device for displaying augmented reality content. The display device has a display area with light redirecting features that direct light to a user. The display area is at least partially transparent and provides a view of an ambient environment through the display area.
SUMMARY
[009] The following embodiments and aspects thereof are described and illustrated in conjunction with systems, tools and methods which are meant to be exemplary and illustrative, not limiting in scope. [010] An aspect of at least certain embodiments of the present invention relates to a wearable device (WD) suitable for assisting users wearing the device with mobility related difficulties. Such mobility related difficulties may be due to health disorders, such as disorders in the central nervous system of a user that mainly affects the motor system - e.g. Parkinson's disease.
[011] In an aspect of the invention, embodiments of the wearable device (WD) may be arranged to be and/or include a super-imposer (SM) member that is configured to superimpose information onto a real-world field of view of a user wearing of the WD. Such superimposing of information may be onto a plane/surface within a line of sight of a user, such as onto a surface of a lens placed in-front of a user’s eye.
[012] SM members of at least certain wearable device (WD) embodiments may be arranged to form (e.g. project) such superimposed ‘information’ (e.g. superimposed cues) on a surface of the WD (e.g. surface of a lens) in-front of a user’s eye - resulting in the superimposed‘information’ being arranged to appear substantially in focus to a person using the WD by appearing in front of the WD substantially on a focal plane of the WD. Such wearable device (WD) may be or may include: eyeglass device and/or eyeglass with micro display device in front of the lenses (etc.).
[013] Wearable devices (WD) embodiments with SM members including and/or in the form of an array of pixels for forming the superimposed‘information’, may be arranged to light each pixel separately to the other pixels.
[014] In some embodiments, the wearable device (WD) may be an eyewear of the user that includes accessories worn on or over the user’s eyes. Such eyewear, possibly in the form of spectacles or eyeglasses, may include at least one of first and second optic members each being placed in-front of a respective eye of the user.
[015] In an embodiment, one of the members, e.g. the first member, may possibly be embodied as an optical lens, a frame for a lens, a multifocal lens (or the like); and the other member, e.g. the second member, may optionally include a portion generally similar to the first member and an additional portion that is or includes a super-imposer (SM) member.
[016] The super imposer (SM) in various forms may be configured to superimpose information on the real-world field of view using computer generated information possibly projected/formed to intercept the real-world view of a user and display in this view augmented effects. Such interception may be upon a plane/surface within a line of sight of a user, such as a surface of a lens placed in- front of a user’s eye.
[017] The super imposer (SM) in various embodiments may be a smart-glass or wearable computer glasses configured for adding information alongside to what the wearer sees. Such SM embodiments may be arranged to change optical properties of members through which eye sight of a user passes (possibly at runtime) in order to form such augmented information.
[018] In some cases, a super imposer (SM) may be programmed to change tint by electronic means. In some cases, superimposing information onto a field of view may be achieved through transparent heads-up displays. In some cases, the super imposer (SM) may be an augmented reality overlay able of reflecting/projecting digital images as well as allowing the user to see through it.
[019] In certain embodiments, a wearable device (WD) and/or super imposer (SM) may be“curved mirror” based and/or“waveguide” based.“Waveguide” based technologies, may include diffractive techniques using e.g. deep slanted diffraction gratings, holographic techniques that use e.g. a holographic element for diffracting light, polarized waveguide techniques that use e.g. multilayer coatings and embedded polarized reflectors, reflective technologies that make use of e.g. semi reflective mirrors for reflecting light, and/or“clear-vu” approach that uses e.g. a surface structure made up of several reflecting structures.
[020] In some embodiments, both the first and second members may be arranged to include a super imposer (SM) possibly in the form of an augmented reality member or lens. [021] In certain embodiments, the first and second members may be integrally formed as a unitary one piece member.
[022] In an embodiment, the super imposer (SM) may include one or more optic members, possibly optic lenses, prisms (or the like). The SM may be mounted on an eyewear, possibly with a field of view of the SM directed generally downwards to reflect a region generally located at the tips of the user’s feet or footwear.
[023] In an embodiment, a super imposer (SM) included in a wearable device (WD) fitted to a user, may be configured to have a focal point that generally lies upon a surface/plane where tips of the user’s feet or footwear are generally located. Such configuration being preferably observed while the user is in a general upright walking state with his line of sight aimed generally forward and while viewing the tips of his feet or footwear at least through marginal lower regions of the SM’s optic.
[024] In an embodiment, a wearable device (WD) may include the first member’s lens in front of one eye of the user and at least a portion of the second member’s SM in front of the other eye of the user.
[025] In certain embodiments, the second member’s SM may be positioned at an angle relative to the user’s line of sight so that the user while walking in a generally upright posture, may view through the SM generally simultaneously both regions adjacent tips of his feet or footwear and regions slightly more distant ahead in his route of advance and/or may switch his/her line of sight between the two fields of view while e.g. walking upright.
[026] In certain embodiments, the first member’s optic may take various forms, such as: merely a frame for an optic, a contact lens located on the user’s eye, an implanted lens (or the like). In certain cases, the first member’s optic may also be absent.
[027] In certain embodiments, a system for assisting a user with mobility related difficulties may employ in addition to a wearable device (WD) according to various embodiments of the invention, also one or more sensors such as: accelerometer(s), GPS sensors (or the like) - in order to monitor the user’s walking condition(s) or advancement condition(s).
[028] In certain embodiments, a system for assisting a user with mobility related difficulties may employ in addition to a wearable device (WD) according to various embodiments of the invention, also one or more cameras including each a field of view capable of imaging both the user’s feet, preferably tips of the feet; the user’s knees, the user’s hands and/or the user’s arms.
[029] In certain embodiments, a system employing any of the above noted features may include a controller for controlling e.g. the cameras, accelerometer sensors, and/or the SM members included in wearable devices (WD) of the system. Such system may also include wire or wireless communication to a distinct (possibly relative remote) device and/or processor (such as of a mobile device, tablet, or the like) where processes affecting and/or controlling embodiments of the SM, cameras (etc.) may be configured to run and/or operate. Such processes may also be configured to run in a cloud based environment (or the like).
[030] In certain embodiments, a super imposer SM and cameras included in a system and/or wearable device (WD); may be configured to have fields of view that are generally aligned. Such alignment may be part of a pre-performed alignment process where alignment between pixels in the cameras and pixels/comparable- zones in the field of view of the SM are computed/calculated.
[031] In certain embodiments, a wearable device (WD) may include hearing means such as an earphone capable of audio communication in audible frequencies that are audible to the average human or in frequencies that are beyond those that are audible to the average human.
[032] In certain embodiments, a wearable device (WD) may include odor substances possibly controllable via a processor located on the device for controlling odor dispersion possibly from receptacle’s housing same. [033] In an aspect of the invention there is provided a system including a controller for controlling operation of a wearable device (WD) of at least certain embodiments of the invention, where said processor being configured to receive image feeds from cameras of or in association with such device for possible processing (e.g. image recognition/analysis) of the image feeds.
[034] Possibly, the image feeds may be communicated for handling to a processor possibly running on a computing means (such as a mobile phone, tablet, a remote processor such as in the cloud or therapist computer, or the like) where said processing (e.g. image recognition/analysis) of the image feeds may be performed. Possibly, such handling may be conducted during or after a walking action of a user using a WD. For example, handling of image feed by a therapist may be during or after a walking action.
[035] Based on image recognition/analysis, the controller may be configured to project an augmented cue (such as a generally horizontal line) on a SM member of the device at a location upon the SM’s optic where the projected cue is configured to be viewed by a user of the device as being generally located adjacent the user’s foot or shoe.
[036] Possibly, the projected line may be configured to be located at a given pre-defined distance ahead of a front terminal area of a foot or footwear of a user of the WD (such as toe, a shoe’s front tip or the like); where the front direction being defined as being the direction of advancement of the user in a walking condition. Such line being configured to represent an indication/target that the user should preferably try to step over during his advancement.
[037] In an embodiment, the controller may be configured to project more than one line, possibly two or more lines further possibly being generally parallel one to the other. All the lines may possibly be in front of a forward tip of the user’s foot while he is walking. Possibly at least some of the lines, preferably one line, may be located at least momentarily behind one or more feet (possibly behind tips of the feet) of the user while he is walking in the direction of the lines. [038] In certain embodiments, the controller may be configured to communicate the image feeds to a computing means (e.g. a personal computer) where according to inputs such as: walking quality, walking rhythm, stride-length, user’s posture and/or variance in stride-length (or the like) - the controller may be configured to determine parameters for augmented cue appearances.
[039] Such parameters may include: rate of appearance of cue(s) (so that the rate generally correlates to user’s walking speed/rate); distance of the first cue (line) most proximal to the user’s foot from the foot’s tip; distance between adjacent lines in a direction ahead of the user where such distance possibly correlates to a general stride-length of the user; colors of lines (e.g. all lines same general color or of different color); line width possibly measured in pixel values of cameras and/or SM member pixel; (and the like).
[040] Possibly, according to paraments such as those discussed, audio signals may be configured to played to the user - where said audio being possibly synchronized with the user’s rhythm of walk. Said audio possibly being configured to mimic the above discussed line appearances, such as distance between lines being correlative e.g. to time span between audio signals (etc.).
[041] Possible, audio and visible signals may be configured to be used/played in conjunction to enhance feedback to a user.
[042] Possibly, information(s) processed by the controller (as discussed here above) may be used for providing stimuli feedbacks (such as visual and/or audial feedbacks) to the user indicative of a‘quality’ measure of his walk. In one example, visual feedbacks may be provided to the user on the super imposing (SM) member, which are indicative of a‘quality’ measure of his walk. Possibly the indications may be provided for walking quality of both feet together or each foot independently.
[043] Indications to ‘quality’ of a walk of a user may be computed by comparison of filmed footage of a user’s walk and a reference walk. Possibly, portions of the knee and leg of the user being observed may be compared to refence movements of similar body parts of a reference walk to provide a‘quality’ rank of the user’s walk. Such‘quality’ rank in certain cases may be binary-like such as “good”“bad” (or the like).
[044] In certain embodiments, audio and/or visual feedback may include instructions to the user such as to move his feet/hands or certain feet/hands when required. Image feeds captured by cameras on wearable devices fitted to the user may be compared in order to asses (possibly via the controller) that the user abides to the instructions.
[045] In certain embodiments, a system may be configured, via e.g. the controller, in response to change in walking rhythm of a user to increase or decrease distance between lines placed in the direction of advance of the user via the SM member.
[046] Possible instructions may be provided to the user to alter (e.g. increase) his stride-length in order to possibly avoid falling. In certain embodiments, such avoidance of falling of a user during a walking process may be detected from signals received form accelerometer(s) or the like mounted to the user.
[047] In certain cases, upon detection of a“frozen” state of the user where walking is at least momentarily in halt, various“awakening” signals/effects may be applied to the user. Such signals may include video clips shown to the user possibly projected onto or by the SM member in his wearable device (WD); audio signals in varying rhythms; triggering of possible trembling means fixed to the user e.g. to sleeves of the user’s shirt, dispersion of odor (or the like).
[048] Such“awakening” signals/effects may be halted once movement of the user may be identified as commencing, possibly from camera footage and/or from accelerometer(s) fixed to the user. Such “awakening” signals/effects may be manually triggered of triggered by sound/voice recognition (or the like).
[049] In at least certain cases,“frozen” states of a user which are about to occur may be identified from camera feeds filming ahead of the user. For example, a narrow entry through which the user is about to pass may be indicative of incidents where the user in the past experienced entry into a“frozen” state. [050] In at least certain embodiments, a system including a wearable device (WD) of some embodiments of the invention may be configured to detect obstacles in the route of advancement of the user. Such obstacles may be characterized as including varying heights in a surface ahead of the user (such as tiles having different heights), a step, (etc.). A controller of at least some embodiments employed in such system may sound a visual, audio and/or other type alarm in anticipation of the obstacle - such an alarm indicating to the user to lift his legs in order to surpass the obstacle (etc.).
[051] In certain eyewear embodiments, augmented indications to the user’s feet may also be projected on or by the SM e.g. in cases where the user is unable to view his feet in his field of view while walking. Together with such augmented indications to the user’s feet lines placed in advance of the projected feet may also be indicated.
[052] In addition to the exemplary aspects and embodiments described above, further aspects and embodiments will become apparent by reference to the figures and by study of the following detailed descriptions.
BRIEF DESCRIPTION OF THE FIGURES
[053] Exemplary embodiments are illustrated in referenced figures. It is intended that the embodiments and figures disclosed herein are to be considered illustrative, rather than restrictive. The invention, however, both as to organization and method of operation, together with objects, features, and advantages thereof, may best be understood by reference to the following detailed description when read with the accompanying figures, in which:
[054] Fig. 1 schematically shows a user fitted with an embodiment of a wearable device in accordance with the present invention providing one example of augmented cues;
[055] Figs. 1A and IB schematically show additional examples of augmented cues possibly formed by embodiments of a wearable devices; [056] Fig. 2 schematically shows a close-up view of Fig. 1 focused on a head area of the user;
[057] Fig. 3 schematically shows an embodiment of a wearable device of the invention;
[058] Fig. 4 schematically shows front views of possible optic members of an embodiment of a wearable device of the invention;
[059] Figs. 5A, 5B, 6A and 6B schematically show side views of possible optic members of an embodiment of a wearable device of the invention; and
[060] Figs. 7A and 7B schematically show a user in different, respective, postures fitted with an embodiment of a wearable device of the present invention.
[061] It will be appreciated that for simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity. Further, where considered appropriate, reference numerals may be repeated within the figures to indicate like elements.
DETAILED DESCRIPTION
[062] Attention is drawn to Figs. 1 and 2 schematically illustrating full and partial views of a user 10 fitted with an embodiment of a wearable device (WD) 12 of the invention, here in the possible form of an eyewear or spectacle. Wearable device (WD) 12 may be configured to assist users with mobility related difficulties due to health disorders, such as disorders in the central nervous system affecting the motor system - e.g. Parkinson's disease.
[063] Wearable device (WD) 12 in at least certain embodiments may permit a user to maintain a relative frontal directed field of view (FOV) 14 (here marked by the‘dashed’ lines) that has a central axis 141 and which is aimed at a relative frontal direction D towards which the user is walking or attempting to walk. Wearable device (WD) 12 in addition may permit a user to maintain a relative downward and lower directed field of view (FOV) 16 (here marked by the‘dotted-dashed’ lines) that has a central axis 161 and may be aimed at a relative downward direction towards an area generally covering the feet and/or footwear 18 of the user, for example a frontal tip 17 of the feet. Each central axis 141, 161 may be defined as generally extending towards and/or along a center of its respective FOV 14, 16.
[064] Various wearable device (WD) embodiments may thus be defined as having different fields of view 14, 16 that are angled/tilted one in relation to the other and/or that possibly do not substantially overlap. The fields of view 14, 16 may in addition or alternatively be defined as being formed with a tilt angle a between their respective central axes 141 and 161.
[065] In certain embodiments, relative angular orientation between the FOVs
14, 16 may be adjustable so that angle a may be adjusted to assume different tilt values. This may be achieved by an optic of such WD (e.g., optic 1 or 2 in Figs. 5) that defines one of the FOV’s (e.g. FOV 16) being arranged to be tilted relative to the other optic defining the other FOV (e.g. FOV 14). Possibly, each optic defining a given FOV may be independently adjustable so that the axes 141, 161 of each FOV may be aimed at a desired location resulting in a resulting angle a that is located therebetween.
[066] Adjustment of tilt between the fields of view (FOVs) may be according to a user’s condition. With attention drawn to Figs. 7A and 7B the angular orientation of the fields of views 14, 16 will be discussed with respect to a user in two posture conditions and/or two different users in two different postures.
[067] In Fig. 7A a user having a relative bent or hunched posture when standing up or walking is exhibited where FOV 14 is directed to a relative close distance dl. This relative close distance dl may be a result of the user being bent forward while the user’s vision and/or optics defining FOV 14 being fixed to the user’s anatomy (here head) and thus being bent downwards together with this anatomy. FOV 16 may be adjusted generally towards the user’s feet resulting in an angle al being defined between axes 141 and 161 in this illustration. [068] In Fig. 7B, another user or the same user (possibly after practice/treatment/training with a WD embodiment of the invention); is illustrated in a more upright and less bent posture. Here, FOV 14 is seen being aimed at a distance d2 that is greater than dl due to the user’s vision and/or optics defining FOV 14 being tilted slightly upwards as a result of the user’s more upright posture. FOV 16 may be adjusted generally towards the user’s feet resulting in an angle a2 defined between axes 141 and 161 in this illustration being larger than al.
[069] Attention is drawn back to Fig. 3 illustrating an embodiment of a wearable device (WD) 12 generally similar to that illustrated in Figs. 1 and 2. WD 12 in this example includes first and second optic members 20, 22. Each one of the optic members may be placed in-front of a respective one of the eyes of the user. Each optic member may include at least a portion in a form of an optical lens, a frame for a lens, a multifocal lens (or the like).
[070] Wearable device (WD) 12 includes in addition a super imposer (SM) 24, for superimposing information on the real-world field of view using e.g. computer generated information. In the illustrated example, the shown embodiment of super imposer (SM) 24 includes a possible module 25 for projecting/creating augmented information that intercepts the real-world view of a user in order to display in this view augmented effects 26. Such interception may be upon a plane/surface within a line of sight of a user. Module 25 may include or be e.g.: a display optics, projector, transmitter and/or electronics - used for creating or assisting in creating augmented effects 26.
[071] Super imposer (SM) 24 in various embodiments may be a smart-glass or wearable computer glasses configured for adding information alongside to what the wearer sees. Such (SM) may be arranged to change optical properties of members through which eye sight of a user passes (possibly at runtime) in order to form such augmented information. In a non-binding example, such SM or wearable device (WD) including such SM - may utilize waveguide based techniques for creating augmented effects. [072] In the illustrated example, such plane/surface upon which the effects 26 appear may be an optic member such as optic member 22 that may be considered as being part of the SM or may be distinct to the SM. The effects 26 appearing on optic member 22 are configured to appear at least in a part of this optic, which may be arranged to have the relative lower field of view 16 aimed at the user’s feet or footwear.
[073] Thus, a user fitted with the various discussed embodiments of wearable device (WD) 12 may be enriched at least in the WD’s relative lower field of view with effects 26 that seemingly appear adjacent his feet while in fact they are not present in reality at this location near his feet.
[074] In the illustrated example in Fig. 1, the visual effects 26 may take form of one or more lines extending generally transverse (possibly generally orthogonal) to an axis extending along the direction of advancement D of a user fitted with an embodiment of wearable device (WD) 12. These lines as seen may preferably be located in-front of a front terminal area 17 (such as toe, a shoe’s front tip or the like) of a foot or footwear of the user. With attention drawn to Figs. 1A and 1B it is apparent that the visual effects 26 may take various forms, such as a stair like formation (seen in Fig. 1 A), tile like formation (seen in Fig. 1B) and the like.
[075] In certain embodiments, the visual effects 26 may be generally static relative to the user or may be controlled to dynamically move relative to the user. For example, the visual effects 26 may be controlled to move in a given pace relative to the user e.g. in the frontal direction D towards which the user is walking or attempting to walk or in an opposing direction to the frontal direction D.
[076] Attention is drawn to Fig. 4 schematically illustrating front views of optic members generally similar to optic members 20 and 22. With attention drawn to the side views illustrated in Figs. 5A and 5B, each optic member 20, 22 may be seen including two possible optic segment 1, 2. In Fig. 5 A these segments may be formed into an optic member as a unitary one piece. Segment 1 may be configured to provide a wearable device (WD) including such segments with the frontal and downward fields of views 14, 16. In the embodiment of Fig. 5B, segment 2 is seen illustrated being angled downwards to provide the downward directed field of view 16.
[077] With attention drawn to the side views illustrated in Figs. 6A and 6B, the optic members 20, 22 are illustrated including only one segment for providing the downward directed field of view 16. In Fig. 6A this segment is seen generally upright and in Fig. 6B it is illustrated being angled downwards.
[078] It is noted that wearable devices (WD) of various embodiments of the invention may not necessarily include two optic members 20, 22 as illustrated e.g. in Fig. 3. For example, in certain embodiments only one optic member 20, 22 or segment 1, 2 may be provided for providing the downward directed field of view 16 - while the frontal field of view 14 may be provided without assistance of an optic or with optics either fitted directly to the eye such as contact or implanted lens (or the like).
[079] Wearable devices (WD) in accordance with at least certain embodiments of the invention may be configured to include eyewear (e.g. eyeglasses) with one optic member 20, 22 being arranged to provide a frontal directed field of view 14, while the other optic member 20, 22 being arranged to provide the downward lower directed field of view 16. Thus, in such configuration, one eye of the user may be arranged to look straight ahead while the other downward to observe the augmented effects 26.
[080] In at least certain embodiments, a system employing or in cooperation with wearable device embodiments of the invention may be arranged to include a controller 29 for controlling e.g. cameras, accelerometer sensors, and/or SM members included in wearable devices (WD) of the system. Such system may also include wire or wireless communication to a distinct (possibly relative remote) device and/or processor (such as of a mobile device, tablet, or the like) where processes affecting and/or controlling embodiments of the SM, cameras (etc.) may be configured to run and/or operate. Such processes may also be configured to run in a cloud based environment (or the like).
[081] In certain embodiments, the controller may be configured to communicate image feeds to a computing means (e.g. a personal computer) where according to inputs such as: walking quality, walking rhythm, stride-length, user’s posture and/or variance in stride-length (or the like) - the controller may be configured to determine parameters for augmented cue appearances.
[082] A derived at‘quality measure’ of a user’s walking and/or advancement may be broadcasted/communicated as feedback to the user by e.g. a so-called ‘processor engine’ running on a processor (e.g. remote processor) and/or by a human therapist of the user. Such feedbacks may be communicated to the user by various stimuli measures such as audio, visual (or the like), where e.g., a human therapist may communicate such feedback via audio means.
[083] The derived at derived at‘quality measure’ and its resulting feedbacks can be used in some cases in conjunction with customized medicine treatment to achieve optimized movement of the user. In some cases, online monitor of the user’s movement may be used as a means to analyze the amount of medicine the user requires. In some cases, gathered information of the user’s movement can be used as a data base for research on possible diseases affecting the user’s condition such as Parkinson disease. Such gathered information may in addition or alternatively be used by a therapist treating the user, for example for off line monitoring of the user.
[084] In the description and claims of the present application, each of the verbs, “comprise”“include” and“have”, and conjugates thereof, are used to indicate that the object or objects of the verb are not necessarily a complete listing of members, components, elements or parts of the subject or subjects of the verb.
[085] Further more, while the present application or technology has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and non- restrictive; the technology is thus not limited to the disclosed embodiments. Variations to the disclosed embodiments can be understood and effected by those skilled in the art and practicing the claimed technology, from a study of the drawings, the technology, and the appended claims.
[086] In the claims, the word“comprising” does not exclude other elements or steps, and the indefinite article“a” or“an” does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures can not be used to advantage.
[087] The present technology is also understood to encompass the exact terms, features, numerical values or ranges etc., if in here such terms, features, numerical values or ranges etc. are referred to in connection with terms such as“about, ca., substantially, generally, at least” etc. In other words,“about 3” shall also comprise “3” or “substantially perpendicular” shall also comprise “perpendicular”. Any reference signs in the claims should not be considered as limiting the scope.
[088] Although the present embodiments have been described to a certain degree of particularity, it should be understood that various alterations and modifications could be made without departing from the scope of the invention as hereinafter claimed.

Claims

CLAIMS:
1. A wearable device (WD) for a user comprising a super imposer (SM) for superimposing augmented information on a real-world field of view of the user, the wearable device (WD) being arranged to provide to the user at least one relative downward directed field of view aimed at a front terminal area of a foot or footwear of the user, and the superimposed information being configured to appear at least within the relative downward directed field of view as being present adjacent the front terminal area of a foot or footwear.
2. The wearable device (WD) of claim 1 and being arranged to provide to the user also a relative forward directed field of view aimed at a direction of advancement of the user.
3. The wearable device (WD) of claim 1 or 2, wherein superimposed augmented information appears to a user as present in front of the front terminal area of the foot or footwear of the user.
4. The wearable device (WD) of any one of the preceding claims, and being used for assisting users suffering from mobility related difficulties due to health disorders, such as disorders in the central nervous system of a user that mainly affects the motor system - e.g. Parkinson's disease.
5. The wearable device (WD) of any one of the preceding claims, wherein the superimposing of information is onto a surface within a line of sight of the user, for example onto a surface of an optic member of the WD placed in-front of at least one of the eyes of the user.
6. The wearable device (WD) of any one of the preceding claims, wherein the superimposed information being arranged to appear substantially in focus to the user at a location in front of the WD substantially on a focal plane of the WD.
7. The wearable device (WD) of any one of the preceding claims and being an eyewear worn on or over the user’s eyes.
8. The wearable device (WD) of any one of the preceding claims, wherein the superimposed information is computer generated information.
9. The wearable device (WD) of any one of the preceding claims, wherein the superimposed information is projected to intercept the real-world view of the user in order to display in this view augmented effects.
10. The wearable device (WD) of any one of the preceding claims, wherein the super imposer (SM) or wearable device (WD) being at least one of: a smart- glass, wearable computer glasses configured for adding information alongside to what the wearer sees, glasses or headset arranged to change optical properties of members through which eye sight of the user passes.
11. The wearable device (WD) of any one of the preceding claims, wherein the downward directed field of view aimed at a front terminal area of a foot or footwear of the user is provided also when the user is walking in a generally upright posture.
12. The wearable device (WD) of any one of the preceding claims and comprising one or more sensors for monitor the user’s walking condition or advancement condition.
13. The wearable device (WD) of claim 12, wherein the sensors are at least one of: accelerometer(s), GPS sensors.
14. The wearable device (WD) of any one of the preceding claims and comprising one or more sensors in form of cameras with field of view capable of imaging: the user’s feet, preferably tips of the feet; the user’s knees; the user’s hands; and/or the user’s arms.
15. The wearable device (WD) of claim 14, wherein the super imposer (SM) and cameras are arranged to have fields of view that are generally aligned.
16. The wearable device (WD) of claim 15, wherein alignment is between pixels in the cameras and pixels/comparable-zones in the field of view of the SM.
17. The wearable device (WD) of any one of claims 12 to 14 wherein upon detection by the sensors of a“frozen” state of the user where walking is at least momentarily in halt, a user of the WD is provided with awakening signals/effects, possibly in the form of at least one of: video clips; audio signals; triggering of trembling means fixed to the user e.g. to sleeves of the user’s shirt, dispersion of odor.
18. The wearable device (WD) of any one of the preceding claims and comprising hearing means for producing audio communication to the user.
19. The wearable device (WD) of claim 18, wherein the audio communication is in audible frequencies to an average user/human and/or in frequencies that are beyond those that are audible to the average human.
20. The wearable device (WD) of any one of the preceding claims and comprising odor dispenser for dispensing odor to the user.
21. A method for assisting a user in walking comprising the steps of:
providing the user with a wearable device (WD) arranged to create augmented effects appearing to the user as present adjacent a front terminal area of a foot or footwear of the user, wherein
the augmented effects serve as an indication/target that the user should preferably try to step over during his walking.
22. The method of claim 21, wherein the augmented effects comprise lines extending transverse to a direction of advance of the user when walking.
23. The method of claim 22, wherein the lines of generally parallel one to the other.
24. The method of any one of claims 21 to 23, wherein the augmented effects are all in front of a forward tip of the user’s foot while he is walking.
25. The method of any one of claims 21 to 23, wherein at least some of the augmented effects are at least momentarily behind a tip of the foot of the user while he is walking.
26. The method of claims 22 or 24, wherein the lines are part of a larger pattern, for example a pattern resembling a stair-like or tile-like formation.
27. The method of any one of claims 21 to 26 and comprising a step of receiving image feeds of the walking of the user.
28 The method of claim 27, wherein the image feeds are form camera(s) located on the wearable device and/or outside of the wearable device.
29. The method of claim 27 or 28, wherein the image feeds are processed in order to determine appearance of the augmented affects.
30. The method of any one of claims 21 to 29, wherein the augmented effects are at a pre-defined distance ahead of a front terminal area of a foot or footwear of the user.
31. The method of any one of claims 27 to 29, wherein the image feeds are processed in order to assess at least one parameter of the user including: a walking quality, a walking rhythm, a stride-length, a posture and/or variance in stride-length; and the at least one parameter being used for affecting appearance of the augmented effects.
32. The method of claim 31 , wherein affecting appearance of the augmented effects comprises at least one of: rate of appearance of augmented effects; distance of the first augmented effect most proximal to the user’s foot from the foot’s tip; distance between adjacent augmented effects in a direction ahead of the user; colors of augmented effects; augmented effect width possibly measured in pixel values.
33. The method of any one of claims 21 to 32 and comprising sensors for monitoring a user’s walking condition or advancement condition, and wherein upon detection by the sensors of a“frozen” state of the user where walking is at least momentarily in halt, the user is provided with awakening signals/effects, possibly in the form of at least one of: video clips; audio signals; triggering of trembling means fixed to the user e.g. to sleeves of the user’s shirt, dispersion of odor.
34. A wearable device (WD) for a user comprising a super imposer (SM) for superimposing augmented information on a real-world field of view of the user, the wearable device (WD) being arranged to provide to the user at least one relative downward directed field of view, and the superimposed information being configured to appear at least within the relative downward directed field of view.
35. The wearable device (WD) of claim 34, wherein the relative downward directed field of view does not substantially cover a forward directed view of a user fitted with the WD, when said user is standing in an upright position and looking forward.
36. The wearable device (WD) of claim 34 or 35, wherein the relative downward directed field of view is generally aimed at a foot or footwear of the user.
37. The wearable device (WD) of claim 36, wherein the superimposed information being configured to appear as present adjacent the front terminal area of a foot or footwear.
38. The wearable device (WD) of any one of claims 34 to 38 and being arranged to provide to the user also a relative forward directed field of view, possibly aimed at a direction of advancement of the user.
39. A wearable device (WD) for a user comprising a super imposer (SM) for superimposing augmented information on a real-world field of view of the user, the wearable device (WD) being arranged to provide to the user at least two fields of view, and the superimposed information being arranged to appear in one of the fields of view.
40. The wearable device (WD) of claim 39, wherein the superimposed information being arranged to appear only in one of the fields of view.
41. The wearable device (WD) of claim 39 or 40, wherein the at least two fields of view are different fields of view that substantially do not overlap.
42. The wearable device (WD) of any one of claims 39 to 41, wherein when fitted to a user one field of view is a relative downward directed field of view and the other field of view is a relative forward directed field of view aimed at a direction of advancement of the user.
43. The wearable device (WD) of any one of claim 42, wherein the superimposed information being arranged to appear in the relative downward directed field of view.
44. The wearable device (WD) of any one of claims 39 to 43, wherein the superimposing of information is onto a surface within a line of sight of the user, for example onto a surface of an optic member of the WD placed in-front of at least one of the eyes of the user.
45. The wearable device (WD) of any one of claims 39 to 44, wherein the superimposed information being arranged to appear substantially in focus to the user at a location in front of the WD substantially on a focal plane of the WD.
46. The wearable device (WD) of any one of claims 39 to 45 and being an eyewear worn on or over the user’s eyes.
47. The wearable device (WD) of any one of claims 39 to 46, wherein the superimposed information is computer generated information.
48. The wearable device (WD) of any one of claims 39 to 47, wherein the superimposed information is projected to intercept the real-world view of the user in order to display in this view augmented effects.
49. The wearable device (WD) of any one of claims 39 to 48, wherein the super imposer (SM) or wearable device (WD) being at least one of: a smart-glass, wearable computer glasses configured for adding information alongside to what the wearer sees, glasses or headset arranged to change optical properties of members through which eye sight of the user passes.
50. The wearable device (WD) of any one of claims 39 to 49 and comprising one or more sensors for monitor the user’s walking condition or advancement condition.
51. The wearable device (WD) of claim 50, wherein the sensors are at least one of: accelerometer(s), GPS sensors.
52. The wearable device (WD) of any one of claims 39 to 51 and comprising one or more sensors in form of cameras with field of view capable of imaging: the user’s feet, preferably tips of the feet; the user’s knees; the user’s hands; and/or the user’s arms.
53. The wearable device (WD) of claim 52, wherein the super imposer (SM) and cameras are arranged to have fields of view that are generally aligned.
54. The wearable device (WD) of claim 53, wherein alignment is between pixels in the cameras and pixels/comparable-zones in the field of view of the SM.
55. The wearable device (WD) of any one of claims 50 to 54 wherein upon detection by the sensors of a“frozen” state of the user where walking is at least momentarily in halt, a user of the WD is provided with awakening signals/effects, possibly in the form of at least one of: video clips; audio signals; triggering of trembling means fixed to the user e.g. to sleeves of the user’s shirt, dispersion of odor.
PCT/IB2019/051588 2018-03-07 2019-02-27 Augmented reality device and/or system and/or method for using same for assisting in walking or movement disorders Ceased WO2019171216A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862639516P 2018-03-07 2018-03-07
US62/639,516 2018-03-07

Publications (2)

Publication Number Publication Date
WO2019171216A1 true WO2019171216A1 (en) 2019-09-12
WO2019171216A4 WO2019171216A4 (en) 2019-10-31

Family

ID=66397285

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2019/051588 Ceased WO2019171216A1 (en) 2018-03-07 2019-02-27 Augmented reality device and/or system and/or method for using same for assisting in walking or movement disorders

Country Status (1)

Country Link
WO (1) WO2019171216A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023017243A1 (en) * 2021-08-07 2023-02-16 Gaitar Limited Apparatus and method
NL2035744A (en) * 2023-05-12 2024-12-02 Yan Zhu A specialized voice-controlled augmented reality glasses for Parkinson's patients

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997047993A1 (en) * 1996-06-13 1997-12-18 Enlightened Technologies Associates, Inc. Interactive light field
US6734834B1 (en) 2000-02-11 2004-05-11 Yoram Baram Closed-loop augmented reality apparatus
US20150002374A1 (en) * 2011-12-19 2015-01-01 Dolby Laboratories Licensing Corporation Head-Mounted Display
US20150241708A1 (en) * 2013-08-23 2015-08-27 Panasonic Intellectual Property Corporation Of America Head-mounted display
US20150355709A1 (en) * 2014-06-10 2015-12-10 Lg Electronics Inc. Wearable device and method of controlling therefor
WO2016168047A1 (en) * 2015-04-15 2016-10-20 Sony Computer Entertainment Inc. Pinch and hold gesture navigation on a head-mounted display
US20160335917A1 (en) * 2015-05-13 2016-11-17 Abl Ip Holding Llc System and method to assist users having reduced visual capability utilizing lighting device provided information
US20170206691A1 (en) 2014-03-14 2017-07-20 Magic Leap, Inc. Augmented reality systems and methods utilizing reflections

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997047993A1 (en) * 1996-06-13 1997-12-18 Enlightened Technologies Associates, Inc. Interactive light field
US6734834B1 (en) 2000-02-11 2004-05-11 Yoram Baram Closed-loop augmented reality apparatus
US20150002374A1 (en) * 2011-12-19 2015-01-01 Dolby Laboratories Licensing Corporation Head-Mounted Display
US20150241708A1 (en) * 2013-08-23 2015-08-27 Panasonic Intellectual Property Corporation Of America Head-mounted display
US20170206691A1 (en) 2014-03-14 2017-07-20 Magic Leap, Inc. Augmented reality systems and methods utilizing reflections
US20150355709A1 (en) * 2014-06-10 2015-12-10 Lg Electronics Inc. Wearable device and method of controlling therefor
WO2016168047A1 (en) * 2015-04-15 2016-10-20 Sony Computer Entertainment Inc. Pinch and hold gesture navigation on a head-mounted display
US20160335917A1 (en) * 2015-05-13 2016-11-17 Abl Ip Holding Llc System and method to assist users having reduced visual capability utilizing lighting device provided information

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023017243A1 (en) * 2021-08-07 2023-02-16 Gaitar Limited Apparatus and method
NL2035744A (en) * 2023-05-12 2024-12-02 Yan Zhu A specialized voice-controlled augmented reality glasses for Parkinson's patients

Also Published As

Publication number Publication date
WO2019171216A4 (en) 2019-10-31

Similar Documents

Publication Publication Date Title
US5597309A (en) Method and apparatus for treatment of gait problems associated with parkinson's disease
US10231614B2 (en) Systems and methods for using virtual reality, augmented reality, and/or a synthetic 3-dimensional information for the measurement of human ocular performance
JP6083880B2 (en) Wearable device with input / output mechanism
US8831278B2 (en) Method of identifying motion sickness
US20060098087A1 (en) Housing device for head-worn image recording and method for control of the housing device
CN104094280A (en) Systems and methods for high-resolution gaze tracking
JP6953247B2 (en) Goggles type display device, line-of-sight detection method and line-of-sight detection system
US12260580B2 (en) System and method for enhancing visual acuity of head wearable displays
US11983310B2 (en) Gaze tracking apparatus and systems
US11619813B2 (en) Coordinating an eye-mounted imager with an external camera
US11579690B2 (en) Gaze tracking apparatus and systems
KR20140037730A (en) Wearable system for providing information
US20250355492A1 (en) Gaze tracking system and method
US11743447B2 (en) Gaze tracking apparatus and systems
US11747897B2 (en) Data processing apparatus and method of using gaze data to generate images
KR102619429B1 (en) Head-mounted display apparatus that automatically adjusts the inter-pupillary distance through eye tracking
JP2018513656A (en) Eyeglass structure for image enhancement
WO2019171216A1 (en) Augmented reality device and/or system and/or method for using same for assisting in walking or movement disorders
JP2017191546A (en) Medical use head-mounted display, program of medical use head-mounted display, and control method of medical use head-mounted display
US12393027B2 (en) Head-mountable display apparatus and methods
TW201805689A (en) External near-eye display device
KR102765188B1 (en) Head-mounted display apparatus having reset button for re-adjusting user focus during inter-pupillary distance control using eye tracking
US12373030B2 (en) Optical sightline tracking for a wearable system
JP2016133541A (en) Electronic spectacle and method for controlling the same
GB2598953A (en) Head mounted display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19721716

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19721716

Country of ref document: EP

Kind code of ref document: A1