[go: up one dir, main page]

WO2015121846A1 - System and method for aiding a visually impaired person to navigate - Google Patents

System and method for aiding a visually impaired person to navigate Download PDF

Info

Publication number
WO2015121846A1
WO2015121846A1 PCT/IB2015/051156 IB2015051156W WO2015121846A1 WO 2015121846 A1 WO2015121846 A1 WO 2015121846A1 IB 2015051156 W IB2015051156 W IB 2015051156W WO 2015121846 A1 WO2015121846 A1 WO 2015121846A1
Authority
WO
WIPO (PCT)
Prior art keywords
obstacles
person
feedback
processing device
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IB2015/051156
Other languages
French (fr)
Inventor
Gaurav Mittal
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of WO2015121846A1 publication Critical patent/WO2015121846A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/007Teaching or communicating with blind persons using both tactile and audible presentation of the information
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/003Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/003Teaching or communicating with blind persons using tactile presentation of the information, e.g. Braille displays
    • G09B21/005Details of specially-adapted software to access information, e.g. to browse through hyperlinked information
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B21/00Teaching, or communicating with, the blind, deaf or mute
    • G09B21/001Teaching or communicating with blind persons
    • G09B21/006Teaching or communicating with blind persons using audible presentation of the information
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H3/00Appliances for aiding patients or disabled persons to walk about
    • A61H3/06Walking aids for blind persons
    • A61H3/061Walking aids for blind persons with electronic detecting or guiding means
    • A61H2003/063Walking aids for blind persons with electronic detecting or guiding means with tactile perception
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/1635Hand or arm, e.g. handle
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/16Physical interface with patient
    • A61H2201/1602Physical interface with patient kind of interface, e.g. head rest, knee support or lumbar support
    • A61H2201/165Wearable interfaces
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5023Interfaces to the user
    • A61H2201/5048Audio interfaces, e.g. voice or music controlled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5058Sensors or detectors
    • A61H2201/5092Optical sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61HPHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
    • A61H2201/00Characteristics of apparatus not provided for in the preceding codes
    • A61H2201/50Control means thereof
    • A61H2201/5097Control means thereof wireless

Definitions

  • the disclosed subject matter relates to the field of navigation and more particularly but not exclusively to navigational aids for visually impaired people.
  • Some conventional systems are designed to transmit ultrasound in the direction of motion of the user, which is reflected back and received by the transmitting device, upon hitting an obstacle in the path up to a range of 3 meters. These systems produce audio signals of different frequencies as an alert. The intensity of the alert signal upon detecting an obstacle remains uniform as the obstacle approaches the user. As a result, the proper distance information is missing in these systems. Further, the instant technique has been found to be less effective in detecting obstacles which are angularly located with respected to such systems. [0005] Some conventional systems use technologies, such as, GPS to assist an individual in navigation. Such systems can only provide navigation assistance at a macro level. However, a visually impaired person would require assistance at micro level, such as providing feedback on obstacles present in his path. [0006] In light of the foregoing discussion, there is a need for a technique that can navigate a visually impaired person more effectively.
  • An embodiment discloses a system for aiding a visually impaired person to navigate.
  • the system includes at least one image capturing device, a processing device, and at least one feedback module.
  • the image capturing device captures a series of images and communicates the images to the processing device.
  • the processing device processes the images and identifies obstacles in the path taken by the person.
  • the processing device further determines the feedback to be provided to the person based on the identified obstacles.
  • the system further includes a sensor that transmits ultrasonic waves in the path taken by the person, which upon hitting any obstacle in the path, are reflected back to the sensor. Based on the time required by the reflected wave to reach the sensor, distance between the person and the obstacles is estimated, and a feedback is provided to the person.
  • the feedback module provides the feedback to the person in the form of audio feedback or haptic feedback.
  • An embodiment discloses a method for aiding a visually impaired person to navigate.
  • the method includes capturing a series of images, processing the images to identify obstacles in the path taken by the person, and determining feedback to be provided to the person, at least based on the identified obstacles.
  • distance of the obstacles from the person, and size of the obstacles are determined by transmitting ultrasonic waves and receiving reflected waves. Such determination is used to refine the feedback provided to the person.
  • FIG. 1 is an illustration of the exemplary system 100 for aiding a visually impaired person to navigate, in accordance with an embodiment
  • FIG. 1A illustrates an exemplary system 100, showing the circuit board 103 with the processing device 104 as a part of it and other units as peripheral devices, in accordance with an embodiment
  • FIG. IB illustrates an exemplary system 100, showing the circuit board 103 with the peripheral devices, while the processing device 104 is provided as a separate unit, in accordance with an embodiment
  • FIG. 2 is an illustration of the exemplary system 100 that can be worn on the dorsal side of the hand between the fingers and wrist, in accordance with an embodiment
  • FIG. 3 illustrates an exemplary system 100, with the sensor 302 in addition to the image capturing device 102, the processing device 104 and the feedback module 108, in accordance with an embodiment
  • FIG. 4 is an illustration of the exemplary sensor 302 transmitting ultrasound 402 and receiving reflected wave 404 to enable distance measurement, in accordance with an embodiment
  • FIG. 5 illustrates the exemplary feedback module 108, connected with the processing device 104 by means of well known communication protocol, outputting audio signals, in accordance with an embodiment
  • FIG. 6 illustrates an exemplary vibration motor 108 mounted on the circuit board 103 connected with well known communication protocols through the slots provided on the circuit board 103, in accordance with an embodiment
  • FIG. 7 is an illustration of the exemplary method 700 for aiding a visually impaired person to navigate, in accordance with an embodiment; and [0019] FIG. 8 is an illustration of the exemplary method of identification of obstacles at step 706, in accordance with an embodiment.
  • An embodiment discloses a technique for aiding a visually impaired person to navigate.
  • the system includes at least one image capturing device, a processing device and at least one feedback module.
  • the image capturing device captures a series of images which can be referred to as videos, and transfers the series of images to the processing device.
  • the processing device processes the series of image frames and identifies obstacles in the path taken by the person.
  • the processing device determines the distance and position of the identified obstacle from the spatial arrangement of the obstacle at any particular instant, wherein the spatial arrangement at a particular instant changes due to the relative motion of the obstacle and the person. Additionally, the processing device also defines boundaries of obstacles in the path. Based on the processing, feedback to be provided is determined by the processing device.
  • the system further includes a sensor that is configured to transmit ultrasonic waves in the path taken by the person, which upon hitting any obstacle in the path, are reflected back to the sensor. Based on the time taken by the reflected wave to reach the sensor, a distance measure is estimated and a feedback is provided to the person.
  • the feedback module upon receiving feedback instructions provides the feedback to the person in the form of audio feedback or haptic feedback.
  • FIG. 1 is an illustration of the exemplary system 100 for aiding a visually impaired person to navigate, in accordance with an embodiment.
  • the system 100 can be developed on a single board 103 (FIG. 1A), such as, a Raspberry Pi.
  • the processing device 104 can be an integral part of the board 103.
  • the image capturing device 102 and the feedback module 108 are peripheral devices that can be integrated on the circuit board 103 by means of communication protocols, such as, Serial Peripheral Interface (SPI) bus, Universal Asynchronous Receiver/Transmitter (UART) and Universal Serial Bus (USB), among other such protocols. Slots are provided for integration of the peripheral devices on the circuit board 103.
  • FIG. 1A illustrates an exemplary system 100, showing the circuit board 103 with the processing device 104 as a part of it and other units as peripheral devices, in accordance with an embodiment.
  • the processing device 104 can be a part of a separate peripheral unit, such as, for example, a smart phone.
  • the image capturing device 102 and the feedback module 108 are integrated on the circuit board 103.
  • the image capturing device 102 and the feedback module 108 communicates with the processing device 104 by means of well known wired or wireless communication protocols, such as IEEE 802.15.1 and WLAN, among others.
  • FIG. IB illustrates an exemplary system 100, showing the circuit board 103 with the peripheral devices, while the processing device 104 is provided as a separate unit, in accordance with an embodiment.
  • all units of the system 100 can be a part of a single device, such as, a smart phone and a tablet, among other such devices.
  • the image capturing device 102, the processing device 104 and the feedback module 108 can be separate units connected together by means of well known communication protocols.
  • the system 100 can be attached to the cane of a visually impaired person.
  • the system 100 can be a wearable device. Examples of wearable devices include, wrist worn devices and head mounted devices, among others.
  • the system 100 can be a handheld or hand-mounted device.
  • FIG. 2 is an illustration of the exemplary system 100 that can be worn on the dorsal side of the hand between the fingers and wrist, in accordance with an embodiment.
  • the image capturing device 102 is a camera.
  • the image capturing device 102 captures a series of images, which may be referred to as videos, of the immediate environment in front of the visually impaired person.
  • the videos are captured and communicated to the processing device 104 for processing.
  • the image capturing device 102 can be an infrared camera.
  • control for the image capturing device 102 is provided in a user interface of the processing device 104.
  • a user can press or touch a control key for a predefined duration, which in turn, turns on the image capturing device 102.
  • the image capturing device 102 starts capturing images of the environment in front of the user.
  • the processing device 104 receives video streams from the image capturing device 102 for processing.
  • the videos are received and processed with various techniques.
  • Each video is a set of sequentially ordered image frames and are processed with instruction the processing device 104 is configured with.
  • the processing device 104 is programmed such that, the images are processed to identify the presence of obstacles in the path taken by the visually impaired person and determine their position. Identification of obstacles further includes determining distance and boundaries of obstacles in the image with respect to the user.
  • the processing device 104 comprises sets of instructions to perform various operations such as optical flow, edge detection and template matching among other such techniques, for processing the images.
  • template matching is used for face recognition.
  • the processing device upon recognising existence of a face, may prompt the user to provide an identifier, such as name, for that face. Faces, which have been provided identifiers, are stored in memory. Subsequently, if the system detects existence of a face, which has been identified previously, then the system provides a feedback to the user.
  • the feedback can be an audio feedback indicating the presence of a person, who was previously identified.
  • the processing device 104 gathers spatial information about the environment in the immediate image to identify non uniformness in pixel intensity levels in the image.
  • the processing device 104 upon detecting non uniformness can define a boundary for an object.
  • the processing device 104 defines boundary for an object by detecting edges in the images. Abrupt changes in intensity of the neighbouring pixels in an image detect edges. The change in the pixel intensity detects the discontinuities in an image and extracts features such as, corners, lines, curves from the edges of an image. The features extracted can thus be used to determine presence of obstacles in the path.
  • the processing device 104 is further programmed to locate objects and detect movements of objects. Sequences of ordered images allow the estimation of motion of objects with respect to the visually impaired person.
  • the processing device 104 forms a pattern of motion of objects captured in the sequence of image frames caused by the relative motion between the object and the user.
  • the apparent motion of objects is based on the assumption that pixels in an image of an object taken at time t will be displaced in the image of the same object taken at time t+1. For example, if the pixel coordinates in an image captured at t are (x, y, z), the pixel coordinates in the image taken at time t+1 will be (x+1, y+1, z+1) due to the relative motion of the object and the user.
  • a change in the spatial arrangement in the subsequent frames with respect to a previous frame(s) in a sequence of image frames can give a measure of the distance of object and the position of the object with respect to the user.
  • the system 100 includes a sensor 302.
  • FIG. 3 illustrates an exemplary system 100, with the sensor 302 in addition to the image capturing device 102, the processing device 104 and the feedback module 108, in accordance with an embodiment.
  • the senor 302 can be a peripheral device that can be integrated on the circuit board 103 by means of well known communication protocols.
  • the sensor 302 can be a part of a single device, such as, a smart phone or a tablet, among other devices.
  • the sensor 302 can be a separate unit connected to the other units of the system 100 by means of well known communication protocols.
  • the senor 302 is configured to estimate a distance at which the obstacles are from the user, in the direction of motion of the visually impaired person.
  • the sensor 302 transmits ultrasonic waves 402 in the path taken by the person at pre-programmed instances.
  • the ultrasonic waves 402 reflect back upon hitting any object on the path.
  • the time required by the reflected wave 404 to reach the sensor 302 is used to measure the distance between the object 406 and the user.
  • the reflected waves 404 are used to determine the shape and nature of the objects.
  • FIG. 4 is an illustration of the exemplary sensor 302 transmitting ultrasound 402 and receiving reflected wave 404 to enable distance measurement, in accordance with an embodiment.
  • Interval at which the ultrasonic waves are radiated can be 50 milliseconds in an embodiment.
  • the sensor 302 may require trigger pulses to be applied on a periodic basis for the ultrasound to be radiated for distance measurement.
  • the trigger pulses can be applied by external devices, such as, IC 555 timer connected to the sensor 302 through GPIO pins provided on the circuit board 103.
  • the senor 302 can detect any obstacle in the direction of motion of the user up to a distance of 3 meters from the sensor 302.
  • the sensor 302 is configured to detect obstacles lying in line with the sensor 302.
  • the sensor 302 detects obstacles in the direction of motion of the user, where the obstacle lies within an angle of 15 degrees to 20 degrees with respect to the line of alignment of the sensor 302.
  • the feedback module 108 is configured to output an alert signal to guide the visually impaired person to change direction and avoid collision if any obstacle is detected on the path taken by the visually impaired person.
  • the feedback module 108 receives signal from the processing device 104 and the sensor 302 based on obstacles identified on the path.
  • the feedback module 108 can output alert signal based on instructions received from the processing device 104.
  • the feedback module 108 is an earphone connected to the processing device 104 by means of wired or wireless communication protocols.
  • Other feedback modules 108 can be headsets and headphones.
  • the feedback provided to the user with the help of the earpiece is received in the form of audio signals.
  • FIG. 5 illustrates the exemplary feedback module 108, connected with the processing device 104 by means of well known communication protocol, outputting audio signals, in accordance with an embodiment.
  • the feedback module 108 is a vibration motor (haptic feedback) that produces vibration to alert users of obstacles.
  • vibration motors are integrated on the circuit board connected to the processing device 104 and the sensor 302 through GPIO pins on the circuit board 103.
  • the intensity of the alert signal increases as the obstacle approaches the visually impaired person and the intensity may be maximum when the obstacle is closest to the person.
  • FIG. 6 illustrates an exemplary vibration motor 108 mounted on the circuit board 103 connected with well known communication protocols through the slots provided on the circuit board 103, in accordance with an embodiment.
  • the processing device 104 can send a feedback to the left vibration motor to output the alert signal and similarly for obstacles detected on the right of the visually impaired person, the processing device 104 can send a feedback to the right vibration motor to output the alert signal.
  • FIG. 7 is an illustration of the exemplary method 700 for aiding a visually impaired person to navigate, in accordance with an embodiment.
  • the method 700 includes, at step 702, capturing a video through an image capturing device; at step 704, processing the video received from the image capturing device at a processing device; at step 706, identifying obstacles in the path taken by the person based on processing; and at step 708, providing feedback to the person at least based on the identified obstacles.
  • the video is captured at step 702 using the image capturing device 102.
  • the image capturing device 102 captures videos of the immediate environment in front of the user.
  • the videos are captured and transferred to the processing device 104 for detecting obstacles.
  • the image capturing device 102 is controlled by the processing device 104 through a control key provided in the user interface of processing device 104. A user can press or touch the control key, which in turn, activates the image capturing device 102. Once activated, the video mode of the image capturing device 102 starts capturing videos of the environment in front of the user.
  • the captured videos are communicated to the processing device 104.
  • the videos are received and processed at 704.
  • Each video is a set of sequentially ordered image frames.
  • Each image frame is processed sequentially at step 704 to perform operations on the image frames.
  • the processing device 104 is programmed such that it identifies obstacles at step 706 in the path taken by the user.
  • the step 706 further includes determining the position of obstacles at step 802, determining the distance of obstacles from the user at step 804 and defining boundaries of the obstacles at step 806 in the images with respect to the user. This information about the obstacle is communicated to the feedback module 108.
  • FIG. 8 is an illustration of the exemplary method of identification of obstacles at step 706, in accordance with an embodiment.
  • the processing device 104 comprises sets of instructions to perform various operations, such as, optical flow, edge detection and template matching, among other such techniques, for processing the images.
  • template matching is used for face recognition.
  • the processing device 104 upon recognising existence of a face, may prompt the user to provide an identifier, such as name, for that face. Faces, which have been provided identifiers, are stored in memory. Subsequently, if the system 100 detects existence of a face, which has been identified previously, then the system provides a feedback to the user.
  • the feedback can be an audio feedback indicating the presence of a person, who was previously identified.
  • identifying obstacles at step 706 comprises gathering information about the spatial arrangement of the immediate image to identify non uniformness in pixel intensity levels in the image.
  • the processing device 104 gathers such information, and upon detection of non uniformness, a boundary for an obstacle can be defined.
  • boundaries are defined by detecting edges.
  • Abrupt changes in intensity in the neighbouring pixels in an image detect edges.
  • the change in the pixel intensity determines the discontinuities in an image and extracts features such as, corners, lines, curves from the edges of an image.
  • the features extracted can thus be used to determine presence of obstacles in the path.
  • the processing device 104 is further programmed to locate objects and detect movements in objects in an image in addition to detecting objects. Sequences of ordered images allow the estimation of motion of objects with respect to the visually impaired person.
  • the processing device 104 uses intelligence to form a pattern of motion of objects captured in the sequence of image frames caused by the relative motion between the object and the user.
  • the apparent motion of objects is based on the assumption that objects in an image taken at time t will be displaced in the image taken at time t+1, but will generally be still there in the image. For example, if the coordinates of any object in an image captured at t are (x, y, z), the coordinates of the same object taken at time t+1 will be (x+1, y+1, z+1) due to the relative motion of the object and the user.
  • a change in the spatial arrangement in the subsequent frames with respect to a first frame in a sequence of image frames can give a measure of the distance of object and the position of the object with respect to the user.
  • the sensor 302 estimates a distance measure for obstacles with the sensor 302, in the direction of motion of the user.
  • Ultrasonic waves in the path taken by the person are transmitted at regular intervals, such as, every 50 milliseconds.
  • the ultrasonic waves reflect back upon hitting any object on the path.
  • the time required by the reflected wave to reach the transmitting device is used to measure the distance between the object and the person.
  • the sensor 302 requires trigger pulses to be applied on a periodic basis for the ultrasound to be radiated for distance measurement.
  • the trigger pulses can be applied by external devices, such as, IC 555 timer connected through GPIO pins provided on the circuit board 103.
  • the sensor 302 can detect any obstacle in the direction of motion of the user up to a distance of 3 meters from the sensor 302.
  • the sensor 302 detects obstacles lying in line with the sensor 302.
  • the sensor 302 can also detect obstacles in the direction of motion of the user where the obstacle lies within an angle of 15 degrees to 30 degrees with respect to the line of motion of the sensor 302.
  • the feedback module 108 provides a feedback at step 708 by alerting the visually impaired person to change direction and avoid collision if any obstacle is detected on the path taken by the user.
  • the feedback module 108 receives instructions from the processing device 104 and the sensor 302 based on obstacles identified on the path.
  • the feedback module 108 can output alert signal based on instructions received from the processing device 108 regarding distance and position of an obstacle.
  • the feedback provided at step 708 to the user can be in the form of audio signals.
  • the feedback provided at step 708 to the user can be in the form of vibration by using vibration motors.
  • vibration motors are integrated on the circuit board connected to the processing device 104 and the sensor 302 through the GPIO pins on the circuit board 103.
  • the intensity of the alert signal increases as the obstacle approaches the visually impaired person and the intensity is maximum when the obstacle is closest to the visually impaired person.
  • the processing device 104 or the sensor 302 can send a feedback to the left vibration motor to output the alert signal and similarly for obstacles detected on the right of the visually impaired person, the processing device 104 or the sensor 302 can send a feedback to the right vibration motor to output the alert signal.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Theoretical Computer Science (AREA)
  • Epidemiology (AREA)
  • Pain & Pain Management (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Rehabilitation Therapy (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Rehabilitation Tools (AREA)

Abstract

System and method for aiding a visually impaired person to navigate are provided. The system (100) includes an image capturing device (102), a processing device (104) and a feedback module (108). The image capturing device (102) is configured to capture a series of images. The processing device (104) is configured to receive input from the image capturing device, identify obstacles in the path taken by the person by processing the input received from the image capturing device (102), and determine feedback to be provided to the person, at least based on the identified obstacles. The feedback module (108) is configured to provide the feedback to the person.

Description

SYSTEM AND METHOD FOR AIDING A VISUALLY IMPAIRED PERSON TO
NAVIGATE
BACKGROUND
Field
[0001] The disclosed subject matter relates to the field of navigation and more particularly but not exclusively to navigational aids for visually impaired people.
Discussion of related field
[0002] Of all sensory activities, sight's influence on perception has always been considered major. For the visually impaired, the challenge is in performing the routine tasks, such as, mobility, way finding, interacting with the environment and people, among others. Measures, such as, teaching the visually impaired people the use of canes and guide dogs for mobility and Braille codes for communication to live independent lives, have proved successful to certain extent over the decades, but the dependency of a visually impaired person on other individuals has not been completely rooted out.
[0003] Research for designing and developing assisting devices for the visually impaired has its focus mainly on areas of navigational support, as problems related to mobility are challenging.
[0004] Some conventional systems are designed to transmit ultrasound in the direction of motion of the user, which is reflected back and received by the transmitting device, upon hitting an obstacle in the path up to a range of 3 meters. These systems produce audio signals of different frequencies as an alert. The intensity of the alert signal upon detecting an obstacle remains uniform as the obstacle approaches the user. As a result, the proper distance information is missing in these systems. Further, the instant technique has been found to be less effective in detecting obstacles which are angularly located with respected to such systems. [0005] Some conventional systems use technologies, such as, GPS to assist an individual in navigation. Such systems can only provide navigation assistance at a macro level. However, a visually impaired person would require assistance at micro level, such as providing feedback on obstacles present in his path. [0006] In light of the foregoing discussion, there is a need for a technique that can navigate a visually impaired person more effectively.
SUMMARY
[0007] An embodiment discloses a system for aiding a visually impaired person to navigate. The system includes at least one image capturing device, a processing device, and at least one feedback module. The image capturing device captures a series of images and communicates the images to the processing device. The processing device processes the images and identifies obstacles in the path taken by the person. The processing device further determines the feedback to be provided to the person based on the identified obstacles. The system further includes a sensor that transmits ultrasonic waves in the path taken by the person, which upon hitting any obstacle in the path, are reflected back to the sensor. Based on the time required by the reflected wave to reach the sensor, distance between the person and the obstacles is estimated, and a feedback is provided to the person. The feedback module provides the feedback to the person in the form of audio feedback or haptic feedback.
[0008] An embodiment discloses a method for aiding a visually impaired person to navigate. The method includes capturing a series of images, processing the images to identify obstacles in the path taken by the person, and determining feedback to be provided to the person, at least based on the identified obstacles. In addition, distance of the obstacles from the person, and size of the obstacles, are determined by transmitting ultrasonic waves and receiving reflected waves. Such determination is used to refine the feedback provided to the person.
BRIEF DESCRIPTION OF DRAWINGS
[0009] Embodiments are illustrated by way of example and not limitation in the Figures of the accompanying drawings, in which like references indicate similar elements and in which: [0010] FIG. 1 is an illustration of the exemplary system 100 for aiding a visually impaired person to navigate, in accordance with an embodiment;
[0011] FIG. 1A illustrates an exemplary system 100, showing the circuit board 103 with the processing device 104 as a part of it and other units as peripheral devices, in accordance with an embodiment;
[0012] FIG. IB illustrates an exemplary system 100, showing the circuit board 103 with the peripheral devices, while the processing device 104 is provided as a separate unit, in accordance with an embodiment;
[0013] FIG. 2 is an illustration of the exemplary system 100 that can be worn on the dorsal side of the hand between the fingers and wrist, in accordance with an embodiment;
[0014] FIG. 3 illustrates an exemplary system 100, with the sensor 302 in addition to the image capturing device 102, the processing device 104 and the feedback module 108, in accordance with an embodiment;
[0015] FIG. 4 is an illustration of the exemplary sensor 302 transmitting ultrasound 402 and receiving reflected wave 404 to enable distance measurement, in accordance with an embodiment;
[0016] FIG. 5 illustrates the exemplary feedback module 108, connected with the processing device 104 by means of well known communication protocol, outputting audio signals, in accordance with an embodiment; [0017] FIG. 6 illustrates an exemplary vibration motor 108 mounted on the circuit board 103 connected with well known communication protocols through the slots provided on the circuit board 103, in accordance with an embodiment;
[0018] FIG. 7 is an illustration of the exemplary method 700 for aiding a visually impaired person to navigate, in accordance with an embodiment; and [0019] FIG. 8 is an illustration of the exemplary method of identification of obstacles at step 706, in accordance with an embodiment.
DETAILED DESCRIPTION I. OVERVIEW EXEMPLARY SYSTEM
EXEMPLARY METHOD CONCLUSION
I. OVERVIEW
[0020] An embodiment discloses a technique for aiding a visually impaired person to navigate. The system includes at least one image capturing device, a processing device and at least one feedback module. The image capturing device captures a series of images which can be referred to as videos, and transfers the series of images to the processing device. The processing device processes the series of image frames and identifies obstacles in the path taken by the person. The processing device determines the distance and position of the identified obstacle from the spatial arrangement of the obstacle at any particular instant, wherein the spatial arrangement at a particular instant changes due to the relative motion of the obstacle and the person. Additionally, the processing device also defines boundaries of obstacles in the path. Based on the processing, feedback to be provided is determined by the processing device. The system further includes a sensor that is configured to transmit ultrasonic waves in the path taken by the person, which upon hitting any obstacle in the path, are reflected back to the sensor. Based on the time taken by the reflected wave to reach the sensor, a distance measure is estimated and a feedback is provided to the person. The feedback module upon receiving feedback instructions provides the feedback to the person in the form of audio feedback or haptic feedback.
[0021] The following detailed description includes references to the accompanying drawings, which form part of the detailed description. The drawings show illustrations in accordance with example embodiments. These example embodiments are described in enough detail to enable those skilled in the art to practice the present subject matter. However, it will be apparent to one of ordinary skill in the art that the present invention may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments. The embodiments can be combined, other embodiments can be utilized or structural, logical, and electrical changes can be made without departing from the scope of the invention. The following detailed description is, therefore, not to be taken as a limiting sense.
[0022] In this document, the terms "a" or "an" are used, as is common in patent documents, to include one or more than one. In this document, the term "or" is used to refer to a nonexclusive "or," such that "A or B" includes "A but not B," "B but not A," and "A and B," unless otherwise indicated.
II. EXEMPLARY SYSTEM
[0023] An embodiment discloses a system 100 for aiding a visually impaired person to navigate. The system 100 includes an image capturing device 102, a processing device 104 and a feedback module 108. FIG. 1 is an illustration of the exemplary system 100 for aiding a visually impaired person to navigate, in accordance with an embodiment.
[0024] In an embodiment, the system 100 can be developed on a single board 103 (FIG. 1A), such as, a Raspberry Pi. The processing device 104 can be an integral part of the board 103. The image capturing device 102 and the feedback module 108 are peripheral devices that can be integrated on the circuit board 103 by means of communication protocols, such as, Serial Peripheral Interface (SPI) bus, Universal Asynchronous Receiver/Transmitter (UART) and Universal Serial Bus (USB), among other such protocols. Slots are provided for integration of the peripheral devices on the circuit board 103. FIG. 1A illustrates an exemplary system 100, showing the circuit board 103 with the processing device 104 as a part of it and other units as peripheral devices, in accordance with an embodiment.
[0025] In an embodiment, the processing device 104 can be a part of a separate peripheral unit, such as, for example, a smart phone. The image capturing device 102 and the feedback module 108 are integrated on the circuit board 103. The image capturing device 102 and the feedback module 108 communicates with the processing device 104 by means of well known wired or wireless communication protocols, such as IEEE 802.15.1 and WLAN, among others. FIG. IB illustrates an exemplary system 100, showing the circuit board 103 with the peripheral devices, while the processing device 104 is provided as a separate unit, in accordance with an embodiment. [0026] In another embodiment, all units of the system 100 can be a part of a single device, such as, a smart phone and a tablet, among other such devices.
[0027] In another embodiment, the image capturing device 102, the processing device 104 and the feedback module 108 can be separate units connected together by means of well known communication protocols.
[0028] In an embodiment, the system 100 can be attached to the cane of a visually impaired person. In another embodiment, the system 100 can be a wearable device. Examples of wearable devices include, wrist worn devices and head mounted devices, among others. In another embodiment, the system 100 can be a handheld or hand-mounted device. FIG. 2 is an illustration of the exemplary system 100 that can be worn on the dorsal side of the hand between the fingers and wrist, in accordance with an embodiment.
[0029] In an embodiment, the image capturing device 102 is a camera. The image capturing device 102 captures a series of images, which may be referred to as videos, of the immediate environment in front of the visually impaired person. The videos are captured and communicated to the processing device 104 for processing. The image capturing device 102 can be an infrared camera.
[0030] In an embodiment, the control for the image capturing device 102 is provided in a user interface of the processing device 104. A user can press or touch a control key for a predefined duration, which in turn, turns on the image capturing device 102. Once activated, the image capturing device 102 starts capturing images of the environment in front of the user.
[0031] The processing device 104 receives video streams from the image capturing device 102 for processing. The videos are received and processed with various techniques. Each video is a set of sequentially ordered image frames and are processed with instruction the processing device 104 is configured with.
[0032] In an embodiment, the processing device 104 is programmed such that, the images are processed to identify the presence of obstacles in the path taken by the visually impaired person and determine their position. Identification of obstacles further includes determining distance and boundaries of obstacles in the image with respect to the user. The processing device 104 comprises sets of instructions to perform various operations such as optical flow, edge detection and template matching among other such techniques, for processing the images. [0033] In an embodiment, template matching is used for face recognition. The processing device, upon recognising existence of a face, may prompt the user to provide an identifier, such as name, for that face. Faces, which have been provided identifiers, are stored in memory. Subsequently, if the system detects existence of a face, which has been identified previously, then the system provides a feedback to the user. The feedback can be an audio feedback indicating the presence of a person, who was previously identified.
[0034] In an embodiment, the processing device 104 gathers spatial information about the environment in the immediate image to identify non uniformness in pixel intensity levels in the image. The processing device 104 upon detecting non uniformness can define a boundary for an object.
[0035] The processing device 104 defines boundary for an object by detecting edges in the images. Abrupt changes in intensity of the neighbouring pixels in an image detect edges. The change in the pixel intensity detects the discontinuities in an image and extracts features such as, corners, lines, curves from the edges of an image. The features extracted can thus be used to determine presence of obstacles in the path.
[0036] In an embodiment, the processing device 104 is further programmed to locate objects and detect movements of objects. Sequences of ordered images allow the estimation of motion of objects with respect to the visually impaired person. The processing device 104 forms a pattern of motion of objects captured in the sequence of image frames caused by the relative motion between the object and the user.
[0037] The apparent motion of objects is based on the assumption that pixels in an image of an object taken at time t will be displaced in the image of the same object taken at time t+1. For example, if the pixel coordinates in an image captured at t are (x, y, z), the pixel coordinates in the image taken at time t+1 will be (x+1, y+1, z+1) due to the relative motion of the object and the user. A change in the spatial arrangement in the subsequent frames with respect to a previous frame(s) in a sequence of image frames can give a measure of the distance of object and the position of the object with respect to the user.
[0038] In an embodiment, the system 100 includes a sensor 302. FIG. 3 illustrates an exemplary system 100, with the sensor 302 in addition to the image capturing device 102, the processing device 104 and the feedback module 108, in accordance with an embodiment.
[0039] In an embodiment, the sensor 302 can be a peripheral device that can be integrated on the circuit board 103 by means of well known communication protocols. In another embodiment, the sensor 302 can be a part of a single device, such as, a smart phone or a tablet, among other devices. In yet another embodiment, the sensor 302 can be a separate unit connected to the other units of the system 100 by means of well known communication protocols.
[0040] In an embodiment, the sensor 302 is configured to estimate a distance at which the obstacles are from the user, in the direction of motion of the visually impaired person. The sensor 302 transmits ultrasonic waves 402 in the path taken by the person at pre-programmed instances. The ultrasonic waves 402 reflect back upon hitting any object on the path. The time required by the reflected wave 404 to reach the sensor 302 is used to measure the distance between the object 406 and the user. In an embodiment, the reflected waves 404 are used to determine the shape and nature of the objects. FIG. 4 is an illustration of the exemplary sensor 302 transmitting ultrasound 402 and receiving reflected wave 404 to enable distance measurement, in accordance with an embodiment.
[0041] Interval at which the ultrasonic waves are radiated can be 50 milliseconds in an embodiment. The sensor 302 may require trigger pulses to be applied on a periodic basis for the ultrasound to be radiated for distance measurement. The trigger pulses can be applied by external devices, such as, IC 555 timer connected to the sensor 302 through GPIO pins provided on the circuit board 103.
[0042] In an embodiment, the sensor 302 can detect any obstacle in the direction of motion of the user up to a distance of 3 meters from the sensor 302. The sensor 302 is configured to detect obstacles lying in line with the sensor 302.
[0043] In an embodiment, the sensor 302 detects obstacles in the direction of motion of the user, where the obstacle lies within an angle of 15 degrees to 20 degrees with respect to the line of alignment of the sensor 302.
[0044] In an embodiment, the feedback module 108 is configured to output an alert signal to guide the visually impaired person to change direction and avoid collision if any obstacle is detected on the path taken by the visually impaired person.
[0045] The feedback module 108 receives signal from the processing device 104 and the sensor 302 based on obstacles identified on the path. The feedback module 108 can output alert signal based on instructions received from the processing device 104.
[0046] In an embodiment, the feedback module 108 is an earphone connected to the processing device 104 by means of wired or wireless communication protocols. Other feedback modules 108 can be headsets and headphones. The feedback provided to the user with the help of the earpiece is received in the form of audio signals. FIG. 5 illustrates the exemplary feedback module 108, connected with the processing device 104 by means of well known communication protocol, outputting audio signals, in accordance with an embodiment.
[0047] In another embodiment, the feedback module 108 is a vibration motor (haptic feedback) that produces vibration to alert users of obstacles. These vibration motors are integrated on the circuit board connected to the processing device 104 and the sensor 302 through GPIO pins on the circuit board 103. In an embodiment, the intensity of the alert signal increases as the obstacle approaches the visually impaired person and the intensity may be maximum when the obstacle is closest to the person. FIG. 6 illustrates an exemplary vibration motor 108 mounted on the circuit board 103 connected with well known communication protocols through the slots provided on the circuit board 103, in accordance with an embodiment.
[0048] In an embodiment, there can be at least two vibration motors, one on the left and the other on the right in the system 100. For obstacles detected on the left of the visually impaired person, the processing device 104 can send a feedback to the left vibration motor to output the alert signal and similarly for obstacles detected on the right of the visually impaired person, the processing device 104 can send a feedback to the right vibration motor to output the alert signal.
III. EXEMPLARY METHOD
[0049] An embodiment discloses a method 700 for aiding a visually impaired person to navigate. FIG. 7 is an illustration of the exemplary method 700 for aiding a visually impaired person to navigate, in accordance with an embodiment. The method 700 includes, at step 702, capturing a video through an image capturing device; at step 704, processing the video received from the image capturing device at a processing device; at step 706, identifying obstacles in the path taken by the person based on processing; and at step 708, providing feedback to the person at least based on the identified obstacles.
[0050] In an embodiment, the video is captured at step 702 using the image capturing device 102. The image capturing device 102 captures videos of the immediate environment in front of the user. The videos are captured and transferred to the processing device 104 for detecting obstacles. [0051] In an embodiment, the image capturing device 102 is controlled by the processing device 104 through a control key provided in the user interface of processing device 104. A user can press or touch the control key, which in turn, activates the image capturing device 102. Once activated, the video mode of the image capturing device 102 starts capturing videos of the environment in front of the user.
[0052] In an embodiment, the captured videos are communicated to the processing device 104. The videos are received and processed at 704. Each video is a set of sequentially ordered image frames. Each image frame is processed sequentially at step 704 to perform operations on the image frames.
[0053] In an embodiment, the processing device 104 is programmed such that it identifies obstacles at step 706 in the path taken by the user. The step 706 further includes determining the position of obstacles at step 802, determining the distance of obstacles from the user at step 804 and defining boundaries of the obstacles at step 806 in the images with respect to the user. This information about the obstacle is communicated to the feedback module 108. FIG. 8 is an illustration of the exemplary method of identification of obstacles at step 706, in accordance with an embodiment. The processing device 104 comprises sets of instructions to perform various operations, such as, optical flow, edge detection and template matching, among other such techniques, for processing the images.
[0054] In an embodiment, template matching is used for face recognition. The processing device 104, upon recognising existence of a face, may prompt the user to provide an identifier, such as name, for that face. Faces, which have been provided identifiers, are stored in memory. Subsequently, if the system 100 detects existence of a face, which has been identified previously, then the system provides a feedback to the user. The feedback can be an audio feedback indicating the presence of a person, who was previously identified.
[0055] In an embodiment, identifying obstacles at step 706 comprises gathering information about the spatial arrangement of the immediate image to identify non uniformness in pixel intensity levels in the image. The processing device 104 gathers such information, and upon detection of non uniformness, a boundary for an obstacle can be defined.
[0056] In an embodiment, boundaries are defined by detecting edges. Abrupt changes in intensity in the neighbouring pixels in an image detect edges. The change in the pixel intensity determines the discontinuities in an image and extracts features such as, corners, lines, curves from the edges of an image. The features extracted can thus be used to determine presence of obstacles in the path.
[0057] In an embodiment, the processing device 104 is further programmed to locate objects and detect movements in objects in an image in addition to detecting objects. Sequences of ordered images allow the estimation of motion of objects with respect to the visually impaired person. The processing device 104 uses intelligence to form a pattern of motion of objects captured in the sequence of image frames caused by the relative motion between the object and the user.
[0058] The apparent motion of objects is based on the assumption that objects in an image taken at time t will be displaced in the image taken at time t+1, but will generally be still there in the image. For example, if the coordinates of any object in an image captured at t are (x, y, z), the coordinates of the same object taken at time t+1 will be (x+1, y+1, z+1) due to the relative motion of the object and the user. A change in the spatial arrangement in the subsequent frames with respect to a first frame in a sequence of image frames can give a measure of the distance of object and the position of the object with respect to the user.
[0059] In an embodiment, estimates a distance measure for obstacles with the sensor 302, in the direction of motion of the user. Ultrasonic waves in the path taken by the person are transmitted at regular intervals, such as, every 50 milliseconds. The ultrasonic waves reflect back upon hitting any object on the path. The time required by the reflected wave to reach the transmitting device is used to measure the distance between the object and the person. The sensor 302 requires trigger pulses to be applied on a periodic basis for the ultrasound to be radiated for distance measurement. The trigger pulses can be applied by external devices, such as, IC 555 timer connected through GPIO pins provided on the circuit board 103.
[0060] In an embodiment, the sensor 302 can detect any obstacle in the direction of motion of the user up to a distance of 3 meters from the sensor 302. The sensor 302 detects obstacles lying in line with the sensor 302. The sensor 302 can also detect obstacles in the direction of motion of the user where the obstacle lies within an angle of 15 degrees to 30 degrees with respect to the line of motion of the sensor 302.
[0061] In an embodiment, the feedback module 108 provides a feedback at step 708 by alerting the visually impaired person to change direction and avoid collision if any obstacle is detected on the path taken by the user.
[0062] The feedback module 108 receives instructions from the processing device 104 and the sensor 302 based on obstacles identified on the path. The feedback module 108 can output alert signal based on instructions received from the processing device 108 regarding distance and position of an obstacle. The feedback provided at step 708 to the user can be in the form of audio signals.
[0063] In another embodiment, the feedback provided at step 708 to the user can be in the form of vibration by using vibration motors. These vibration motors are integrated on the circuit board connected to the processing device 104 and the sensor 302 through the GPIO pins on the circuit board 103. The intensity of the alert signal increases as the obstacle approaches the visually impaired person and the intensity is maximum when the obstacle is closest to the visually impaired person.
[0064] In an embodiment, there can be at least two vibration motors one on the left and the other on the right in the system 100. For obstacles detected on the left of the visually impaired person, the processing device 104 or the sensor 302 can send a feedback to the left vibration motor to output the alert signal and similarly for obstacles detected on the right of the visually impaired person, the processing device 104 or the sensor 302 can send a feedback to the right vibration motor to output the alert signal.
IV. CONCLUSION
[0065] The technique disclosed in the embodiments enables navigation of a visually impaired person.
[0066] It shall be noted that the processes described above is described as sequence of steps, this was done solely for the sake of illustration. Accordingly, it is contemplated that some steps may be added, some steps may be omitted, the order of the steps may be rearranged, or some steps may be performed simultaneously.
[0067] Although embodiments have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the system and method described herein. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
[0068] Many alterations and modifications of the present invention will no doubt become apparent to a user of ordinary skill in the art after having read the foregoing description. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. It is to be understood that the description above contains many specifications, these should not be construed as limiting the scope of the invention but as merely providing illustrations of some of the usually preferred embodiments of this invention. Thus the scope of the invention should be determined by the appended claims and their legal equivalents rather than by the examples given.

Claims

CLAIMS I claim:
1. A system for aiding a visually impaired person to navigate, the system comprising: at least one image capturing device configured to capture a series of images;
a processing device configured to:
receive input from the image capturing device;
identify obstacles in the path taken by the person by processing the input received from the image capturing device; and
determine feedback to be provided to the person, at least based on the identified obstacles; and
at least one feedback module configured to provide the feedback to the person.
2. The system according to claim 1, wherein the processing device determines the position and distance of the obstacles in the path by considering spatial arrangement of the obstacles, wherein the spatial arrangement is determined by considering the apparent motion of pixels in the images, wherein the apparent motion of pixels is due to the relative motion of an obstacle and the person.
3. The system according to claim 1, wherein the processing device detects edges in the images by determining sudden changes in pixel intensity, wherein the changes in pixel intensity defines boundaries of obstacles.
4. The system according to claim 1, wherein the processing device further identifies faces by comparing at least a portion of facial features in an immediate image with features in at least one existing template.
5. The system according to claim 1, further comprising at least one sensor configured to: transmit ultrasonic waves in the path taken by the person;
receive reflected waves; and
estimate the distance between the obstacle and the person based on the received reflected waves.
6. The system according to claim 4, wherein the time taken by the reflected wave to reach the sensor is used to measure the distance between the obstacle and the person.
7. The system according to claim 1, wherein the feedback is at least one of audio feedback and haptic feedback.
8 A method for aiding a visually impaired person to navigate, the method comprising: capturing a series of images in a path taken by the person;
identifying obstacles in the path by processing the series of images; and
providing feedback to the person, at least based on the identified obstacles.
9 The method according to claim 8, wherein identifying obstacles comprises determining the position and distance of the obstacles in the path by considering spatial arrangement of the obstacles, wherein the spatial arrangement is determined by considering the apparent motion of pixels in the images, wherein the apparent motion of pixels is due to the relative motion of an obstacle and the person.
10. The method according to claim 8, wherein identifying obstacles comprises detecting edges in the images by locating sudden changes in pixel intensity, wherein the change in pixel intensity defines boundaries of obstacles.
11. The method according to claim 8, further comprises identifying faces by comparing at least a portion of facial features in an immediate image with features in at least one existing template.
12. The method according to claim 8, wherein identifying obstacles comprises determining the distance between the obstacles and the person by transmitting ultrasonic waves in the path taken by the person and receiving waves that are reflected back upon hitting one or more obstacles.
13. The method according to claim 8, wherein providing feedback comprises providing at least one of audio feedback and haptic feedback.
PCT/IB2015/051156 2014-02-17 2015-02-17 System and method for aiding a visually impaired person to navigate Ceased WO2015121846A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN745CH2014 2014-02-17
IN745/CHE/2014 2014-02-17

Publications (1)

Publication Number Publication Date
WO2015121846A1 true WO2015121846A1 (en) 2015-08-20

Family

ID=53799650

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2015/051156 Ceased WO2015121846A1 (en) 2014-02-17 2015-02-17 System and method for aiding a visually impaired person to navigate

Country Status (1)

Country Link
WO (1) WO2015121846A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9942701B2 (en) 2016-04-07 2018-04-10 At&T Intellectual Property I, L.P. Apparatus and method for detecting objects and navigation
CN111084710A (en) * 2018-10-24 2020-05-01 上海博泰悦臻网络技术服务有限公司 Method and system for providing navigation for special user
WO2022061380A1 (en) 2020-09-22 2022-03-31 Thomas Scheu Guide apparatus for persons with impaired vision
WO2023116998A1 (en) * 2021-12-20 2023-06-29 Unwired Things Aps A modular system for aiding visually or cognitively impaired or blind individuals in perceiving their surroundings

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080170118A1 (en) * 2007-01-12 2008-07-17 Albertson Jacob C Assisting a vision-impaired user with navigation based on a 3d captured image stream
US20080226134A1 (en) * 2007-03-12 2008-09-18 Stetten George Dewitt Fingertip visual haptic sensor controller
US20130131985A1 (en) * 2011-04-11 2013-05-23 James D. Weiland Wearable electronic image acquisition and enhancement system and method for image acquisition and visual enhancement

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080170118A1 (en) * 2007-01-12 2008-07-17 Albertson Jacob C Assisting a vision-impaired user with navigation based on a 3d captured image stream
US20080226134A1 (en) * 2007-03-12 2008-09-18 Stetten George Dewitt Fingertip visual haptic sensor controller
US20130131985A1 (en) * 2011-04-11 2013-05-23 James D. Weiland Wearable electronic image acquisition and enhancement system and method for image acquisition and visual enhancement

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9942701B2 (en) 2016-04-07 2018-04-10 At&T Intellectual Property I, L.P. Apparatus and method for detecting objects and navigation
US10917747B2 (en) 2016-04-07 2021-02-09 At&T Intellectual Property I, L.P. Apparatus and method for detecting objects and navigation
CN111084710A (en) * 2018-10-24 2020-05-01 上海博泰悦臻网络技术服务有限公司 Method and system for providing navigation for special user
CN111084710B (en) * 2018-10-24 2023-02-28 上海博泰悦臻网络技术服务有限公司 Method and system for providing navigation for special user
WO2022061380A1 (en) 2020-09-22 2022-03-31 Thomas Scheu Guide apparatus for persons with impaired vision
WO2023116998A1 (en) * 2021-12-20 2023-06-29 Unwired Things Aps A modular system for aiding visually or cognitively impaired or blind individuals in perceiving their surroundings
US20250057675A1 (en) * 2021-12-20 2025-02-20 Unwired Things Aps A modular system for aiding visually or cognitively impaired or blind individuals in perceiving their surroundings

Similar Documents

Publication Publication Date Title
Patel et al. Multisensor-based object detection in indoor environment for visually impaired people
Patel et al. Assistive device using computer vision and image processing for visually impaired; review and current status
TWI474173B (en) Assistance system and assistance method
CN105362048A (en) Mobile equipment and barrier information prompting method and device based on mobile equipment
WO2015121846A1 (en) System and method for aiding a visually impaired person to navigate
Bharathi et al. Effective navigation for visually impaired by wearable obstacle avoidance system
US20240130922A1 (en) Haptic guiding system
KR20150097043A (en) Smart System for a person who is visually impaired using eyeglasses with camera and a cane with control module
Mattoccia et al. 3D glasses as mobility aid for visually impaired people
KR20190111262A (en) Portable device for measuring distance from obstacle for blind person
Sharma et al. International journal of engineering sciences & research technology a review on obstacle detection and vision
Chinchole et al. Artificial intelligence and sensors based assistive system for the visually impaired people
Ilag et al. Design review of smart stick for the blind equipped with obstacle detection and identification using artificial intelligence
TWI652656B (en) Visual assistant system and wearable device having the same
Khanom et al. A comparative study of walking assistance tools developed for the visually impaired people
Liu et al. Electronic travel aids for the blind based on sensory substitution
KR102092212B1 (en) Wearable aids for the visually impaired
KR20200076170A (en) Assistance system and method for a person who is visually impaired using smart glasses
WO2017038248A1 (en) Instrument operation device, instrument operation method, and electronic instrument system
US10593058B2 (en) Human radar
JP2016099939A (en) Face recognition device
Bal et al. NAVIX: a wearable navigation system for visually impaired persons
KR20120088320A (en) Object recognition and for the visually impaired pedestrian guidance system
KR101788307B1 (en) Motion signal alarm system for pet
AU2020101563A4 (en) An artificial intelligence based system to assist blind person

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15748533

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15748533

Country of ref document: EP

Kind code of ref document: A1