[go: up one dir, main page]

EP4629877A1 - Appareil et procédé pour mesurer la vision fonctionnelle chez les patients malvoyants - Google Patents

Appareil et procédé pour mesurer la vision fonctionnelle chez les patients malvoyants

Info

Publication number
EP4629877A1
EP4629877A1 EP23901706.4A EP23901706A EP4629877A1 EP 4629877 A1 EP4629877 A1 EP 4629877A1 EP 23901706 A EP23901706 A EP 23901706A EP 4629877 A1 EP4629877 A1 EP 4629877A1
Authority
EP
European Patent Office
Prior art keywords
light
subject
vision
objects
test
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP23901706.4A
Other languages
German (de)
English (en)
Inventor
Samarendra Kumar Mohanty
Sanghoon Kim
Michael Carlson
Subrata Batabyal
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanoscope Instruments Inc
Original Assignee
Nanoscope Instruments Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanoscope Instruments Inc filed Critical Nanoscope Instruments Inc
Publication of EP4629877A1 publication Critical patent/EP4629877A1/fr
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0033Operational features thereof characterised by user input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0041Operational features thereof characterised by display arrangements
    • A61B3/0058Operational features thereof characterised by display arrangements for multiple images
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/028Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
    • A61B3/032Devices for presenting test symbols or characters, e.g. test chart projectors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Definitions

  • the present invention relates to vision quantification systems and methods designed to evaluate functional vision. Specifically, the systems are designed with the intent to track a user’s vision using mobility assessments, shape and optical flow recognition devices. More specifically, the invention relates to the design and application of the systems in the functional vision assessments in low vision patients.
  • the assessment of functional vision is characterized by measurement of multiple and varying parameters captured under complex, real-life conditions.
  • RP Retinitis Pigmentosa
  • the present invention provides a device and method for assessing functional vision in low vision patients, mimicking daily activities at varying light intensities such as walking toward lighted window or doorway avoiding obstacles as well as picking up objects on a table.
  • a normal vision or better vision subject is expected to perform all the low vision tests without difficulty.
  • the present invention encompasses a method of evaluating functional vision in low vision subjects (which may be worse than 20/200 for instance) comprising at least one of the steps of:
  • a Visually Guided Mobility Test comprising a single or multiple Light panel(s) for emitting light at different intensity levels or a lighted object(s) at different intensity; providing a single or multiple randomly- selected starting point(s) for a subject to find at least one of the Light panel(s) that is emitting light; providing a variable number of obstacle(s) positioned at different locations in the path to the Light panel(s) or the lighted object(s) to assess the ability of the subject to avoid them; providing at least one video camera for recording the mobility of the subject; providing a computer for switching at least one of the Light panel(s) or light shining on the object(s) ON/OFF, and varying the Light panel(s) light intensity or intensity of light shining on the object(s) and color by integrated software for directing and recording the performance of the Visually Guided Mobility Test; wherein the ability of the subject to detect and freely navigate towards the at least one of the Light panel(s) that is emitting light or lighted object(s) and avoid the obstacles is evaluated, without
  • a Visually guided Dexterity Test comprising a pre-calibrated Light panel for controlled illumination; providing differently shaped Three- dimensional (3D) objects that are stationary or moving; lighting the objects by the Light panel; detecting when the objects are placed or displaced; providing a control board communicating with a computer for controlling the light intensity levels of the Light panel; providing integrated software for providing instructions to the subject in a randomized order and for recording the performance of the subject; wherein the Visually guided Dexterity Test evaluates the ability of a subject to detect and discriminate an object and/or motion from a collection of differently sized/shaped/colored stationary/moving objects for near vision evaluation in three-dimensions; and/or
  • providing the video camera for recording the mobility of the subject may include an infrared LED illumination and may be mounted on a tripod and/or ceiling to record in low-light conditions (approximately than 1 lux), for example.
  • a scoring system may have a minimum threshold limit and may be evaluated as a pass or a failure depending on whether the minimum threshold limit is reached for a pass.
  • the pre-calibrated Light panel is mounted on an apparatus for controlled illumination. This may provide a consistent and uniform testing for all subjects - a standardized test which provides a consistent light level on every procedure.
  • the differently shaped Three-dimensional (3D) objects that are stationary or moving may be placed at a stationary or moving base of such an apparatus, for example.
  • Pressure sensor(s) may be attached to the base of such an apparatus for detecting a change in the pressure when the objects are placed or displaced, for example.
  • control board communicating with the computer may also read and communicate with the pressure sensor(s).
  • the Visually guided Dexterity Test may comprise a Three-dimensional Shape Discrimination which evaluates the ability of a subject (without requiring to be mobile) to detect and discriminate an object and/or motion from a collection of differently sized/shaped/colored stationary/moving objects for near vision evaluation in three- dimensions.
  • the screen or Light panel may be a touchscreen.
  • the Visually guided Dexterity Test may comprise a Two-dimensional (2D) Shape Discrimination Test.
  • the light and dark moving stripes/rings of different frequencies and intensities may move at different speeds and randomized directions.
  • Evaluating may involve an Optical Flow Test
  • the ability of the subject to detect the direction of motion of the pattern may involve a pass or failure (binary).
  • the subject may have normal vision, or visual impairment in one or both eyes.
  • the method may comprise the step of evaluating the functional vision of at least one eye of the subject.
  • the method may comprise the step of assessing the functional vision quantitatively in multiple light intensity levels in subjects with low vision.
  • the Light panel(s) or light shining on object(s) provides varying light intensities ranging from 0.1 lux (for example, moonless night) to 100 lux or above (for example, bright outdoor lighting) for evaluation of real-life visually guided mobility and/or dexterity vision in rod driven (at night, for example) and cone driven (during day light, for example) conditions.
  • the method may comprise the step of assessing the individual S, M or L cone based functional (color) vision by varying the color of light emitted by the Light panel.
  • the method may comprise the step of comprising evaluating the change in functional vision by scoring the subject’s visually guided mobility and dexterity test performance at varying light intensities, with a highest score for passing the test at lowest light intensity and lowest score for not passing the test at highest light intensity.
  • the method may comprise the step of evaluating functional vision based on success of the completion of a task based on the accuracy of the mobility and dexterity tasks including (i) touching the Light panel while avoiding the obstacles, (ii) touching/picking up the target object, and/or detecting correct direction of motion. This may involve Optical flow.
  • the method may comprise the step of using multiple Light panels or starting positions, increasing number of obstacles or objects, and/or randomizing the positioning of the objects, obstacles, and subject to minimize a learning effect of the subject while performing the visually guided mobility and dexterity tests.
  • the method may comprise the step of adjusting the difficulty level of the visually guided mobility and dexterity tests to evaluate subjects with a specific or broad range of ocular diseases based on their functional vision status.
  • the specific diseases may comprise, for example, central, peripheral, or pan-retinal vision loss.
  • the present invention contemplates an apparatus configured for performing functional vision tests in low vision subjects comprising: a single or multiple Light panel(s) for emitting light or shining light on object(s) at different intensity levels; a single or multiple randomly-selected starting point(s) for a subject to find at least one of the Light panel(s) that is emitting light or the light shining on object(s); a variable number of obstacle(s) positioned at different locations in the path to the Light panel(s) or the light shining on object(s); a video camera for recording the mobility of the subject; a computer for switching at least one of the Light panel(s) ON/OFF, and integrated software operable to vary the light intensity and color of the Light panel(s), wherein the size, shape, and number of Light panels or the light shining on object(s) is selectable; wherein the Light panel(s) or the light shining on object(s) comprises an LED display operable to generate specific frequency of LED patterns.
  • the distance between adjacent Light panel(s) or lighted object(s), and distance between the starting point to the Light panel(s) or the lighted object(s) is selectable; wherein control of dynamic range of light intensity from the at least one Light panel(s) or the light shining on object(s) is adjustable; wherein the position of the single or multiple obstacle(s) between the starting point and the at least one Light panel(s) or the light shining on object(s) is adjustable; and a control module for providing instructions to the subject.
  • Any of the apparatus defined herein may be configured for facilitating any of the methods of evaluating functional vision in low vision subjects as defined herein.
  • the video camera for recording the mobility of the subject may include an infrared LED illumination and may be mounted on a tripod and/or ceiling to record in low-light conditions (approximately than 1 lux), for example.
  • the panel(s) is adjustable using different types of LED, polarizer and neutral density filter, for example.
  • a goggle with different neutral density filters is used by the subject to further attenuate the light intensity reaching the eye.
  • the height of Light panel(s) position on the tripod can be adjusted to account for different heights of the subject’s eye level.
  • the position of the single or multiple obstacle(s) between the starting point and the at least one Light panel(s) may be adjustable in different arrangements to adjust the difficulty level of the mobility test;
  • the obstacle(s) of different shape, size, and color may have different reflectivity, for instance.
  • the reflectivity from the obstacle(s) may be changed using a polarizing film.
  • the motion sensors may be mounted on/under the obstacle(s) and/or LED display for automatic detection of obstacle hit by the subject performing the test.
  • the obstacles may be detected/recorded by a human.
  • a height adjuster may be used to change the height of the obstacle(s) to account for different height of the subjects performing the test.
  • control module for providing instructions to the subject may also direct and record the test and this may be conducted by PC, tablet, or smart devices.
  • the accuracy score is calculated based on completion of Light panel(s) touching task, with penalties assigned for each hit of the obstacle(s), and/or time taken to complete the task.
  • the integrated software may enable directing, making announcement, and/or recording of the performance of test.
  • the present invention contemplates an apparatus configured for functional vision tests in low vision subjects comprising: a pre-calibrated Light panel for controlled illumination; differently shaped Three-dimensional (3D) objects positioned at the stationary or moving portion of the apparatus; a pressure sensor(s) attached to the apparatus to detect change in the pressure when the objects are placed or displaced; a control board operable to communicate with a computer for controlling the light intensity levels of the Light panel and for reading the pressure sensor(s); integrated software for providing instructions to the subject in randomized order and for recording the performance of test, wherein the distance between adjacent objects, and distance between the mounted Light panel and objects is selectable; wherein the size, shape of Light panel is selectable; wherein control of dynamic range of light intensity from the LED panel is adjustable; and a control module for providing instructions to the subject.
  • a pre-calibrated Light panel for controlled illumination comprising: a pre-calibrated Light panel for controlled illumination; differently shaped Three-dimensional (3D) objects positioned at the stationary or moving portion of the apparatus; a pressure sensor(s) attached
  • the panel(s) is adjustable using different types of LED, polarizer and neutral density filter, for example.
  • the obstacle(s) of different shape, size, and color have different reflectivity, for instance.
  • the reflectivity from the obstacle(s) may be changed using a polarizing film.
  • the 3D shapes objects may be selected but not limited to, Cube, pyramid, sphere.
  • the pressure sensors are mounted on/under the object(s) for automatic detection of correct organization of the objects based on different weight as well as identify and record the object picked up by the subject performing the test.
  • control module for providing instructions to the subject may also direct and record the test and this may be conducted by PC, tablet, or smart devices.
  • the accuracy score and time score are calculated based on correctness of shape determination, and/or time it takes to complete the task.
  • the integrated software may enable directing, making announcement, and recording the performance of the test.
  • the present invention comprehends an apparatus for functional vision tests in low vision subjects comprising multiple types of objects that are displayed at different intensity levels against a background on a screen or Light panel in randomized order, wherein the intensity and color of the objects and background is adjustable; wherein the objects displayed are stationary or floating within the screen or Light panel(s); and wherein a touch sensor on the screen or Light panel records the screen touch by the subject for analysis.
  • a Two-dimensional (2D) Shape Discrimination Test evaluates ability of a subject to discriminate 2D objects of different sizes/shapes displayed at pre-allocated random locations on touchscreen Light panel, wherein the 2D shapes objects may be selected but not limited to, Square, Triangle, and Circle.
  • the intensity and color of the objects and background may be adjustable to probe different aspects of vision.
  • the present invention envisages an apparatus for functional vision tests in low vision subjects comprising: multiple types of objects that are displayed at different intensity levels against a background on a screen or Light panel in randomized order, wherein the intensity and color of the objects and background is adjustable; wherein the objects displayed are stationary or floating within the screen or Light panel; and wherein a touch sensor on the screen or Light panel records the screen touch by the subject for analysis;
  • the screen or Light panel(s) may be a touchscreen.
  • a Two-dimensional (2D) Shape Discrimination Test evaluates ability of a subject to discriminate 2D objects of different sizes/shapes displayed at pre-allocated random locations on touchscreen Light panel(s), wherein the 2D shapes objects may be selected but not limited to, Square, Triangle, and Circle.
  • the intensity and color of the objects and background may be adjustable to probe different aspects of vision.
  • the present invention comprehends an apparatus for functional vision tests in low vision subjects comprising: light and dark moving stripes/rings of different frequencies and intensities that are displayed on a screen or Light panel(s) in randomized direction, wherein the intensity and color of the light and dark moving stripes/rings is adjustable; wherein the speed of the light and dark moving stripes/rings is adjustable; and wherein a touch sensor on the screen or Light panel(s) records the screen touch by the subject for analysis.
  • the screen or Light panel(s) may be a touchscreen.
  • the present invention describes a device and method of Visually Guided Mobility Test evaluating the ability of a subject to detect and freely navigate toward lighted panel at different light intensities avoiding the obstacles. Without any other visual cues for direction such as arrows or markers, this emulates performance of various mobility routines of daily living.
  • the present invention also provides a device and method to evaluate near vision (near-vision dexterity task) in low-vision subjects (without requiring to be mobile) via discrimination of shape and motion of objects in a two- dimensional (2D) and/or three-dimensional (3D) environment having different light intensities.
  • the present invention describes a device and method wherein the near vision testing is conducted by displaying 2D Optical Flow (light and dark stripes/rings of different frequencies and intensities moving at different speeds and directions) on a screen wherein the subject is required to detect the direction of motion of the pattern.
  • the present invention describes a method of performing the visually guided mobility task and near-vision dexterity task repeatedly at same light intensity level (with randomly arranged shapes and directions of movement) for determining proportion of pass events to determine pass/fail at that light level based on predefined threshold.
  • the pass/fail criteria may include performing the test with score higher than threshold score and test completed within pre-determined cut off time.
  • the present invention describes a device and method of varying the difficulty of the low vision tests by changing the number of light panels, light intensity levels, number of obstacles, and/or reflectivity of the obstacles in mobility test, by changing the light intensity, number, shape, size of the objects in 2D/3D Shape Discrimination tests, by altering the frequency and speed of the displayed stripes in 2D Optical Flow test.
  • the test can be adapted to evaluate specific as well as broad range of low-vision patients.
  • Fig. 1. is a diagram showing a Visually Guided Mobility Test Setup.
  • 1001 LED panel 1 ;
  • 1002 LED panel 2;
  • 1003 Center obstacle(s);
  • 1004 Left obstacle(s);
  • 1005 Right obstacle(s);
  • 1006 Subject start position;
  • 1008 Video camera;
  • Fig. 2A is a diagram showing a Visually Guided Mobility Test component - LED panel. Different size and shapes of LED panel are used, and the LEDs emit different colors. Also, different elements or arrays of LED within the LED panel were lighted to generate LED stripes and patterns. For further control of dynamic range of light intensity from LED, a polarizer (1001 ) and/or neutral density filter (1002) is used;
  • Fig. 2B is a diagram showing a Visually Guided Mobility Test component - Obstacle(s). Different color, reflectivity, size and shapes of obstacle(s). Polarizing film (1001) is utilized to further change the reflectivity of obstacle(s);
  • FIG. 2C is a diagram showing a Visually Guided Mobility Test component - Control Module. PC, or tablet for synchronization and control of the test;
  • Fig. 3A is a diagram showing a Configuration 2 of Visually Guided Mobility Test Setup.
  • 1001 LED panel 1 ;
  • 1002 LED panel 2;
  • 1003 LED panel 3;
  • 1004 Left obstacle(s);
  • 1005 Center obstacle(s);
  • 1006 Right obstacle(s);
  • 1007 Subject start position;
  • 1008 Video camera;
  • 1009 PC and monitor/laptop/tablet;
  • Fig. 3B. is a diagram showing a Configuration 3 of Visually Guided Mobility Test Setup.
  • 1001 LED panel 1 ; 1002: LED panel 2; 1003: Left obstacle; 1004: Right obstacle; 1005: Left obstacle(s); 1006: Center obstacle(s); 1007: Right obstacle(s); 1008: Subject start position; 1009: Video camera; 1010: PC and monitor/laptop/tablet;
  • Fig. 3C. is of a picture showing the arrangement of obstacles and 3 light panels as seen in Fig. 3A;
  • Fig. 3D is an image of a low-vision subject performing the test, navigating through the obstacle course shown in Fig. 3C, ending with touching the (randomly)-lit panel;
  • Fig. 3E. is a picture showing the arrangement of obstacles and 2 light panels as seen in Fig. 3B;
  • Fig. 3F. is a picture of a low-vision subject (CF T) performing the test, navigating through obstacle course shown in Fig. 3E, and ending with touching the (randomly)-lit panel;
  • CF T low-vision subject
  • Fig. 3G. is a picture showing a variation in the arrangement of obstacles shown in Visually Guided Mobility Test with 2 light panels;
  • Fig. 3H is a picture of a low-vision subject performing the test, navigating through the obstacle course shown in Fig. 3G, and ending with touching the (randomly)-lit panel;
  • Fig. 4A is of an image of a Visually Guided Mobility Test procedures: Beginning of the test; Subject positioned at the start line, and randomly lighted LED panel;
  • Fig. 4B is an image taken at the End of the test; Subject finding/touching lighted LED panel after navigating through obstacles;
  • Fig. 4C is a scattered light map of the obstacles at the starting position under different luminance levels. 0.3 Lux measured at starting position at eye level - when the visibility data is collected with the left LED panel was ON;
  • Fig. 4D is a scattered light map of the obstacles at the starting position under different luminance levels. 1 Lux measured at starting position at eye level - when the visibility data is collected with the left LED panel was ON;
  • Fig. 4E. is a scattered light map of the obstacles at the starting position under different luminance levels. 3 Lux measured at starting position at eye level - when the visibility data is collected with the left LED panel was ON;
  • Fig. 4F. is a scattered light map of the obstacles at the starting position under different luminance levels. 10 Lux measured at starting position at eye level - when the visibility data is collected with the left LED panel was ON;
  • Fig. 4G. is a scattered light map of the obstacles at the starting position under different luminance levels. 32 Lux measured at starting position at eye level - when the visibility data is collected with the left LED panel was ON;
  • Fig. 4H. is a scattered light map of the obstacles at the starting position under different luminance levels. 100 Lux measured at starting position at eye level - when the visibility data is collected with the left LED panel was ON;
  • Fig. 4I. is an image of the Luminance distribution in the Visually Guided Mobility Test as measured by light meter. Map of luminance in room from the light source and scattered light from the obstacles as seen by the subject. The non-uniformity of the illumination path (created by obstacles and their reflection) for the light level of 0.3 Lux. Scale bar: Lux;
  • Fig. 4J. is an image of the Luminance distribution in the Visually Guided Mobility Test as measured by light meter. Map of luminance in room from the light source and scattered light from the obstacles as seen by the subject. The non-uniformity of the illumination path (created by obstacles and their reflection) for the light level of 1 Lux. Scale bar: Lux;
  • Fig. 4K is an image of the Luminance distribution in the Visually Guided Mobility Test as measured by light meter. Map of luminance in room from the light source and scattered light from the obstacles as seen by the subject. The non-uniformity of the illumination path (created by obstacles and their reflection) for the light level of 3 Lux. Scale bar: Lux;
  • Fig. 4L is an image of the Luminance distribution in the Visually Guided Mobility Test as measured by light meter. Map of luminance in room from the light source and scattered light from the obstacles as seen by the subject. The non-uniformity of the illumination path (created by obstacles and their reflection) for the light level of 10 Lux. Scale bar: Lux;
  • Fig. 4M is an image of the Luminance distribution in the Visually Guided Mobility Test as measured by light meter. Map of luminance in room from the light source and scattered light from the obstacles as seen by the subject. The non-uniformity of the illumination path (created by obstacles and their reflection) for the light level of 32 Lux. Scale bar: Lux;
  • Fig. 4N is an image of the Luminance distribution in the Visually Guided Mobility Test as measured by light meter. Map of luminance in room from the light source and scattered light from the obstacles as seen by the subject. The non-uniformity of the illumination path (created by obstacles and their reflection) for the light level of 100 Lux. Scale bar: Lux;
  • Fig. 5. is an image of a Graphical User Interface for Visually Guided Mobility
  • Test. Testing Eye Tab Input which eye (OS/OD/OU) to be tested; Light Control Tab: Turn on/off specific LED panel (left or right) or random LED panels for testing; Intensity Control Tab: Input light intensity level for LED panel; Test Control Tab: Start and end test (inbuilt timer to track the time elapse synchronized with video camera recording; Obstacle Tab: Record which obstacles were hit during the test; Save tab: Save the test parameters and result;
  • Fig. 6. Is a Flowchart of Visually Guided Mobility Test.
  • 1001 Test instruction provided to subject;
  • 1002 Light level intensity selection;
  • 1003 Start the Visually Guided Mobility Test;
  • 1004 Synchronized video record and timer start;
  • 1005 Single randomized LED panel light up;
  • 1006 Record any obstacle(s) hit in software during test, and test ends if the subject finds the lighted LED panel or timer runs out;
  • 1007 Repeat the test for N number of times at the same light intensity level;
  • 1008 Light level increase;
  • 1009 Repeat the test protocols for M number of intensities;
  • Fig. 7 A is an image of a Scoring system of Visually Guided Mobility Test.
  • 1001 Assign penalty weights for obstacle-hit(s), repositioning, cut-off test time, pass/fail threshold score.
  • 1002 Compute the final score (Accuracy score) from data collected within a trial and assigned weights;
  • 1003 Compare with pass/fail threshold score to determine pass or fail for the trial;
  • 1004 Determine if the subject passes the light level by comparing to the proportion of trial pass criteria set for overall passing of a light level;
  • 1005 Determine lowest level of illumination subject passes;
  • Fig. 7B is an image of an Example of a scoring system for visually guided mobility test. Penalty weights of any obstacles hit, going out of boundary such as hitting the wall of the room, and repositioning are assigned differently. If subject touches the correct LED light panel, the subject earns certain points (ex, 100). Pass/fail threshold value is set to be threshold for pass criteria for each trial. The subject is considered pass if final calculated score (Accuracy score) is higher than pass threshold value;
  • Fig. 8A is a diagram showing a Visually Guided Mobility Test Scenario 1.
  • 1001 LED panel 1 (on); 1002: LED panel 2 (off); 1003: Center obstacle(s); 1004: Left obstacle(s); 1005: Right obstacle(s); 1006: Subject start position; 1007; PC and monitor/laptop/tablet; 1008: Video camera.
  • the subject navigates and touches the lighted LED panel without bumping any obstacles.
  • Fig. 8B is a diagram showing a Visually Guided Mobility Test Scenario 2.
  • 1001 LED panel 1 (on); 1002: LED panel 2 (off); 1003: Center obstacle(s); 1004: Left obstacle(s); 1005: Right obstacle(s); 1006: Subject start position; 1007; PC and monitor/laptop/tablet; 1008: Video camera.
  • the subject bumps left obstacle(s) once, then navigates to touch the lighted LED panel without bumping any additional obstacle(s).
  • Fig. 8C. is a diagram showing a Visually Guided Mobility Test Scenario 3.
  • 1001 LED panel 1 (off); 1002: LED panel 2 (on); 1003: Center obstacle(s); 1004: Left obstacle(s); 1005: Right obstacle(s); 1006: Subject start position; 1007; PC and monitor/laptop/tablet; 1008: Video camera.
  • the subject bumps left obstacle(s) once, then bumps into the center obstacles(s) once and never find/touch the lighted LED panel.
  • Fig. 9A is a Plot of accuracy score vs. time score in low vision subjects. The straight lines represent fit for determining correlation between accuracy and time scores;
  • Fig. 9B is a Plot of accuracy score vs. time score in better vision subjects
  • Fig. 10A. is a diagram showing a 3D Shape Discrimination Test.
  • 1001 flat LED panel (Intensity set by user);
  • 1002 Base of the 3D Shape Discrimination unit;
  • 1003 Object slot 1 and pressure sensor;
  • 1004 Object slot 2 and pressure sensor;
  • 1005 Object slot 3 and pressure sensor: 1006: Object 1 ;
  • 1007 Object 2;
  • 1008 Object 3;
  • 1009 Control board for adjusting LED light and communicating with pressure sensor(s).
  • 1010 PC communicate with the control board to take inputs from the user and display which object(s) is(are) picked up based on pressure sensor;
  • Fig. 10B is an image of a 3D-Shape Discrimination Test set up configured with an assortment of 6 large-sized real-world objects which are equivalent to geometric shapes pyramid, donut, brick, cube, cylinder, and sphere;
  • Fig. 11A is a diagram of an Example of 3D Shape discrimination setup.
  • Proctor (1001) of the test sits on the opposite side of the subject (1002), and the graphical user interface (GUI) is displayed in the monitor screen (1003) displaying the orders of objects need to be placed in 3D Shape Discrimination platform (1004).
  • GUI graphical user interface
  • the test ends when the subject picks up any object regardless of correctness. Then the test is repeated in the same light level for a predetermined set of times before moving on to tests with higher illumination intensity;
  • Fig. 11 B is a picture of a 3D Shape Discrimination Test set up configured with an assortment of 6 large sized-objects (pyramid, donut, brick, cube, cylinder and sphere);
  • Fig. 11C. is a picture of a 3D-Shape Discrimination Test set up configured with an assortment of 6 large sized-objects (pyramid, donut, brick, cube, cylinder and sphere) in a row of two separate heights.
  • Fig. 11 D is a picture of a 3D Shape Discrimination Test set up configured with an assortment of 6 medium sized-objects (pyramid, donut, brick, cube, cylinder and sphere. Reduction of object size increases difficulty;
  • Fig. 11E. is a picture of a 3D Shape Discrimination Test set up configured with an assortment of 6 small sized-objects (pyramid, donut, brick, cube, cylinder and sphere). Reduction of object size increases difficulty;
  • Fig. 12A is a picture of a 3D shape discrimination framework and procedure example. Three different types of object places on the base of 3D Shape Discrimination apparatus mounted with pressure sensor. The flat LED panel from the top of the device controls light intensity level;
  • Fig. 12B is a picture of an Example of a subject picking up an instructed object in the 3D shape discrimination test
  • Fig. 12C is an image of an Example of output from the 3D Shape Discrimination assay. Light intensity, object potion information, target shape and location, shape of object picked up by a subject, and correctness of the 3D Shape Discrimination are provided;
  • Fig. 12D is a picture of a Set up configured with an assortment of 3 large sized- objects (pyramid, cube, and sphere) in sselling object rotating during testing. Platform with rotation hidden beneath objects. Position 1 during rotation is shown;
  • Fig. 12E is a picture of a Set up configured with an assortment of 3 large sized- objects (pyramid, cube, and sphere) in sselling object rotating during testing Platform with rotation hidden beneath objects. Position 2 during rotation is shown;
  • Fig. 12F. is a picture of a Set up configured with an assortment of 3 large sized- objects (pyramid, cube, and sphere) in sselling object rotating during testing Platform with rotation hidden beneath objects. Position 3 during rotation is shown;
  • Fig. 12G. is a picture of a Set up configured with an assortment of 3 large sized- objects (pyramid, cube, and sphere) in sselling object rotating during testing Platform with rotation hidden beneath objects. Position 4 during rotation is shown;
  • Fig. 13A. is a diagram of a 2D shape discrimination: Three different types of are displayed in touchscreen in a random arrangement order;
  • Fig. 13B is diagram of Another display option with floating objects in randomized manner. Which object to be picked up is announced (by proctor according to the GUI, or by automated voice in the software). The test ends when the subject touches the touchscreen regardless of correctness;
  • Fig. 13C is a chart of a Correlation of 2D size threshold score with BCVA (measured by Koch Acuity) into different groups of low vision patents;
  • Fig. 13D is a plot of a Correlation of 2D shape Discrimination accuracy with BCVA
  • Fig. 13E is an Example of output from the 2D Shape Discrimination assay. Light intensity, shape position information, target shape and location in XY coordinates, shape of object selected by a subject, XY coordinates of the touch input, elapsed time and correctness of the shape discrimination are recorded;
  • Fig. 14A is an Optic flow moving towards the left;
  • Fig. 14B is an Optic flow moving towards up (in an upwardly direction) in the 2D optical flow discrimination test. Black and white stripes at different frequency that flow into random directions are displayed in touchscreen. The subject is asked to tell which direction the flow is moving or asked to touch the side of touchscreen where the flow is moving towards;
  • Fig. 14C is an image showing an Example of output from the 2D optical flow determination assay. Flow direction, XY coordinates of the touch input, elapsed time and correctness of the flow direction determinations are provided;
  • Fig. 15A is a picture of a 3D and 2D Shape Discrimination setup
  • Fig. 15B is an image of a Graphical user interface for 2D I 3D shape recognition and optical flow test
  • Fig. 16A is an image of a Triangle shape displayed in the LED panel
  • Fig. 16B is an image of a Circle shape displayed in the LED panel
  • Fig. 16C is an image of a Square shape displayed in the LED panel
  • Fig. 16D is an image of a Optical flow pattern displayed in the LED panel
  • Fig. 16E displays a Graphical user interface for LED based 2D Shape recognition, and optical flow. Multiple display parameters such as intensity, shape, color, and size of the objects, and direction and speed which objects and optical flows are provided; and
  • Fig. 16F is a graph showing the Association speed threshold of Optical Flow with Patient Reported Outcome (PRO, measured by NEI-VFQ) in Low-Vision patients. Pearson Correlation between speed threshold and PRO for two different groups.
  • ETDRS Early Treatment Diabetic Retinopathy Study
  • a standard ETDRS visual acuity chart is designed for visual acuities of 20/800 (log MAR 1 .0) to 20/10 (Snellen visual acuity equivalent).
  • Traditional vision testing charts are not able to measure vision below a certain level. Vision in this range is classified as counting fingers (CF), hand movements (HM) and light perception (LP). These measures are not very accurate or easily quantifiable.
  • Multi-Luminance Mobility Test (MLMT) 1 has been used to measure functional vision in low vision patients, but cannot be used for very low vision (BCVA worse than 20/200) patients. Further, evaluating mobility performance within a set time may not be feasible for low-vision patients with difficulty in mobility.
  • the present invention provides a device and method for assessing functional vision in low vision patients, mimicking daily activities at varying light intensities such as walking towards a lighted window or doorway, avoiding obstacles, as well as picking up objects on a table.
  • the present invention describes a device and method of evaluating the ability of a subject to detect and freely navigate towards a lighted panel at different light intensities avoiding the obstacles, without any other visual cues for direction such as arrows or markers.
  • this emulates performance of various mobility routines of daily living.
  • the present invention describes a device and method of evaluating the navigation performance wherein the functional vision of each eye as well as both eyes are evaluated based on the subject’s ability to find a randomly-selected lighted panel that provides varying light intensities ranging from moonless night ( ⁇ 0.5 lux) to bright outdoor lighting (> 100 lux), measured at the starting point at the eye level of the subject.
  • the discrete light intensities can be standardized (via calibrated lux meter) to have a semi-log difference and may include one or more of 0.3 lux, 1 lux, 3 lux, 10 lux, 32 lux, 100 lux, and / or other intensity levels.
  • the present invention allows evaluation of real-life navigational vision in night (rod driven) and day light (cone driven) conditions.
  • the present invention describes a device and method of increasing the dynamic range of the light intensities by use of LED panels with different sizes and colors, control of the LED current, use of polarizers and neutral density filters in the mobility assessment module. Therefore, in addition to evaluating overall cone functions, individual S, M or L cone-based functional (color) vision can be assessed.
  • the present invention describes a device and method of varying the difficulty of the mobility test by changing the number of light panels, light intensity levels, number of obstacles, and/or reflectivity of the obstacles.
  • the test can be adapted to evaluate specific as well as a broad range of low-vision patients.
  • the present invention describes a near-vision evaluation device and method based on the ability of a subject (without requiring to be mobile) to detect and discriminate an object and/or motion from a collection of differently shaped stationary/moving objects using one eye or both eyes, wherein the color and reflectivity or contrast of the objects along with the color and intensity of the illuminating light may also be altered. Therefore, in addition to evaluating overall cone functions, individual S, M or L cone-based functional (color) near vision can be assessed.
  • the present invention describes a device and method wherein the near vision testing includes evaluation of size threshold for correctly detecting different shapes.
  • the device and method involve size of displayed objects varied from 1- 30 degrees (at subject’s eye). Ability of the subject to accurately detect smaller size objects imply better vision.
  • the present invention describes a device and method which is capable of discriminating subjects with different levels of near-vision and capable of monitoring changes in functional vision in low-vision subjects at different time points and/or after therapeutic intervention.
  • the present invention describes a method of performing the near-vision dexterity task repeatedly at same light intensity level (with randomly arranged shapes and directions of movement) for determining proportion of pass events to determine pass/fail at that light level based on predefined threshold.
  • the pass/fail criteria may include performing the test with score higher than threshold score and test completed within pre-determined cut off time.
  • the present invention describes a method of determining change in functional vision by longitudinally scoring the subject’s mobility and dexterity performance at varying light intensities.
  • the present invention describes a device and method for quantitative measurements of vision level that can be correlated with low- vision subject’s real-life visual perception and interaction.
  • the terms “a” and “an” are defined as one or more unless this disclosure explicitly requires otherwise.
  • the term “substantially” is defined as largely but not necessarily wholly what is specified (and includes what is specified; e.g., substantially 90 degrees includes 90 degrees and substantially parallel includes parallel), as understood by a person of ordinary skill in the art.
  • the terms “substantially,” “approximately,” and “about” may be substituted with “within [a percentage] of’ what is specified, where the percentage includes 0.1 , 1 , 5, and 10 percent.
  • the term “about” is used to indicate that a value includes the standard deviation of error for the device or method being employed to determine the value.
  • a device or method that is configured in a certain way is configured in at least that way, but it can also be configured in other ways than those specifically described.
  • any embodiment of any of the apparatuses, devices, systems, and methods can consist of or consist essentially of - rather than comprise/include/contain/have - any of the described steps, elements, and/or features.
  • the term “consisting of’ or “consisting essentially of’ can be substituted for any of the open-ended linking verbs recited above, in order to change the scope of a given claim from what it would otherwise be using the open-ended linking verb.
  • the feature or features of one embodiment may be applied to other embodiments, even though not described or illustrated, unless expressly prohibited by this disclosure or the nature of the embodiments.
  • Example 1 the Visually Guided Mobility Test measures changes in functional vision, as assessed by the ability of a subject to navigate accurately and at a reasonable pace at different levels of illumination.
  • Example of visually guided mobility test set up is shown in Fig. 1.
  • the subject (1006) is asked to walk towards a Light Emitting Diode (LED) panel (1001 or 1002) lighted at different levels of illumination while avoiding obstacles (1003, 1004, 1005).
  • Videos of subjects undergoing visually guided mobility test are recorded using a camera(s) (1008) that is/are mounted on a tripod for front view and/or in the ceiling for top-view.
  • the height of the camera position on the tripod is adjustable to account for different heights of the subjects performing the test.
  • the camera is equipped with infrared LEDs to enable imaging in a low luminance (or dark) environment.
  • the synchronized video recording, switching ON/OFF of the LED panels, and LED panel light intensity are controlled by a PC (1007) (a control module).
  • Example 2 Each component of the Visually Guided Mobility Test is chosen to have different properties. As shown in Fig. 2A, different size and shapes of LED panel are used, and the LEDs emit different colors (red, green, blue and white) that allow the assessment of color specific functional vision. Also, different elements or arrays of LED within the LED panel are lighted to generate specific spatial frequency of LED stripes and patterns. For further control of dynamic range of light intensity from LED panel (Fig. 2A) a polarizer (1001) and/or neutral density filter (1002) is used to attenuate the light. The height of the LED panel position on the tripod can be adjusted to account for different heights of the subjects performing test. Fig. 2B shows that the obstacles(s) of different shape, size, and color having different reflectivity.
  • the reflectivity from the obstacles could be changed by use of a polarizing film (1001 ), for instance.
  • the obstacles are mounted on motion sensors (1002) for automatic detection of the obstacle-hit by the subjects performing the test.
  • a height adjuster (1003) allowed changing the height of the obstacles to account for different height of the subjects performing test.
  • the control module is selected from a PC or portable tablet for synchronization and control of the LED lights (intensity, ON/OFF, color), and camera recording as shown in Fig. 2C.
  • Example 3 Multiple possible configurations of Visually Guided Mobility Test are shown in Fig. 3A and Fig. 3B. Additional number of LED panels (1001 , 1002, 1003) are placed as shown in Fig. 3A. Further, different arrangements and number of obstacle(s), to adjust difficulty level of the mobility test, are placed as demonstrated in Fig. 3B.
  • the visually guided mobility set up design is adapted to the vision-status, and mobility of the subject population. Practical setup based on three LED setup from Fig. 3A is visible in Fig 3C. Low vision subject shown navigating obstacle setup to touch the randomly turned-on LED panel in Fig. 3D. The complexity of the Visually Guided Mobility Test setup was increased by additional obstacles and positioning of the obstacles near the light sources were increased as shown in Fig.
  • Example 4 Fig. 4A and Fig. 4B illustrate examples of visually guided mobility test performance.
  • the subject is positioned at the starting position and a random LED panel is turned on to produce desired light intensity level as shown in Fig. 4A.
  • the subject navigates around the obstacles and the test ends when the subject touches the lighted LED panel as shown in Fig. 4B.
  • the proctor/motion sensors recorded the obstacles that are hit during the test.
  • the video is also recorded for verification.
  • translucent obstacles are positioned left, right and center of the test with two LED panels at the end (as in Fig. 1 ) while maintaining controlled low luminance level environment for the testing ( ⁇ 1 lux ambient room light).
  • Fig. 4A and Fig. 4B illustrate examples of visually guided mobility test performance.
  • the subject is positioned at the starting position and a random LED panel is turned on to produce desired light intensity level as shown in Fig. 4A.
  • the subject navigates around the obstacles and the test ends when the subject touches the lighted LED panel as shown in Fig. 4
  • FIG. 4C illustrates a visual top-down representation of objects based on the configuration shown in Fig. 1.
  • Each obstacle reflects light corresponding to the activated LED panel during the test.
  • the visibility of the obstacles remains poor, and the visibility increases as the LED intensity is increased.
  • the left side obstacles visibility is higher when the left LED panel is ON compared to right obstacles, and the same is true when the right LED panel is ON.
  • the graphic can be vertically mirrored to depict the scenario when the right LED panel is active.
  • Fig. 4D shows a slightly more defined layout at 1Lux.
  • Fig. 4E shows a fully defined left and middle obstacle with the right side being lowly lit at 3 Lux.
  • FIG. 4F shows a brightly lit middle and left side with the right side being clearly defined at 10 Lux.
  • Fig. 4G shows the fully illuminated left and middle obstacles at 32 Lux.
  • Fig. 4H shows the fully illuminated obstacles at 100 Lux.
  • a total of 9 x10 measurements per luminance level were made excluding the top row where the light panel is located.
  • the light intensity map, and the obstacle scattering map, were overlaid to represent what a subject would see at a given position navigating through the Visually Guided Mobility Testshown in Fig. 41.
  • Example 5 shows the Graphical User Interface for Visually Guided Mobility Test.
  • the proctor provided inputs which eye-open condition for the subject (Monocular: OS, OD or Binocular: OU) is to be tested.
  • the proctor manually checked if the LED panels are functioning or not. According to the test protocol, desired light intensity and light color are selected in the “Intensity Control” and “Color control” tabs respectively.
  • the upper tabs are designed to include the controllable parameters of individual test trial by a proctor. Once the subject is positioned at the starting line, the proctor starts the test in the Test Control tab.
  • the timer starts as well as recording of the video as shown in the bottom right tab.
  • the proctor inputs which obstacle are hit by the subject in the Obstacle tab.
  • the timer ends when End button in the Test Control tab is pressed, and the proctor records the input parameters and results in Save tab.
  • Example 6 illustrates a sequence (flow chart) of the procedure in a Visually Guided Mobility Test.
  • the test instruction is provided to the subject (1001 ), and lowest light intensity is selected (1002).
  • the test starts (1003)
  • the synchronized video and timer starts (1004) and the randomly selected LED panel is lighted (1005).
  • the subject navigates around the obstacles to touch the lighted LED panel, while the proctor records in the software any obstacle(s) hit during the test.
  • the motion sensor(s) associated with the obstacle(s) also records obstacle(s) hit in software independently and/or alternatively.
  • the test ends when the subject finds the lighted (correct) LED panel or pre-set timer runs out (1006).
  • Example 7 This example shows how the visually guided mobility test is implemented to assess and discriminate subjects with different levels of functional vision by adjusting the levels of difficulty in performing/passing the test. This includes adjustment of the range (lowest and highest) of light intensity levels and location of different obstacles within the test as described in Example 3.
  • various scoring systems are assigned to adjust sensitivity as well as dynamic range of the assay considering different functional vision level.
  • Fig. 7A demonstrates a scoring system of Visually Guided Mobility Test. First, individual penalty weights for obstacle-hit(s), repositioning (required when subject is completely off-the mobility course), cut-off test time as well as pass/fail threshold score are assigned (1001 ).
  • Repositioning of subject is conducted when there is a safety issue and/or when the subject is completely lost of visually-guided task and touches the non-lighted (wrong) panel.
  • final score is computed from data collected within each trial (1002).
  • the computed score of each trial is compared to the pass/fail threshold score (1003).
  • the subject’s trial-performance is evaluated for pass/fail of the specific light level (1004).
  • the lowest light illumination level (1005) that subject passed (light intensity level in which the proportion of trial pass criteria is met) is determined.
  • Fig. 7B shows example of scoring system for the Visually Guided Mobility Test (Fig. 1).
  • the penalty weights of any obstacle(s) hit, going out of boundary such as hitting the boundary of the test area (such as wall of the room), and repositioning are assigned differently.
  • left and right obstacle(s) have a weight C1
  • center obstacle(s) have a weight C3
  • out of boundary have a weight C4
  • repositioning have a weight C5.
  • N1 through N5 represent number of hits with corresponding obstacle(s) and number of other penalties during the test.
  • Screen Touch Score e.g., 100 points
  • Example 8 Fig. 8A, Fig. 8B, and Fig. 8C depict several scenarios of scoring by the subject in the Visually Guided Mobility Test.
  • the left LED panel is lighted (lit), and the subject navigates and touches the lighted LED panel without bumping any obstacles.
  • the left LED panel is lighted, and the subject bumps left obstacle(s) once, then navigates to touch the lighted LED panel without bumping any additional obstacle(s).
  • the right LED panel is lighted; however, the subject bumps left obstacle(s) once, then bumps into the center obstacles(s) once and never find/touch the lighted LED panel.
  • left and right obstacle(s) weights are assigned the same, but different weights are assigned for left and right obstacle(s) depending on which LED panel is lighted for other mobility test configurations. If value of C1 is 20, value of C2 is 30, and pass accuracy score threshold is higher than 50 points, the subject in Fig. 8A passes the trial with an accuracy score of 100, the subject in Fig. 8B passes the test with a score of 80, but the subject in Fig. 8C fails the trial with accuracy score of -50. [00184]
  • Example 9 Visually Guided Mobility Test has been validated with subjects with different visual impairments.
  • Fig. 9A shows plot of accuracy score vs. time score in low vision subjects.
  • Fig. 9B summarizes plot of accuracy score vs. time score in better vision subjects.
  • the dashed horizontal and vertical lines represent accuracy and time score passing criteria. As shown, all the better vision subjects were found to pass the test meeting both the accuracy and time pass criteria.
  • Fig. 9C shows the sensitivity analysis of the Visually Guided Mobility Test in normal vision and low-vision subjects having different levels of vision.
  • the measured Visually Guided Mobility Test score as a function of BCVA (at baseline) of eyes demonstrates that a change of 2 light level was associated with 0.3 logMAR (measured by FrACT) in individuals with severe vision loss due to RP.
  • Example 10 Near vision (objects within hand reach distance) of low vision subjects is evaluated via discrimination static/moving objects of varying shapes (and/or different sizes) in two-dimensional (2D) and three-dimensional (3D) configurations as well as determination direction of moving patterns (optical flow) at multiple luminance level.
  • the 2D shape discrimination emulates a subject using smart display devices (used in daily activities such as cell phone, tablets, computers), and 3D shape discrimination of objects that are illuminated at different luminance level emulates different lighting conditions while performing activities of daily living. A normal vision or better is expected to perform all the low vision tests without difficulty.
  • Example of 3D shape discrimination test set up is shown in Fig. 10A.
  • the 3D shape discrimination apparatus has a flat LED panel (1001 ) mounted at the top of the apparatus.
  • the LED panel is set to pre-calibrated light intensity levels to illuminate different objects (1006, 1007, 1008) placed at the base (1002) of the 3D Shape Discrimination unit.
  • the control board (1009) communicates with a PC/tablet (1010) to take input from the user to control the light intensity levels and provides feedback from pressure sensor to display which object(s) is(are) picked up based on pressure sensor reading.
  • the objects in Fig. 10B are the real world objects that were used as a base to design the geometric shapes for test use.
  • the arrangement includes two rows.
  • Example 11 illustrates example of 3D Shape Discrimination setup including the positions of the subject, proctor and the display.
  • Proctor (1001 ) of the test sitting on the opposite side of the subject (1002) has easy access to the graphical user interface (GUI), displayed in the monitor screen (1003).
  • GUI graphical user interface
  • the subject does not have access to the GUI and the display on the monitor screen.
  • the display monitor light intensity is dimmed to maintain controlled low luminance (e.g., ⁇ 1 lux ambient room light) level environment for the testing.
  • the GUI displays the orders of objects such as sphere, cube and pyramid that need to be placed in the 3D Shape Discrimination platform (1004).
  • the object to be picked up is announced (by proctor according to the procedure, or by automated voice in the GUI).
  • the pressure sensor or camera sensor
  • the subject is instructed to pick up the object within a time cut-off, and not allowed to change his/her decision once (s)he touched an object (as the texture of the object can influence the decision).
  • the trial concludes when the subject picks up any object regardless of correctness.
  • the pressure sensor in the 3D Shape Discrimination apparatus provides feedback to the software and record the shape that was picked up in that trial. The test is repeated in the same light level for a preset number of times before moving on to higher illumination intensity.
  • Fig. 11B the six geometrical objects were arranged in a three by two format to increase the number of choices available per trial and to increase difficulty. The back row will allow for objects to be stored for additional difficulty or simultaneously used as requested objects to increase the number of possible objects. In the event that only first row objects will be options it gives variability to the objects presented to the subject. To Further evaluate subjects' discrimination ability multiple heights are introduced as seen in Fig. 11C. The multiple heights introduce a variability within light reflectance that is also controlled giving 2 stages of possible difficulty within each light level.
  • Fig. 11 D The addition of medium sized objects give rise to another measuring point of the system as seen in Fig. 11 D. Reduction of size similar to reduction of font size in writing lowers visual clarity raising the difficulty on the test while maintaining scalability.
  • Fig. 11 E demonstrates the smallest size chosen to showcase the ability of the test to increase in difficulty and to extend to even those with 20/20 vision.
  • Example 12 illustrates example of a 3D Shape Discrimination framework under controlled low luminance environments.
  • the flat LED panel (with diffuser) provides uniform illumination to objects and reflections of the objects to the subject’s eye.
  • three different types of objects are placed on the base of the 3D Shape Discrimination apparatus mounted with pressure sensor.
  • Fig. 12B demonstrates how a subject would reach the announced object and pick it up. The pressure sensor recognizes the object that is picked up and the location.
  • Software output from the 3D Shape Discrimination assay is summarized in Fig. 12C. Light intensity (measured in lux), object position information, target shape and location, shape of object picked up by a subject, and correctness of the shape discrimination are provided. Utilizing movement Fig.
  • FIG. 12D serves as an additional tool to enhance object silhouettes, offering an alternative detection method for low vision subjects beyond contrast enhancement.
  • Fig. 12E demonstrates different silhouettes on a pyramid and cube
  • Fig. 12F demonstrates different silhouettes on pyramid and cube
  • Fig. 12G demonstrates different silhouettes on a pyramid and cube.
  • Example 13 For 2D Shape Discrimination, multiple types of shapes are displayed at different intensity level in a tablet touchscreen in randomized order. The intensity level is controlled by the monitor brightness while maintaining controlled low luminance level environment. The target object shape is announced by the automated voice output of the tablet speaker, and the subject is instructed to touch the target object with his/her finger as shown in Fig. 13A. The objects have different shapes and sizes, and the randomized objects are either stationary or floating as shown in Fig. 13B. After announcement of the target shape, the test ends when the subject touches the touchscreen regardless of the correctness.
  • Fig. 13C shows correlation plot of size threshold score with best corrected visual acuity (BCVA) in two different groups of low vision patents.
  • the test was conducted to evaluate various vision-related parameters such as size detection threshold, shape recognition ability, in low-vision patients.
  • the objects were shaped into either circular, square, or triangular shapes.
  • the intensities of the display were varied to determine the intensity threshold of recognizing shapes by the low-vision subjects.
  • 2D Shape Discrimination tests revealed that the intensity threshold to detect different static shapes is different for Counting Finger vision subjects vs. subjects with hand motion vision. Further, with increase in intensity level, the shape discrimination accuracy increased significantly.
  • the size-threshold of 2D Shape Discrimination Test was found to be correlated with BCVA only for subjects having ⁇ 1.95 logMAR.
  • the poor correlation in profoundly impaired low-vision subjects (BCVA>1.95) is due to the inability of the subjects to detect optotypes (in Freiburg Acuity) owing to low-luminance of Koch Acuity measuring display device.
  • Fig. 13D shows correlation plot of 2D Shape Discrimination accuracy with measured BCVA. The straight line represents fit for determining correlation between BCVA and 2D Shape Discrimination accuracy.
  • the intensities of the 2D Shape Discrimination display panel was varied to determine the intensity threshold of recognizing shapes.
  • Fig. 13E Software output from the 2D Shape Discrimination assay is summarized in Fig. 13E. Light intensity, shape position information, target shape and location (XY coordinates of the center of mass) of the shapes, shape of object selected by a subject, XY coordinates of the subject’s touch input on the touchscreen, distance between input of the subject and center of mass of the target object coordinates, and elapsed time to touch the target object and correctness of the shape discrimination are recorded.
  • Example 14 The optical flow test emulates the activity of daily living situations such as (i) watching a movie on a smart device; and (ii) following moving objects (cars, people).
  • 2D Optical Flow test the black and white stripes at different frequency that flow into random directions are displayed in touchscreen.
  • Fig. 14A illustrates flow moving towards the left and
  • Fig. 14B shows flow moving towards up (in an upwardly direction).
  • the subject is asked to tell which direction the flow is moving or asked to touch the side of touchscreen where the flow is moving towards.
  • Fig. 14C shows a summary of output from the 2D optical flow determination assay. Flow direction, XY coordinates of the touch input, elapsed time and correctness of the flow direction determinations are provided.
  • Optical flow direction detection accuracy and upper speed threshold for correctly detecting optical flow was determined in low-vision patients. Laterally or radially moving (inward or outward) vs. stationary illumination patterns were displayed, and the subject were asked to recognize the optical flow. Testing was performed monocularly in counterbalanced order. The upper speed threshold (to accurately detect direction of optical flow) was found to be dependent on visual acuity. Further, with increase in the display intensity level, the shape discrimination accuracy increased significantly.
  • Example 15 shows 3D Shape Discrimination and 2D Shape Discrimination apparatus under low illumination environment. Both devices are connected to a single control device (PC/laptop/tablet). The software interface for tests including 2D/3D Shape Discrimination and Optical Flow is shown in Fig. 15B.
  • the low vision test systems are portable having small form factor and imitates real life interactions to correlate the visual functions to quantitative measurements in low vision subjects.
  • Example 16 shows an image of a triangle displayed on the 2D Shape Discrimination apparatus using LED arrays. Arrays exhibiting various 2D shapes with different colors along with stripes (for optical flow test) are displayed. Circle Displayed in Fig. 16B. Square Displayed in Fig. 16C. Optical flow pattern displayed in Fig. 16D. Fig. 16E shows the Graphical user interface for LED-array based 2D Shape recognition, and optical flow test. Control for multiple display parameters such as intensity, shape, color, and size of the objects, as well as direction and speed of optical flows are provided in the GUI. Fig. 16F shows association of a measured speed threshold of Optical Flow with Patient Reported Outcome (measured by NEI-VFQ) in Low-Vision patients. Pearson Correlation between speed threshold and PRO for two different groups.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ophthalmology & Optometry (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

La présente invention concerne des systèmes et des procédés de quantification de la vision conçus pour évaluer la vision fonctionnelle. Plus particulièrement, les systèmes sont conçus dans le but de suivre la vision d'un sujet à l'aide d'évaluations de la mobilité et de la dextérité. Plus particulièrement, au moins un aspect de l'invention concerne la conception et l'application des systèmes dans les évaluations fonctionnelles de la vision chez les patients malvoyants.
EP23901706.4A 2022-12-08 2023-12-08 Appareil et procédé pour mesurer la vision fonctionnelle chez les patients malvoyants Pending EP4629877A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263386644P 2022-12-08 2022-12-08
PCT/US2023/083247 WO2024124214A1 (fr) 2022-12-08 2023-12-08 Appareil et procédé pour mesurer la vision fonctionnelle chez les patients malvoyants

Publications (1)

Publication Number Publication Date
EP4629877A1 true EP4629877A1 (fr) 2025-10-15

Family

ID=91380317

Family Applications (1)

Application Number Title Priority Date Filing Date
EP23901706.4A Pending EP4629877A1 (fr) 2022-12-08 2023-12-08 Appareil et procédé pour mesurer la vision fonctionnelle chez les patients malvoyants

Country Status (3)

Country Link
US (1) US20240188817A1 (fr)
EP (1) EP4629877A1 (fr)
WO (1) WO2024124214A1 (fr)

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7455407B2 (en) * 2000-02-11 2008-11-25 Amo Wavefront Sciences, Llc System and method of measuring and mapping three dimensional structures
DE10329165A1 (de) * 2003-06-27 2005-01-13 Carl Zeiss Meditec Ag Vorrichtung zur Bestimmung der Fehlsichtigkeit eines optischen Systems
US20060143022A1 (en) * 2004-01-13 2006-06-29 Betty Bird Consumer care management method and system
SG10201402138PA (en) * 2009-05-09 2014-07-30 Vital Art And Science Llc Shape discrimination vision assessment and tracking system
US8967809B2 (en) * 2010-03-01 2015-03-03 Alcon Research, Ltd. Methods and systems for intelligent visual function assessments
JP5643078B2 (ja) * 2010-12-27 2014-12-17 株式会社ニデック 眼屈折力測定装置
CN108354581B (zh) * 2013-10-25 2020-12-01 费城儿童医院 在不同的亮度级别测试视功能和功能性视力的装置和方法
WO2016084086A1 (fr) * 2014-11-26 2016-06-02 Eyekon E.R.D. Ltd. Détermination d'affections physiques liées à la vision à partir de tests de vision à paramètres croisés
KR102355455B1 (ko) * 2016-06-20 2022-01-24 매직 립, 인코포레이티드 시각적 프로세싱 및 지각 상태들을 포함하는 신경학적 상태들의 평가 및 수정을 위한 증강 현실 디스플레이 시스템
WO2019169322A1 (fr) * 2018-03-02 2019-09-06 Ohio State Innovation Foundation Systèmes et procédés de mesure de cartes de la fonction visuelle
US11500461B2 (en) * 2019-11-01 2022-11-15 Evolution Optiks Limited Light field vision-based testing device, system and method
WO2021177968A1 (fr) * 2020-03-05 2021-09-10 I-Lumen Scientific, Inc. Système et procédé de test et de traitement de vision
AU2022217152A1 (en) * 2021-02-03 2023-08-24 Jcyte, Inc. Low luminance mobility test
CN115429213B (zh) * 2022-08-26 2025-09-23 上海市第一人民医院 多亮度视觉导航测试场

Also Published As

Publication number Publication date
WO2024124214A1 (fr) 2024-06-13
US20240188817A1 (en) 2024-06-13

Similar Documents

Publication Publication Date Title
US10610093B2 (en) Method and system for automatic eyesight diagnosis
US8500278B2 (en) Apparatus and method for objective perimetry visual field test
JP5421146B2 (ja) 視野検査システム
US8337019B2 (en) Testing vision
JP2025128370A (ja) 視野解析のためのシステムおよび方法
JP6731850B2 (ja) 閾値の検査および決定
JP2009542368A (ja) 偏心視の診断および治療システム
JP2021519195A (ja) モバイルデバイスを使用した視覚検査
US20190216311A1 (en) Systems, Methods and Devices for Monitoring Eye Movement to Test A Visual Field
US12137975B2 (en) Systems, methods, and program products for performing on-off perimetry visual field tests
Bennett et al. Optimization and validation of a virtual reality orientation and mobility test for inherited retinal degenerations
CN110381811A (zh) 视觉表现评估
US20240188817A1 (en) Apparatus and method for measurement of functional vision in patients with low vision
EP4649883A1 (fr) Nouveau procédé de localisation de l'angle mort d'une personne et autres procédés ophtalmologiques étalonnés par cet angle mort localisé
Kim Multi-Luminance Y-Mobility Test for Assessment of Functional Vision in Patients with Severe Vision Impairment
Roberts Visual Fields
Ledford et al. VISUAL FIELDS
RU2576798C2 (ru) Способ определения характеристик зрения и устройство для его осуществления
CN118284357A (zh) 确定人的眼睛的视觉表现
Bailey Visual Fields and Functionality
Djiallis Variability of the automated perimetric threshold response

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20250515

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR