AU2024292037B2 - A digital display device field of vision testing system - Google Patents
A digital display device field of vision testing systemInfo
- Publication number
- AU2024292037B2 AU2024292037B2 AU2024292037A AU2024292037A AU2024292037B2 AU 2024292037 B2 AU2024292037 B2 AU 2024292037B2 AU 2024292037 A AU2024292037 A AU 2024292037A AU 2024292037 A AU2024292037 A AU 2024292037A AU 2024292037 B2 AU2024292037 B2 AU 2024292037B2
- Authority
- AU
- Australia
- Prior art keywords
- screen
- brightness
- camera
- eye
- detect
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0016—Operational features thereof
- A61B3/0041—Operational features thereof characterised by display arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/02—Subjective types, i.e. testing apparatus requiring the active assistance of the patient
- A61B3/024—Subjective types, i.e. testing apparatus requiring the active assistance of the patient for determining the visual field, e.g. perimeter types
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/02—Subjective types, i.e. testing apparatus requiring the active assistance of the patient
- A61B3/028—Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
- A61B3/032—Devices for presenting test symbols or characters, e.g. test chart projectors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0489—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using dedicated keyboard keys or combinations thereof
- G06F3/04897—Special input arrangements or commands for improving display capability
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/22—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of characters or indicia using display control signals derived from coded signals representing the characters or indicia, e.g. with a character-code memory
- G09G5/30—Control of display attribute
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/39—Control of the bit-mapped memory
- G09G5/391—Resolution modifying circuits, e.g. variable screen formats
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/30—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/163—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0285—Improving the quality of display appearance using tables for spatial correction of display data
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0626—Adjustment of display parameters for control of overall brightness
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0693—Calibration of display systems
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/04—Changes in size, position or resolution of an image
- G09G2340/0407—Resolution change, inclusive of the use of different resolutions for different screen areas
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2380/00—Specific applications
- G09G2380/08—Biomedical applications
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Heart & Thoracic Surgery (AREA)
- Ophthalmology & Optometry (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Pathology (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Computer Hardware Design (AREA)
- Eye Examination Apparatus (AREA)
Abstract
The present system is designed for home-based, self-administered visual field testing without the need for expert involvement, utilising common computing devices. The system adapts to differences in various technology platforms, ensuring accuracy and reliability of the tests by using calibration for screen size and brightness, real-time viewing distance monitoring, adjustments for display curvature, gaze stability and eye occlusion detection using a front-facing camera according to various described embodiments.
Description
wo 2025/015374 PCT/AU2024/050758
A Digital Display Device Field of Vision Testing System
Field of the Field of theInvention Invention
[0001] The invention is a system designed to assess and monitor visual field
thresholds using common digital display devices enabling accurate testing and even
self-testing. The system is designed to replace traditional specialised bowl-type visual
field-testing machines whilst offering accurate vision field testing and progression
monitoring using a wide range of digital display consumer electronic devices.
Background of the Invention
[0002] Glaucoma, a leading cause of blindness globally, is characterised by the
degeneration of the optic nerve, often associated with cupping of the optic disc and
elevated intraocular pressure (IOP). Glaucoma typically damages vision starting in
the peripheral regions. Thus, visual field (VF) tests, covering a wide area (e.g., 30
degrees), are standard for diagnosing glaucoma. These tests, known as perimetry or
automated perimetry, use machines with a bowl-shaped surface projecting varying
intensity light spots. The patient focuses on a central spot and responds to peripheral
light spots, with the machine measuring retinal sensitivity and creating a graphical
representation of deviations from normal values, aiding in diagnosing and monitoring
glaucoma. glaucoma.
[0003] Alternative digital display-based methods for eye testing have been proposed
including US20230165460A1 (Shousha, 2023-06-01) which uses a wearable device
which displays stimuli at various field locations of the visual field and use spatial
information to select appropriate display locations to collect feedback based on the
eye's response using sensors, thereby assisting in diagnosing ocular anomalies.
[0004] US20130155376A1 (Huang et al., 2013-06-20) involves a video game
designed to map a test subject's peripheral vision. The game includes a moving visual
fixation point confirmed by the subject's action and requires the subject to locate a
briefly presented visual stimulus. The game runs on a platform comprising a video
display, user input device, and video camera to map the subject's visual perception
thresholds, thresholds, comparable comparable with with age-stratified age-stratified normative normative data. data.
WO wo 2025/015374 PCT/AU2024/050758
[0005] However, there is a pressing need for home-based vision testing that can be
self-administered without professional supervision on conventional computing
devices. However, variations in display characteristics, such as screen resolution and
brightness, as well as improper use and variable environmental factors, can render
vision testing on these devices relatively inaccurate.
Summary of the Disclosure
[0006] The described field of vision testing system is designed for home-based, self-
administered visual field testing using common digital display devices. In
embodiments, it employs calibration steps to accommodate different screen sizes and
brightness levels, as well as methods to ensure accurate testing, including stabilising
viewing distance, detecting improper use, and accounting for digital display curvature.
[0007] Similar to standard perimetry, the test subject focuses on a fixation point and
responds to stimuli displayed at peripheral locations.
[0008] The system may account for screen size variability by displaying calibration
markers and offset adjustment controls to adjust the distance between them to match
a fixed physical object. This allows the system to calculate a screen size scaling
factor according to the distance and subsequently scale the screen size accordingly.
This procedure can be performed accurately by non-specialist home users on
conventional computer devices.
[0009] To account for screen brightness variability, the system may display a first
stimulus at a brightness below known visual threshold (lowest contrast level which
can just be detectable by human eye) and a second stimulus at a brightness just
above above the the threshold, threshold, calculating calculating screen screen brightness brightness based based on on the the frequency frequency of of
responses to these stimuli. The system determines the screen brightness by
analysing the number of responses received for each stimulus.
[0010] The system may ensures proper viewing distance through viewing distance
calibration, which involves positioning the digital display a set distance from the
user's face and analysing image data obtained from a camera to set a reference
calibrated facial metric. Real-time viewing distance monitoring is achieved by
continually analysing image data to determine a real-time facial metric and comparing wo 2025/015374 PCT/AU2024/050758 it to the calibrated metric to determine the real-time viewing distance. The system can warn the user if the viewing distance deviates from the calibrated distance.
[0011] For smaller screens, the system may adjust the position of the fixation point to
the four corners of the screen to enable testing of viewing distances greater than from
the centre. The system may also reduce threshold variability by applying a scaling
factor to spot size to account for the tangent effect of digital displays, particularly at
peripheral locations.
[0012] Gaze stability may be is monitored by using a front-facing camera to capture
images of the eye and analyse pixel intensity of key features to assess gaze stability,
ensuring reliable visual field testing. The system may also verifies the occlusion of
the non-tested eye by capturing images of the eyes and ensuring the correct eye is
occluded, with warning messages provided if the incorrect eye is occluded.
[0013] In embodiments, the system translates a stimulus position from flat screen
geometry to curved screen geometry and detect background environmental
brightness by analysing general and localised brightness within image data obtained
from the camera.
[0014] For vision field testing, the system may employ Bayesian probability prediction
to perform a binary search for the threshold contrast at each location, using a
population probability density function (PDF) that includes individuals with normal
eyes and those with disease. This allows for fast and accurate estimation of
thresholds based on the subject's response matrix. Neighbourhood logic checks for
errors in responses and recovers from potential corruptions in threshold logic, refining
the prediction of initial seeding intensity for each test location based on neighbouring
location endpoints. Furthermore, the system may employ adaptive responses by
adjusting the speed of the test "wait-window" based on the average speed of response
to stimuli for each individual subject. If a subject responds quickly, the software
reduces the interval between stimuli, and if they respond slowly, it increases the
interval. The algorithm dynamically adjusts the timing based on a running average of
past responses.
wo 2025/015374 PCT/AU2024/050758
[0015] Overall, the present system allows for visual field testing to be performed
conveniently and efficiently on commonly available devices, providing accurate
results while addressing technical considerations associated with the transition from
curved bowl surfaces to digital displays.
[0016] Other aspects of the invention are also disclosed.
Brief Description of the Drawings
[0017] Notwithstanding any other forms which may fall within the scope of the present
invention, preferred embodiments of the disclosure will now be described, by way of
example only, with reference to the accompanying drawings in which:
[0018] Figure 1 shows the key components and functionality of a field of vision testing
system, which includes a personal computing device with a digital display used for
displaying a field of vision testing user interface.
[0019] Figure 2 illustrates the processing carried out by the screen size calibration
controller.
[0020] Figure 3 shows a screen size calibration user interface presented by the screen
size calibration controller. The interface includes a fixation point and stimuli that are
scaled according to the calculated screen size scaling factor.
[0021] Figure 4 demonstrates the processing performed by the screen brightness
calibration controller. The figure shows three scenarios: calibrated screen brightness
detection, excessive brightness detection, and too dark brightness detection.
[0022] Figure 5 depicts the three screen brightness detections: calibrated screen
brightness detection, excessive brightness detection, and too dark brightness
detection.
[0023] Figure 6 illustrates the processing conducted by the gaze stability detection
controller. It shows the analysis of eye image data to detect gaze stability.
[0024] Figure 7 shows the operation of the gaze stability detection controller,
detecting image properties across the eye region to assess gaze stability.
[0025] Figure 8 represents the processing carried out by the eye occlusion detection
controller. controller.
wo 2025/015374 PCT/AU2024/050758
[0026] Figure 9 shows the operation of the eye occlusion detection controller,
detecting an occluded eye region and a non-occluded eye region in the facial image
data. data.
[0027] Figure 10 displays the processing performed by the viewing distance
calibration controller.
[0028] Figures 11A - D illustrates the exemplary image processing conducted by the
viewing viewing distance distance calibration calibration controller, controller, including including the the detection detection of of facial facial width width and and
boundary analysis.
[0029]
[0029] Figure Figure 12 12 illustrates illustrates mapping mapping of of flatscreen flatscreen geometry geometry to to curved curved screen screen
geometry.
[0030] Figure 13 shows processing for bright spot detection.
Description of Embodiments
[0031] Figure 1 shows some key components 101 and functionality 102 of a field of
vision testing system 100, which comprises a personal computing device (such as a a tablet tablet PC) PC) having having aa digital digital display display 103 103 configured configured to to display display aa field field of of vision vision testing testing
user interface thereon. In accordance with a preferred embodiment, the present
system 100employs system 100 employsa a WebWeb server server architecture architecture wherein wherein the personal the personal computing computing
device is in operable communication with a Web server via the Internet and the
personal computing device has a web browser application 104 configured to send
HTTP requests to the Web server and to render webpages served by the Web server
in response thereto. The personal computing device comprises a processor for
processing digital data which is in operable communication with a memory device,
the the memory memory device device configured configured to to store store digital digital data data including including computer computer program program code code
instructions. In use, the processor fetches these computer program code instructions
and associated data from the memory device for interpretation and execution of the
functionality functionality herein. herein. The The computer computer program program code code instructions instructions may may be be logically logically divided divided
into into aa plurality plurality of of computer computer program program code code instruction instruction controllers controllers 105 105 as as will will be be
described described in in further further detail detail below. below. The The computer computer program program code code instruction instruction controllers controllers
may comprise a stimulus testing controller which is configured to perform an
assessment assessment of of visual visual field field function function 106 106 by by displaying displaying aa fixation fixation point point and and peripheral peripheral
WO wo 2025/015374 PCT/AU2024/050758
stimuli on the digital display 103 and recording responses to the peripheral stimuli. In
embodiments, the system 100 may comprise a front-facing camera 107 wherein
computer vision analysis 108 is performed on image data obtained by the camera 107
for for various various purposes purposes as as will will be be described described in in further further detail detail below. below. The The controllers controllers 105 105
may comprise a screen size calibration controller 109 which calculates a screen size
scaling factor to ensure that the actual distance between the stimuli and the fixation
point is uniform across different types of digital displays irrespective of their screen
size and/or screen resolution.
[0032] Figure 2 shows processing 110 by the screen size calibration controller and
Figure 3 shows a screen size calibration user interface displayed by the controller. At
step 111, the controller 109 is configured to display calibration markers 115 (in this
case calibration lines) and offset adjustment controls 116 (in this case decrement and
increment buttons) which are controllable to adjust the distance 117 between the
markers 115. The user is instructed to place a physical object of set size, such as a
ruler 118, between the markers 115 and to use the offset adjustment controls 116 to
increase or decrease the display distance 117 between the markers 115 until the
distance between the markers 115 matches a set length of the ruler 118, such as 5
cm. The system 100 calculates the screen size scaling factor according to the
distance between the markers 115 and thereafter scales the screen size according to
the screen size scaling factor to ensure that the actual distance between the stimuli
and and the the fixation fixation point point is is uniform uniform irrespective irrespective of of screen screen size size and/or and/or screen screen resolution resolution
variability of different types of digital displays 103. At step 113, the controller 109
may calculate the distance between the markers 115 as pixels per unit length to take
into account screen resolution.
[0033] Figure 3c shows the field of vision testing user interface 119 having the fixation
point 120 and stimuli 121 which are scaled from a small scale 122 to a larger scale
123 according to the calculated screen size scaling factor. It should be noted that the
scaling factor may be calibrated both across the X axis and the Y axis of the digital
display 103 to account for differences in X and Y screen resolution accordingly. With
reference to Figure 1, the controllers 105 may further comprise a screen brightness wo 2025/015374 PCT/AU2024/050758 PCT/AU2024/050758 calibration controller 124 configured to calibrate the brightness of the digital display
103 to account for any screen brightness variability between different types of
personal computing devices. The screen brightness calibration controller 124 may
display instructions to increase or decrease the brightness of the screen or may
alternatively automatically adjust the brightness of the screen.
[0034] Figure 4 shows processing 125 by the screen brightness calibration controller
124 resulting in calibrated screen brightness detection 126, excessive brightness
detection 127, and inadequate brightness detection 128. Specifically, the screen
brightness calibration controller 124 is configured to display a less bright stimulus
129A at a brightness below a threshold and a brighter stimulus 129B at a brightness
above the threshold. The stimuli 129 may take the form of greyscale dots displayed
adjacently on the screen. The user is instructed to respond to the display of the stimuli
129A and 129B at step 130 depending on their visibility. For example, the user would
respond to both stimuli 129A and 129B if both were visible, only to the brighter
stimulus 129B if only the brighter stimulus 129B was visible, and not respond if neither
stimulus 129 were visible.
[0035] Figure 4 shows detection of a high-frequency response 131A to the less bright
stimulus 129A (e.g., greater than 50%) and a low-frequency response 131B (i.e., less
than 50%) to the brighter stimulus 129B. Calibrated screen brightness 126 is detected
when both responses 131A and 131B are equal as shown in Figure 5. Excessive
screen brightness 127 is detected when a high-frequency response 131A to the low
brightness stimulus 129A is detected, and the user may be instructed to lower the
brightness of the screen. Inadequate screen brightness (too dark) 128 is detected
when a low-frequency response 131B is detected for the brighter stimulus 129B.
Similarly, the user may be instructed to increase the brightness of the screen.
[0036] With reference to Figure 1, the controllers 105 may further comprise a viewing
distance controller 133 configured for viewing distance calibration by analysing image
data obtained from the camera to determine a reference calibrated facial metric when
the digital display 103 is positioned a set distance from a user's face. The user may
be be instructed instructed to to position position the the digital digital display display 103 103 appropriately appropriately using using an an object object of of set set wo 2025/015374 PCT/AU2024/050758 length, such as a 30 cm ruler. Then, during use, the viewing distance controller 133 may be configured for real-time viewing distance monitoring by continually monitoring image data obtained from the camera to determine a real-time facial metric and comparing the calibrated reference and real-time facial metrics to determine a real- time viewing distance. Viewing distance controller 133 may be configured to warn the user if the display 103 is too close or too far away. The viewing distance controller
133 may be configured to calculate the calibrated facial metric measured as a pixel
width, which may represent facial width, distance between the eyes, nose width, and
jawline width. Furthermore, the viewing distance controller 133 may be configured for
employing at least one of boundary analysis, contrast differential analysis, and colour
differential analysis to determine the calibrated facial metric.
[0037] Figure 10 shows processing 135 by the viewing distance controller 133 and
Figures 11A - D shows exemplary image processing by the viewing distance controller
133, which may be individualised to each user's facial features to allow for small or
large faces to be used by the viewing distance controller 133. At step 137, the viewing
distance controller 133 is configured to capture an image 143 of the user's face as
shown in Figure 11A and, at step 138, perform computer vision analysis 108 thereon.
The computer vision analysis 108 may comprise detecting a facial width 144 as shown
in Figure 11B. The facial width 144 may be calculated in pixels. The facial width 144
may comprise detection of the width of a visually detected facial boundary 145. The
boundary 145may boundary 145 maybebe determined determined using using boundary boundary detection detection analysis, analysis, including including
intensity and colour differential analysis.
[0038] Intensity differential analysis detects variations in pixel intensity to detect
edges. For instance, in the facial image, the boundary 145 between the face and the
background often exhibits significant changes in brightness. By analysing these
intensity gradients, the viewing distance controller 133 can effectively outline the
facial boundary 145. Colouring differential analysis assesses differences in colour
between adjacent pixels are assessed. In a facial image, the skin tone may contrast
with the background or other facial features, such as the eyes or hair. By evaluating wo 2025/015374 PCT/AU2024/050758 PCT/AU2024/050758 these colour variations, the viewing distance controller 133 can accurately identify the edges of the face.
[0039] The computer vision analysis 108 may employ edge detection algorithms, such
as the Canny edge detector. This method uses a multi-stage process, including
gradient calculation, non-maximum suppression, and edge tracking by hysteresis, to
produce a precise edge map of the image. Additionally, boundary detection by the
viewing distance controller 133 can be enhanced with machine learning approaches.
Convolutional neural networks (CNNs) can be trained to recognise and delineate
boundaries by learning from annotated datasets.
[0040] In alternative embodiments, the computer vision analysis 108 is configured to
detect the width between the outer edges of both eyes. The computer vision analysis
108 may detect the width between the eyes in a facial image using Haar Cascades,
which are pre-trained classifiers that detect specific facial features based on edge or
line detection. This approach involves using a series of stages, each containing a set
of Haar-like features, to identify the eyes within the image. Once the eyes are
detected, the distance between their centres can be measured in pixels by the viewing
distance controller 133.
[0041] The computer vision analysis 108 may alternatively use convolutional neural
networks (CNNs) specifically trained for facial landmark detection to accurately locate
key points on the face, such as the corners of the eyes. For example, the facial
landmark detection model might identify the coordinates of the inner and outer
corners of each eye, allowing for precise calculation of the inter-eye distance. This
approach benefitsfrom approach benefits from thethe robustness robustness and accuracy and accuracy of learning of deep deep learning models, models,
especially when trained on large datasets of annotated facial images.
[0042] The computer vision analysis 108 may alternatively use a Histogram of
Oriented Gradients (HOG) feature descriptor for eye detection which focuses on the
structure of local gradients, which are indicative of edges and textures. By extracting
HOG features from a facial image and applying a sliding window technique, the
viewing distance controller 133 can detect the presence and location of eyes. Once
identified, the width between the detected eye regions can be computed.
WO wo 2025/015374 PCT/AU2024/050758
[0043] The viewing distance controller 133 may be calibrated at step 139 wherein the
user is instructed to hold the digital display 103 a set distance from the face, such as
30 cm. During such calibration, the viewing distance controller 133 calculates a
reference width 144 representing a calibrated distance, whereafter the real-time
viewing distance is thereafter calculated at step 140 according to the real-time width
144 and the calibrated reference width 144. At step 141, the viewing distance
controller 133 is configured to determine if the actual viewing distance exceeds the
reference distance by a threshold, such as 10%, and, if so, at step 142, the viewing
distance controller 133 may be configured to display a warning to the user to hold the
digital display 103 closer or further away as the case may be. Figure 11C shows
wherein the user is too far away from the digital display 103 and therefore the visually
detected facial boundary 145 is smaller than the calibrated reference width 144
whereas Figure 11E shows wherein the user is too close to the digital display 103
such that the visually detected facial boundary is wider than the calibrated width 144.
[0044] In embodiments, the stimulus testing controller is configured to translate
eccentricities and angular locations of the stimuli 121 to equivalent Cartesian X-Y
values on the digital display 103. Furthermore, the stimulus testing controller may be
configured to apply a scaling factor to the display of stimuli 121 to account for a visible
tangent effect of the display of the stimuli on the digital display. In other words, the
stimuli 121 may be displayed larger and with elliptical distortion further away from the
fixation point 120.
[0045] With reference to Figure 1, the controllers 105 may further comprise a gaze
stability detection controller 146 configured to detect gaze stability by analysing eye
image data obtained by the camera 107. Figure 6 shows processing 134 by the gaze
stability detection controller 146 and Figure 7 illustrates exemplary operation thereof.
At step 148, the gaze stability detection controller 146 performs gaze detection on
eye image data obtained from the eye region 152 of the user captured at step 149.
At step 150, the gaze stability detection controller 146 detects gaze stability and, if
detecting gaze instability, displays a warning message of unstable gaze to the user
at step 151.
WO wo 2025/015374 PCT/AU2024/050758
[0046] Figure 7 shows that the gaze stability detection controller 146 may be
configured to detect image properties 153 across the eye region 152. These image
properties properties 153 153 typically typically represent represent changes changes in in image image brightness, brightness, with with large large brightness brightness
changes detected at the edge of the iris and/or pupil. Alternatively, the image
properties 153 may be colour properties. As shown, the image properties 153 may
comprise image properties 153A detected across an X axis of the eye region 152 and
image properties 153B across a Y axis of the eye region
[0047]15
[0047] 152.Figure Figure 77 illustrates illustrates aahorizontal horizontal gaze gaze change change wherein wherein the image the image properties properties
153A change across the X axis. The gaze stability detection controller 146 may further
detect gaze changes vertically using the image properties 153B across the Y axis of
the eye region 152.
[0048] With reference to Figure 1, the controllers 105 may further comprise an eye
occlusion detection controller 154 configured to detect eye occlusion by analysing
facial image data 143 obtained from the camera 107. The eye occlusion detection
controller 154 may be used by the system 100 to ensure that the non-tested eye is is
closed or occluded. Figure 8 shows processing 156 by the eye occlusion detection
controller 154 and Figure 9 shows exemplary operation thereof. With reference to
Figure 9, the facial image data 143 may have an occluded eye region 152A (which is
occluded with a patch in this embodiment) and a non-occluded eye region 152B. At
step 157, the eye occlusion detection controller 154 captures the eye regions 152
from the facial image data 143 captured by the camera 107. Image processing is
performed thereon at step 158 wherein, at step 159, the eye occlusion detection
controller 154 detects an occluded eye. Detection of the occluded eye may comprise
detection of the colour of the eyepatch or alternatively absence of eye feature
recognition, or alternatively high homogeneity of pixels within the detection area due
to absence of eye feature. In further embodiments, detection of an occluded eye may
comprise the eye recognition only recognising one eye within the facial image data
143. At step 159, the eye occlusion detection controller 154 detects if the eye being
tested is occluded and, if so, at step 160, displays a warning message to the user.
WO wo 2025/015374 PCT/AU2024/050758
[0049] In embodiments, the controllers may comprise a screen geometry translation
controller which, with reference to Figure 12, maps a stimulus 121 display from a flat
digital screen to a curved digital screen, specifically focusing on mapping point A on
the flat screen to point B on the curved screen. The setup involves a viewer positioned
at point C, with the viewing distance denoted as d, which is the distance from point C
to the flat screen. The curved screen is represented as an arc with a radius R,
originating originating from from the the centre centre of of curvature. curvature. Two Two primary primary planes planes are are shown: shown: the the plane plane of of
the the flat flat digital digital screen screen where where point point AA is is located located and and the the plane plane of of the the curved curved digital digital
screen where point B is located. The mapping involves calculating the new
coordinates on the curved screen based on the viewing geometry. The line of sight
from point C intersects the flat screen at point A and the curved screen at point B.
The equation y = -d/L1x + d represents the line of sight and x^2 + (y - R)^2 = R^2
describes the curved screen. Solving these equations simultaneously gives the
coordinates of point B on the curved screen.
[0050] In embodiments, the controllers may comprise a bright spot detection controller
configured to implement the process 170 shown in Figure 13, which begins with the
front-facing camera scanning an image of a background scene at step 171. The
system 100 then checks if any pixel's brightness exceeds a predetermined threshold
level at step 172. If such a pixel is detected, the system 100 identifies the beginning
of a cluster of bright pixels at step 173. This cluster is then expanded at step 174 by
including adjacent pixels that also exceed the brightness threshold until the entire
cluster is identified at step 175. The system 100 then assesses whether the cluster
exceeds a specified size at decision 176. If the cluster is larger than the specified
size, it is deemed a bright spot at step 177, and a warning message is displayed to
the user. If the cluster does not exceed the specified size, the system continues
scanning at step 178 for other potential bright spots.
[0051] The foregoing description, for purposes of explanation, used specific
nomenclature to provide a thorough understanding of the invention. However, it will
be be apparent apparent to to one one skilled skilled in in the the art art that that specific specific details details are are not not required required in in order order to to
practise practise the the invention. invention. Thus, Thus, the the foregoing foregoing descriptions descriptions of of specific specific embodiments embodiments of of wo 2025/015374 PCT/AU2024/050758 the the invention invention are are presented presented for for purposes purposes of of illustration illustration and and description. description. They They are are not not intended to be exhaustive or to limit the invention to the precise forms disclosed as obviously obviously many many modifications modifications and and variations variations are are possible possible in in view view of of the the above above details. details.
The embodiments were chosen and described in order to best explain the principles
of the invention and its practical applications, thereby enabling others skilled in the
art art to to best best utilize utilize the the invention invention and and various various embodiments embodiments with with various various modifications modifications
as are suited to the particular use contemplated. It is intended that the following
claims claims and and their their equivalents equivalents define define the the scope scope of of the the invention. invention.
Claims (30)
- wo 2025/015374 PCT/AU2024/050758Claims 1. 1. A field of vision testing system comprising a digital display and configured forrecording responses to stimuli displayed on the digital display, wherein the system isconfigured for screen size scaling comprising:displaying calibration markers and offset adjustment controls configured toadjust adjust aa distance distance therebetween; therebetween;calculating a screen size scaling factor according to the distance; andscaling a screen size according to the screen size scaling factor.
- 2. The system as claimed in claim 1, wherein the system is configured tocalculate screen pixels per unit length according to the distance.
- 3. The system as claimed in claim 2, wherein the system is configured forcalculating a screen size scaling factor for both X and Y axes of the digital displayand scaling the screen size along the X and Y axes according to the screen sizescaling factors respectively.
- 4. The system as claimed in claim 1, wherein the system is configured for screenbrightness calibration comprising:displaying a first stimulus at a brightness below a visual threshold and asecond stimulus at a brightness above the threshold; andcalculating a screen brightness according to responses to the stimuli.
- 5. The system as claimed in claim 4, wherein the screen brightness is classifiedas calibrated, inadequate and excessive.
- 6. The system as claimed in claim 4, wherein excessive screen brightness isdetected by frequency of responses to the first stimulus exceeding a threshold.
- 7. 7. The system as claimed in claim 4, wherein inadequate screen brightness isdetected by frequency of responses to the second stimulus under a threshold.
- 8. The system as claimed in claim 4, wherein calibrated screen brightness isdetected by approximately equal responses to both stimuli.WO wo 2025/015374 PCT/AU2024/050758
- 9. The systemasasclaimed The system claimed in in claim claim 1, wherein 1, wherein the system the system furtherfurther comprises comprises a acamera andwherein camera and whereinthethe system system is configured is configured for: for:viewing distance calibration comprising:positioning the digital display a set distance from a user's face;analysing image data obtained from the camera to determine acalibrated facial metric;real-time viewing distance monitoring comprising:continually monitoring image data obtained from the camera todetermine a real-time facial metric;comparing the calibrated and real-time facial metrics to determine areal-time viewing distance.
- 10. The system as claimed in claim 9, wherein the calibrated facial metric ismeasured in pixel width.
- 11. The system as claimed in claim 9, wherein the calibrated facial metriccomprises at least one of facial width, distance between the eyes, nose width andjawline width.
- 12. The system as claimed in claim 11, wherein system is configured forconfigured for employing at least one of boundary analysis, contrast differentialanalysis and colour differential analysis to determine the calibrated facial metric.
- 13. The system as claimed in claim 1, wherein the stimulus testing controller isconfigured to adjust a fixation point display location on the digital display.
- 14. The system as claimed in claim 13, wherein the stimulus testing controller isconfigured configured to to adjust adjust the the fixation fixation point point display display location location depending depending on on aa size size of of the thedigital display.
- 15. The system as claimed in claim 14, wherein the stimulus testing controller isconfigured to position the fixation point display location at peripheral or cornerregions of the digital display.wo WO 2025/015374 PCT/AU2024/050758 PCT/AU2024/050758
- 16. The system as claimed in claim 1, wherein the system is configured fordisplaying displaying aa fixation fixation point point and and scaling scaling geometry geometry of of the the stimuli stimuli proportionately proportionatelyaccording to their display distance from the fixation point.
- 17. The system as claimed in claim 16, wherein the geometry is stimuli areelliptical distortion.
- 18. The system as claimed in claim 17, wherein stimuli are displayed with greaterelliptical distortion further away from the fixation point.
- 19. The systemasasclaimed The system claimed in in claim claim 1, wherein 1, wherein the system the system furtherfurther comprises comprises a acamera camera and and wherein wherein the the system system is is further further configured configured to to detect detect aa gaze gaze stability stability by byanalysing analysing image image data data of of aa user's user's eye eye obtained obtained from from the the camera. camera.
- 20. 20. The The system system as as claimed claimed in in claim claim 19, 19, wherein wherein the the system system is is configured configured to to detect detectimage properties across at least one of an X and Y axis of an eye region.
- 21. 21. The system as claimed in claim 20, wherein the image properties are at leastone of brightness levels, contrast levels and colour values.
- 22. 22. The The system system as as claimed claimed in in claim claim 20, 20, wherein wherein the the system system is is configured configured to to detect detectchanges in the image properties across the X axis of an eye region to detect gazeinstability. instability.
- 23. 23. The The system system as as claimed claimed in in claim claim 22, 22, wherein wherein the the system system is is further further configured configured to todetect changes in the image properties across the Y axis of an eye region to detectgaze instability. gaze instability.
- 24. 24. The systemasasclaimed The system claimed in in claim claim 1, wherein 1, wherein the system the system furtherfurther comprises comprises a acamera and wherein the system is further configured to detect eye occlusion byanalysing facial image data obtained from the camera.
- 25. 25. The system as claimed in claim 24, wherein the system employs eye imagerecognition and recognition of only one eye.
- 26. 26. The system as claimed in claim 1, wherein the system is configured totranslate a stimulus position from flatscreen geometry to curved screen geometry.wo 2025/015374 PCT/AU2024/050758
- 27. 27. The system as claimed in claim 26, wherein the system is configured tocalculate line of sight intersections of the flatscreen geometry and the curved screengeometry.
- 28. 28. The systemasasclaimed The system claimed in in claim claim 1, wherein 1, wherein the system the system furtherfurther comprises comprises a afront facing camera and wherein the system is further configured to detect brightspots withinimage spots within image data data obtained obtained fromfrom the camera. the camera.
- 29. 29. The system as claimed in claim 28, wherein the system is configured to:scan an image of the background scene,determine if any pixel's brightness exceeds a predetermined threshold,if if aa bright bright pixel pixel is is detected, detected, identify identify aa start start of of aa cluster cluster of of bright bright pixels; pixels;expand the cluster by including adjacent pixels exceeding the brightnessthreshold threshold until; until;determining if the cluster exceeds a specified size.
- 30. 30. The The system system as as claimed claimed in in claim claim 29, 29, wherein wherein the the system system is is further further configured configured to todisplay a warning message if the if the cluster exceeds the specified size.
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| AU2023902320 | 2023-07-20 | ||
| AU2023902320A AU2023902320A0 (en) | 2023-07-20 | A flatscreen device field of vision testing system | |
| PCT/AU2024/050758 WO2025015374A1 (en) | 2023-07-20 | 2024-07-16 | A digital display device field of vision testing system |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| AU2024292037A1 AU2024292037A1 (en) | 2025-04-24 |
| AU2024292037B2 true AU2024292037B2 (en) | 2025-06-26 |
Family
ID=94281035
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| AU2024292037A Active AU2024292037B2 (en) | 2023-07-20 | 2024-07-16 | A digital display device field of vision testing system |
Country Status (2)
| Country | Link |
|---|---|
| AU (1) | AU2024292037B2 (en) |
| WO (1) | WO2025015374A1 (en) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130155376A1 (en) * | 2011-12-20 | 2013-06-20 | Icheck Health Connection, Inc. | Video game to monitor visual field loss in glaucoma |
| US20230165460A1 (en) * | 2021-11-30 | 2023-06-01 | Heru Inc. | Visual field map expansion |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB0709405D0 (en) * | 2007-05-16 | 2007-06-27 | Univ Edinburgh | Testing vision |
| US8132916B2 (en) * | 2008-12-12 | 2012-03-13 | Carl Zeiss Meditec, Inc. | High precision contrast ratio display for visual stimulus |
| US20210228074A1 (en) * | 2016-03-01 | 2021-07-29 | Nova Southeastern University | Perimetry testing using multimedia |
-
2024
- 2024-07-16 WO PCT/AU2024/050758 patent/WO2025015374A1/en active Pending
- 2024-07-16 AU AU2024292037A patent/AU2024292037B2/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130155376A1 (en) * | 2011-12-20 | 2013-06-20 | Icheck Health Connection, Inc. | Video game to monitor visual field loss in glaucoma |
| US20230165460A1 (en) * | 2021-11-30 | 2023-06-01 | Heru Inc. | Visual field map expansion |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2025015374A1 (en) | 2025-01-23 |
| AU2024292037A1 (en) | 2025-04-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20250009222A1 (en) | Head-mounted device for presenting image content and generating a dry-eye-related indication via infrared sensor data | |
| JP4895847B2 (en) | 瞼 Detection device and program | |
| US10878237B2 (en) | Systems and methods for performing eye gaze tracking | |
| CN108427503B (en) | Human eye tracking method and human eye tracking device | |
| CN112384127B (en) | Eyelid ptosis detection method and system | |
| KR102863293B1 (en) | Visual defect determination and improvement | |
| US11659987B2 (en) | Vision testing via prediction-based setting of initial stimuli characteristics for user interface locations | |
| CN111511318A (en) | digital therapy corrective glasses | |
| JP4655235B2 (en) | Information processing apparatus and method, and program | |
| Lupascu et al. | Automated detection of optic disc location in retinal images | |
| Chin et al. | Automatic fovea location in retinal images using anatomical priors and vessel density | |
| CN110211021B (en) | Image processing apparatus, image processing method, and storage medium | |
| CN110428421A (en) | Method and device for macular image region segmentation | |
| EP4236756A1 (en) | Vision testing via prediction-based setting of initial stimuli characteristics for user interface locations | |
| KR20190083155A (en) | Apparatus and method for detecting state of vehicle driver | |
| JPWO2018078857A1 (en) | Gaze estimation apparatus, gaze estimation method, and program recording medium | |
| CN116999017B (en) | Auxiliary eye care intelligent control system based on data analysis | |
| US11966511B2 (en) | Method, system and computer program product for mapping a visual field | |
| AU2024292037B2 (en) | A digital display device field of vision testing system | |
| CN115756173A (en) | Eye tracking method, system, storage medium and computing device | |
| US12400480B2 (en) | Iris detection method, iris detection apparatus, and non-transitory computer-readable recording medium recording iris detection program | |
| CN111588345A (en) | Eye disease detection method, AR glasses and readable storage medium | |
| US20230404394A1 (en) | Device And Method For Determining Glaucoma | |
| CN114093018B (en) | Vision screening equipment and system based on pupil positioning | |
| TWI673034B (en) | Methods and system for detecting blepharoptosis |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FGA | Letters patent sealed or granted (standard patent) |