WO1999025239A1 - Automated photorefractive screening - Google Patents
Automated photorefractive screening Download PDFInfo
- Publication number
- WO1999025239A1 WO1999025239A1 PCT/US1998/024275 US9824275W WO9925239A1 WO 1999025239 A1 WO1999025239 A1 WO 1999025239A1 US 9824275 W US9824275 W US 9824275W WO 9925239 A1 WO9925239 A1 WO 9925239A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- eye
- reflexes
- eyes
- model
- patient
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0016—Operational features thereof
- A61B3/0025—Operational features thereof characterised by electronic signal processing, e.g. eye models
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/02—Subjective types, i.e. testing apparatus requiring the active assistance of the patient
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/103—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining refraction, e.g. refractometers, skiascopes
Definitions
- This invention relates to instruments for measuring characteristics of eyes, and more pa ⁇ icularly to a system and method for locating and modeling eyes in imagery for automated photorefractive screening, and for enabling determination of the presence of anomalies in the patient's visual anatomy.
- a well-known eye examination consists of shining a light into the interior of the eye and visually inspecting the uniformity of the reflected light, normally visible as a red color filling the pupil. Any deviation from uniformity, either in one eye or between the pair of eyes, indicates a potential problem in the patient's visual anatomy. Such light refractive screening is thus a useful tool in patient eye care.
- Ocular disorders such as strabismus, various forms of refractive errors (myopia, hyperopia, and astigmatism) and opacities of the ocular media are the leading causes of amblyopia, or vision loss. These combine to cause amblyopia in approximately 5% of the population. However, some form of visual problem, not necessarily leading to amblyopia, may be present in over 20% of children.
- the present invention is directed to a system and method for locating and modeling eyes in imagery for automated photorefractive screening, and for enabling determination of the presence of-anomaiies in the patient's visual anatomy.
- Pattern Recognition 19(1), 1986, pp. 77-84
- Eye tracking may be accomplished via a constrained updating of an eye model
- an eye model for instance, see X. Xie, R. Sudhakar and H. Zhuang, "On improving eye feature extraction using deformable templates", in Pattern Recognition Letters, 27(6), 1994, pp. 791-799; A. Yuille and P. Hallinan, "Deformable Templates", Chapter 2 in Active Vision, MIT Press, 1992, pp. 21-38).
- Face recognition for instance, see International Conference on Automatic Face and Ges t ure Recognition, proceedings of, edited by M.
- a system and method for locating and modeling eyes in imagery for automated photorefractive screening includes a digital camera having a lens-mounted flash for obtaining a digital image of the face of an individual, and a suitably programmed processor, such as a general purpose digital computer, for locating an eye of the individual in the digital image, modeling structures in the eye, analyzing the digitized eyes of the individual for eye disease, and providing a recommenda- tion for treatment.
- a digital camera having a lens-mounted flash for obtaining a digital image of the face of an individual
- a suitably programmed processor such as a general purpose digital computer
- the invention includes a system and method for locating a patient's eyes in a digital image that includes each eye as illuminated by a near- axis flash, including automatically finding light reflexes in the digital images as indicative of the locat i on of each eye.
- Automatically finding light reflexes includes analyzing such light reflexes to determine possible pupil and sclera borders.
- the invention further includes automatically fitting a corresponding model to such possible pupil and sclera borders, analyzing the model of each eye to determine possible abnormalities in each eye; and outputting a possible diagnosis for each eye based on such analyzing.
- the invention includes a system and method for locating and modeling a patient's eyes in a digital image that includes each eye as illuminated by a near- axis flash, including finding and indicating bright spots in the digital images as possible comeal reflexes of each eye; finding and indicating red-black and black-white gradients, each comprising a set of gradient points, around such bright spots as possible pupil and sclera borders, respectively, of each eye; fitting a plurality of circles to subsets of such gradient points as possible models for each eye, each eye model having an associated strength; and sorting the eye models for each eye by strengths, and indicating the strongest corresponding eye model as best representing each eye.
- Another aspect of the invention includes measuring red reflexes and comeal reflexes from the indicated eye models as an indicator of anomalies in the patient's eyes.
- Yet another aspect of the invention includes generating a digital image of each of a patient's eyes with a camera having a flash positioned near to a center axis of a lens of the camera so as to generate images with bright, sharp light reflexes.
- the target audience for the invention is previously unreached large populations , such as school children. . in order to keep the imaging process simple and cheap, no special apparatus i s used.
- the eyes may appear anywhere in the image of the patient's face (simplifying use of the inventi •on , by fi r e ildj eyecare note p m ra.c- t t;i t t;i n o n n ⁇ e» r rs), t thnee e evyeess o 0 f 1 t u he p vatient need not fill the camera image frame, the camera may • b_e a at t a 3 H dii ⁇ s t tainnccee fr rroomm the p v atient that is not intrusive to the patient, and the patient's head need not be tightly constrained.
- FIG. la is block diagram showing the preferred physical components of the invention as used for imaging an eye of a patient.
- FIG. lb is a perspective view of a lens 14 and flash 12 in accordance with the invention.
- FIG. 2a is a diagram showing the light path from a flash to a pair of normal eyes and back to a camera, as well as the appearance of the comeal reflexes from the camera view.
- FIG. 2b is a diagram showing the light path from a flash to a pair of eyes, one of which is abnormal, and back to a camera, as well as the appearance of the comeal reflexes from the camera view.
- FIG. 2c is a diagram of the retinal reflex 26 for a pair of normal eyes, showing uniform reflectance equal in both eyes and no refractive error.
- FIG. 2d is a diagram of the retinal reflex for a pair of eyes, one of which has a retinal pathology that causes lesser reflectance.
- FIG. 2e is a diagram of the retinal reflex for a pair of eyes, each showing a crescent- shaped reflectance, which indicates the same type of refractive error in both eyes.
- FIG. 2f is a diagram of the retinal reflex for two eyes, showing different crescent reflectances, which indicates differing amounts of refractive error in the eyes.
- FIG. 3a is a close-up photograph of a patient's eye.
- FIG. 3b is a close-up photograph of a patient's eye with a superimposed graphical representation of the eye model used in the invention.
- FIG. 4a is a close-up photograph of a patient's eye containing an abnormal pupil area
- FIG. 4b is a close-up photograph of a patient's eye containing an abnormal pupil area with a superimposed graphical representation of the eye model used in the invention.
- FIG. 5 is a flowchart showing an overview of the steps of the image processing of the preferred embodiment of the invention.
- FIG. 6a is a close-up photograph of a patient's eye without a superimposed model.
- FIG. 6b is a close-up photograph of a patient's eye, showing black-to-white gradient points located for the image in FIG. 6a.
- FIG. 6c is a close-up photograph of a patient's eye, illustrating the result of applying constraints to the black-to-white gradient points shown in Fig. 6b.
- FIG. 6d is a close-up photograph of a patient's eye, showing the highest scoring subset of black-to-white gradient points from the set shown in FIG. 6c.
- FIG. 7 is a flowchart of a preferred test sequence using the invention, once an image is acquired and the eyes are located and modeled.
- FIG. la is block diagram showing the preferred physical components of me invention as used for imaging the eyes of a patient.
- a digital camera 10 is modified to include a "near- axis" flash 12 positioned slightly off the optical axis of the lens 14 of the camera 10.
- the flash 12 may be, for example, a ring flash or one or more small flash units attached near the optical axis of the lens 14 of the camera 10.
- a suitable camera is the Model DC120 Digital Camera from Kodak Corporation, with an attached 6X telephoto lens 14 and a small flash unit 12 mounted to the front of the telephoto lens, slightly off-center.
- the telephoto lens increases the field of view of the camera as it images the patient's eye, thus increasing the number of pixels available for analysis.
- FIG. lb is a perspective view of a lens 14 and flash 12 in accordance with the invention.
- the flash 12 in the illustrated embodiment uses visible light.
- a flash that principally emits non-visible light, such as infrared flash may be used instead with a suitable camera 10.
- the camera 10 is used to capture and directly download a digital image of the face of a patient 16 to a conventional computer 18 for processing.
- the camera 10 is used to obtain a high quality digital image of the patient's face and eyes, preferably a full-color (e.g., RGB), high resolution (e.g., 8 bits per pixel for each color) image of about at least 640x480 pixels in size, and preferably about at least 1024*768 pixels in size.
- a full-color (e.g., RGB) image e.g., 8 bits per pixel for each color
- high resolution e.g. 8 bits per pixel for each color
- the techniques of the invention can be applied to gray-scale images. However, for purposes of this explanation, a color digital image will be assumed.
- the camera 10 may be a conventional film camera, such as an instant photography camera, the images of which are optically scanned into the computer 18.
- processing of the image in accordance with the invention is done within a specially programmed digital camera 10, allowing for an integrated photoscreening system.
- Photoscreening works by photographing two distinct light reflexes from the eye, a corneal reflex and a retinal reflex.
- Light from a flash 12 is bounced off of the air-tear film interface on the cornea of the eye. Since the comeal surface is essentially spherical, the closest point of the cornea to the camera 10 will reflect the light flash back to the camera as a co eal reflex. If the eye faces the camera, the light reflection point is normally centered in the pupil of the eye. If the eye has strabismus or a related defect, the light reflection point is not centered on the pupil.
- FIGS. 2a-2b Several comeal reflex conditions are shown in FIGS. 2a-2b.
- FIG. 2a is a diagram showing the light path from a flash to a pair of normal eyes and back to a camera, as well as the appearance of the comeal reflexes from the camera view.
- the comeal light reflex 20 is essentially centered within the outline of the iris 22.
- FIG. 2b is a diagram showing the light path from a flash to two eyes, one of which is abnormal, and back to a camera, as well as the appearance of the comeal reflexes from the camera view.
- the comeal light reflex 20 is essentially centered within the outline of the iris 22 for the left eye, but the comeal light reflex 24 for the right eye is off-center. Note that the tolerance of "centered" depends in part on the displacement of the flash 12 from the optical axis of the camera lens 14.
- the retinal light reflex gives information on ocular pathologies and the refractive state of the eye. If one eye has any pathology (e.g., a tumor, blood, cataract, etc.), the retina does not reflect as much light as in a normal eye. The difference in reflectance between an abnormal retina and a normal retina is noticeable. Also, in an eye without refractive error, the retinal light reflex appears as a uniformly bright circle. If the eye has any refractive error, a crescent will appear in the retinal light reflex. The relative size of the crescent to the pupil diameter is generally related to the refractive error.
- pathology e.g., a tumor, blood, cataract, etc.
- FIGS. 2c-2f are shown in FIGS. 2c-2f.
- FIG. 2c is a diagram of the retinal reflex 26 for a pair of normal eyes, showing uniform reflectance equal in both eyes and no refractive error.
- FIG. 2d is a diagram of the retinal reflex for a pair of eyes, one of which has a retinal pathology that causes lesser reflectance 28.
- FIG. 2e is a diagram of the retinal reflex for a pair of eyes, each showing a crescent-shaped reflectance 30, which indicates the same type of refractive error in both eyes.
- FIG. 2f is a diagram of the retinal reflex for a pair of eyes, showing different crescent reflectances 32, 34, which indicates differing amounts of refractive error in the eyes.
- a model of the eyes in an image of a patient's face must be'generated.
- the frontal projection of an eye is modeled by a pair of concentric circles:
- FIG. 3 a is a close-up photograph of a patient's eye
- FIG. 3b is a close-up photograph of a patient's eye with a superimposed graphical representation of the eye model used in the invention.
- FIG. 4a is a close-up photograph of a patient's eye containing an abnormal pupil area.
- FIG. 4b is a close-up photograph of a patient's eye containing an abnormal pupil area with a superimposed graphical representation of the eye model used in the invention.
- a set of reflexes is measured for each eye
- the comeal ref l ex is the reflect.on of light from the front surface of the eye (the cornea). It typically appears as a bright spot.
- the comeal reflex CR is modeled as a four-connected re ⁇ i0 n (defined below). In FIG.
- a four-connected region is defined as follows. Formally, let I denote an image, consisting of a two-dimensional raster of pixels, organized on an integer grid of R rows and C columns. Let (r, c) denote a pixel location in image I. A four-connected path P between two pixels (r card c.) and (/ c,) is defined as: (rford,£ titan) ⁇ .
- a four-connected region S is defined as:
- Eq.4 describes any set of pixels in I contiguously connected, such that any pixel in the set may be reached from any other pixel in the set by at least one path of pixels also in the set.
- the possible as well as actual comeal reflexes, crescent reflexes, and other abnormal pupil areas, as described in this work, are all modeled as four-connected regions.
- One .aspect of the invention is to derive necess.ary parameters from a digitized image of a patient's face and eyes sufficient to locate each eye and generate a model for each eye as described above.
- the invention advantageously utilizes the characteristics of a particular illuminating flash, which produces light reflexes not seen in normal intensity imagery.
- the preferred embodiment of the invention specifically takes advantage of such light reflexes using special image processing to locate and model the eye so as to enable automated photorefractive screening.
- each eye is assumed to contain a comeal reflex CR, which appears as a saturated (bright) spot somewhere inside the pupil.
- Each pupil is assumed to contain some amount of red shading, which creates a prominent gradient in the red band at the pupil-iris boundary. (This is referred to below as the "red-black" gradient for convenience.
- red-black a similar bound- ary can be ascertained in gray scale images as distinct differences in grayscale gradients.
- Each sclera is assumed to appear as a shade of white, which creates a prominent gradient in full color at the iris-sclera boundary.
- FIG. 5 is a flowchart showing an overview of the steps of the image processing of the preferred embodiment of the invention.
- the steps of the preferred process for locating, mocleling, analyzing, and diagnosing a patient's eyes are as follows, each of which is discussed more fully below:
- the camera 10 and near-axis flash 12 are used to capture an image of a patient's face and eyes. If the camera 10 is digital, the image data may be directly down- loaded to the computer 18 for processing as a digital image. If the camera 10 is a conventional film camera, such as an instant photography camera,- the images are optically scanned into the computer 18 and stored as a digital image. The unprocessed images may be displayed on a computer monitor or printed (monochrome or color), and may be annotated using suitable conventional graphics software.
- Each three-band (i.e., RGB) image input into the computer 18 is preferably thresholded in each band to locate areas that are possible comeal reflexes.
- the images are logically AND'd together to produce a binary image of bright areas.
- Bright pixels are then spatially grouped into four-connected regions using a queue-based paint-fill algorithm.
- Bright regions with areas within a selected size range are labeled as possible or hypothesized comeal reflexes.
- the input image I is thresholded to produce a binary image I bn ⁇ ;hl indicating which pixels are bright:
- I red , I ⁇ . and I blue are the individual band values of the input image
- I bnghl l indicates a pixel is bright
- T B is a selected threshold value.
- a queue-based paint-fill algorithm is then applied to spatially group bright pixels into four-connected regions using.
- T s and T L are selectable threshold values.
- LABEL For each possible comeal reflex (region) LABEL, a set of pixels is found circularly around the comeal reflex, at distances within the expected range of pupil and iris radii, that exhibit strong gradients. This is accomplished by considering the centroid of the region, together with pixels along an evenly distributed set of compass directions from the centroid, as forming a set of rays. A linear gradient filter is applied to the pixels along a segment of each ray. For each segment, the pixel with the highest black-to-white gradient and the pixel with the highest red-to-black gradient are noted.
- centroid (x c , y c ) of the four-connected region with the value LABEL is found as:
- centroid together with an equiangular set of compass directions, forms a set of rays:
- T ⁇ (degrees) is a selectable algorithm parameter that controls the angular resolution of the set.
- the pixels (integer coordinates) along each ray ⁇ are enumerated between:
- T, .and T 4 are algorithm parameters describing the minimum and maximum expected radius (in pixels) of the pupil and iris, respectively.
- N ⁇ is the number of points in the list returned by the above pseudocode.
- a 1 *7 linear gradient filter shown in Table 1 above, is convolved with the points along the ray to compute a black-to-white gradient BW ⁇ :i (Eq. 12) and a red-to-black gradient RB ⁇ :i (Eq. 13) estimate at each pixel on each ray:
- T-, and T 3 are algo ⁇ thm parameters describing the maximum and minimum expected radius (in pixels) of the pupil and iris, respectively.
- circle equations are fit to the gradient points discov- ered in the previous step. For each possible comeal reflex, one circle is fit to the black-to- white gradient pixels and one circle is fit to the red-to-black gradient points.
- the preferred embodiment of the invention uses a novel method for eliminating outliers in the circle fit, based upon subarcs. The removal of outliers as arc subsets follows naturally from the problem context. If an eye is imaged in a non-forward orientation, or if an eyelid covers some portion of the iris, or if the retinal reflex fills only part of the . pupil, then fitting a circle to the strongest large subarc generally yields the most reliable model.
- a circle equation is fit to eight subsets of each type of gradient point. Each subset covers 270° of arc, starting at 45° increments.
- Three strength measures are computed for each circle fit: residual, average gradient, and arc coverage. These values are normalized and summed as a strength score.
- the circle (out of 8 possible, in the illustrated embodiment) with the overall best strength score is taken as the model for the given hypothesized comeal reflex.
- a, b and r are the model parameters and N is the number of points to be modeled.
- the following procedure is used to derive the pupil-iris boundary (model (A,B,C,), Eq. 1) from the set of red-to-black gradient points (RAY ⁇ :r , Eq. 17) and the iris-sclera boundary (model (A,B,C 2 ), Eq. 2) from the set of black-to- white gradient points (RAY ⁇ :W , Eq. 16).
- FIGS. 6a-6d are close-up photographs of a patient's eye, showing various stages in the removal of outliers.
- FIG. 6a is a close-up photograph of a patient's eye without a superimposed model.
- FIG. 6b shows the black-to-white gradient points 60 located for the image in FIG. 6a. These pixels illustrate a spurious response to an eyelid as well as some unstructured outliers.
- a popular method for fitting in the presence of outliers is the least-median-of-squares method (see, e.g., P.
- the preferred embodiment uses several constraints combined with a novel method to eliminate outliers in 0(ri) time:
- the residual, average gradient strength, and arc coverage are used to calculate a score.
- the score values all depend upon the number of inliers used for fitting the circle. This number, denoted N, n , is computed as 360/T 0 (the number of pixels in the original set RAY ⁇ w or RAY ⁇ r ) minus the count of pixels .discarded or omitted by any of the appropriate steps given in Eq. 27-33.
- the residual is computed as:
- N ⁇ 1 1 7' (x . -a) + 0 ⁇ -*) (x I ,y ' I ) € inliers
- the average gradient strength for each SUBARC ⁇ . ⁇ . w is computed as:
- the maximum score in each case is 3.
- the weighting factors were derived experimentally.
- a possible iris-sclera boundary radius C 2 (Eq. 2) is calculated as the mode radius of BW a . w , where the mode is determined as the highest count at an integer radius, where the count at each radius is computed as:
- the diminished score reflects the diminished confidence in the model.
- Each hypothesized comeal reflex now has an overall model strength (i.e., likelihood of being an eye).
- the set of strengths are sorted.
- the two strongest models are selected as the correct eye models.
- the correct eye models may be indicated by output from a computer, including by superimposing a graphical representation of each model on 10 images of a corresponding eye.
- each possible comeal reflex LABEL that resulted in a possible eye model has an associated score (Eq. 43).
- the two possible eye models with the highest total score are taken to be the actual eye models, denoted as: 1 5 (A left , B len , C , C ) and (A right , B righl , , C )
- the centers of the eye models must be at least two eye diameters separated from each other:
- the orientation of the patient's face in the image is assumed to be either vertical or horizontal, but in either case level with one of the parallel sets of image bound- 20 aries. This may be verified by:
- Test A In normal eyes, the comeal reflexes will be only slightly off-center (the slight offset is caused by the distance between the camera flash and the center of the camera lens). In this test, for each eye, the offset of the comeal reflex is measured as the ratio of the distance between the comeal reflex centroid and the center of the eye, to the radius of the iris:
- both comeal reflexes are off-center, then the patient was not looking directly into the camera, and the computer system so indicates. If only one comeal reflex is off-center, the patient has a tropia and the computer system indicates that the patient should be referred to a medical eyecare specialist. Formally, these conditions are tested as:
- T Pain is a selectable threshold.
- Test B In normal eyes, the retinal reflexes will appear equal and uniform in both eyes. To test for such uniformity, note that for each eye model (left and right) there is a corresponding pair of concentric circles modeled by (A,B,C,,C 2 ) (Eq. 1-2), and a comeal reflex modeled by CR (Eq. 4) and its associated centroid modeled by (x c ,y c ) (Eq. 8).
- the retinal reflex RR of each eye model preferably is measured as the average red value of pixels within the iris-pupil boundary, excluding pixels labeled as belonging to the comeal reflex and its two-deep surrounding border (the border preferably is excluded to minimize error).
- PR the set of pixels in the pupil to be used for computing the retinal reflex
- T Range is a selectable threshold
- Test C Pixels inside the pupillary bound.ary (excluding the comeal reflex) that are brighter than the average intensity inside the pupil are segmented into regions. Any region of sufficient size, with sufficient perimeter adjacent to the pupillary boundary, is labeled as a crescent reflex. In normal eyes, no crescent reflexes will be seen. If either eye exhibits a crescent reflex, then the patient has a refractive error and the computer system indicates that the patient should be referred to a medical eyecare specialist. Any region of sufficient size, but not adjacent to the pupillary boundary, is labeled as an abnormal pupil area (e.g., the region may represent a cataract), and the computer system indicates that the patient should be referred to a medical eyecare specialist. Note that if the pupil-iris boundary is not located for an eye, then the crescent reflex and abnormal pupil area tests are undefined in the preferred embodiment of the invention.
- the average blue intensity inside the pupil denoted BR, is calculated as:
- each bright region is calculated as the number of bright pixels adjacent to non-bright pixels:
- each bright region with the pupillary boundary is calculated as the number of bright pixels within a small distance of the pupil-iris circle:
- Bright regions are classified according to the following criteria:
- FIG. 7 is a flowchart of a preferred examination sequence using the invention, once an image is acquired and the eyes are located, modeled, and analyzed.
- a determination is made as to whether the comeal and retinal light reflexes are visible in the image taken by the camera 10 (STEP 100). If not, then the examination should be repeated (i.e., another image obtained) (STEP 102). If so, then a determination is made as to whether the comeal light reflexes are centered (STEP 104). If not, then an indication is given that the patient may have strabismus, and should be referred to further examination (STEP 106). Otherwise, a determination is made as to whether the retinal (red) light reflex in both eyes is equally bright (STEP 108).
- an indication is given that the patient may have a retinal problem, and should be referred to further examination (STEP 110). Otherwise, a determination is made as to whether a retinal crescent light reflex exists (STEP 112). If so, then an indication is given that the patient may have a refractive error, and should be referred to further examination (STEP 114). Otherwise, a determination is made as to whether other abnormal areas exist in the retinal light reflexes (STEP 116). If so, then an indication is given that the patient may have a possible media opacity, and should be referred to further examination (STEP 118). If not, the test sequence ends (STEP 120). Note that other test sequences may also be devised, and the tests described below may be done in other orders and/or terminated after any tentative diagnosis step.
- the patient 16 is expected to be directly facing the camera, photographed at a roughly known distance from the camera, using a known lens 14. This consistent configuration yields an expected range of image-apparent sizes of visual anatomical features.
- the constraint that the patient 16 not be too far from the camera 10 does not reflect necessarily on the ability of the inventive system to locate and model the eyes. Rather, this constraint is imposed to insure the eye regions are imaged by a sufficient number of pixels for making statistically sound measurements for making photoscreening decisions.
- the patient's eyes should be dilated. Typically, the normal dilation that occurs after three to five minutes in a darkened room is acceptable.
- the ambient lighting level in the room in which the picture is taken should be as dark as possible, to preserve dilation.
- the embodiment of the invention described herein is designed to be robust in the presence of a confusing background, the image is expected to be reasonably free of clutter.
- the background should not contain life-size pictures of people (particularly faces). Normal jewelry is acceptable, but anything which has an eye-like appearance should be removed.
- the invention may be implemented in hardware or software, or a combination of both (e.g., programmable logic arrays). Unless otherwise specified, the algorithms included as part of the invention are not inherently related to any particular computer or other apparatus. In particular, various general purpose machines may be used with programs written in accordance with the teachings herein, or it may be more convenient to construct more specialized apparatus to perform the required method steps. However, preferably, the invention is implemented in one or more computer programs executing on programmable systems each comprising at least one processor, at least one data storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Program code is applied to input data to perform the functions described herein and generate output information. The output information is applied to one or more output devices, in known fashion.
- Program code is applied to input data to perform the functions described herein and generate output information.
- the output information is applied to one or more output devices, in known fashion.
- Each such program may be implemented in any desired computer language (including machine, assembly, high level procedural, or object oriented programming languages) to communicate with a computer system.
- the language may be a compiled or interpreted language.
- Each such computer program is preferably stored on a storage media or device (e.g., ROM, CD-ROM, or magnetic or optical media) readable by a general or special purpose programmable computer, for configuring and operating the computer when the storage media or device is read by the computer to perform the procedures described herein.
- the inventive system may also be considered to be implemented as a computer-readable storage medium, configured with a computer program, where the storage medium so configured causes a computer to operate in a specific and predefined manner to perform the functions described herein.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Surgery (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Eye Examination Apparatus (AREA)
Abstract
Description
Claims
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| AU15240/99A AU1524099A (en) | 1997-11-14 | 1998-11-13 | Automated photorefractive screening |
| EP98959444A EP1052928A1 (en) | 1997-11-14 | 1998-11-13 | Automated photorefractive screening |
| KR1020007005253A KR20010032112A (en) | 1997-11-14 | 1998-11-13 | Automated photorefractive screening |
| JP2000520683A JP2001522679A (en) | 1997-11-14 | 1998-11-13 | Automatic light reflection screening |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US6553797P | 1997-11-14 | 1997-11-14 | |
| US60/065,537 | 1997-11-14 | ||
| US09/173,571 | 1998-10-15 | ||
| US09/173,571 US6089715A (en) | 1998-10-15 | 1998-10-15 | Automated photorefractive screening |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO1999025239A1 true WO1999025239A1 (en) | 1999-05-27 |
Family
ID=26745703
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US1998/024275 Ceased WO1999025239A1 (en) | 1997-11-14 | 1998-11-13 | Automated photorefractive screening |
Country Status (5)
| Country | Link |
|---|---|
| EP (1) | EP1052928A1 (en) |
| JP (1) | JP2001522679A (en) |
| KR (1) | KR20010032112A (en) |
| AU (1) | AU1524099A (en) |
| WO (1) | WO1999025239A1 (en) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7255443B2 (en) | 2002-03-05 | 2007-08-14 | Chul Myung Choe | Quantitative analysis apparatus for phenomena of glare and the method for the same |
| WO2010011785A1 (en) * | 2008-07-23 | 2010-01-28 | Indiana University Research & Technology Corporation | System and method for a non-cooperative iris image acquisition system |
| CN111832344A (en) * | 2019-04-17 | 2020-10-27 | 深圳熙卓科技有限公司 | Dynamic pupil detection method and device |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004005167A (en) * | 2002-05-31 | 2004-01-08 | Matsushita Electric Ind Co Ltd | Eye position specifying method and apparatus |
| JP4179606B2 (en) * | 2003-06-09 | 2008-11-12 | 株式会社コーナン・メディカル | Photorefractor |
| IL215883A0 (en) * | 2011-10-24 | 2012-03-01 | Iriss Medical Technologies Ltd | System and method for indentifying eye conditions |
| JP6774136B2 (en) * | 2015-01-20 | 2020-10-21 | グリーン シー.テック リミテッド | Methods and systems for automatic vision diagnosis |
| US10042181B2 (en) * | 2016-01-27 | 2018-08-07 | Johnson & Johnson Vision Care, Inc. | Ametropia treatment tracking methods and system |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5218387A (en) * | 1990-05-21 | 1993-06-08 | Nissan Motor Co., Ltd. | Eye position detecting apparatus |
-
1998
- 1998-11-13 WO PCT/US1998/024275 patent/WO1999025239A1/en not_active Ceased
- 1998-11-13 JP JP2000520683A patent/JP2001522679A/en active Pending
- 1998-11-13 EP EP98959444A patent/EP1052928A1/en not_active Withdrawn
- 1998-11-13 AU AU15240/99A patent/AU1524099A/en not_active Abandoned
- 1998-11-13 KR KR1020007005253A patent/KR20010032112A/en not_active Ceased
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5218387A (en) * | 1990-05-21 | 1993-06-08 | Nissan Motor Co., Ltd. | Eye position detecting apparatus |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7255443B2 (en) | 2002-03-05 | 2007-08-14 | Chul Myung Choe | Quantitative analysis apparatus for phenomena of glare and the method for the same |
| WO2010011785A1 (en) * | 2008-07-23 | 2010-01-28 | Indiana University Research & Technology Corporation | System and method for a non-cooperative iris image acquisition system |
| US8644565B2 (en) | 2008-07-23 | 2014-02-04 | Indiana University Research And Technology Corp. | System and method for non-cooperative iris image acquisition |
| CN111832344A (en) * | 2019-04-17 | 2020-10-27 | 深圳熙卓科技有限公司 | Dynamic pupil detection method and device |
| CN111832344B (en) * | 2019-04-17 | 2023-10-24 | 深圳熙卓科技有限公司 | Dynamic pupil detection method and device |
Also Published As
| Publication number | Publication date |
|---|---|
| EP1052928A1 (en) | 2000-11-22 |
| JP2001522679A (en) | 2001-11-20 |
| KR20010032112A (en) | 2001-04-16 |
| AU1524099A (en) | 1999-06-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US6089715A (en) | Automated photorefractive screening | |
| US12475561B2 (en) | Methods and systems for ocular imaging, diagnosis and prognosis | |
| Tobin et al. | Detection of anatomic structures in human retinal imagery | |
| Patton et al. | Retinal image analysis: concepts, applications and potential | |
| Niemeijer et al. | Segmentation of the optic disc, macula and vascular arch in fundus photographs | |
| US10413180B1 (en) | System and methods for automatic processing of digital retinal images in conjunction with an imaging device | |
| AU2019204611B2 (en) | Assessment of fundus images | |
| US11967075B2 (en) | Application to determine reading/working distance | |
| US20220198831A1 (en) | System for determining one or more characteristics of a user based on an image of their eye using an ar/vr headset | |
| Fritzsche et al. | Automated model based segmentation, tracing and analysis of retinal vasculature from digital fundus images | |
| CN110623629A (en) | Visual attention detection method and system based on eyeball motion | |
| KR20190022216A (en) | Eye image analysis method | |
| US6616277B1 (en) | Sequential eye screening method and apparatus | |
| Narasimha-Iyer et al. | Integrated analysis of vascular and nonvascular changes from color retinal fundus image sequences | |
| EP4006833B1 (en) | Image processing system and image processing method | |
| EP3769283B1 (en) | Pupil edge detection in digital imaging | |
| Gairola et al. | Smartkc: Smartphone-based corneal topographer for keratoconus detection | |
| Consejo et al. | Detection of subclinical keratoconus with a validated alternative method to corneal densitometry | |
| CN109215039B (en) | Method for processing fundus picture based on neural network | |
| WO1999025239A1 (en) | Automated photorefractive screening | |
| Giancardo | Automated fundus images analysis techniques to screen retinal diseases in diabetic patients | |
| Valencia | Automatic detection of diabetic related retina disease in fundus color images | |
| US20240277224A1 (en) | Optical coherence tomography (oct) self-testing system, optical coherence tomography method, and eye disease monitoring system | |
| Kwok et al. | Democratizing Optometric Care: A Vision-Based, Data-Driven Approach to Automatic Refractive Error Measurement for Vision Screening | |
| Wang | Investigation of image processing and computer-assisted diagnosis system for automatic video vision development assessment |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AK | Designated states |
Kind code of ref document: A1 Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GD GE GH GM HR HU ID IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG UZ VN YU ZW |
|
| AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
| WWE | Wipo information: entry into national phase |
Ref document number: 1020007005253 Country of ref document: KR |
|
| ENP | Entry into the national phase |
Ref country code: JP Ref document number: 2000 520683 Kind code of ref document: A Format of ref document f/p: F |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 1998959444 Country of ref document: EP |
|
| REG | Reference to national code |
Ref country code: DE Ref legal event code: 8642 |
|
| WWP | Wipo information: published in national office |
Ref document number: 1998959444 Country of ref document: EP |
|
| WWP | Wipo information: published in national office |
Ref document number: 1020007005253 Country of ref document: KR |
|
| NENP | Non-entry into the national phase |
Ref country code: CA |
|
| WWW | Wipo information: withdrawn in national office |
Ref document number: 1998959444 Country of ref document: EP |
|
| WWR | Wipo information: refused in national office |
Ref document number: 1020007005253 Country of ref document: KR |