US11141056B2 - Ophthalmic device - Google Patents
Ophthalmic device Download PDFInfo
- Publication number
- US11141056B2 US11141056B2 US16/427,709 US201916427709A US11141056B2 US 11141056 B2 US11141056 B2 US 11141056B2 US 201916427709 A US201916427709 A US 201916427709A US 11141056 B2 US11141056 B2 US 11141056B2
- Authority
- US
- United States
- Prior art keywords
- image
- nucleus
- tissues
- abnormality
- subject eye
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/102—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for optical coherence tomography [OCT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0016—Operational features thereof
- A61B3/0025—Operational features thereof characterised by electronic signal processing, e.g. eye models
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/0008—Apparatus for testing the eyes; Instruments for examining the eyes provided with illuminating means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/117—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for examining the anterior chamber or the anterior chamber angle, e.g. gonioscopes
- A61B3/1173—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for examining the anterior chamber or the anterior chamber angle, e.g. gonioscopes for examining the eye lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/13—Ophthalmic microscopes
- A61B3/135—Slit-lamp microscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/14—Arrangements specially adapted for eye photography
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10101—Optical tomography; Optical coherence tomography [OCT]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30041—Eye; Retina; Ophthalmic
Definitions
- the technique disclosed herein relates to an ophthalmic device. To be more precise, it relates to an ophthalmic device configured to capture a tomographic image of a crystalline lens of a subject eye.
- observation of the crystalline lens is performed.
- An observation method using a slit lamp microscope is generally known for the observation of crystalline lens.
- an examiner such as a doctor directly observes the condition of a crystalline lens by irradiating the subject eye with a slit lamp.
- the examiner directly observes the subject eye and diagnoses the condition of the crystalline lens of the subject eye.
- the examiner adjusts a position irradiated with light from the slit lamp, irradiating a same position with the slit lamp is difficult, thus there has been a problem that variations in diagnosis results occur depending on examinations.
- the disclosure herein discloses a technique for accurately analyzing a crystalline lens of a subject eye.
- An ophthalmic device disclosed herein may comprise: an image capturing unit configured to capture a tomographic image of a crystalline lens of a subject eye; a processor; and a memory storing computer-readable instructions therein.
- the computer-readable instructions when executed by the processor, may cause the processor to execute: detecting a boundary between tissues in a tomographic image of the crystalline lens of the subject eye captured by the image capturing unit; determining whether the tissues defined by the boundary include an abnormality; and analyzing an analysis item associated with the abnormality.
- FIG. 1 shows a schematic configuration of an optical system of an ophthalmic device according to an embodiment.
- FIG. 2 shows a schematic configuration of a scanning-alignment optical system.
- FIG. 3 is a block diagram showing a control system of the ophthalmic device according to the embodiment.
- FIG. 4 is a flowchart showing an example of a process of analyzing a crystalline lens of a subject eye.
- FIGS. 5A and 5B are diagrams for explaining a radial scanning scheme.
- FIGS. 6A and 6B are diagrams for explaining a raster scanning scheme.
- FIG. 7 is a schematic diagram showing a state in which boundaries between tissues in the crystalline lens are detected in a tomographic image.
- FIGS. 8A and 8B are diagrams for explaining a procedure for creating an En-face image of a front cortex, where FIG. 8A shows a tomographic image of the crystalline lens and FIG. 8B shows the En-face image of the front cortex.
- FIG. 9 is a schematic diagram showing a two-dimensional tomographic image of a colored nucleus.
- FIG. 10 is a diagram showing an example of analysis results displayed on a touch panel.
- An ophthalmic device disclosed herein may comprise: an image capturing unit configured to capture a tomographic image of a crystalline lens of a subject eye; a processor; and a memory storing computer-readable instructions therein.
- the computer-readable instructions when executed by the processor, may cause the processor to execute: detecting a boundary between tissues in a tomographic image of the crystalline lens of the subject eye captured by the image capturing unit; determining whether the tissues defined by the boundary include an abnormality; and analyzing an analysis item associated with the abnormality.
- the above ophthalmic device analyzes the crystalline lens by using the tomographic image of the crystalline lens of the subject eye captured by the image capturing unit. Due to this, by positioning the image capturing unit relative to the subject eye, image capture and analysis can be executed for a same portion in the crystalline lens, by which variations in diagnosis results among examinations can be avoided. Further, detecting the boundary between tissues enables analysis for each of the tissues. Further, by analyzing the analysis item associated with the detected abnormality, an analysis for an analysis item that is not associated with the detected abnormality can be eliminated, by which processing speed required for the analysis can be increased. Due to this, diagnosis time can be shortened, and burden on a subject can be reduced.
- the ophthalmic device disclosed herein may further comprise a display unit configured to display an analysis result.
- the display unit may be configured to display an analysis result of the analysis item. According to such a configuration, the analysis result for the analysis item associated with the detected abnormality is displayed on the display unit, by which details of the detected abnormality can be notified to an examiner.
- the display unit when the abnormality is detected in a nucleus of the crystalline lens, the display unit may be configured to display the tomographic image of the crystalline lens captured by the image capturing unit by coloring the nucleus of the crystalline lens in a plurality of different colors based on luminance in the tomographic image of the crystalline lens.
- Such a configuration allows the examiner to easily recognize a condition of the nucleus if the abnormality is detected in the nucleus of the crystalline lens.
- the ophthalmic device 1 is configured to capture tomographic images of an anterior eye part of a subject eye E by using an Optical Coherence Tomography (OCT).
- OCT Optical Coherence Tomography
- the ophthalmic device 1 includes a light source 10 , an interference optical system 14 configured to cause reflected light reflected from the subject eye E and reference light to interfere with each other, and a K-clock generator 50 configured to generate K-clock signals.
- the light source 10 is a wavelength-sweeping light source, and is configured to change a waveform of the light emitted therefrom in a predetermined cycle.
- a reflected position of reflected light that interferes with the reference light, among reflected light from respective parts of the subject eye E in a depth direction changes in the depth direction of the subject eye E in accordance with the wavelength of the emitted light. Due to this, it is possible to specify positions of the respective parts (such as a cornea and a crystalline lens) inside the subject eye E by measuring the interference light while changing the wavelength of the emitted light.
- the light outputted from the light source 10 is inputted to a fiber coupler 12 through an optical fiber.
- the light inputted to the fiber coupler 12 is split in the fiber coupler 12 , and the split light is outputted to a fiber coupler 16 and the K-clock generator 50 through optical fibers.
- the K-clock generator 50 will be described later.
- the interference optical system 14 includes a measurement optical system configured to irradiate inside of the subject eye E with light from the light source 10 and generate reflected light therefrom, a reference optical system configured to generate reference light from the light of the light source 10 , and a balance detector 40 configured to detect interference light that is a combination of the reflected light guided by the measurement optical system and the reference light guided by the reference optical system.
- the measurement optical system is constituted of the fiber coupler 16 , a circulator 18 , and a scanning-alignment optical system 20 .
- the light outputted from the light source 10 and inputted to the fiber coupler 16 through the fiber coupler 12 is split in the fiber coupler 16 into measurement light and reference light, and these light are outputted therefrom.
- the measurement light outputted from the fiber coupler 16 is inputted to the circulator 18 through an optical fiber.
- the measurement light inputted to the circulator 18 is outputted to the scanning-alignment optical system 20 .
- the scanning-alignment optical system 20 is configured to irradiate the subject eye E with the measurement light outputted from the circulator 18 and to output reflected light from the subject eye E to the circulator 18 .
- the reflected light inputted to the circulator 18 is inputted to one of inputs of a fiber coupler 38 .
- the scanning-alignment optical system 20 will be described later in detail.
- the reference optical system is constituted of the fiber coupler 16 , a circulator 22 , and a reference unit 24 .
- the reference light outputted from the fiber coupler 16 is inputted to the circulator 22 through an optical fiber.
- the reference light inputted to the circulator 22 is outputted to the reference unit 24 .
- the reference unit 24 is constituted of collimator lenses 26 , 28 and a reference mirror 30 .
- the reference light outputted to the reference unit 24 is reflected by the reference mirror 30 through the collimator lenses 26 , 28 , and is outputted from the reference unit 24 through the collimator lenses 26 , 28 again.
- the reference light outputted from the reference unit 24 is outputted to the circulator 22 .
- the collimator lens 28 and the reference mirror 30 are each configured to be moved forward and rearward relative to the collimator lens 26 by a second driver 54 (see FIG. 3 ).
- a second driver 54 moves the collimator lens 28 and the reference mirror 30 , an optical path length of the reference optical system changes. Due to this, the optical path length of the reference optical system can be adjusted to be substantially equal to an optical path length of the measurement optical system.
- the reference light inputted to the circulator 22 is inputted to another input of the fiber coupler 38 through a polarized wave controller 36 .
- the polarized wave controller 36 is an element configured to control polarization of the reference light to be inputted to the fiber coupler 38 .
- As the polarized wave controller 36 a configuration such as a paddle type or an inline type used in known ophthalmic devices can be used, thus a detailed description thereof will be omitted.
- the fiber coupler 38 is configured to combine the reflected light from the subject eye E and the reference light that were inputted thereto to generate interference light.
- the fiber coupler 38 is further configured to split the generated interference light into two interference light having phases that differ by 180 degrees from each other, and input them to the balance detector 40 .
- the balance detector 40 is configured to execute a process for differential amplification and a process for reducing noise on the two interference light having the phases that differ by 180 degrees, which were inputted from the fiber coupler 38 , to convert them into electric signals (interference signals).
- the balance detector 40 is configured to output the interference signals to a processor 60 .
- the scanning-alignment optical system 20 includes a scanning optical system, an anterior-eye-part image capturing system, a fixation target optical system, and an alignment optical system.
- the scanning optical system includes a collimator lens 102 , a Galvano scanner 104 , a hot mirror 106 , and an object lens 108 .
- the measurement light outputted from the circulator 18 (see FIG. 1 ) is emitted to the Galvano scanner 104 through the collimator lens 102 .
- the Galvano scanner 104 is configured to be tilted by a first driver 52 (see FIG. 3 ), and a position irradiated with the measurement light in the subject eye E is scanned by the first driver 52 tilting the Galvano scanner 104 .
- the hot mirror 106 is irradiated with the measurement light emitted from the Galvano scanner 104 and the measurement light is reflected there at an angle of 90 degrees.
- the measurement light with which the hot mirror 106 was irradiated is provided to the subject eye E through the object lens 108 . Reflected light from the subject eye E is inputted to the circulator 18 after passing through the object lens 108 , the hot mirror 106 , the Galvano scanner 104 , and the collimator lens 102 along a reversed path from the above.
- the anterior-eye-part image capturing system includes two illuminating light sources 110 , the object lens 108 , the hot mirror 106 , a cold mirror 112 , an imaging lens 114 , a CCD camera 116 , and an optical controller 118 .
- the two illuminating light sources 110 are configured to irradiate a front side of the subject eye E with illumination light in a visible range. Reflected light from the subject eye E travels through the object lens 108 , the hot mirror 106 , the cold mirror 112 and the imaging lens 114 and is inputted to the CCD camera 116 . Due to this, a front image of the subject eye E is captured. Data of the captured image is subjected to image processing by the optical controller 118 and is displayed on a touch panel 56 .
- the fixation target optical system includes a fixation target light source 120 , cold mirrors 122 , 124 , a relay lens 126 , a half mirror 128 , the cold mirror 112 , the hot mirror 106 , and the object lens 108 .
- Light from the fixation target light source 120 travels through the cold mirrors 122 , 124 , the relay lens 126 and the half mirror 128 , and is reflected on the cold mirror 112 .
- the light reflected on the cold mirror 112 travels through the hot mirror 106 and the object lens 108 , and the subject eye E is irradiated with the light.
- an eyeball that is, the subject eye E
- the alignment optical system is constituted of an XY-direction position detection system and a Z-direction position detection system.
- the XY-direction position detection system is used to detect positions of the subject eye E (to be more precise, a corneal apex thereof) in XY directions (that is, positional displacements thereof in up-down and right-left directions relative to the ophthalmic device 1 ).
- the Z-direction position detection system is used to detect a position of the corneal apex of the subject eye E in a front-rear direction (a Z direction).
- the XY-direction position detection system includes an XY-position detection light source 130 , the cold mirror 124 , the relay lens 126 , the half mirror 128 , the cold mirror 112 , the hot mirror 106 , the object lens 108 , an imaging lens 132 , and a position sensor 134 .
- the XY-position detection light source 130 is configured to emit alignment light for position detection.
- the alignment light emitted from the XY-position detection light source 130 is reflected on the cold mirror 124 , travels through the relay lens 126 and the half mirror 128 , and is reflected on the cold mirror 112 .
- the light reflected on the cold mirror 112 travels through the hot mirror 106 and the object lens 108 , and the anterior eye part (cornea) of the subject eye E is irradiated with the light.
- the alignment light is reflected on the corneal surface so as to form a bright spot image on an inner side with respect to the corneal apex of the subject eye E.
- the reflected light from this corneal surface enters the object lens 108 and is reflected on the cold mirror 112 through the hot mirror 106 .
- the reflected light reflected on the cold mirror 112 is reflected on the half mirror 128 and is inputted to the position sensor 134 through the imaging lens 132 .
- a position of the corneal apex (that is, its position in X and Y directions) is detected by the position sensor 134 detecting a position of the bright spot.
- the detection signal of the position sensor 134 is inputted to the processor 60 through the optical controller 118 .
- alignment is set between the position sensor 134 and the anterior-eye-part image capturing system, and a predetermined (regular) image acquisition position for the corneal apex (a position thereof to be tracked upon acquiring tomographic images) is set.
- the regular image acquisition position for the corneal apex is, for example, a point that matches a center position of an image captured by the CCD camera 116 .
- the processor 60 is configured to calculate positional displacement amounts of the detected corneal apex (bright point) in the X and Y directions relative to the regular image acquisition position based on the detection of the position sensor 134 .
- the Z-direction position detection system includes a Z-position detection light source 140 , an imaging lens 142 , and a line sensor 144 .
- the Z-position detection light source 140 is configured to irradiate the subject eye E with light for detection (slit light or spot light) from an oblique direction with respect to the subject eye E. Reflected light in the oblique direction from the cornea of the subject eye E enters the line sensor 144 through the imaging lens 142 . At this occasion, an incident position of the reflected light entering the line sensor 144 varies depending on the position of the subject eye E in the front-rear direction (Z direction) relative to the ophthalmic device 1 . Due to this, the position of the subject eye E in the Z direction relative to the ophthalmic device 1 is detected by detecting the incident position of the reflected light. The detection signal of the line sensor 144 is inputted to the processor 60 .
- the K-clock generator 50 (see FIG. 1 ) is configured to optically generate sample clock (K-clock) signals from the light of the light source 10 to sample the interference signals at a regular interval frequency (frequency interval that is equalized with respect to light frequency). Further, the generated K-clock signals are outputted toward the processor 60 . Due to this, the processor 60 samples the interference signals based on the K-clock signals, by which distortion in the interference signals can be suppressed and deterioration in resolution can be prevented. In the present embodiment, the interference signals that were sampled at timings defined by the K-clock signals are inputted to the processor 60 , however, no limitation is placed to this configuration.
- the processor 60 may execute a process to scale data sampled at a predetermined time interval by using a function indicating a frequency with respect to a preset sweep time, or a sweep profile that is acquired simultaneously therewith.
- the interference optical system 14 and the K-clock generator 50 are an example of “image capturing unit”.
- the ophthalmic device 1 is controlled by the processor 60 .
- the processor 60 is constituted of a microcomputer (microprocessor) constituted of a CPU, a ROM, a RAM, and the like.
- the processor 60 is connected with the light source 10 , the first driver 52 , the second driver 54 , the illuminating light sources 110 , the fixation target light source 120 , the XY-position detection light source 130 , the Z-position detection light source 140 , the optical controller 118 , the line sensor 144 , the balance detector 40 , the K-clock generator 50 , and the touch panel 56 .
- the processor 60 is configured to control on/off of the light source 10 and to drive the Galvano scanner 104 and the reference unit 24 by controlling the first driver 52 and the second driver 54 . Further, the interference signals corresponding to intensities of the interference light detected by the balance detector 40 and the K-clock signals generated by the K-clock generator 50 are inputted to the processor 60 .
- the processor 60 is configured to sample the interference signals from the balance detector 40 based on the K-clock signals. Further, the processor 60 executes Fourier transform on the sampled interference signals to specify positions of respective parts (such as the cornea, an anterior chamber, and a crystalline lens) and tissues (such as a nucleus, a cortex, and a capsule of the crystalline lens) of the subject eye E. Data and calculation results inputted to the processor 60 are stored in a memory (not shown).
- the processor 60 is configured to control on/off of the illuminating light sources 110 , the fixation target light source 120 , and the XY-position detection light source 130 .
- the front image of the subject eye E captured by the CCD camera 116 and processed by the optical controller 118 and the position of the corneal apex (bright point) detected by the position sensor 134 via the optical controller 118 are inputted to the processor 60 .
- the processor 60 is configured to calculate the displacement amounts of the corneal apex (bright point) in the XY directions based on the front image of the subject eye E and the position of the corneal apex (bright point) that were inputted.
- the detection signal of the line sensor 144 is inputted to the processor 60 , and the processor 60 is configured to calculate the displacement amount of the subject eye E in the Z direction relative to the ophthalmic device 1 .
- the processor 60 controls a main driver (not shown) such that the aforementioned positional displacement amounts all become 0 and moves a main body of the ophthalmic device 1 relative to a stage (not shown).
- the processor 60 is configured to control the touch panel 56 .
- the touch panel 56 is a display device configured to provide various types of information related to measurement results and analysis results of the subject eye E to the examiner and is also a user interface configured to accept instructions and information from the examiner.
- the touch panel 56 is configured to display images and analysis results of respective tissues of the crystalline lens of the subject eye E that were generated by the processor 60 .
- various settings of the ophthalmic device 1 can be inputted to the touch panel 56
- the ophthalmic device 1 of the present embodiment is provided with the touch panel 56 , however, no limitation is placed on this configuration.
- the ophthalmic device 1 simply needs to be able to display and input the aforementioned information, and may be provided with a monitor and an input device (for example, a mouse and a keyboard).
- a process of analyzing the crystalline lens of the subject eye E will be described with reference to FIG. 4 .
- the processor 60 acquires tomographic images of the anterior eye part of the subject eye E (S 12 ).
- the process of acquiring tomographic images of the anterior eye part of the subject eye E is executed according to the following procedure.
- the processor 60 executes alignment between the subject eye E and the ophthalmic device 1 .
- the alignment is executed based on the displacement amounts in the XY directions and the Z direction detected by the alignment optical system.
- the processor 60 moves the main body of the ophthalmic device 1 relative to the stage (not shown) so that the positional displacement amounts of the corneal apex (bright point) in the X and Y directions detected by the XY-direction position detection system and the positional displacement amount of the subject eye E in the Z direction detected by the Z-direction position detection system all become 0.
- the processor 60 captures tomographic images of the anterior eye part of the subject eye E.
- the measurement of the anterior eye part of the subject eye E in step S 12 is executed according to a radial scanning scheme. Due to this, the tomographic images of the anterior eye part are acquired over an entire region thereof. That is, as shown in FIG. 5 , the tomographic images are captured with B-scan directions set in radial directions from the corneal apex of the subject eye E and a C-scan direction set in a circumferential direction thereof. In this embodiment, the tomographic images are captured in 128 directions radially (specifically, in 128 directions at regular intervals in the circumferential direction) according to the radial scanning scheme.
- the processor 60 records data of the acquired (captured) tomographic images in the memory.
- a method of capturing tomographic images of the crystalline lens is not limited to the radial scanning scheme. Any method may be adopted so long as it is able to acquire tomographic images of the crystalline lens over an entire region thereof, and for example, the images may be captured according to a raster scanning scheme. That is, as shown in FIG. 6 , the tomographic images may be captured with the B-scan direction in a horizontal direction and the C-scan direction set in a vertical direction relative to the subject eye E.
- the processor 60 detects boundaries between tissues in the crystalline lens based on luminance information included respectively in interference signal information (S 14 ). As shown in FIG. 7 , the processor 60 detects, in the crystalline lens, a boundary L 1 between an anterior capsule and the anterior chamber (that is, a front surface of the crystalline lens), a boundary L 2 between the anterior capsule and the cortex, boundaries L 3 , L 4 between the cortex and the nucleus, a boundary L 5 between the cortex and a posterior capsule, and a boundary L 6 between the posterior capsule and a vitreous body (that is, a rear surface of the crystalline lens).
- the interference signal information includes components of the reflected light that were reflected at these boundaries L 1 to L 6 .
- the boundaries L 1 to L 6 between the tissues in the crystalline lens are detected based on those signal components included in the interference signal information.
- the cortex is often divided into an anterior chamber side (which is an upper side in FIG. 7 ) and a vitreous body side (which is a lower side in FIG. 7 ) by the nucleus.
- a portion of the cortex on the anterior chamber side may be termed “front cortex” and a part thereof on the vitreous body side may be termed “rear cortex”.
- the tissues namely, the anterior capsule (a region between the boundaries L 1 and L 2 ), the front cortex (a region between the boundaries L 2 and L 3 ), the nucleus (a region between the boundaries L 3 and L 4 ), the rear cortex (a region between the boundaries L 4 and L 5 ), and the posterior capsule (a region between the boundaries L 5 and L 6 ), can be specified.
- the processor 60 determines whether or not each of the tissues specified in step S 14 includes an abnormality (S 16 ). Whether an abnormality is included or not is determined based on the luminance information of pixels constituting each tissue. For example, different thresholds for the luminance information are preset for the respective tissues, a comparison is performed for each tissue between luminance of each pixel constituting the tissue and the threshold set for the tissue, and a determination that an abnormal portion is present in the tissue is made when the luminance exceeds the threshold. If opacity is present in a tissue, an optic component reflected at the opacified portion is large, which result in high luminance at the opacified portion.
- the thresholds for the luminance information can be set to include the luminance of the tomographic image including a portion diagnosed as opacified (or a portion suspected as such). In this case, the luminance based on which it is diagnosed that opacity is present differs among the tissues. Due to this, the thresholds are set to different values for the respective tissues.
- the processor 60 determines, for each of the tissues, whether or not the tissue includes a portion with the luminance higher than the threshold corresponding to the tissue. Then, in a case where the tissue includes the portion with higher luminance than the threshold, the processor 60 determines that an abnormality is present in the tissue.
- the processor 60 determines that an abnormality is present in the tissue when a number of pixels having higher luminance than the threshold exceeds a set number.
- the set number for executing the abnormality determination may be different among the tissues.
- whether an abnormality is present or not is determined by comparing the luminance of each pixel in the tissue with the threshold, however, no limitation is placed on this method.
- whether an abnormality is present or not in the tissue may be determined in a simplified manner by comparing an average of the luminance of the pixels constituting the tissue with the threshold.
- the processor 60 creates a two-dimensional image of the tissue (S 18 ).
- the two-dimensional image is created as a front image in which only the tissue is extracted.
- the nucleus the two-dimensional image is created as a tomographic image.
- the examiner observes the condition of the nucleus by using a slit lamp. In this case, the examiner observes the nucleus in a similar manner to observing its tomographic image.
- the tissues other than the nucleus are observed by a retroillumination in which an ocular fundus of the subject eye E is irradiated with illumination light and the observation is performed by reflected light from the ocular fundus.
- the examiner observes the tissues other than the nucleus (the anterior capsule, the front cortex, the rear cortex, and the posterior capsule) in a similar manner to observing their front images.
- the examiner can diagnose the tissues similar to the conventional observation methods.
- front image will be described with the front cortex as an example.
- an abnormality is detected in the front cortex in step S 16 . That is, it is assumed that a portion having higher luminance than the value set as the threshold for the luminance information of the front cortex is detected in the region specifying the front cortex (the region between the boundaries L 2 and L 3 ).
- the region specifying the front cortex is extracted from each of the tomographic images, and the front image constituted only of the front cortex is created.
- the front image is, for example, an En-face (en-face) image. Specifically, a maximum value and an average value in a depth direction are calculated for each A-scan in three-dimensional data, and the three-dimensional data is compressed to a two-dimensional En-face image.
- the processor 60 averages, in the region between the boundaries L 2 and L 3 specifying the front cortex, the luminance in the depth direction shown by arrows for each A-scan. Then, as shown in FIG. 8B , the processor 60 displays the averaged luminance as dots to create the En-face image. For each of the anterior capsule, the rear cortex and the posterior capsule as well, in a case where an abnormality is detected therein in step S 16 , the processor 60 creates an En-face image that displays only the anterior capsule, the rear cortex or the posterior capsule according to a procedure similar to that for the front cortex.
- tissues such as the anterior capsule, the nucleus, the rear cortex, and the posterior capsule
- specific tissue such as the front cortex
- a tomographic image of the nucleus is colored in plural colors based on the luminance information.
- the tomographic image used here may have had speckle noise therein removed by executing signal averaging on a tomographic image in a horizontal direction and a plurality of tomographic images adjacent to the aforementioned tomographic image in a circumferential direction (in this embodiment, a total of four tomographic images captured in directions of ⁇ 1.4 degrees and ⁇ 2.8 degrees with respect to the aforementioned tomographic image).
- the luminance information of each pixel is replaced with a hue such that colors change as the luminance increases.
- green is applied in a case where the luminance is lower than the threshold for the luminance information for detecting an abnormality in the nucleus as used in step S 16 , and colors are applied such that they gradually change from green to yellow as the luminance increases. Further, colors are applied such that they gradually change from yellow to red as the luminance further increases.
- the pixel(s) is replaced with red.
- the pixels having the same luminance information in the tomographic image are colored in the same color.
- Opacity tends to occur in the nucleus at a central portion thereof rather than at its outer periphery portion in a cross section. Due to this, when the pixels are replaced with the aforementioned hues in the tomographic image of the subject eye E in which cataract is progressed, the central portion of the nucleus is colored in red and other portions thereof are colored such that colors gradually changes closer to green from the central portion toward the outer periphery portion. For example, in the case where cataract is progressed in the nucleus as shown in FIG.
- a region R 1 located at a centermost position in the tomographic image is colored in red
- a region R 2 adjacent outside the region R 1 is colored in orange
- a region R 3 adjacent outside the region R 2 is colored in yellow
- a region R 4 located on an outermost side is colored in green.
- the nucleus is observed in different colors depending on progression states of cataract. That is, the nucleus is observed in a color close to white when a degree of progression of cataract is low. Then, as the degree of progression of cataract becomes higher, the color becomes closer to yellow from white, and further becomes close to brown.
- the examiner can recognize the condition of the nucleus in colors close to those in the conventional observation methods.
- the luminance information of each pixel is replaced with a hue, however, each pixel may be replaced with a set hue based on an average of the luminance information including the luminance information of its surrounding pixels.
- the pixels are colored by using the hues that change from green to red, however, colors to be used upon replacing the luminance information are not particularly limited.
- the replacement may use the colors observed by using the conventional slit lamp microscopes (that is, white, yellow, and brown) to convert the tomographic image into an image equivalent to an image observed by using the slit lamp microscopes.
- grading of the tissues is executed based on the two-dimensional images created in step S 18 (S 20 ). For example, for each of the tissues in which an abnormality is detected in step S 16 , the grading is executed based on the WHO classification.
- the cortex is classified based on a ratio (%) of opacity occupying its circumference in the En-face image. Further, opacity at a center of the cortex is classified depending on whether opacity is present within a range of 3 mm from a pupil center or not. For example, in an En-face image 74 (see FIG. 10 ) of the front cortex, it is assumed that the ratio of opacity occupying the circumference is calculated as 30%. In the WHO classification, if a ratio of opacity occupying a circumference in a cortex is 25% or more and 50% or less, the cortex is classified as grade 2.
- the opacity is present in the range of 3 mm from the pupil center in the En-face image 74 of the front cortex.
- the front cortex is classified as grade 2 and further classified as having opacity at the center of the front cortex.
- the nucleus is classified according to a grading method in which determination is made based on comparison with standard photos of the WHO classification.
- the nucleus is graded based on which of the standard photos of the WHO classification corresponds to the luminance information (that is, the colors applied in step S 18 ) of the nucleus in the tomographic image. For example, in a tomographic image 82 of a colored nucleus (see FIG. 10 ), it is assumed that the central portion of the nucleus is colored in yellow.
- the pixels are determined as corresponding to a color of an opacified portion in a standard photo 2 (grade 1) among the standard photos of the WHO classification. In this case, the nucleus is classified as grade 1 in step S 20 .
- the posterior capsule is classified based on a size (mm) of opacity. For example, in an En-face image 78 (see FIG. 10 ) of the posterior capsule, it is assumed that the size of opacity is calculated as about 4 mm. In the WHO classification, if the size of opacity is 3 mm or more, it is classified as grade 3. Thus, in step S 20 , the posterior capsule is classified as grade 3.
- a size (mm) of opacity For example, in an En-face image 78 (see FIG. 10 ) of the posterior capsule, it is assumed that the size of opacity is calculated as about 4 mm. In the WHO classification, if the size of opacity is 3 mm or more, it is classified as grade 3. Thus, in step S 20 , the posterior capsule is classified as grade 3.
- results of analyses executed in steps S 18 and S 20 are outputted to the touch panel 56 (S 22 ).
- the two-dimensional images 72 to 78 and 82 created in step S 18 and the grades of the respective tissues classified in step S 20 are displayed on the touch panel 56 . That is, for the tissues in which an abnormality is detected, the touch panel 56 displays their two-dimensional images (the En-face images or the colored tomographic image) in which those tissues are displayed, and the grades of those tissues.
- FIG. 10 shows an example of the analysis results displayed on the touch panel 56 .
- the touch panel 56 displays the En-face image 72 of the anterior capsule, the En-face image 74 of the front cortex, the En-face image 76 of the rear cortex, the En-face image 78 of the posterior capsule, the tomographic image 82 of the colored nucleus, and a table 84 indicating their grading results.
- the touch panel 56 displays the tomographic image 80 of the crystalline lens.
- the touch panel 56 displays the two-dimensional images 72 , 74 , 76 , 78 , 82 of the tissues that are created in step S 18 due to an abnormality being detected in step S 16 and the table 84 indicating the analysis results thereof.
- the two-dimensional images of the tissues in which an abnormality is detected are displayed on the touch panel 56 . Since an abnormality is detected in all the tissues, the two-dimensional images of all the tissues are displayed in FIG. 10 , however, two-dimensional image(s) will not be displayed for the tissue(s) in which an abnormality is not detected in step S 16 . Further, as shown in FIG. 10 , the tomographic image 80 acquired in step S 12 may be displayed together with the two-dimensional images created in step S 18 .
- the table 84 shows the grading results of the respective tissues. Further, the table 84 indicates respective numerical values calculated for use in the grading determinations. For example, FIG. 10 indicates, in regard to the opacity of the cortex (COR), the value (30) of the ratio (Total (%)) of the opacity calculated in step S 20 and presence of the opacity (Yes) at the center thereof (Center). Further, the grades (2 and +) determined based on these values are indicated. Further, in regard to the opacity of the nucleus (NUC), the closest-resembling standard photo ( ⁇ Standard_2) determined based on the colors applied in step S 18 (that is, based on the luminance information) and the grade (1) corresponding to the standard photo are indicated. Further, in regard to the opacity of the posterior capsule (PSC), the value (4) for the size (mm) of the opacity calculated in step S 20 and the grade (3) determined based on the value are indicated.
- COR the opacity of the cortex
- the examiner can easily recognize details of the conditions (that is, the images and the grades) of the tissues in which an abnormality is detected.
- the respective tissues in the crystalline lens of the subject eye E are analyzed based on the tomographic images of the subject eye E captured by using the OCT. Due to this, when the subject eye E is to be captured, a same position of the subject eye E can always be captured, and variations in the analysis results among examinations can be avoided. Due to this, the crystalline lens of the subject eye E can accurately be analyzed. Further, in the conventional methods using the slit lamp microscopes, the slit lamp is used for observing the nucleus, and the retroillumination is used for observing the tissues other than the nucleus. Due to this, in order to observe all the tissues in the crystalline lens, the subject eye E has to be irradiated with the illumination light over a long period of time.
- tomographic images of the respective tissues of the crystalline lens can be acquired at once by capturing the tomographic images of the anterior eye part of the subject eye E. Due to this, image-capturing time of the anterior eye part of the subject eye E can be shortened. Further, infrared light is used for the image capture in the OCT, thus the subject does not feel dazzled by the light from the light source during the image capture. Due to this, burden on the subject can be reduced.
- step S 16 only for the tissues in which an abnormality is detected in step S 16 , the creation of two-dimensional images for the tissues in step S 18 and the analysis of the tissues in step S 20 are executed. Due to this, the processes of steps S 18 and S 20 can be omitted for the tissues in which no abnormality is detected, and thus processing speed of the processor 60 can be increased.
- the tissues are graded in step S 20 based on the WHO classification, however, a standard classification method used in the grading is not particularly limited.
- the two-dimensional images are created only for the tissues in which an abnormality is detected, however, no limitation is placed on this configuration. Two-dimensional images may be created not only for tissues in which an abnormality is detected but also for tissues in which no abnormality is detected.
- the radial scanning scheme and the raster scanning scheme may be combined to independently capture tomographic images for creating the two-dimensional tomographic image of the nucleus and tomographic images for creating the front images of the tissues other than the nucleus (the anterior capsule, the front cortex, the rear cortex, and the posterior capsule).
- the anterior eye part of the subject eye E is captured by the radial scanning scheme for creating the two-dimensional tomographic image of the nucleus.
- tomographic images in 8 directions radially are captured.
- scan is executed plural times (for example, 4 times) at a same position in each direction, and the signal averaging is executed.
- the anterior eye part of the subject eye E is captured by the raster scanning scheme for creating the front images of the tissues other than the nucleus (the anterior capsule, the front cortex, the rear cortex, and the posterior capsule).
- tomographic images in 256 parallel cross sections are captured.
- the scan is simply executed only once.
- tomographic images captured by different schemes are used for the two-dimensional image (tomographic image) of the nucleus and the two-dimensional images (front images) of the tissues other than the nucleus
- the two-dimensional images can respectively be created with high accuracy. Since the two-dimensional image created for the nucleus is a tomographic image, the radial scan, which captures tomographic images such that each of them includes the corneal apex, is employed.
- the two-dimensional images created for the tissues other than the nucleus are En-face images
- entireties of the tissues simply needs to be captured.
- the entireties of the tissues can be captured either with the radial scanning scheme or the raster scanning scheme.
- time is required from a start of the image capture until an end thereof.
- the subject eye E might move. If the subject eye E moves while being captured according to the radial scanning scheme, displacements might occur between before and after the movement of the subject eye E in the created En-face images.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Ophthalmology & Optometry (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Signal Processing (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Eye Examination Apparatus (AREA)
Abstract
Description
Claims (4)
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018-107209 | 2018-06-04 | ||
| JP2018107209A JP7148113B2 (en) | 2018-06-04 | 2018-06-04 | ophthalmic equipment |
| JPJP2018-107209 | 2018-06-04 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20190365218A1 US20190365218A1 (en) | 2019-12-05 |
| US11141056B2 true US11141056B2 (en) | 2021-10-12 |
Family
ID=66690283
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/427,709 Active 2039-07-29 US11141056B2 (en) | 2018-06-04 | 2019-05-31 | Ophthalmic device |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US11141056B2 (en) |
| EP (1) | EP3578093B1 (en) |
| JP (1) | JP7148113B2 (en) |
| CN (1) | CN110547761B (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20230000346A1 (en) * | 2019-09-10 | 2023-01-05 | Topcon Corporation | Slit lamp microscope, ophthalmic information processing apparatus, ophthalmic system, method of controlling slit lamp microscope, and recording medium |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2021040855A (en) * | 2019-09-10 | 2021-03-18 | 株式会社トプコン | Slit lamp microscope, ophthalmic information processing device, ophthalmic system, method for controlling slit lamp microscope, program, and recording medium |
| JP6894648B2 (en) * | 2019-12-18 | 2021-06-30 | 株式会社中京メディカル | Lens data acquisition method, ophthalmic device and program for ophthalmic device |
| JP7517903B2 (en) | 2020-08-20 | 2024-07-17 | 株式会社トプコン | Slit Lamp Microscope System |
| CN116958550A (en) * | 2023-07-21 | 2023-10-27 | 视微影像(河南)科技有限公司 | Optical disc area structure hierarchy determination method, device and electronic equipment |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020186875A1 (en) * | 2001-04-09 | 2002-12-12 | Burmer Glenna C. | Computer methods for image pattern recognition in organic material |
| JP2011024930A (en) | 2009-07-29 | 2011-02-10 | Topcon Corp | Ophthalmological observation device |
| US20120083667A1 (en) | 2010-09-30 | 2012-04-05 | Nidek Co., Ltd. | Method of observing a three-dimensional image of examinee's eye |
| US20120127428A1 (en) * | 2009-09-30 | 2012-05-24 | Nidek Co., Ltd. | Ophthalmic photographing apparatus |
| US20140167762A1 (en) * | 2011-08-09 | 2014-06-19 | Hitachi Metals, Ltd. | Coil device and magnetic resonance imaging apparatus |
| US20160360962A1 (en) * | 2015-06-11 | 2016-12-15 | Tomey Corporation | Anterior Ocular Segment Optical Coherence Tomographic Imaging Device and Anterior Ocular Segment Optical Coherence Tomographic Imaging Method |
| JP2017093854A (en) | 2015-11-25 | 2017-06-01 | 株式会社トプコン | Ophthalmic photographing apparatus and ophthalmic image display apparatus |
| JP2018051071A (en) | 2016-09-29 | 2018-04-05 | 株式会社ニデック | Slit lamp |
Family Cites Families (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3730554B2 (en) | 2001-10-02 | 2006-01-05 | 株式会社ニデック | Anterior segment imaging device |
| JP5636550B2 (en) | 2010-11-09 | 2014-12-10 | 株式会社コーナン・メディカル | Lens image analyzer |
| WO2013187361A1 (en) * | 2012-06-14 | 2013-12-19 | 学校法人北里研究所 | Method and system for estimating postoperative intraocular lens position |
| CN105682538B (en) * | 2013-10-29 | 2018-07-03 | 尼德克株式会社 | Ophthalmology observes device and the storage medium of ophthalmology observation program |
| JP2016029968A (en) * | 2014-07-25 | 2016-03-07 | キヤノン株式会社 | Image processing apparatus, image processing method, program, and toric intraocular lens |
| JP2016214781A (en) * | 2015-05-26 | 2016-12-22 | ソニー株式会社 | Surgical system, and image processing apparatus and method |
| JP2015157182A (en) * | 2015-06-10 | 2015-09-03 | 株式会社ニデック | Ophthalmologic observation system |
| WO2017065018A1 (en) * | 2015-10-15 | 2017-04-20 | ソニー株式会社 | Image processing device, image processing method, and surgical microscope |
| JP6436888B2 (en) * | 2015-10-19 | 2018-12-12 | 株式会社トーメーコーポレーション | Intraocular lens power determination device |
| JP2018083106A (en) * | 2018-01-24 | 2018-05-31 | 株式会社ニデック | Ophthalmic photographing apparatus and photographing control program |
-
2018
- 2018-06-04 JP JP2018107209A patent/JP7148113B2/en active Active
-
2019
- 2019-05-31 US US16/427,709 patent/US11141056B2/en active Active
- 2019-06-03 EP EP19177946.1A patent/EP3578093B1/en active Active
- 2019-06-04 CN CN201910481408.8A patent/CN110547761B/en active Active
Patent Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20020186875A1 (en) * | 2001-04-09 | 2002-12-12 | Burmer Glenna C. | Computer methods for image pattern recognition in organic material |
| JP2011024930A (en) | 2009-07-29 | 2011-02-10 | Topcon Corp | Ophthalmological observation device |
| US20120121158A1 (en) | 2009-07-29 | 2012-05-17 | Kabushiki Kaisha Topcon | Ophthalmic observation apparatus |
| US20140078466A1 (en) | 2009-07-29 | 2014-03-20 | Kabushiki Kaisha Topcon | Method of controlling ophthalmic observation apparatus and ophthalmic observation apparatus |
| US20120127428A1 (en) * | 2009-09-30 | 2012-05-24 | Nidek Co., Ltd. | Ophthalmic photographing apparatus |
| US20120083667A1 (en) | 2010-09-30 | 2012-04-05 | Nidek Co., Ltd. | Method of observing a three-dimensional image of examinee's eye |
| US20140167762A1 (en) * | 2011-08-09 | 2014-06-19 | Hitachi Metals, Ltd. | Coil device and magnetic resonance imaging apparatus |
| US20160360962A1 (en) * | 2015-06-11 | 2016-12-15 | Tomey Corporation | Anterior Ocular Segment Optical Coherence Tomographic Imaging Device and Anterior Ocular Segment Optical Coherence Tomographic Imaging Method |
| JP2017093854A (en) | 2015-11-25 | 2017-06-01 | 株式会社トプコン | Ophthalmic photographing apparatus and ophthalmic image display apparatus |
| JP2018051071A (en) | 2016-09-29 | 2018-04-05 | 株式会社ニデック | Slit lamp |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20230000346A1 (en) * | 2019-09-10 | 2023-01-05 | Topcon Corporation | Slit lamp microscope, ophthalmic information processing apparatus, ophthalmic system, method of controlling slit lamp microscope, and recording medium |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2019208844A (en) | 2019-12-12 |
| EP3578093B1 (en) | 2024-09-11 |
| US20190365218A1 (en) | 2019-12-05 |
| EP3578093A1 (en) | 2019-12-11 |
| JP7148113B2 (en) | 2022-10-05 |
| CN110547761A (en) | 2019-12-10 |
| CN110547761B (en) | 2024-10-22 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11141056B2 (en) | Ophthalmic device | |
| US11659992B2 (en) | Ophthalmic apparatus | |
| EP2701575B1 (en) | Systems and methods for improved ophthalmic imaging | |
| JP5728302B2 (en) | Ophthalmic apparatus, ophthalmic system, ophthalmic apparatus control method, and program for the control method | |
| US11826102B2 (en) | Ophthalmic device, control method therefor, and recording medium | |
| US10561311B2 (en) | Ophthalmic imaging apparatus and ophthalmic information processing apparatus | |
| JP6607346B2 (en) | Anterior segment optical coherence tomography apparatus and anterior segment optical coherence tomography method | |
| EP3424404A1 (en) | Ophthalmologic imaging apparatus | |
| JP2017136216A (en) | Ophthalmic apparatus and ophthalmic examination system | |
| US11903646B2 (en) | Ophthalmic apparatus, method of controlling the same, method of ophthalmic optical coherence tomography, and recording medium | |
| JP6603545B2 (en) | Ophthalmic equipment | |
| US11857255B2 (en) | Ophthalmic apparatus | |
| JP7164679B2 (en) | Ophthalmic device and its control method | |
| JP7148114B2 (en) | ophthalmic equipment | |
| JP6839310B2 (en) | Optical tomography imaging device, its control method, and program | |
| JP7613500B2 (en) | IMAGE PROCESSING METHOD, IMAGE PROCESSING PROGRAM, AND IMAGE PROCESSING APPARATUS | |
| WO2023008385A1 (en) | Anterior eye part analysis device, anterior eye part analysis method, and program | |
| JP2019198724A (en) | Ophthalmologic apparatus | |
| JP2019130046A (en) | Ophthalmologic apparatus |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: TOMEY CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OKAMOTO, KEIICHIRO;HIGASHITA, RISA;HSIN YUAN, CHUANG;REEL/FRAME:049337/0730 Effective date: 20190523 |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 4 |