[go: up one dir, main page]

WO2023042577A1 - Dispositif de traitement d'informations ophtalmiques, dispositif ophtalmique, procédé de traitement d'informations ophtalmiques et programme - Google Patents

Dispositif de traitement d'informations ophtalmiques, dispositif ophtalmique, procédé de traitement d'informations ophtalmiques et programme Download PDF

Info

Publication number
WO2023042577A1
WO2023042577A1 PCT/JP2022/030396 JP2022030396W WO2023042577A1 WO 2023042577 A1 WO2023042577 A1 WO 2023042577A1 JP 2022030396 W JP2022030396 W JP 2022030396W WO 2023042577 A1 WO2023042577 A1 WO 2023042577A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
eye
image
information processing
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2022/030396
Other languages
English (en)
Japanese (ja)
Inventor
達夫 山口
陽子 広原
正博 秋葉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Topcon Corp
Original Assignee
Topcon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Topcon Corp filed Critical Topcon Corp
Publication of WO2023042577A1 publication Critical patent/WO2023042577A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography

Definitions

  • the present invention relates to an ophthalmic information processing device, an ophthalmic device, an ophthalmic information processing method, and a program.
  • fundus observation is useful for diagnosing fundus diseases and estimating the hardening state of the whole body (especially cerebral blood vessels).
  • a fundus image acquired by an ophthalmologic apparatus such as a fundus camera or a scanning light ophthalmoscope (SLO) is used.
  • Patent Literature 1 and Patent Literature 2 disclose an ophthalmologic apparatus that acquires a spectral fundus image.
  • Non-Patent Document 1 and Non-Patent Document 2 disclose a method of applying a hyperspectral image as a spectral fundus image to the retina.
  • Patent Literature 3 discloses a method of accurately identifying a site based on spectral characteristics from a spectral fundus image.
  • Spectral distribution data such as spectral images are acquired based on the reflected light of the illumination light from the measurement target area. Since the detected reflected light includes reflected light and scattered light from various tissues in the depth direction of the measurement target site, it is unclear from which tissue in the measurement target site the light is reflected. . If it is possible to specify which tissue in the measurement target site the reflected light comes from, it becomes possible to perform a more detailed analysis of the spectral distribution data.
  • the present invention has been made in view of such circumstances, and one of its purposes is to provide a new technique for analyzing spectral distribution data in more detail.
  • a first aspect includes a characteristic region identifying unit that identifies a characteristic region in spectral distribution data acquired by receiving return light in a predetermined wavelength range from an eye to be inspected illuminated with illumination light; and a depth information specifying unit that specifies depth information of the characteristic region based on the measurement data of the eye to be inspected, which has a higher resolution in the depth direction than the distribution data.
  • the characteristic region specifying unit is obtained by illuminating the eye to be inspected with illumination light and receiving return light from the eye to be inspected having different wavelength ranges. and specifying the characteristic region in any one of the plurality of spectral distribution data.
  • the measurement data is OCT data obtained by performing optical coherence tomography on the eye to be examined.
  • the depth information specifying unit has the highest correlation with the spectral distribution data among a plurality of front images formed based on the OCT data and having different depth positions.
  • a search unit that searches for a front image with a high degree of depth is included, and the depth information is specified based on the front image searched by the search unit.
  • the depth information specifying unit selects an image including the characteristic region from among a plurality of front images formed based on the OCT data and having different depth positions.
  • a searching unit that searches for a front image containing an image region with the highest degree of correlation is included, and the depth information is specified based on the front image searched by the searching unit.
  • a sixth aspect according to the embodiment is the fourth aspect or the fifth aspect, including an estimating unit that estimates the presence or absence of a disease, the probability of the disease, or the type of the disease based on the front image searched by the searching unit. .
  • a seventh aspect according to the embodiment includes, in the sixth aspect, a display control unit that causes the display means to display disease information including the presence or absence of the disease, the probability of the disease, or the type of the disease estimated by the estimation unit.
  • An eighth aspect according to the embodiment includes, in the fourth aspect or the fifth aspect, a display control unit that causes a display unit to display the front image and the depth information searched by the searching unit.
  • a ninth aspect according to the embodiment, in the fourth aspect or the fifth aspect, includes a display control unit that superimposes the spectral distribution data on the front image searched by the searching unit and displays it on a display unit.
  • the display control unit causes the display means to identifiably display an area corresponding to a characteristic part in the front image corresponding to the characteristic area.
  • An eleventh aspect according to the embodiment in any one of the first to seventh aspects, includes a display control unit that displays the spectral distribution data and the depth information on the display means.
  • the depth information represents a depth position, a depth range, and a layer region with reference to a reference portion of the subject's eye. contains at least one of the information;
  • a thirteenth aspect according to the embodiment is an illumination optical system that illuminates the eye to be inspected with illumination light, a light receiving optical system that receives return light of the illumination light from the eye to be inspected, the wavelength ranges of which are different from each other, and the eye to be inspected. and an ophthalmic information processing apparatus according to any one of the first to twelfth aspects.
  • a fourteenth aspect according to the embodiment is a characteristic region identifying step of identifying a characteristic region in spectral distribution data obtained by receiving return light in a predetermined wavelength range from an eye to be inspected illuminated with illumination light; and a depth information specifying step of specifying depth information of the characteristic region based on measurement data of the eye to be inspected, which has higher resolution in the depth direction than distribution data.
  • the characteristic region identifying step includes illuminating the eye to be inspected with illumination light and receiving return light from the eye to be inspected having different wavelength ranges. and specifying the characteristic region in any one of the plurality of spectral distribution data.
  • the measurement data is OCT data obtained by performing optical coherence tomography on the eye to be examined.
  • the depth information specifying step includes determining the depth information that is most correlated with the spectral distribution data from among a plurality of front images formed based on the OCT data and having different depth positions.
  • a search step of searching for a front image with a high degree of depth is included, and the depth information is specified based on the front image searched in the search step.
  • the depth information specifying step includes: selecting an image including the characteristic region from among a plurality of front images formed based on the OCT data and having different depth positions; A search step of searching for a front image containing an image region with the highest degree of correlation is included, and the depth information is specified based on the front image searched in the search step.
  • a nineteenth aspect according to the embodiment is the seventeenth aspect or the eighteenth aspect, including an estimation step of estimating the presence or absence of a disease, the probability of the disease, or the type of the disease based on the front image searched in the searching step. .
  • a twentieth aspect according to the embodiment, in the nineteenth aspect, includes a display control step of causing the display means to display the disease information including the presence or absence of the disease, the probability of the disease, or the type of the disease estimated in the estimation step.
  • a twenty-first aspect according to the embodiment in the seventeenth aspect or the eighteenth aspect, includes a display control step of displaying the front image and the depth information searched in the searching step on a display means.
  • a 22nd aspect according to the embodiment, in the 17th aspect or 18th aspect, includes a display control step of superimposing the spectral distribution data on the front image searched in the searching step and displaying it on a display means.
  • the display control step causes the display means to identifiably display an area corresponding to a characteristic part in the front image corresponding to the characteristic area.
  • a twenty-fourth aspect according to the embodiment, in any one of the fourteenth to nineteenth aspects, includes a display control step of displaying the spectral distribution data and the depth information on a display means.
  • the depth information represents a depth position, a depth range, and a layer region with reference to a reference portion of the subject's eye. contains at least one of the information;
  • a twenty-sixth aspect according to the embodiment is a program that causes a computer to execute each step of the ophthalmologic information processing method according to any one of the fourteenth to twenty-fifth aspects.
  • FIG. 1 is a schematic diagram showing an example of the configuration of a control system of an ophthalmologic apparatus according to an embodiment
  • FIG. 1 is a schematic diagram showing an example of the configuration of a control system of an ophthalmologic apparatus according to an embodiment
  • FIG. 1 is a schematic diagram showing an example of the configuration of a control system of an ophthalmologic apparatus according to an embodiment
  • FIG. 1 is a schematic diagram showing an example of the configuration of a control system of an ophthalmologic apparatus according to an embodiment
  • FIG. It is a schematic diagram for explaining the operation of the ophthalmologic apparatus according to the embodiment.
  • An ophthalmologic information processing apparatus acquires spectral distribution data of an eye to be examined, and based on measurement data of the eye to be examined that has higher resolution in the depth direction than the spectral distribution data, information representing the depth of the spectral distribution data ( depth information).
  • the ophthalmologic information processing apparatus identifies a characteristic region in the spectral distribution data of the subject's eye, and calculates the depth of the identified characteristic region based on measurement data of the subject's eye having higher resolution in the depth direction than the spectral distribution data.
  • the information to be represented can be specified.
  • the spectral distribution data is obtained by receiving return light in a predetermined wavelength range from the subject's eye (for example, fundus, anterior segment) illuminated with illumination light.
  • spectral distribution data include a spectral image (spectral fundus image, spectral anterior segment image) as a two-dimensional spectral distribution.
  • spectral images include hyperspectral images, multispectral images, and RGB color images.
  • characteristic regions include blood vessels, optic discs, diseased regions, and abnormal regions.
  • an eye to be inspected is illuminated with illumination light having two or more wavelength components whose wavelength ranges are different from each other, and return light having wavelength components in a predetermined wavelength range is selected from the light returned from the eye to be inspected. obtains a plurality of spectral distribution data.
  • the subject's eye is sequentially illuminated with illumination light having two or more wavelength components whose wavelength ranges are different from each other, and return light having wavelength components in a predetermined wavelength range is sequentially emitted from the return light from the subject's eye.
  • a plurality of spectral distribution data are acquired by selecting .
  • illumination light having wavelength components in a predetermined wavelength range is sequentially selected from illumination light having two or more wavelength components whose wavelength ranges are different from each other, and the eye to be examined is sequentially illuminated with the selected illumination light.
  • a plurality of spectral distribution data are acquired by illuminating and sequentially receiving return light from the subject's eye.
  • illumination light having two or more wavelength components with mutually different wavelength ranges is sequentially emitted using a light source whose wavelength range can be arbitrarily changed, and the emitted illumination light sequentially illuminates the subject's eye.
  • a plurality of spectral distribution data are acquired by illuminating and sequentially receiving return light from the subject's eye.
  • a plurality of spectral distribution data are obtained by illuminating the subject's eye with illumination light, sequentially changing the wavelength range in which the light receiving device has high light receiving sensitivity, and sequentially selecting return light from the subject's eye. is obtained.
  • the depth direction may be the traveling direction of illumination light that illuminates the subject's eye, the depth direction of the subject's eye, the direction from the superficial layer to the deep layer of the fundus, or the direction of the measurement optical axis (imaging optical axis) with respect to the subject's eye.
  • OCT optical coherence tomography
  • AO Adaptive Optics
  • the OCT data is obtained, for example, by dividing light from an OCT light source into measurement light and reference light, projecting the measurement light onto the eye to be inspected, returning light of the measurement light from the eye to be inspected, and reference light passing through the reference light path. It is obtained by detecting interfering light.
  • the ophthalmic information processing device is configured to acquire OCT data obtained by an externally provided OCT device.
  • the functionality of the ophthalmic information processing device is implemented by an ophthalmic device capable of acquiring OCT data.
  • the ophthalmic information processing device is configured to acquire measurement data obtained by an externally provided AO-SLO device.
  • the functionality of the ophthalmic information processing device is implemented by an ophthalmic device having AO-SLO functionality.
  • a plurality of spectral distribution data are acquired by sequentially receiving return light of illumination light with different wavelength ranges in a predetermined analysis wavelength region. Regarding the first returned light and the second returned light whose wavelength ranges are adjacent to each other among the sequentially received returned lights, part of the wavelength range of the first returned light may overlap with the wavelength range of the second returned light.
  • characteristic region identification processing is executed for each of the plurality of spectral distribution data. For example, the depth information of the characteristic region in the spectral distribution data that can specify the characteristic region with the highest accuracy among the plurality of spectral distribution data is obtained. For example, depth information of a characteristic region in desired spectral distribution data selected by a user or the like from a plurality of spectral distribution data is obtained.
  • the ophthalmologic information processing apparatus generates a plurality of front images (en-face images, C-scan images, projection images) formed based on OCT data of the subject's eye and projected or integrated in mutually different depth ranges. , OCT angiography) is searched for a front image having the highest degree of correlation with the spectral distribution data. The ophthalmologic information processing apparatus identifies depth information of the searched front image as depth information of the spectral distribution data.
  • An ophthalmologic information processing method includes one or more steps executed by the ophthalmologic information processing apparatus described above.
  • a program according to an embodiment causes a computer (processor) to execute each step of an ophthalmologic information processing method according to an embodiment.
  • a recording medium according to the embodiment is a non-temporary recording medium (storage medium) in which the program according to the embodiment is recorded.
  • a processor is, for example, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), an ASIC (Application Specific Integrated Circuit), a programmable logic device (for example, a SPLD (Simple Programmable Logic Device CPLD Logic Device), FPGA (Field Programmable Gate Array)), etc.
  • the processor implements the functions according to the embodiment by, for example, reading and executing a program stored in a memory circuit or memory device.
  • a memory circuit or device may be included in the processor. Also, a memory circuit or memory device may be provided external to the processor.
  • depth information for a spectral fundus image as spectral distribution data of the fundus of the subject's eye
  • the configuration according to the embodiment is not limited to this.
  • the following embodiments are also applicable to specifying depth information for a spectral anterior segment image as spectral distribution data of an anterior segment other than the fundus.
  • the ophthalmologic information processing apparatus is configured to acquire spectral distribution data of an eye to be examined that is externally acquired through a communication function.
  • an ophthalmologic apparatus capable of acquiring spectral distribution data of an eye to be examined has the function of an ophthalmologic information processing apparatus.
  • An ophthalmologic apparatus including the functions of the ophthalmologic information processing apparatus according to the embodiment will be described as an example.
  • An ophthalmologic apparatus according to an embodiment includes an ophthalmologic imaging apparatus.
  • the ophthalmic imaging device included in the ophthalmic device of some embodiments is, for example, any one or more of a fundus camera, a scanning optical ophthalmoscope, a slit lamp ophthalmoscope, a surgical microscope, or the like.
  • An ophthalmic device according to some embodiments includes any one or more of an ophthalmic measurement device and an ophthalmic treatment device in addition to an ophthalmic imaging device.
  • the ophthalmic measurement device included in the ophthalmic device of some embodiments is, for example, any one or more of an eye refractor, a tonometer, a specular microscope, a wavefront analyzer, a perimeter, a microperimeter, etc. .
  • the ophthalmic treatment device included in the ophthalmic device of some embodiments is, for example, any one or more of a laser treatment device, a surgical device, a surgical microscope, and the like.
  • the ophthalmic device includes an optical coherence tomography and a fundus camera.
  • swept source OCT is applied to this optical coherence tomography
  • the type of OCT is not limited to this, and other types of OCT (spectral domain OCT, time domain OCT, Amphas OCT, etc.) may be applied. good.
  • the x direction is the direction (horizontal direction) perpendicular to the optical axis direction of the objective lens
  • the y direction is the direction (vertical direction) perpendicular to the optical axis direction of the objective lens.
  • the z-direction is assumed to be the optical axis direction of the objective lens.
  • the ophthalmologic apparatus 1 includes a fundus camera unit 2 , an OCT unit 100 and an arithmetic control unit 200 .
  • the retinal camera unit 2 is provided with an optical system and a mechanism for acquiring a front image of the eye E to be examined.
  • the OCT unit 100 is provided with a part of an optical system and a mechanism for performing OCT. Another part of the optical system and mechanism for performing OCT is provided in the fundus camera unit 2 .
  • the arithmetic control unit 200 includes one or more processors that perform various arithmetic operations and controls.
  • the ophthalmologic apparatus 1 includes a pair of anterior eye cameras 5A and 5B.
  • the fundus camera unit 2 is provided with an optical system for photographing the fundus Ef of the eye E to be examined.
  • the acquired image of the fundus oculi Ef (referred to as a fundus image, fundus photograph, etc.) is a front image such as an observed image or a photographed image. Observation images are obtained by moving image shooting using near-infrared light.
  • the captured image is a still image using flash light or a spectral image (spectral fundus image, spectral anterior segment image).
  • the fundus camera unit 2 can photograph the anterior segment Ea of the subject's eye E to obtain a front image (anterior segment image).
  • the retinal camera unit 2 includes an illumination optical system 10 and an imaging optical system 30.
  • the illumination optical system 10 irradiates the eye E to be inspected with illumination light.
  • the imaging optical system 30 detects return light of the illumination light from the eye E to be examined.
  • the measurement light from the OCT unit 100 is guided to the subject's eye E through the optical path in the retinal camera unit 2, and its return light is guided to the OCT unit 100 through the same optical path.
  • observation illumination light output from an observation light source 11 of an illumination optical system 10 is reflected by a reflecting mirror 12 having a curved reflecting surface, passes through a condenser lens 13, and passes through a visible light cut filter 14. It becomes near-infrared light. Furthermore, the observation illumination light is once converged near the photographing light source 15 , reflected by the mirror 16 , and passed through the relay lenses 17 and 18 , the diaphragm 19 and the relay lens 20 . Then, the observation illumination light is reflected by the periphery of the perforated mirror 21 (area around the perforation), passes through the dichroic mirror 46, is refracted by the objective lens 22, Illuminate part Ea).
  • the return light of the observation illumination light from the subject's eye E is refracted by the objective lens 22, passes through the dichroic mirror 46, passes through the hole formed in the central region of the aperture mirror 21, and passes through the photographing focusing lens 31. through and reflected by mirror 32 . Further, this return light passes through the half mirror 33A, is reflected by the dichroic mirror 33, and is imaged on the light receiving surface of the image sensor 35 by the condenser lens . The image sensor 35 detects returned light at a predetermined frame rate. The focus of the imaging optical system 30 is adjusted so as to match the fundus oculi Ef or the anterior segment Ea.
  • the light (imaging illumination light) output from the imaging light source 15 irradiates the fundus oculi Ef through the same path as the observation illumination light.
  • the return light of the photographing illumination light from the subject's eye E is guided to the dichroic mirror 33 through the same path as the return light of the observation illumination light, passes through the dichroic mirror 33, is reflected by the mirror 36, and enters the tunable filter 80. be guided.
  • the wavelength tunable filter 80 is a filter that can select the wavelength range of transmitted light in a predetermined analysis wavelength region.
  • the wavelength range of light transmitted through the wavelength tunable filter 80 can be arbitrarily selected.
  • the tunable filter 80 is similar to the liquid crystal tunable filter disclosed in Japanese Patent Application Laid-Open No. 2006-158546, for example.
  • the wavelength tunable filter 80 can arbitrarily select the wavelength selection range of transmitted light by changing the voltage applied to the liquid crystal.
  • the wavelength tunable filter 80 includes two or more wavelength selection filters having different wavelength selection ranges for transmitted light, and the two or more wavelength selection filters are selectively arranged in the optical path of return light of the illumination light. may be configured to be possible.
  • the wavelength tunable filter 80 is a filter that can select the wavelength range of reflected light in a predetermined analysis wavelength region.
  • Return light from the mirror 36 that has passed through the wavelength tunable filter 80 is imaged on the light receiving surface of the image sensor 38 by the condenser lens 37 .
  • tunable filter 80 is placed between dichroic mirror 33 and condenser lens 34 .
  • the tunable filter 80 is configured to be insertable/removable with respect to the optical path between the dichroic mirror 33 or mirror 36 and the condenser lens 37 .
  • the ophthalmologic apparatus 1 can sequentially obtain the results of receiving the returned light obtained by the image sensor 38. Multiple spectral fundus images can be acquired.
  • the ophthalmologic apparatus 1 when the wavelength tunable filter 80 is retracted from the optical path between the dichroic mirror 33 and the condenser lens 37, the ophthalmologic apparatus 1 obtains the result of receiving the return light obtained by the image sensor 38, and normally still images (fundus image, anterior segment image) can be acquired.
  • An image (observation image) based on the fundus reflected light detected by the image sensor 35 is displayed on the display device 3 .
  • the display device 3 also displays an image (captured image, spectral fundus image) based on the fundus reflected light detected by the image sensor 38 .
  • the display device 3 that displays the observed image and the display device 3 that displays the captured image may be the same or different.
  • An LCD (Liquid Crystal Display) 39 displays a fixation target and a visual acuity measurement target.
  • a part of the light flux output from the LCD 39 is reflected by the half mirror 33 A, reflected by the mirror 32 , passes through the photographing focusing lens 31 , and passes through the aperture of the apertured mirror 21 .
  • the luminous flux that has passed through the aperture of the perforated mirror 21 is transmitted through the dichroic mirror 46, refracted by the objective lens 22, and projected onto the fundus oculi Ef.
  • fixation position of the subject's eye E By changing the display position of the fixation target on the screen of the LCD 39, the fixation position of the subject's eye E can be changed.
  • fixation positions include the fixation position for acquiring an image centered on the macula, the fixation position for acquiring an image centered on the optic disc, and the center of the fundus between the macula and the optic disc. and a fixation position for acquiring an image of a site far away from the macula (eye fundus periphery).
  • the ophthalmologic apparatus 1 includes a GUI (Graphical User Interface) or the like for designating at least one of such fixation positions.
  • the ophthalmologic apparatus 1 includes a GUI or the like for manually moving the fixation position (the display position of the fixation target).
  • a movable fixation target can be generated by selectively lighting multiple light sources in a light source array (such as a light emitting diode (LED) array). Also, one or more movable light sources can generate a movable fixation target.
  • a light source array such as a light emitting diode (LED) array.
  • LED light emitting diode
  • one or more movable light sources can generate a movable fixation target.
  • the focus optical system 60 generates a split index used for focus adjustment of the eye E to be examined.
  • the focus optical system 60 is moved along the optical path (illumination optical path) of the illumination optical system 10 in conjunction with the movement of the imaging focusing lens 31 along the optical path (illumination optical path) of the imaging optical system 30 .
  • the reflecting bar 67 can be inserted into and removed from the illumination optical path. When performing focus adjustment, the reflecting surface of the reflecting bar 67 is arranged at an angle in the illumination optical path.
  • Focus light output from the LED 61 passes through a relay lens 62, is split into two light beams by a split index plate 63, passes through a two-hole diaphragm 64, is reflected by a mirror 65, and is reflected by a condenser lens 66 onto a reflecting rod 67. is once imaged on the reflective surface of , and then reflected. Further, the focused light passes through the relay lens 20, is reflected by the perforated mirror 21, passes through the dichroic mirror 46, is refracted by the objective lens 22, and is projected onto the fundus oculi Ef. The fundus reflected light of the focus light is guided to the image sensor 35 through the same path as the return light of the observation illumination light. Manual focus and autofocus can be performed based on the received light image (split index image).
  • the dichroic mirror 46 synthesizes the fundus imaging optical path and the OCT optical path.
  • the dichroic mirror 46 reflects light in the wavelength band used for OCT and transmits light for fundus imaging.
  • the optical path for OCT (the optical path of the measurement light) includes, in order from the OCT unit 100 side toward the dichroic mirror 46 side, a collimator lens unit 40, an optical path length changing section 41, an optical scanner 42, an OCT focusing lens 43, a mirror 44, and a relay lens 45 are provided.
  • the optical path length changing unit 41 is movable in the direction of the arrow shown in FIG. 1, and changes the length of the OCT optical path. This change in optical path length is used for optical path length correction according to the axial length of the eye, adjustment of the state of interference, and the like.
  • the optical path length changing section 41 includes a corner cube and a mechanism for moving it.
  • the optical scanner 42 is arranged at a position optically conjugate with the pupil of the eye E to be examined.
  • the optical scanner 42 deflects the measurement light LS passing through the OCT optical path.
  • the optical scanner 42 is, for example, a galvanometer scanner capable of two-dimensional scanning.
  • the OCT focusing lens 43 is moved along the optical path of the measurement light LS in order to adjust the focus of the OCT optical system. Movement of the imaging focusing lens 31, movement of the focusing optical system 60, and movement of the OCT focusing lens 43 can be controlled in a coordinated manner.
  • anterior segment cameras 5A and 5B are used to determine the relative position between the optical system of the ophthalmologic apparatus 1 and the subject's eye E, similar to the invention disclosed in Japanese Patent Laid-Open No. 2013-248376.
  • the anterior eye cameras 5A and 5B are provided on the face of the subject's eye E side of a housing (fundus camera unit 2, etc.) housing an optical system.
  • the ophthalmologic apparatus 1 analyzes two anterior segment images obtained substantially simultaneously from different directions by the anterior segment cameras 5A and 5B, thereby determining the three-dimensional relative relationship between the optical system and the subject's eye E. find the position.
  • the analysis of the two anterior segment images may be similar to the analysis disclosed in Japanese Patent Application Laid-Open No. 2013-248376.
  • the number of anterior segment cameras may be any number of two or more.
  • the position of the eye to be examined E (that is, the relative position between the eye to be examined E and the optical system) is obtained using two or more anterior eye cameras. It is not limited to this.
  • the position of the eye E to be examined can be obtained by analyzing a front image of the eye E to be examined (for example, an observed image of the anterior segment Ea).
  • means for projecting an index onto the cornea of the subject's eye E can be provided, and the position of the subject's eye E can be obtained based on the projection position of this index (that is, the detection state of the corneal reflected light flux of this index).
  • the OCT unit 100 is provided with an optical system for performing swept-source OCT.
  • This optical system includes an interference optical system.
  • This interference optical system has a function of dividing light from a wavelength tunable light source (wavelength swept light source) into measurement light and reference light, return light of the measurement light from the subject's eye E, and reference light passing through the reference light path. and a function of generating interference light and a function of detecting this interference light.
  • a detection result (detection signal) of the interference light obtained by the interference optical system is a signal indicating the spectrum of the interference light, and is sent to the arithmetic control unit 200 .
  • the light source unit 101 includes, for example, a near-infrared tunable laser that changes the wavelength of emitted light at high speed.
  • the light L0 output from the light source unit 101 is guided to the polarization controller 103 by the optical fiber 102, and the polarization state is adjusted.
  • the light L0 whose polarization state has been adjusted is guided by the optical fiber 104 to the fiber coupler 105 and split into the measurement light LS and the reference light LR.
  • the reference light LR is guided to the collimator 111 by the optical fiber 110, converted into a parallel beam, passed through the optical path length correction member 112 and the dispersion compensation member 113, and guided to the corner cube 114.
  • the optical path length correction member 112 acts to match the optical path length of the reference light LR and the optical path length of the measurement light LS.
  • the dispersion compensation member 113 acts to match the dispersion characteristics between the reference light LR and the measurement light LS.
  • the corner cube 114 is movable in the incident direction of the reference light LR, thereby changing the optical path length of the reference light LR.
  • the reference light LR that has passed through the corner cube 114 passes through the dispersion compensating member 113 and the optical path length correcting member 112 , is converted by the collimator 116 from a parallel beam into a converged beam, and enters the optical fiber 117 .
  • the reference light LR incident on the optical fiber 117 is guided to the polarization controller 118 to have its polarization state adjusted, guided to the attenuator 120 via the optical fiber 119 to have its light amount adjusted, and guided to the fiber coupler 122 via the optical fiber 121 . be killed.
  • the measurement light LS generated by the fiber coupler 105 is guided by the optical fiber 127 and converted into a parallel light beam by the collimator lens unit 40, and the optical path length changing unit 41, the optical scanner 42, the OCT focusing lens 43, and the mirror 44. and relay lens 45 .
  • the measurement light LS that has passed through the relay lens 45 is reflected by the dichroic mirror 46, refracted by the objective lens 22, and enters the eye E to be examined.
  • the measurement light LS is scattered and reflected at various depth positions of the eye E to be examined.
  • the return light of the measurement light LS from the subject's eye E travels in the opposite direction along the same path as the forward path, is guided to the fiber coupler 105 , and reaches the fiber coupler 122 via the optical fiber 128 .
  • the incident length of the optical fiber 127 into which the measurement light LS is incident is arranged at a position substantially conjugate with the fundus oculi Ef of the eye E to be examined.
  • the fiber coupler 122 combines (interferences) the measurement light LS that has entered via the optical fiber 128 and the reference light LR that has entered via the optical fiber 121 to generate interference light.
  • the fiber coupler 122 generates a pair of interference lights LC by splitting the interference lights at a predetermined splitting ratio (for example, 1:1).
  • a pair of interference lights LC are guided to detector 125 through optical fibers 123 and 124, respectively.
  • the detector 125 is, for example, a balanced photodiode.
  • a balanced photodiode includes a pair of photodetectors that respectively detect a pair of interference lights LC, and outputs a difference between a pair of detection results obtained by these photodetectors.
  • the detector 125 sends this output (detection signal) to a DAQ (Data Acquisition System) 130 .
  • a clock KC is supplied from the light source unit 101 to the DAQ 130 .
  • the clock KC is generated in the light source unit 101 in synchronization with the output timing of each wavelength swept within a predetermined wavelength range by the wavelength tunable light source.
  • the light source unit 101 for example, optically delays one of the two branched lights obtained by branching the light L0 of each output wavelength, and then outputs the clock KC based on the result of detecting these combined lights. Generate.
  • the DAQ 130 samples the detection signal input from the detector 125 based on the clock KC.
  • DAQ 130 sends the sampling result of the detection signal from detector 125 to arithmetic control unit 200 .
  • an optical path length changing unit 41 for changing the length of the optical path (measurement optical path, measurement arm) of the measurement light LS and an optical path length changing unit 41 for changing the length of the optical path (reference optical path, reference arm) of the reference light LR corner cubes 114 are provided.
  • only one of the optical path length changing portion 41 and the corner cube 114 may be provided. It is also possible to change the difference between the measurement optical path length and the reference optical path length by using optical members other than these.
  • Control system 3 to 5 show configuration examples of the control system of the ophthalmologic apparatus 1.
  • FIG. 3 to 5 some of the components included in the ophthalmologic apparatus 1 are omitted.
  • the same parts as those in FIGS. 1 and 2 are denoted by the same reference numerals, and description thereof will be omitted as appropriate.
  • the control section 210, the image forming section 220 and the data processing section 230 are provided in the arithmetic control unit 200, for example.
  • Control unit 210 executes various controls.
  • Control unit 210 includes main control unit 211 and storage unit 212 .
  • the main controller 211 includes a processor (eg, control processor) and controls each part of the ophthalmologic apparatus 1 (including each element shown in FIGS. 1 to 5).
  • the main control unit 211 controls each part of the optical system of the retinal camera unit 2 shown in FIGS. It controls mechanism 150 , image forming section 220 , data processing section 230 and user interface (UI) 240 .
  • UI user interface
  • the control over the retinal camera unit 2 includes control over the focus driving units 31A and 43A, control over the wavelength tunable filter 80, control over the image sensors 35 and 38, control over the optical path length changing unit 41, and control over the optical scanner 42.
  • the control for the focus drive unit 31A includes control for moving the photographing focus lens 31 in the optical axis direction.
  • the control for the focus drive unit 43A includes control for moving the OCT focus lens 43 in the optical axis direction.
  • the control over the wavelength tunable filter 80 includes selection control of the wavelength range of transmitted light (for example, control of voltage applied to the liquid crystal).
  • the control of the image sensors 35 and 38 includes control of the light receiving sensitivity of the imaging element, control of the frame rate (light receiving timing, exposure time), control of the light receiving area (position, size, size), and readout of the light receiving result of the imaging element. control, etc.
  • the image sensor by changing the exposure time according to the wavelength range of the return light, the image sensor is arranged so that the received light intensity is uniform in each wavelength range of the analysis wavelength range in which a plurality of spectral fundus images are acquired. 35, 38 are controlled.
  • the main controller 211 controls the wavelength components of the illumination light in each wavelength range so that the received light intensity is uniform in each wavelength range of the analysis wavelength range in which a plurality of spectral fundus images are acquired. to control the intensity of the
  • Control over the LCD 39 includes control of the fixation position.
  • the main control unit 211 displays the fixation target at a position on the screen of the LCD 39 corresponding to the fixation position set manually or automatically. Further, the main control unit 211 can change (continuously or stepwise) the display position of the fixation target displayed on the LCD 39 . Thereby, the fixation target can be moved (that is, the fixation position can be changed).
  • the display position and movement mode of the fixation target are set manually or automatically. Manual setting is performed using, for example, a GUI. Automatic setting is performed by the data processing unit 230, for example.
  • the control over the optical path length changing unit 41 includes control for changing the optical path length of the measurement light LS.
  • the main control unit 211 moves the optical path length changing unit 41 along the optical path of the measuring light LS by controlling the driving unit that drives the corner cubes of the optical path length changing unit 41 to change the optical path length of the measuring light LS. .
  • Control of the optical scanner 42 includes control of scan mode, scan range (scan start position, scan end position), scan speed, and the like.
  • the main control unit 211 can perform an OCT scan with the measurement light LS on a desired region of the measurement site (imaging site).
  • the main control unit 211 also controls the observation light source 11, the photographing light source 15, the focus optical system 60, and the like.
  • Control over the OCT unit 100 includes control over the light source unit 101, control over the reference driver 114A, control over the detector 125, and control over the DAQ 130.
  • the control of the light source unit 101 includes control of turning on and off of the light source, control of the amount of light emitted from the light source, control of the wavelength sweep range, wavelength sweep speed, control of emission timing of light of each wavelength component, and the like. .
  • the control over the reference driver 114A includes control to change the optical path length of the reference light LR.
  • the main control unit 211 moves the corner cube 114 along the optical path of the reference light LR by controlling the reference driving unit 114A to change the optical path length of the reference light LR.
  • the control of the detector 125 includes control of the light receiving sensitivity of the detecting element, control of the frame rate (light receiving timing), control of the light receiving area (position, size, size), control of reading the light receiving result of the detecting element, and the like.
  • Control over the DAQ 130 includes fetch control (fetch timing, sampling timing) of the detection result of interference light obtained by the detector 125, readout control of the interference signal corresponding to the detection result of the fetched interference light, and the like.
  • the control for the anterior eye cameras 5A and 5B includes control of the light receiving sensitivity of each camera, frame rate (light receiving timing) control, synchronization control of the anterior eye cameras 5A and 5B, and the like.
  • the movement mechanism 150 for example, three-dimensionally moves at least the retinal camera unit 2 (optical system).
  • the movement mechanism 150 includes at least a mechanism for moving the retinal camera unit 2 in the x direction (horizontal direction), a mechanism for moving it in the y direction (vertical direction), and a mechanism for moving it in the z direction (depth direction). , back and forth).
  • the mechanism for moving in the x-direction includes, for example, an x-stage movable in the x-direction and an x-moving mechanism for moving the x-stage.
  • the mechanism for moving in the y-direction includes, for example, a y-stage movable in the y-direction and a y-moving mechanism for moving the y-stage.
  • the mechanism for moving in the z-direction includes, for example, a z-stage movable in the z-direction and a z-moving mechanism for moving the z-stage.
  • Each movement mechanism includes a pulse motor as an actuator and operates under control from the main control unit 211 .
  • the control over the moving mechanism 150 is used in alignment and tracking. Tracking is to move the apparatus optical system according to the eye movement of the eye E to be examined. Alignment and focus adjustment are performed in advance when tracking is performed. Tracking is a function of maintaining a suitable positional relationship in which alignment and focus are achieved by causing the position of the apparatus optical system to follow the movement of the eyeball. Some embodiments are configured to control movement mechanism 150 to change the optical path length of the reference beam (and thus the optical path length difference between the optical path of the measurement beam and the optical path of the reference beam).
  • the user relatively moves the optical system and the subject's eye E by operating the user interface 240 so that the displacement of the subject's eye E with respect to the optical system is cancelled.
  • the main control unit 211 controls the moving mechanism 150 to move the optical system relative to the eye E by outputting a control signal corresponding to the operation content of the user interface 240 to the moving mechanism 150 .
  • the main control unit 211 controls the movement mechanism 150 so that the displacement of the eye E to be examined with respect to the optical system is canceled, thereby moving the optical system relative to the eye E to be examined.
  • arithmetic processing using trigonometry based on the positional relationship between the pair of anterior eye cameras 5A and 5B and the subject's eye E is performed, and the main control unit A reference numeral 211 controls the moving mechanism 150 so that the eye to be examined E has a predetermined positional relationship with respect to the optical system.
  • the main controller 211 outputs a control signal such that the optical axis of the optical system substantially coincides with the axis of the eye E to be examined and the distance of the optical system from the eye E to be examined is a predetermined working distance.
  • the working distance is a default value also called a working distance of the objective lens 22, and corresponds to the distance between the subject's eye E and the optical system at the time of measurement (at the time of photographing) using the optical system.
  • the main control unit 211 can display various information on the display unit 240A as a display control unit.
  • the main control unit 211 causes the display unit 240A to display a plurality of spectral fundus images in association with wavelength ranges.
  • the main control unit 211 causes the display unit 240A to display analysis processing results obtained by the analysis unit 231, which will be described later.
  • the storage unit 212 stores various data.
  • the function of the storage unit 212 is implemented by a storage device such as a memory or a storage device.
  • the data stored in the storage unit 212 includes, for example, control parameters, fundus image data, anterior segment image data, OCT data (including OCT images), spectral image data of the fundus image, and anterior segment image. spectroscopic image data, information on the eye to be examined, and the like.
  • Control parameters include hyperspectral imaging control data and the like.
  • the hyperspectral imaging control data is control data for acquiring a plurality of fundus images based on return lights with different central wavelengths within a predetermined analysis wavelength range.
  • hyperspectral imaging control data examples include an analysis wavelength range in which a plurality of spectral fundus images are acquired, a wavelength range in which each spectral fundus image is acquired, a center wavelength, a center wavelength step, and a wavelength tunable filter 80 corresponding to the center wavelength. control data, etc.
  • the eye information to be examined includes information about the subject such as patient ID and name, information about the eye to be examined such as left/right eye identification information, and electronic medical record information.
  • the storage unit 212 stores programs for executing various processors (control processor, image forming processor, data processing processor).
  • the image forming unit 220 includes a processor (for example, an image forming processor), and forms an OCT image (image data) of the subject's eye E based on the output from the DAQ 130 (sampling result of the detection signal). For example, the image forming unit 220 performs signal processing on the spectral distribution based on the sampling result for each A line, forms a reflection intensity profile for each A line, and images these A line profiles as in the conventional swept source OCT. and arrange them along the scan lines.
  • the signal processing includes noise removal (noise reduction), filtering, FFT (Fast Fourier Transform), and the like.
  • the image forming section 220 performs known processing according to that type.
  • the data processing unit 230 includes a processor (for example, a data processing processor) and performs image processing and analysis processing on the image formed by the image forming unit 220 . At least two of the processor included in the main control unit 211, the processor included in the data processing unit 230, and the processor included in the image forming unit 220 may be configured by a single processor.
  • a processor for example, a data processing processor
  • the data processing unit 230 executes known image processing such as interpolation processing for interpolating pixels between tomographic images to form image data of a three-dimensional image of the fundus oculi Ef or the anterior segment Ea.
  • image data of a three-dimensional image means image data in which pixel positions are defined by a three-dimensional coordinate system.
  • Image data of a three-dimensional image includes image data composed of voxels arranged three-dimensionally. This image data is called volume data or voxel data.
  • rendering processing volume rendering, MIP (Maximum Intensity Projection: maximum intensity projection), etc.
  • Image data of a pseudo three-dimensional image is formed. This pseudo three-dimensional image is displayed on a display device such as the display unit 240A.
  • stack data of a plurality of tomographic images is image data of a three-dimensional image.
  • Stacked data is image data obtained by three-dimensionally arranging a plurality of tomographic images obtained along a plurality of scan lines based on the positional relationship of the scan lines. That is, stack data is image data obtained by expressing a plurality of tomographic images, which were originally defined by individual two-dimensional coordinate systems, by one three-dimensional coordinate system (that is, embedding them in one three-dimensional space).
  • the data processing unit 230 generates a B-scan image by arranging the A-scan images in the B-scan direction. In some embodiments, the data processing unit 230 performs various renderings on the acquired three-dimensional data set (volume data, stack data, etc.) to obtain a B-mode image (B-scan image) (longitudinal section) in an arbitrary cross section. plane image, axial cross-sectional image), C-mode image (C-scan image) at an arbitrary cross-section (cross-sectional image, horizontal cross-sectional image), projection image, shadowgram, and the like.
  • B-mode image B-scan image
  • C-mode image C-mode image
  • An arbitrary cross-sectional image such as a B-scan image or a C-scan image, is formed by selecting pixels (pixels, voxels) on a specified cross-section from a three-dimensional data set.
  • a projection image is formed by projecting a three-dimensional data set in a predetermined direction (z direction, depth direction, axial direction).
  • a shadowgram is formed by projecting a portion of the three-dimensional data set (for example, partial data corresponding to a specific layer) in a predetermined direction. By changing the depth range in the layer direction to be integrated, it is possible to form two or more different shadowgrams.
  • An image such as a C-scan image, a projection image, or a shadowgram whose viewpoint is the front side of the subject's eye is called an en-face image.
  • the data processing unit 230 generates B-scan images and front images (blood vessel-enhanced images, angiograms) in which retinal vessels and choroidal vessels are emphasized based on data (for example, B-scan image data) collected in time series by OCT. can be constructed.
  • data for example, B-scan image data
  • time-series OCT data can be collected by repeatedly scanning substantially the same portion of the eye E to be examined.
  • the data processing unit 230 compares time-series B-scan images obtained by B-scans of substantially the same site, and converts the pixel values of the portions where the signal intensity changes to the pixel values corresponding to the changes.
  • An enhanced image in which the changed portion is emphasized is constructed by the conversion.
  • the data processing unit 230 extracts information for a predetermined thickness in a desired region from the constructed multiple enhanced images and constructs an en-face image to form an OCTA (angiography) image.
  • Such a data processing unit 230 includes an analysis unit 231.
  • the analysis section 231 includes a characteristic site identification section 231A, a three-dimensional position calculation section 231B, and a spectral distribution data processing section 231C.
  • the analysis unit 231 can analyze the image (including the spectroscopic fundus image) of the subject's eye E to identify the characteristic regions depicted in the image. For example, the analysis unit 231 obtains the three-dimensional position of the subject's eye E based on the positions of the anterior eye cameras 5A and 5B and the positions of the specified characteristic regions.
  • the main control unit 211 aligns the optical system with respect to the eye to be examined E by relatively moving the optical system with respect to the eye to be examined E based on the determined three-dimensional position.
  • the analysis unit 231 can perform predetermined analysis processing on a plurality of spectral fundus images.
  • predetermined analysis processing include comparison processing of arbitrary two images among a plurality of spectral fundus images, processing of extracting a common region or difference region specified by the comparison processing, and attention to at least one of the plurality of spectral fundus images.
  • the analysis unit 231 identifies depth information of a characteristic region in any one of the plurality of spectral fundus images, and identifies depth information of the identified characteristic region based on OCT data as measurement data. In some embodiments, the analysis unit 231 aligns a plurality of spectral fundus images based on the OCT data so that each part of each spectral fundus image is aligned in the z direction, and aligns the aligned multiple fundus images. It is possible to identify any characteristic region of the spectral fundus image of .
  • the characteristic site identification unit 231A analyzes each of the captured images obtained by the anterior segment cameras 5A and 5B to identify positions (referred to as characteristic positions) in the captured images corresponding to the characteristic sites of the anterior segment Ea. Identify. For example, the pupil region of the subject eye E, the pupil center position of the subject eye E, the pupil center position, the corneal center position, the corneal vertex position, the subject eye center position, or the iris are used as the characteristic site. A specific example of processing for specifying the pupil center position of the eye E to be examined will be described below.
  • the characteristic part specifying unit 231A specifies an image region (pupil region) corresponding to the pupil of the subject's eye E based on the distribution of pixel values (such as luminance values) of the captured image. Since the pupil is generally drawn with lower luminance than other parts, the pupil region can be identified by searching for the low-luminance image region. At this time, the pupil region may be specified in consideration of the shape of the pupil. In other words, the pupil region can be identified by searching for a substantially circular low-brightness image region.
  • the characteristic part identifying section 231A identifies the central position of the identified pupil region. Since the pupil is substantially circular as described above, it is possible to specify the outline of the pupil region, specify the center position of this outline (the approximate circle or approximate ellipse), and set this as the pupil center position. Alternatively, the center of gravity of the pupil region may be obtained and the position of the center of gravity may be specified as the position of the center of gravity of the pupil.
  • the characteristic part identifying unit 231A can sequentially identify characteristic positions corresponding to characteristic parts in the captured images sequentially obtained by the anterior eye cameras 5A and 5B. In addition, the characteristic part identification unit 231A may identify the characteristic position every one or more frames of the captured images sequentially obtained by the anterior eye cameras 5A and 5B.
  • the three-dimensional position calculation unit 231B calculates the three-dimensional positions of the characteristic regions of the subject's eye E based on the positions of the anterior eye cameras 5A and 5B and the characteristic positions corresponding to the characteristic regions identified by the characteristic region identification unit 231A. Identify as a three-dimensional position.
  • the three-dimensional position calculation unit 231B calculates the positions (known) of the two anterior eye cameras 5A and 5B and corresponding to characteristic regions in the two captured images.
  • the three-dimensional position of the subject's eye E is calculated by applying a known trigonometric method to the position where the eye E is to be examined.
  • the three-dimensional position calculated by the three-dimensional position calculator 231B is sent to the main controller 211.
  • the main control unit 211 determines that the x- and y-direction positions of the optical axis of the optical system match the x- and y-direction positions of the three-dimensional position, and that the z-direction distance is
  • the moving mechanism 150 is controlled so as to achieve a predetermined working distance.
  • the spectral distribution data processing unit 231C executes a process of specifying depth information of the spectral distribution data based on the OCT data.
  • the spectral distribution data processing unit 231C executes a process of identifying a characteristic region in the spectral fundus image as the spectral distribution data and identifying depth information of the identified characteristic region.
  • the spectral distribution data processing unit 231C can estimate the presence or absence of a disease, the probability of the disease, or the type of the disease based on the characteristic regions specified by the above processing.
  • the spectral distribution data processing unit 231C can highly accurately estimate the presence or absence of a disease based on the feature region for which the depth information has been specified by the above processing.
  • the spectral distribution data processing section 231C includes a characteristic region identifying section 2311C, a depth information identifying section 2312C, and a disease estimating section 2314C.
  • the depth information specifying section 2312C includes a searching section 2313C.
  • the characteristic region identifying section 2311C identifies a characteristic region in the spectral fundus image.
  • characteristic regions include blood vessels, diseased regions, optic nerve papilla, abnormal regions, regions characterized by changes in pixel luminance, and the like.
  • the characteristic region identifying section 2311C may identify the characteristic region designated using the operation section 240B of the user interface 240 as the characteristic region in the spectral distribution data.
  • the characteristic region identifying unit 2311C identifies characteristic regions for each of a plurality of spectral fundus images.
  • a feature region may be two or more regions.
  • the characteristic region identifying unit 2311C identifies characteristic regions for one or more spectral fundus images selected from a plurality of spectral fundus images.
  • the characteristic region identifying unit 2311C performs principal component analysis on the spectral fundus image and identifies characteristic regions using the principal component analysis results. For example, in the principal component analysis of a spectroscopic fundus image, principal components of one or more dimensions are sequentially identified so as to maximize the variance (variation). Each principal component reflects a characteristic region (characteristic part).
  • the characteristic region specifying unit 2311C first calculates the center of gravity (average value) of all the data of the spectral fundus image, and specifies the direction in which the variance of the data from the calculated center of gravity is maximum as the first principal component. Then, the second principal component that has the maximum variance in the direction orthogonal to the identified first principal component is identified. Subsequently, the characteristic region identifying unit 2311C identifies the (n+1)-th principal component having the maximum variance in the direction perpendicular to the most recently identified n-th (n is an integer of 2 or more) principal component, Principal components are identified sequentially up to the dimension
  • a method of specifying a characteristic region by applying principal component analysis to such a spectroscopic fundus image is exemplified in Japanese Patent Application Laid-Open No. 2007-330558, for example.
  • the first principal component reflects the underlying retinal shape
  • the second principal component reflects the interchoroidal vessels
  • the third principal component reflects the retinal veins
  • the fifth principal component reflects the retinal veins.
  • the principal component reflects the entire retinal blood vessels.
  • the component representing the retinal artery can be extracted by removing the third principal component representing the retinal vein from the fifth principal component representing the entire retinal blood vessel. That is, each principal component obtained by the principal component analysis reflects the characteristic region (characteristic part) in the spectral distribution data, and it is possible to specify the characteristic region in the spectral distribution data using the principal component analysis result.
  • the characteristic region specifying unit 2311C uses at least one of an eigenvalue, a contribution rate, and a cumulative contribution rate corresponding to each principal component obtained by principal component analysis of the spectral fundus image to determine the spectral fundus image. Identify feature regions in the image.
  • the characteristic region identifying unit 2311C identifies characteristic regions based on comparison results obtained by comparing a plurality of spectral fundus images. For example, the characteristic region identifying unit 2311C identifies a characteristic region by comparing two spectral fundus images whose wavelength ranges are adjacent to each other. For example, the characteristic region identifying unit 2311C identifies a characteristic region by comparing spectral fundus images in two predetermined wavelength ranges.
  • the depth information specifying section 2312C specifies depth information of the feature area specified by the feature area specifying section 2311C.
  • Examples of depth information include information representing a position in the depth direction, which is the direction of the optical axis for measurement, information representing a range of positions in the depth direction, information representing a layer region, and information representing a tissue, with reference to a predetermined reference site. information, etc.
  • Examples of the predetermined reference sites include the fundus surface of the eye to be examined, a predetermined layer region forming the retina of the eye to be examined, the corneal vertex of the eye to be examined, the site where the reflected light intensity of the eye to be examined is maximized, and the anterior eye of the eye to be examined.
  • the depth information identifying unit 2312C identifies depth information of the characteristic region identified by the characteristic region identifying unit 2311C using OCT data with higher resolution in the depth direction than the spectral distribution data. Specifically, the depth information specifying unit 2312C searches the OCT data for a region having the highest degree of correlation with the feature region specified by the feature region specifying unit 2311C. The depth information specifying unit 2312C specifies the depth information in the searched OCT data area as the depth information of the feature area specified by the feature area specifying unit 2311C.
  • the main control unit 211 causes the display unit 240A to display the spectral fundus image (spectral distribution data) and the depth information specified by the depth information specifying unit 2312C. At this time, the main control unit 211 can display the OCT data corresponding to the depth information together with the spectral fundus image and the depth information on the display unit 240A.
  • the searching unit 2313C searches the OCT data (for example, three-dimensional OCT data) for a region having the highest degree of correlation with the spectroscopic fundus image in a predetermined wavelength range.
  • the searching unit 2313C obtains a plurality of degrees of correlation between each of the plurality of regions of the OCT data and the spectral fundus image, and selects the region of the OCT data with the highest degree of correlation from the plurality of degrees of correlation obtained. identify.
  • the searching unit 2313C obtains a plurality of degrees of correlation between each of the plurality of front images and the spectral distribution image in the predetermined wavelength range.
  • the search unit 2313C identifies the front image with the highest degree of correlation, and identifies the depth information of the identified front image as the depth information of the spectral fundus image.
  • the search unit 2313C obtains the plurality of degrees of correlation with respect to the three-dimensional OCT image of the eye to be examined E formed based on the OCT data of the eye to be examined E, and calculates the obtained plurality of A region of the three-dimensional image with the highest correlation is identified from the correlation.
  • the depth information specifying unit 2312C specifies depth information in the specified region of the three-dimensional image as depth information of the spectral fundus image.
  • the search unit 2313C can search the OCT data for an area that has the highest degree of correlation with the characteristic region (spectral distribution data in a broad sense) identified by the characteristic region identification unit 2311C.
  • the searching unit 2313C obtains a plurality of degrees of correlation between each of the plurality of regions of the OCT data and the characteristic region identified by the characteristic region identifying unit 2311C, and selects the highest degree of correlation from the obtained plurality of degrees of correlation. A region of OCT data with a high degree of correlation is identified.
  • the searching unit 2313C obtains, for each of the plurality of front images, a plurality of degrees of correlation between each of the plurality of regions of the front image and the feature regions specified by the feature region specifying unit 2311C.
  • the searching unit 2313C identifies a region having the highest degree of correlation with the characteristic region in each front image, and includes the region having the highest degree of correlation from among the plurality of front images in which the regions with the highest degree of correlation are identified in each of the front images. Identify the front image.
  • the searching unit 2313C identifies the identified depth information of the front image as the depth information of the characteristic region identified by the characteristic region identifying unit 2311C.
  • the search unit 2313C obtains the plurality of degrees of correlation with respect to the three-dimensional OCT image of the eye to be examined E formed based on the OCT data of the eye to be examined E, and calculates the obtained plurality of A region of the three-dimensional image with the highest correlation is identified from the correlation.
  • the depth information specifying unit 2312C specifies depth information in the specified region of the three-dimensional image as depth information of the feature region specified by the feature region specifying unit 2311C.
  • the disease estimating unit 2314C determines the presence or absence of disease, the probability of disease, or the disease probability based on the front image searched by the searching unit 2313C (or the region in the front image corresponding to the characteristic region identified by the characteristic region identifying unit 2311C). Estimate the type of In some embodiments, the disease estimator 2314C estimates the presence or absence of disease, the probability of disease, or the type of disease based on two or more frontal images of a predetermined depth range including the searched frontal image.
  • the disease estimation unit 2314C a plurality of image patterns corresponding to disease types are registered in advance.
  • the disease estimating unit 2314C obtains the degree of correlation between the searched front image (or the above-described region in the front image) and each of the plurality of image patterns, and when the degree of correlation is equal to or greater than a predetermined threshold, the eye E to be examined has a disease.
  • Generate disease information including the fact that it is estimated to be accompanied by
  • the disease estimating unit 2314C can generate disease information including the fact that it is estimated to be associated with a disease and the type of disease corresponding to an image pattern having a degree of correlation equal to or greater than a threshold.
  • the degree of correlation is less than a predetermined threshold value
  • the disease estimating unit 2314C generates disease information including the fact that it is estimated that the subject's eye E does not have a disease.
  • the main control unit 211 can cause the display unit 240A to display disease information including the presence or absence of a disease, the probability of disease, or the type of disease.
  • the main control unit 211 transmits disease information together with at least one of the searched front image (or the front image including the searched region), the spectral distribution image, and the specified depth information. Displayed on the display unit 240A. Further, the main control unit 211 may superimpose the spectral distribution image on the searched front image and display it on the display unit 240A.
  • the main control unit 211 causes the display unit 240A to identifiably display an area corresponding to the characteristic region in the front image corresponding to the characteristic area identified by the characteristic area identification unit 2311C.
  • User interface 240 includes a display section 240A and an operation section 240B.
  • Display unit 240A includes display device 3 .
  • the operation unit 240B includes various operation devices and input devices.
  • the user interface 240 may include a device such as a touch panel that combines a display function and an operation function. In other embodiments, at least a portion of the user interface may not be included on the ophthalmic device.
  • the display device may be an external device connected to the ophthalmic equipment.
  • the communication unit 250 has a function for communicating with an external device (not shown).
  • the communication unit 250 has a communication interface according to a connection form with an external device.
  • external devices include server devices, OCT devices, scanning optical ophthalmoscopes, slit lamp ophthalmoscopes, ophthalmic measurement devices, and ophthalmic treatment devices.
  • ophthalmic measurement devices include eye refractometers, tonometers, specular microscopes, wavefront analyzers, perimeters, microperimeters, and the like.
  • Examples of ophthalmic treatment devices include laser treatment devices, surgical devices, surgical microscopes, and the like.
  • the external device may be a device (reader) that reads information from a recording medium, or a device (writer) that writes information to a recording medium. Further, the external device may be a hospital information system (HIS) server, a DICOM (Digital Imaging and Communication in Medicine) server, a doctor terminal, a mobile terminal, a personal terminal, a cloud server, or the like.
  • HIS hospital information system
  • DICOM Digital Imaging and Communication in Medicine
  • the arithmetic control unit 200 (the control unit 210, the image forming unit 220, and the data processing unit 230) is an example of the "ophthalmic information processing apparatus" according to the embodiment.
  • a spectral image (spectral fundus image, spectral anterior segment image) is an example of "spectral distribution data” according to the embodiment.
  • OCT data is an example of "measurement data” according to the embodiment.
  • the disease estimator 2314C is an example of the "estimator” according to the embodiment.
  • the control unit 210 (main control unit 211) is an example of a "display control unit” according to the embodiment.
  • the imaging optical system 30 is an example of a "light receiving optical system” according to the embodiment.
  • the optical system from the OCT unit 100 to the objective lens 22 is an example of the "OCT optical system” according to the embodiment.
  • the ophthalmologic apparatus 1 acquires a plurality of spectroscopic fundus images by illuminating the fundus oculi Ef with illumination light and receiving return light from the fundus oculi Ef having different wavelength ranges within a predetermined analysis wavelength range.
  • FIG. 6 shows an example of a plurality of spectral fundus images according to the embodiment.
  • FIG. 6 shows an example of a spectral fundus image displayed on the display unit 240A.
  • the main control unit 211 causes the display unit 240A to horizontally arrange a plurality of spectral fundus images acquired by sequentially receiving the returning light from the image sensor 38 and display them on the display unit 240A. At this time, the main control unit 211 can cause the display unit 240A to display each of the plurality of spectral fundus images in association with the wavelength range. This makes it possible to easily grasp the spectral distribution of the fundus corresponding to the wavelength range.
  • FIG. 7 shows an explanatory diagram of an operation example of the ophthalmologic apparatus 1 according to the embodiment.
  • the spectral distribution data processing unit 231C obtains the degree of correlation between any of the spectral fundus images IMG1 of the plurality of spectral fundus images and each of the plurality of en-face images having different depth positions, and determines the en- Identify the face image.
  • the spectral distribution data processing unit 231C specifies the depth information of the specified en-face image as the depth information of the spectral fundus image IMG1.
  • the spectral fundus image IMG1 may be a spectral fundus image to be analyzed in which a characteristic region or a region of interest is drawn.
  • the depth position or layer region of the spectral fundus image IMG1 can be specified with high precision, and the spectral distribution of the spectral fundus image IMG1 can be determined while grasping the tissue, site, etc. depicted in the spectral fundus image IMG1. analysis becomes possible. Therefore, at least one of spectral distribution and depth position (layer area) can be used to improve the accuracy of disease estimation.
  • FIG. 8 shows an explanatory diagram of another operation example of the ophthalmologic apparatus 1 according to the embodiment.
  • the spectral distribution data processing unit 231C analyzes any one spectral fundus image IMG2 of the plurality of spectral fundus images to identify a characteristic region CS, and generates a characteristic region image IMG3 including the identified characteristic region CS and a depth position. The degree of correlation with each of a plurality of en-face images with different values is obtained, and the en-face image with the highest degree of correlation is specified. The spectral distribution data processing unit 231C identifies the identified depth information of the en-face image as the depth information of the characteristic region image IMG3.
  • the spectral fundus image IMG2 may be a spectral fundus image in which the characteristic region is most clearly depicted among the plurality of spectral fundus images.
  • the depth position or layer region of the characteristic region CS in the spectral fundus image IMG2 can be specified with high accuracy, and the spectral distribution of the spectral fundus image IMG2 can be analyzed while grasping the tissue, site, etc. in the characteristic region CS. it becomes possible to Therefore, at least one of spectral distribution and depth position (layer area) can be used to improve the accuracy of disease estimation.
  • the spectral fundus image specified by the spectral distribution data processing unit 231C may be superimposed on the en-face image and displayed on the display unit 240A.
  • the spectral distribution data processing unit 231C obtains the degree of correlation between the spectral fundus image and each of a plurality of en-face images with different depth positions, and identifies the en-face image with the highest degree of correlation.
  • the main control unit 211 causes the display unit 240A to display the spectroscopic fundus image superimposed on the specified en-face image.
  • the spectral fundus image may be a desired spectral fundus image among the plurality of spectral fundus images, or a spectral fundus image in which the characteristic region is most clearly rendered among the plurality of spectral fundus images.
  • an en-face image corresponding to the spectral fundus image specified by the spectral distribution data processing unit 231C is displayed in a display mode corresponding to the spectral fundus image. You may make it display on the part 240A.
  • the en-face image is displayed on the display unit 240A by assigning color information corresponding to the luminance value of the spectral fundus image to each pixel, each predetermined region, or each part.
  • the spectral distribution data processing unit 231C obtains the degree of correlation between the spectral fundus image and each of a plurality of en-face images with different depth positions, and identifies the en-face image with the highest degree of correlation. .
  • the main control unit 211 assigns color information corresponding to the spectral fundus image to the specified en-face image, and causes the display unit 240A to display it.
  • FIG. 9 to 11 show operation examples of the ophthalmologic apparatus 1 according to the embodiment.
  • FIG. 9 shows a flowchart of an operation example of the ophthalmologic apparatus 1 when acquiring a plurality of spectral fundus images.
  • FIG. 10 shows a flow diagram of an operation example of the ophthalmologic apparatus 1 when estimating a disease using a spectral fundus image.
  • FIG. 11 shows a flowchart of an operation example of the ophthalmologic apparatus 1 when displaying a spectral fundus image superimposed on an OCT image.
  • the storage unit 212 stores computer programs for realizing the processes shown in FIGS.
  • the main control unit 211 executes the processes shown in FIGS. 9 to 11 by operating according to this computer program.
  • the main control unit 211 controls the anterior segment cameras 5A and 5B to photograph the anterior segment Ea of the subject's eye E substantially simultaneously.
  • the characteristic site identification unit 231A receives control from the main control unit 211, analyzes a pair of anterior segment images obtained substantially simultaneously by the anterior segment cameras 5A and 5B, and identifies the pupil of the subject's eye E as a characteristic site. Identify the center position.
  • the three-dimensional position calculator 231B obtains the three-dimensional position of the eye E to be examined. This processing includes arithmetic processing using trigonometry based on the positional relationship between the pair of anterior eye cameras 5A and 5B and the subject's eye E, as described in Japanese Patent Application Laid-Open No. 2013-248376, for example.
  • the main control unit 211 moves based on the three-dimensional position of the subject's eye E obtained by the three-dimensional position calculator 231B so that the optical system (for example, the fundus camera unit 2) and the subject's eye E have a predetermined positional relationship.
  • the predetermined positional relationship is a positional relationship that enables imaging and examination of the subject's eye E using an optical system.
  • the x-coordinate and y-coordinate of the optical axis of the objective lens 22 are the eye E and the difference between the z-coordinate of the objective lens 22 (front lens surface) and the z-coordinate of the eye to be examined E (corneal surface) is equal to a predetermined distance (working distance) , is set as the destination of the optical system.
  • the main control unit 211 controls the focus optical system 60 to project the split index on the eye E to be examined.
  • the analysis unit 231 extracts a pair of split index images by analyzing the observed image of the fundus oculi Ef on which the split indices are projected, and calculates the relative relationship between the pair of split indices. Calculate the deviation.
  • the main control unit 211 controls the focus driving unit 31A and the focus driving unit 43A based on the calculated deviation (direction of deviation, amount of deviation).
  • the main controller 211 controls the wavelength tunable filter 80 to set the wavelength selection range of transmitted light to a predetermined wavelength range.
  • a predetermined wavelength range is an initial wavelength range when sequentially repeating the selection of wavelength ranges to cover the analysis wavelength range.
  • the main control unit 211 causes the image data of the spectral fundus image to be obtained.
  • the main control unit 211 controls the illumination optical system 10 to illuminate the subject's eye E with illumination light, captures the light reception result of the reflected light of the illumination light obtained by the image sensor 38, and obtains the image data of the spectral fundus image. get
  • the main control unit 211 determines whether or not to acquire a spectral fundus image in the next wavelength range. For example, when wavelength selection is sequentially changed in predetermined wavelength range steps within the analysis wavelength range, the main control unit 211 determines whether or not to acquire the next spectral fundus image based on the number of times the wavelength range has been changed. can judge. For example, the main control unit 211 can determine whether or not to acquire the next spectral fundus image by determining whether or not all of a plurality of predetermined wavelength ranges have been selected.
  • step S5 When it is determined in step S5 that the next spectral fundus image is to be acquired (step S5: Y), the operation of the ophthalmologic apparatus 1 proceeds to step S6.
  • step S5 when it is determined not to acquire the next spectral fundus image (step S5: N), the operation of the ophthalmologic apparatus 1 ends (end).
  • step S6 Change wavelength range
  • the main control unit 211 controls the wavelength tunable filter 80 to select the range of transmitted light to be selected next. change. Subsequently, the operation of the ophthalmologic apparatus 1 proceeds to step S4.
  • the ophthalmologic apparatus 1 can acquire a plurality of spectral fundus images corresponding to a plurality of wavelength ranges within a predetermined analysis wavelength range.
  • FIG. 10 shows an operation example in the case of estimating a disease using either one of a plurality of spectral fundus images acquired according to the operation example shown in FIG. 9 or a plurality of acquired spectral fundus images.
  • the main control unit 211 controls the characteristic region specifying unit 2311C to specify a characteristic region in the spectral fundus image.
  • the characteristic region identification unit 2311C performs characteristic region identification processing on the spectral fundus image as described above.
  • the characteristic region identifying unit 2311C identifies a characteristic region in a spectral fundus image selected in advance from among a plurality of spectral fundus images. In some embodiments, the characteristic region identifying section 2311C selects one characteristic region from the plurality of characteristic regions identified in each of the plurality of spectral fundus images.
  • the main controller 211 acquires an OCT image.
  • the OCT data of the eye E to be examined is obtained by performing OCT on the eye E to be examined in advance. It is also assumed that a three-dimensional OCT image or a plurality of en-face images with different depth positions are formed based on OCT data. In this case, the main controller 211 acquires a 3D OCT image or a plurality of en-face images.
  • step S12 the main control unit 211 controls the OCT unit 100 and the like to perform OCT on the subject's eye E and obtain OCT data.
  • the data processing unit 230 forms a three-dimensional OCT image or a plurality of en-face images having different depth positions based on the acquired OCT data.
  • the main control unit 211 controls the depth information specifying unit 2312C (searching unit 2313C) so that the OCT image acquired in step S12 has the highest degree of correlation with the image including the characteristic region specified in step S11. is searched for an image region with a high , or an en-face image containing the image region.
  • the main control unit 211 controls the depth information specifying unit 2312C to specify a part (layer region, depth position, etc.) on the fundus corresponding to the characteristic region specified in step S11.
  • the depth information specifying unit 2312C specifies the depth information from the image area that has the highest correlation with the image containing the feature area searched in step S13, or from the en-face image that includes the image area. Identify the site on the fundus from the depth information.
  • the main control unit 211 controls the disease estimating unit 2314C to specify the presence or absence of the disease, the probability of the disease, or the type of the disease in the site on the fundus identified in step S14.
  • the disease estimation unit 2314C performs disease estimation processing as described above.
  • the disease estimating unit 2314C uses the spectral distribution (spectral characteristics) of the spectral fundus image for which the characteristic region is identified in step S11, the en-face image (OCT image) searched in step S13, Based on the site on the fundus identified in step S14, the presence or absence of disease, the probability of disease, or the type of disease is identified.
  • the main control unit 211 controls the spectral fundus image with the characteristic region identified in step S11, the characteristic region identified in step S11, the en-face image (OCT image) identified in step S13, and the characteristic region.
  • the depth information obtained, the part on the fundus identified in step S14, and at least one of the presence or absence of the disease, the probability of disease, and the type of disease estimated in step S15 are displayed on the display unit 240A.
  • step S16 the main control unit 211 superimposes the spectroscopic fundus image in which the characteristic region is specified in step S11 on the en-face image specified in step S13, and displays the images on the display unit 240A.
  • the main control unit 211 may cause the display unit 240A to display a synthetic fundus image generated by assigning color components and changeable transparency information to each of a plurality of spectral fundus images and superimposing them. good.
  • the ophthalmologic apparatus 1 can identify a region corresponding to a characteristic region in the spectral fundus image of the eye E to be examined based on the OCT data of the eye E to be examined, and estimate a disease.
  • FIG. 11 shows an operation example in which either one of the plurality of spectral fundus images acquired according to the operation example shown in FIG. 9 or the acquired plurality of spectral fundus images is superimposed on the OCT image and displayed. show.
  • the main control unit 211 controls the characteristic region identifying unit 2311C to identify a characteristic region in the spectral fundus image, as in step S11.
  • the main control unit 211 acquires an OCT image as in step S12.
  • the OCT data of the eye E to be examined is obtained by performing OCT on the eye E to be examined in advance. It is also assumed that a three-dimensional OCT image or a plurality of en-face images with different depth positions are formed based on OCT data. In this case, the main controller 211 acquires a 3D OCT image or a plurality of en-face images.
  • the main control unit 211 controls the depth information specifying unit 2312C (searching unit 2313C) so that the OCT image acquired in step S22 has the highest correlation with the spectral fundus image in which the characteristic region is specified in step S21. Search for en-face images (or image regions in 3D OCT images) with high degrees.
  • step S24 the main control unit 211 causes the display unit 240A to superimpose the spectral fundus image of which the characteristic region was specified in step S21 on the en-face image searched in step S23.
  • step S24 the main control unit 211 causes the display unit 240A to display the characteristic regions identified in step S21 in a identifiable manner.
  • the ophthalmologic apparatus 1 can superimpose the spectral fundus image of the eye E to be examined on the OCT data of the eye E to be displayed.
  • the ophthalmologic information processing apparatus (control unit 210, image forming unit 220, and data processing unit 230) according to some embodiments includes a characteristic region specifying unit (2311C) and a depth information specifying unit (2312C).
  • the characteristic region identifying unit identifies a characteristic region in the spectral distribution data acquired by receiving return light in a predetermined wavelength range from the subject's eye (E) illuminated with the illumination light.
  • the depth information specifying unit specifies the depth information of the characteristic region based on the measurement data of the subject's eye having higher resolution in the depth direction than the spectral distribution data.
  • the characteristic region identifying unit illuminates the subject's eye with illumination light and receives return light from the subject's eye having different wavelength ranges from each other. Identify feature regions.
  • a characteristic region having a characteristic spectral distribution is identified, and the identified characteristic region is the depth at the measurement site. It is possible to specify with high accuracy which tissue in the direction it belongs to.
  • the measurement data is OCT data obtained by performing optical coherence tomography on the eye to be examined.
  • the depth information specifying unit is a search unit that searches for a front image having the highest degree of correlation with the spectral distribution data from among a plurality of front images formed based on OCT data and having different depth positions. (2313C), and specifies depth information based on the front image searched by the search unit.
  • the spectral distribution data can be obtained with high precision. Depth information can be easily identified.
  • the depth information specifying unit selects an image including a characteristic region from among a plurality of front images formed based on OCT data and having different depth positions, and an image including an image region having the highest degree of correlation.
  • a searching unit (2313C) for searching an image is included, and depth information is specified based on the front image searched by the searching unit.
  • the front image including the image area having the highest correlation with the image including the characteristic area in the spectral distribution data is specified by the search processing for the plurality of front images formed based on the OCT data. Therefore, it is possible to easily identify highly accurate depth information of the characteristic region.
  • Some embodiments include an estimating unit (disease estimating unit 2314C) that estimates the presence or absence of disease, the probability of disease, or the type of disease based on the front image searched by the searching unit.
  • disease estimating unit 2314C disease estimating unit 2314C
  • a display control unit (control unit 210, main control 211).
  • Some embodiments include a display control unit (control unit 210, main control unit 211) that causes display means (display unit 240A) to display the front image and depth information searched by the search unit.
  • Some embodiments include a display control unit (control unit 210, main control unit 211) that superimposes the spectral distribution data on the front image searched by the search unit and displays it on display means (display unit 240A).
  • the spectral distribution data can be superimposed on the front image and displayed, and the distribution data and the front image can be associated with each other.
  • the display control unit causes the display means to identifiably display the area corresponding to the characteristic site in the front image corresponding to the characteristic area.
  • Some embodiments include a display control unit (control unit 210, main control unit 211) that displays spectral distribution data and depth information on display means (display unit 240A).
  • the depth information includes at least one of information representing a depth position, a depth range, and a layer area relative to a reference portion of the subject's eye.
  • An ophthalmologic apparatus (1) includes an illumination optical system (10) that illuminates an eye to be inspected with illumination light, and a light receiving optical system that receives return light of the illumination light from the eye to be inspected whose wavelength ranges are different from each other.
  • illumination optical system 10
  • light receiving optical system that receives return light of the illumination light from the eye to be inspected whose wavelength ranges are different from each other.
  • OCT optical system an optical system from an OCT unit to an objective lens
  • any one of the ophthalmic information processing apparatuses described above. include.
  • An ophthalmologic information processing method includes a characteristic region identifying step and a depth information identifying step.
  • the characteristic region identifying step identifies a characteristic region in the spectral distribution data acquired by receiving return light in a predetermined wavelength range from the eye (E) illuminated with the illumination light.
  • the depth information specifying step specifies the depth information of the characteristic region based on the measurement data of the subject's eye having higher resolution in the depth direction than the spectral distribution data.
  • the characteristic region identifying step includes any of a plurality of spectral distribution data acquired by illuminating the eye to be inspected with illumination light and receiving return light from the eye to be inspected that has different wavelength ranges. Identify feature regions.
  • a characteristic region having a characteristic spectral distribution is identified, and the identified characteristic region is the depth at the measurement site. It is possible to specify with high accuracy which tissue in the direction it belongs to.
  • the measurement data is OCT data obtained by performing optical coherence tomography on the eye to be examined.
  • the depth information specifying step is a search step of searching for a front image having the highest degree of correlation with the spectral distribution data from among a plurality of front images formed based on OCT data and having different depth positions. and identifying depth information based on the front image searched in the searching step.
  • the spectral distribution data can be obtained with high accuracy. Depth information can be easily identified.
  • the depth information specifying step includes, from among a plurality of front images formed based on OCT data and having different depth positions, an image including a characteristic region and an image region having the highest correlation degree.
  • a search step of searching the image is included, and depth information is identified based on the front image searched in the search step.
  • a front image including an image area having the highest degree of correlation with an image including a characteristic area in the spectral distribution data is specified by performing search processing on a plurality of front images formed based on OCT data. Therefore, it is possible to easily identify highly accurate depth information of the characteristic region.
  • Some embodiments include an estimation step of estimating the presence or absence of disease, the probability of disease, or the type of disease based on the front image searched in the search step.
  • Some embodiments include a display control step of displaying disease information including the presence or absence of a disease, the probability of disease, or the type of disease estimated in the estimation step on the display means (display unit 240A).
  • the disease information estimated from the spectral distribution data can be displayed and the disease information can be notified to the outside.
  • Some embodiments include a display control step of displaying the front image and depth information searched in the search step on display means (display unit 240A).
  • the front image and depth information corresponding to the spectral distribution data can be displayed, and the front image and depth information can be notified to the outside.
  • Some embodiments include a display control step of superimposing the spectral distribution data on the front image searched in the search step and displaying it on the display means (display unit 240A).
  • the spectral distribution data can be superimposed on the front image and displayed, and the distribution data and the front image can be associated with each other.
  • the display control step causes the display means to identifiably display the area corresponding to the characteristic site in the front image corresponding to the characteristic area.
  • Some embodiments include a display control step of displaying spectral distribution data and depth information on display means (display unit 240A).
  • the spectral distribution data and the depth information can be displayed and notified to the outside.
  • the depth information includes at least one of information representing a depth position, a depth range, and a layer area relative to a reference portion of the subject's eye.
  • a program causes a computer to execute each step of the ophthalmologic information processing method described above.
  • the storage unit 212 stores a program that causes a computer to execute the ophthalmologic information processing method.
  • a program may be stored in any computer-readable recording medium.
  • the recording medium may be electronic media using magnetism, light, magneto-optics, semiconductors, and the like.
  • recording media are magnetic tapes, magnetic disks, optical disks, magneto-optical disks, flash memories, solid state drives, and the like.
  • ophthalmologic apparatus 1 ophthalmologic apparatus 2 retinal camera unit 10 illumination optical system 22 objective lens 30 imaging optical system 80 wavelength tunable filter 100 OCT unit 210 control section 211 main control section 220 image forming section 230 data processing section 231 analysis section 231A characteristic site identification section 231B Dimensional position calculation unit 231C Spectral distribution data processing unit 2311C Characteristic region identification unit 2312C Depth information identification unit 2313C Search unit 2314C Disease estimation unit E Eye to be examined Ef Fundus LS Measurement light

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

Ce dispositif de traitement d'informations ophtalmiques comprend une unité d'identification de région de caractéristique et une unité d'identification d'informations de profondeur. L'unité d'identification de région de caractéristique identifie une région de caractéristique dans des données de distribution spectrale, qui sont obtenues suite à la réception d'une lumière qui était une lumière d'éclairage projetée sur un œil soumis à un examen puis renvoyée par l'œil et se situe dans une plage de longueurs d'onde prescrite. L'unité d'identification d'informations de profondeur identifie des informations de profondeur de la région de caractéristique, sur la base de données de mesure de l'œil soumis à un examen, les données de mesure ayant une résolution plus élevée dans le sens de la profondeur par comparaison avec les données de distribution spectrale.
PCT/JP2022/030396 2021-09-16 2022-08-09 Dispositif de traitement d'informations ophtalmiques, dispositif ophtalmique, procédé de traitement d'informations ophtalmiques et programme Ceased WO2023042577A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-150693 2021-09-16
JP2021150693A JP7774410B2 (ja) 2021-09-16 2021-09-16 眼科情報処理装置、眼科装置、眼科情報処理方法、及びプログラム

Publications (1)

Publication Number Publication Date
WO2023042577A1 true WO2023042577A1 (fr) 2023-03-23

Family

ID=85602752

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/030396 Ceased WO2023042577A1 (fr) 2021-09-16 2022-08-09 Dispositif de traitement d'informations ophtalmiques, dispositif ophtalmique, procédé de traitement d'informations ophtalmiques et programme

Country Status (2)

Country Link
JP (1) JP7774410B2 (fr)
WO (1) WO2023042577A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006061328A (ja) * 2004-08-26 2006-03-09 Kowa Co 眼科装置
JP2007330557A (ja) * 2006-06-15 2007-12-27 Topcon Corp 分光眼底測定装置及びその測定方法
JP2007330558A (ja) * 2006-06-15 2007-12-27 Topcon Corp 分光眼底測定装置及びその測定方法
JP2009264787A (ja) * 2008-04-22 2009-11-12 Topcon Corp 光画像計測装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006061328A (ja) * 2004-08-26 2006-03-09 Kowa Co 眼科装置
JP2007330557A (ja) * 2006-06-15 2007-12-27 Topcon Corp 分光眼底測定装置及びその測定方法
JP2007330558A (ja) * 2006-06-15 2007-12-27 Topcon Corp 分光眼底測定装置及びその測定方法
JP2009264787A (ja) * 2008-04-22 2009-11-12 Topcon Corp 光画像計測装置

Also Published As

Publication number Publication date
JP2023043212A (ja) 2023-03-29
JP7774410B2 (ja) 2025-11-21

Similar Documents

Publication Publication Date Title
JP6426974B2 (ja) データ処理方法及びoct装置
JP6469413B2 (ja) データ処理方法及びoct装置
JP2023014190A (ja) 眼科撮影装置
JP2022176282A (ja) 眼科装置、及びその制御方法
JP7117873B2 (ja) 眼科装置
JP6703839B2 (ja) 眼科計測装置
JP2023038280A (ja) 血流計測装置
JP2019154988A (ja) 眼科撮影装置、その制御方法、プログラム、及び記録媒体
JP7325169B2 (ja) 眼科装置、及びその制御方法
JP7166182B2 (ja) 眼科情報処理装置、眼科装置、眼科情報処理方法、及びプログラム
JP7199172B2 (ja) 眼科装置、及びその制御方法
JP7260426B2 (ja) 光コヒーレンストモグラフィ装置、その制御方法、光計測方法、プログラム、及び記憶媒体
JP7374272B2 (ja) 眼科装置
JP7774410B2 (ja) 眼科情報処理装置、眼科装置、眼科情報処理方法、及びプログラム
JP7216514B2 (ja) 血管解析装置
JP2018192082A (ja) 眼科装置、及びその制御方法
JP6942627B2 (ja) 眼科撮影装置、その制御方法、プログラム、及び記録媒体
JP7684158B2 (ja) 眼科情報処理装置、眼科装置、眼科情報処理方法、及びプログラム
JP7289394B2 (ja) 眼科情報処理装置、眼科装置、眼科情報処理方法、及びプログラム
JP2021142022A (ja) 眼科情報処理装置、眼科装置、眼科情報処理方法、及びプログラム
JP7769510B2 (ja) 眼科情報処理装置、及び眼科装置
JP7288110B2 (ja) 眼科装置
JP2020103405A (ja) 眼科装置、及びその制御方法
WO2025004826A1 (fr) Dispositif de traitement d'analyse, dispositif de tomographie par cohérence optique, procédé de traitement d'analyse et programme
JP7221628B2 (ja) 血流計測装置、情報処理装置、情報処理方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22869724

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22869724

Country of ref document: EP

Kind code of ref document: A1