[go: up one dir, main page]

WO2017037781A1 - Dispositif d'observation de type à balayage - Google Patents

Dispositif d'observation de type à balayage Download PDF

Info

Publication number
WO2017037781A1
WO2017037781A1 PCT/JP2015/074430 JP2015074430W WO2017037781A1 WO 2017037781 A1 WO2017037781 A1 WO 2017037781A1 JP 2015074430 W JP2015074430 W JP 2015074430W WO 2017037781 A1 WO2017037781 A1 WO 2017037781A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
scanning
correction amount
subject
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2015/074430
Other languages
English (en)
Japanese (ja)
Inventor
西村 淳一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Priority to PCT/JP2015/074430 priority Critical patent/WO2017037781A1/fr
Priority to JP2017537051A priority patent/JPWO2017037781A1/ja
Publication of WO2017037781A1 publication Critical patent/WO2017037781A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor

Definitions

  • the present invention relates to a scanning observation apparatus.
  • the present invention has been made in view of the above-described circumstances, and an object thereof is to provide a scanning observation apparatus capable of correcting image distortion in an image at high speed by simple calculation. To do.
  • the present invention provides a scanning unit that scans a beam for irradiating a subject along a predetermined scanning locus, a detection unit that detects a signal wave generated in the subject by irradiation of the beam, and a detection unit that detects the signal wave.
  • An image generating unit that generates an image of the subject by associating the signal wave with an irradiation position of the beam on the predetermined scanning locus; and a display unit that displays an image formed by the image forming unit;
  • a correction amount setting unit that sets a distortion correction amount for the image displayed on the display unit based on a user operation, and the image generation unit is set by the correction amount setting unit.
  • Shift time is calculated based on the signal wave, and the signal wave of the beam at a time later than the detection time of the signal wave by the detection unit by the shift time. In association with the elevation position to provide a scanning observation apparatus which generates an image.
  • the image generation unit when the beam is scanned on the subject by the scanning unit, a signal wave is generated at the irradiation position of the beam, and the signal wave is detected by the detection unit.
  • the image generation unit generates an image of the subject by associating the detected signal wave with a beam irradiation position on the subject.
  • the image generation unit associates the signal wave with the irradiation position at a time later than the detection time by the detection unit for the shift time based on the distortion correction amount set by the correction amount setting unit.
  • the shift time is equal to the delay time from the theoretical irradiation position to the actual irradiation position
  • the signal wave is associated with the irradiation position where the signal wave is actually generated, and the image in the image caused by the delay Distortion is eliminated.
  • the user can correct the distortion of the image in the image by setting the correction amount that minimizes the distortion of the image in the image in the correction amount setting unit based on the image displayed on the display unit. it can. Further, image distortion can be corrected at high speed by a simple calculation in which the signal wave to be associated and the irradiation position of the laser beam are shifted by a shift time equal to the time direction.
  • the correction amount setting unit may include a GUI provided in the display unit and an input device connected to the display unit. In this way, an image in which the distortion of the image of the subject is corrected can be obtained on the screen of the display unit while being operated by the user in real time.
  • the scanning unit may rotationally scan the beam at a constant angular velocity along a spiral or concentric scanning trajectory.
  • the distortion of the image due to the delay is further enhanced by the distortion of the scanning locus. Therefore, the image quality can be effectively improved by removing the image distortion caused by the delay.
  • the scanning unit may scan the beam along a Lissajous scanning trajectory.
  • the image in the image is distorted in the opposite direction on both sides of the image diagonal line.
  • Such complicated distortion of an image can be corrected with high accuracy only by a simple calculation process.
  • FIG. 1 is an overall configuration diagram of a scanning observation apparatus according to an embodiment of the present invention. It is a block diagram which shows the detailed structure of the scanning observation apparatus of FIG. It is a figure which shows the structure of the scanner in the scanning observation apparatus of FIG.
  • FIG. 3B is a cross-sectional view taken along line III-III in FIG. 3A. . It is a figure which shows an example of the brightness
  • DELTA coordinate by shift time
  • scanning type observation device 1 concerning one embodiment of the present invention is explained with reference to drawings.
  • the scanning observation apparatus 1 emits a laser beam (beam) L irradiated to the subject A from the distal end of the insertion portion 21 of the endoscope 2 in a spiral scanning locus B.
  • the scanning observation apparatus 1 includes a light source unit 3, an optical scanning unit (scanning unit) 4 that scans the laser light L output from the light source unit 3 and irradiates the subject A, and a subject.
  • a detection unit (detection unit) 5 that detects the reflected light (signal wave) L ′ of the laser light L from A, and an image that generates an image of the subject A based on the reflected light L ′ detected by the detection unit 5
  • a generation unit (image generation unit) 6, a control unit 7 that controls the optical scanning unit 4, the detection unit 5, and the image generation unit 6, and a distortion correction amount ⁇ for the image of the subject A in the image based on a user operation.
  • a correction amount setting unit 8 to be set and a display unit 9 are provided.
  • a part of the optical scanning unit 4 and the detection unit 5, the image generation unit 6, and the control unit 7 are accommodated in a housing 10 connected to the endoscope 2.
  • the light source unit 3 includes three laser light sources (not shown) that respectively emit red (R), green (G), and blue (B) continuous wave laser beams.
  • the light source unit 3 combines the R, G, and B laser beams to generate white laser beam L, and emits the white laser beam L.
  • a semiconductor solid-state laser light source or a laser diode is used as the laser light source.
  • a white light image of the subject A is acquired
  • other types of images may be acquired.
  • an image of fluorescence excited by laser light may be acquired, or an infrared image or NBI (narrow band imaging) image may be acquired using laser light of a specific color.
  • NBI narrow band imaging
  • the optical scanning unit 4 includes a waveform generator 41 that generates a digital waveform based on a control signal from the controller 7, a D / A converter 42 that D / A converts the digital waveform to generate an alternating voltage, a light source And a scanner 43 that irradiates the subject A while scanning the laser beam L from the unit 3.
  • the waveform generation unit 41 generates a digital waveform having a frequency and amplitude specified by the control signal received from the control unit 7.
  • the D / A converter 42 generates an alternating voltage by converting the digital waveform generated by the waveform generator 41 into a voltage waveform.
  • the generated alternating voltage is supplied to the scanner 43 via the electric cables 13A and 13B.
  • the scanner 43 is disposed inside the distal end portion 21 a of the insertion portion 21.
  • the scanner 43 includes an illumination optical fiber 44 disposed in the insertion portion 21 along the longitudinal direction, an actuator 45 that vibrates the illumination optical fiber 44, and the illumination optical fiber 44 and the actuator 45 of the insertion portion 21. And a fixing portion 46 that is fixed to the outer cylinder.
  • Reference numerals 11 and 12 denote condensing lenses that focus the laser light L emitted from the tip of the illumination optical fiber 44 onto the subject A.
  • the proximal end of the illumination optical fiber 44 is connected to the light source unit 3.
  • the laser light L incident on the proximal end surface of the illumination optical fiber 44 from the light source unit 3 is guided from the proximal end to the distal end of the illumination optical fiber 44, and from the distal end surface of the illumination optical fiber 44 to the insertion portion 21. It is injected toward the front of the tip.
  • the actuator 45 includes a rectangular cylindrical elastic portion 47 and four piezoelectric elements 48A and 48B fixed to the outer peripheral surface of the elastic portion 47.
  • the illumination optical fiber 44 passes through the elastic portion 47, and the elastic portion 47 is formed on the outer peripheral surface of the illumination optical fiber 44 at a position spaced from the distal end of the illumination optical fiber 44 to the proximal end side. It is fixed.
  • a portion of the elastic portion 47 closer to the base end side than the piezoelectric elements 48 ⁇ / b> A and 48 ⁇ / b> B is fixed to the outer cylinder of the insertion portion 21 via the fixing portion 46.
  • the elastic portion 47 and the distal end portion of the illumination optical fiber 44 are supported in a cantilever shape and can swing.
  • the piezoelectric elements 48A and 48B are plate-like and polarized in the thickness direction.
  • an arrow P indicates the polarization direction of the piezoelectric elements 48A and 48B.
  • the piezoelectric elements 48A and 48B have four elastic portions 47 so that the polarization directions of the two piezoelectric elements 48A or 48B facing the radial direction of the illumination optical fiber 44 are the same.
  • One sheet is fixed to each of the two outer surfaces.
  • An A-phase electrical cable 13A is connected to the two X scanning piezoelectric elements 48A facing in the X direction
  • a B phase electrical cable is connected to the two Y scanning piezoelectric elements 48B facing in the Y direction.
  • the cable 13B is connected.
  • the X direction and the Y direction are radial directions of the illumination optical fiber 44 and are directions orthogonal to each other.
  • the waveform generator 41 generates two digital waveforms of A phase and B phase.
  • the D / A converter 42 is provided for the A phase and the B phase, respectively.
  • the B-phase D / A converter 42 D / A converts the B-phase digital waveform, and the generated B-phase alternating voltage is supplied to two Y-scanning piezoelectric elements 48B via the electric cable 13B. To be supplied.
  • the piezoelectric element 48A for X scanning expands and contracts in the longitudinal direction (Z direction) of the illumination optical fiber 44 by application of an A-phase alternating voltage.
  • the elastic portion 47 is excited to bend in the X direction with the position of the fixed portion 46 as a node. Is done.
  • the bending vibration of the elastic portion 47 is transmitted to the illumination optical fiber 44.
  • the tip of the illumination optical fiber 44 bends and vibrates in the X direction
  • the tip of the optical fiber 11 vibrates in the X direction
  • the laser light L emitted from the tip is scanned in the X direction.
  • the Y-scanning piezoelectric element 48B expands and contracts in the longitudinal direction (Z direction) of the illumination optical fiber 44 when a B-phase alternating voltage is applied. At this time, one of the two piezoelectric elements 48B contracts in the Z direction and the other expands in the Z direction, so that the elastic portion 47 is excited by bending vibration in the Y direction with the position of the fixed portion 46 as a node. Is done. The bending vibration of the elastic portion 47 is transmitted to the illumination optical fiber 44.
  • the distal end portion of the illumination optical fiber 44 bends and vibrates in the Y direction, the distal end of the illumination optical fiber 44 vibrates in the Y direction, and the laser light L emitted from the distal end is scanned in the Y direction.
  • the detection unit 5 includes a light receiving unit 51 that receives the reflected light L ′ of the laser light L reflected from the subject A, and a photodetector 52 that detects the reflected light L ′ received by the light receiving unit 51.
  • the light receiving unit 51 is a light receiving optical fiber (hereinafter also referred to as “light receiving optical fiber 51”) arranged in parallel with the illumination optical fiber 44 and inside the insertion unit 21.
  • a plurality of light receiving optical fibers 51 are provided so as to surround the illumination optical fiber 44 in the circumferential direction.
  • a light receiving lens (not shown) is disposed on the front end side of each light receiving optical fiber 51, and the reflected light L ′ from the subject A enters the front end surface of the light receiving optical fiber 51 through the light receiving lens.
  • the base end of the light receiving optical fiber 51 is connected to the photodetector 52.
  • the reflected light L ′ incident on the front end surface of the light receiving optical fiber 51 is guided from the front end to the base end of the light receiving optical fiber 51 and enters the photodetector 52.
  • the photodetector 52 detects the reflected light L ′ at regular time intervals, and outputs an electrical signal corresponding to the detected intensity of the reflected light L ′ to the A / D converter 61 in the image generation unit 6.
  • the photodetector 52 is, for example, a color separation element (not shown) that separates the white reflected light L ′ from the light receiving optical fiber 51 into three colors of R, G, and B, and color separation by the color separation element. And three photodiodes (not shown) that photoelectrically convert each reflected light L ′ of R, G, and B. With this configuration, the photodetector 52 detects the reflected light L ′ of R, G, and B separately and simultaneously, and outputs three electrical signals to the A / D converter 61.
  • the image generation unit 6 includes an A / D converter 61 that performs A / D conversion on the electrical signal output from the photodetector 52, and a delay that corrects the time delay of the digital value obtained by the A / D converter 61.
  • An adjustment unit 62 and an image forming unit 63 for forming an image are provided.
  • the A / D converter 61 obtains a digital value indicating the intensity of the reflected light L ′ by digitally converting each of the three electrical signals from the photodetector 52.
  • the obtained digital values are the R, G, and B luminance values of each pixel of the image formed by the image forming unit 63.
  • the R, G, and B luminance values detected at the time ti by the photodetector 52 are referred to as luminance values S (ti).
  • the A / D converter 61 transmits the obtained luminance value S (ti) to the delay adjustment unit 62.
  • the delay adjustment unit 62 calculates the following equation from the value of the distortion correction amount ⁇ received from the correction amount setting unit 8 and the value of the angular velocity ⁇ (described later) of the laser light L received from the control unit 7. Based on this, the shift time ⁇ t is calculated.
  • ⁇ t ⁇ / ⁇
  • the delay adjustment unit 62 receives from the control unit 7 a coordinate data set D2 (detailed later) in which the coordinates P (ti) of each pixel for one frame of the image are associated with the detection time ti.
  • the delay adjustment unit 62 sets the coordinate P (ti) and the detection time ti in the time direction in the coordinate data set D2 so that the coordinate P (ti) newly corresponds to the detection time ti + ⁇ t after the shift time ⁇ t. Shift relative to.
  • the correspondence between the coordinates P (ti) and the detection time ti is uniformly changed.
  • the distortion correction amount ⁇ is 0 °
  • the shift time ⁇ t is zero
  • the coordinate data set D2 ' is the same as the coordinate data set D2.
  • the delay adjusting unit 62 associates each coordinate P (ti + ⁇ t) in the coordinate data set D2 ′ with the same detection time ti as each coordinate P (ti + ⁇ t) in the luminance value data set D1 ′.
  • An image data set is generated by associating the luminance value S (ti).
  • the delay adjustment unit 62 transmits the generated image data set to the image forming unit 63.
  • the delay adjustment unit 62 sets the shift time ⁇ t based on the distortion correction amount ⁇ for each frame of the image.
  • the shift time ⁇ t can be changed in units of one image frame
  • the distortion correction amount ⁇ of the image of the subject A can be changed in units of one image frame.
  • the image forming unit 63 forms an image by assigning each pixel a luminance value S (ti) corresponding to the coordinate P (ti + ⁇ t) of the pixel based on the image data set.
  • the image forming unit 63 transmits the formed image to the display unit 9 and causes the display unit 9 to display the image.
  • the control unit 7 controls the photodetector 52 so that the photodetector 52 detects the reflected light L ′ at regular time intervals.
  • the control unit 7 sets the frequency and amplitude of the alternating voltage, generates a control signal for generating a digital waveform having the set frequency and amplitude, and transmits the control signal to the waveform generation unit 41.
  • control unit 7 generates two control signals that cause the waveform generation unit 41 to generate A-phase and B-phase digital waveforms whose amplitude changes in a sine wave shape and whose phases are shifted from each other by ⁇ / 2.
  • the A / B converter 42 generates alternating voltages of the A phase and the B phase, the amplitude of which gradually changes in a sine wave shape and whose phases are shifted from each other by ⁇ / 2.
  • the tip of the illumination optical fiber 44 spirally vibrates, and the laser light L is scanned along the spiral scanning locus B on the subject A. It is like that.
  • control unit 7 generates the control signal so that the waveform generation unit 41 generates a digital waveform having a constant frequency.
  • the laser beam L is scanned along the scanning locus B at a constant angular velocity ⁇ .
  • the control unit 7 calculates the angular velocity ⁇ of the laser light L from the frequency of the digital waveform, and transmits the calculated value of the angular velocity ⁇ to the delay adjustment unit 62.
  • the control unit 7 uses the theoretical irradiation position on the scanning trajectory B, which is irradiated with the laser light L at the detection time ti of the reflected light L ′ by the photodetector 52, as a control signal for A phase and B phase. Calculate based on.
  • the control unit 7 sets the theoretical irradiation position at each detection time ti as the pixel coordinate P (ti) in the image, and associates the pixel coordinate P (ti) with the detection time ti as shown in FIG. 4A.
  • the attached coordinate data set D2 is generated, and the generated coordinate data set D2 is transmitted to the image forming unit 63.
  • the above-described waveform generation unit 41, delay adjustment unit 62, image forming unit 63, and control unit 7 are realized by a computer.
  • the computer includes a middle processing unit (CPU), a main storage device such as a RAM, and an auxiliary storage device.
  • the auxiliary storage device is a computer-readable non-transitory storage medium such as a hard disk or various memories, and a control program for controlling the optical scanning unit 4 and the detection unit 5 and an image for forming an image.
  • each part 41, 62, 63, 7 may be comprised from dedicated hardware like ASIC (application specific integrated circuit).
  • ASIC application specific integrated circuit
  • the correction amount setting unit 8 includes a GUI (graphical user interface) provided in the display unit 9 and an input device such as a mouse connected to the display unit 9, and allows the user to set the distortion correction amount ⁇ . Is displayed on the screen of the display unit 9.
  • GUI graphical user interface
  • the distortion correction amount ⁇ corresponds to the rotation angle of the image in the image.
  • a deviation due to a delay which will be described later, is scanned between the theoretical irradiation position and the actual irradiation position of the laser beam L. It occurs in the circumferential direction around the center of the locus B. This deviation between the irradiation positions causes image distortion. Therefore, by rotating the image in the image, the deviation between the theoretical irradiation position of the laser light L and the actual irradiation position can be eliminated, and the distortion of the image can be corrected.
  • the graphic includes, for example, a scale bar 81 from 0 ° to 360 ° and a slider 82 movable within the scale bar 81 as shown in FIG.
  • the user can set the distortion correction amount ⁇ within the range from 0 ° to 360 ° by moving the slider 82 within the scale bar 81 using the input device.
  • the distortion correction amount ⁇ is initially set to 0 °.
  • the correction amount setting unit 8 transmits the set distortion correction amount ⁇ to the delay adjustment unit 62.
  • the waveform generator 41 starts generating a digital waveform in response to a control signal from the controller 7, an alternating voltage is applied to the piezoelectric elements 48A and 48B of the scanner 43, so that the tip of the illumination optical fiber 44 spirally vibrates. To do.
  • the white laser light L is scanned along the spiral scanning locus B on the surface of the subject A facing the distal end surface of the insertion portion 21.
  • the reflected light L ′ of the laser light L reflected from the surface of the subject A is received by the light receiving optical fiber 51, detected by the photodetector 52, and an electrical signal of the reflected light L ′ is transmitted into the image generation unit 6. Is done.
  • the electric signal is digitally converted by the A / D converter 61, whereby the luminance value S (ti) of each pixel of the image is obtained.
  • the delay adjustment unit 62 a luminance value data set D1 that is time-series data of the luminance value S (ti) at each detection time ti is generated, and further, a luminance value data set D1 'is generated.
  • a coordinate data set D2 which is time series data of the coordinates P (ti) at each detection time ti is generated.
  • the set D2 is transmitted to the delay adjustment unit 62.
  • the delay adjustment unit 62 generates a coordinate data set D2 ′ from the coordinate data set D2, and adds the luminance value S () at the same time ti in the luminance value data set D1 to each coordinate P (ti + ⁇ t) in the coordinate data set D2 ′.
  • By associating ti) an image data set for one image frame is generated.
  • each luminance value S (ti) is associated with a coordinate P (ti) as indicated by a white circle in FIG.
  • the actual scanning trajectory B of the laser beam L on the subject A does not have an ideal spiral shape, but has a distorted shape as shown in FIG.
  • the distortion of the scanning locus B is caused by a plurality of factors such as the optical characteristics of the condensing lenses 11 and 12 and the mechanical characteristics of the scanner 43.
  • the image of the subject A is further distorted by adding the deviation of the irradiation position of the laser beam L due to various delays to the distortion of the scanning locus B.
  • the delay is a mechanical response delay of the scanner 43, an electrical delay associated with signal propagation and processing, or the like. Due to such a delay, at the detection time ti, the laser beam L is actually irradiated to a position advanced in the scanning direction from the coordinate P (ti) which is a theoretical irradiation position.
  • the correction amount setting unit 8 sets the distortion correction amount ⁇ to an angle other than 0 °.
  • the delay adjustment unit 62 sets the coordinate data so that the coordinate P (ti) is delayed from the detection time ti by the shift time ⁇ t based on the distortion correction amount ⁇ .
  • the coordinates P (ti) in the set D2 and the detection time ti are relatively shifted in the time direction.
  • each luminance value S (ti) is associated with the coordinate P (ti + ⁇ t) as shown by a black circle in FIG. 7, and image distortion due to delay is reduced.
  • the distortion of the image due to the delay is that the shift time ⁇ t is a delay time between the theoretical irradiation position of the laser beam L and the actual irradiation position (the laser beam L from the theoretical irradiation position to the actual irradiation position). Is completely removed when it is equal to the travel time). At this time, image distortion is minimized. Therefore, the user corrects the distortion correction set in the correction amount setting unit 8 by operating the GUI such as the scale bar 81 using an input device such as a mouse while observing the image displayed on the display unit 9. The amount ⁇ is changed, and the distortion correction amount ⁇ that minimizes the distortion of the image in the image is finally set in the correction amount setting unit 8. Thereby, an image in which the distortion of the image of the subject A is corrected can be obtained on the screen of the display unit 9 in real time.
  • the distortion of the image of the subject A in the image can be corrected with high accuracy. Therefore, it is possible to form an image with corrected image distortion at a high frame rate required for live video, even with an inexpensive hardware configuration without using a high-performance CPU or dedicated hardware.
  • image distortion due to delay occurs in the entire image. By correcting such image distortion, there is an advantage that the image quality can be effectively improved.
  • associating the luminance value S (ti) with the coordinates P (ti + ⁇ t) at the time after the shift time ⁇ t means that the image of the subject A in the image is rotated by the angle ⁇ . . Therefore, a process for rotating the image by an angle ⁇ in the opposite direction to the rotation of the image accompanying the delay adjustment may be performed on the image.
  • the coordinate P (ti) in the coordinate data set D2 and the detection time ti are shifted in the time direction.
  • the luminance value S in the luminance value data set D1 is used.
  • (Ti) and detection time ti may be shifted in the time direction. Even if it does in this way, the same effect can be acquired.
  • the optical scanning unit 4 scans the laser light L along the spiral scanning locus B.
  • the optical scanning unit 4 may scan along the concentric scanning locus.
  • the configuration of the scanner 43 can be appropriately changed according to the shape of the scanning locus.
  • an optical scanner using a photonic crystal may be used instead of the scanner 43 using the piezoelectric actuator 45.
  • the laser light L may be scanned along a Lissajous scanning locus.
  • the scanning direction of the laser light L is opposite on both sides of the diagonal line of the substantially rectangular scanning region. Therefore, as shown by a solid line in FIG. 8, distortion occurs in the image such that the images are shifted in opposite directions on one side and the other side across the diagonal line.
  • the complex distortion of the image which has been difficult to correct by the conventional method, can be corrected only by simple calculation.
  • the scanning unit including the optical scanning unit 4 that scans the laser light L to acquire an optical image is provided.
  • the ultrasonic beam is scanned on the subject A.
  • a scanning unit including an ultrasonic scanning unit that acquires an ultrasonic image by detecting an echo (signal wave) from the subject A may be provided.
  • the endoscope apparatus has been described.
  • the image rotation processing described above is an arbitrary observation in which an image of the subject A is acquired by rotating and scanning a light or ultrasonic beam on the subject A. It can be applied to the device.
  • each laser light may be pulsed light, and may be oscillated in order so as not to overlap each other in time.
  • the color image quality can be maintained well.
  • the plurality of correction amount setting units 8 may be linked to each other so that the shift time changes uniformly according to the ratio of the deviation amounts according to the wavelength characteristics. By doing in this way, the operativity for a user can be improved.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

La présente invention concerne un dispositif d'observation de type à balayage (1) comprenant : une unité de balayage (4) qui balaie un faisceau (L) servant à exposer un sujet (A) à un rayonnement ; une unité de détection (5) qui détecte une onde signal (L') en provenance du sujet (A) ; une unité d'affichage (9) qui affiche une image ; une unité (8) de réglage de degré de correction permettant à un utilisateur de régler un degré de correction de distorsion ; et une unité (6) de génération d'image qui génère une image par l'association de l'onde signal (L') avec la position d'exposition au rayonnement du faisceau (L) à un instant plus tôt par un temps de décalage sur la base du degré de correction de distorsion par rapport au temps de détection par l'unité de détection (5).
PCT/JP2015/074430 2015-08-28 2015-08-28 Dispositif d'observation de type à balayage Ceased WO2017037781A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2015/074430 WO2017037781A1 (fr) 2015-08-28 2015-08-28 Dispositif d'observation de type à balayage
JP2017537051A JPWO2017037781A1 (ja) 2015-08-28 2015-08-28 走査型観察装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/074430 WO2017037781A1 (fr) 2015-08-28 2015-08-28 Dispositif d'observation de type à balayage

Publications (1)

Publication Number Publication Date
WO2017037781A1 true WO2017037781A1 (fr) 2017-03-09

Family

ID=58187193

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/074430 Ceased WO2017037781A1 (fr) 2015-08-28 2015-08-28 Dispositif d'observation de type à balayage

Country Status (2)

Country Link
JP (1) JPWO2017037781A1 (fr)
WO (1) WO2017037781A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020095114A (ja) * 2018-12-11 2020-06-18 株式会社日立製作所 光走査装置及び光走査方法
JP2021140007A (ja) * 2020-03-04 2021-09-16 株式会社日立製作所 光走査装置及び光走査方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001356273A (ja) * 2000-06-15 2001-12-26 Olympus Optical Co Ltd 共焦点光走査プローブ装置
JP2002131646A (ja) * 2000-08-03 2002-05-09 Leica Microsystems Heidelberg Gmbh 走査型顕微鏡法における位置信号および検出信号の位相補正のための方法ならびに装置および走査型顕微鏡
JP2010515947A (ja) * 2007-01-10 2010-05-13 ユニヴァーシティ オブ ワシントン 走査ビーム装置の較正
WO2014020943A1 (fr) * 2012-07-30 2014-02-06 オリンパスメディカルシステムズ株式会社 Système d'endoscope
JP2014068704A (ja) * 2012-09-28 2014-04-21 Nidek Co Ltd 眼科撮影装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3552534A3 (fr) * 2004-10-01 2020-04-15 University of Washington Procédés de remappage pour réduire les distorsions dans des images
WO2007084915A2 (fr) * 2006-01-17 2007-07-26 University Of Washington Imagerie optique non linéaire à balayage par fibre optique et endoscope de spectroscopie
US8757812B2 (en) * 2008-05-19 2014-06-24 University of Washington UW TechTransfer—Invention Licensing Scanning laser projection display devices and methods for projecting one or more images onto a surface with a light-scanning optical fiber
JP5570436B2 (ja) * 2011-01-06 2014-08-13 Hoya株式会社 キャリブレーション装置、及びキャリブレーション方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001356273A (ja) * 2000-06-15 2001-12-26 Olympus Optical Co Ltd 共焦点光走査プローブ装置
JP2002131646A (ja) * 2000-08-03 2002-05-09 Leica Microsystems Heidelberg Gmbh 走査型顕微鏡法における位置信号および検出信号の位相補正のための方法ならびに装置および走査型顕微鏡
JP2010515947A (ja) * 2007-01-10 2010-05-13 ユニヴァーシティ オブ ワシントン 走査ビーム装置の較正
WO2014020943A1 (fr) * 2012-07-30 2014-02-06 オリンパスメディカルシステムズ株式会社 Système d'endoscope
JP2014068704A (ja) * 2012-09-28 2014-04-21 Nidek Co Ltd 眼科撮影装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020095114A (ja) * 2018-12-11 2020-06-18 株式会社日立製作所 光走査装置及び光走査方法
JP7084293B2 (ja) 2018-12-11 2022-06-14 株式会社日立製作所 光走査装置及び光走査方法
JP2021140007A (ja) * 2020-03-04 2021-09-16 株式会社日立製作所 光走査装置及び光走査方法

Also Published As

Publication number Publication date
JPWO2017037781A1 (ja) 2018-06-14

Similar Documents

Publication Publication Date Title
US9993139B2 (en) Optical scanning observation apparatus and optical scanning observation method
US9832435B2 (en) Optical scanning image forming apparatus and optical scanning image forming method
JP5242148B2 (ja) 走査型顕微鏡装置
WO2016079768A1 (fr) Dispositif d'endoscope de type à balayage optique
JP2009222973A (ja) 画像投射装置
US20160327782A1 (en) Optical scanning observation apparatus
WO2017037781A1 (fr) Dispositif d'observation de type à balayage
US10108002B2 (en) Optical scanning image forming apparatus and optical scanning image forming method
JP6445809B2 (ja) 光走査型観察装置
WO2015145826A1 (fr) Dispositif endoscopique à balayage
WO2016116963A1 (fr) Procédé et dispositif de balayage optique
WO2017037782A1 (fr) Dispositif d'observation de type à balayage
US10491873B2 (en) Scanning observation apparatus and image display method of scanning observation apparatus
WO2016116962A1 (fr) Procédé et dispositif de balayage optique
TW200909975A (en) Device and method for compensating at least one non-linearity and laser projection system
JP2015025900A (ja) レーザ走査顕微鏡
JP2012147831A (ja) 走査位置補正装置
JP2010250028A (ja) 画像投射装置
WO2017104190A1 (fr) Système d'endoscope
JP2005241321A (ja) レーザ走査装置、レーザ走査型顕微鏡および走査プログラム
WO2016116968A1 (fr) Dispositif de balayage optique
KR102346342B1 (ko) 광간섭 단층 영상 시스템을 위한 멤스 미러 제어 장치 및 그 제어 방법
JP6053449B2 (ja) 顕微鏡
WO2017109814A1 (fr) Dispositif d'observation à balayage lumineux
JP5945390B2 (ja) 観察装置、観察装置の作動方法及び画像処理プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15902907

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017537051

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15902907

Country of ref document: EP

Kind code of ref document: A1