[go: up one dir, main page]

WO2016117162A1 - Système d'observation à balayage optique - Google Patents

Système d'observation à balayage optique Download PDF

Info

Publication number
WO2016117162A1
WO2016117162A1 PCT/JP2015/075090 JP2015075090W WO2016117162A1 WO 2016117162 A1 WO2016117162 A1 WO 2016117162A1 JP 2015075090 W JP2015075090 W JP 2015075090W WO 2016117162 A1 WO2016117162 A1 WO 2016117162A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
period
scanning
unit
output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2015/075090
Other languages
English (en)
Japanese (ja)
Inventor
和真 金子
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Priority to JP2016527478A priority Critical patent/JPWO2016117162A1/ja
Publication of WO2016117162A1 publication Critical patent/WO2016117162A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances

Definitions

  • the present invention relates to an optical scanning observation system, and more particularly to an optical scanning observation system that scans a subject to acquire an image.
  • a scanning endoscope that does not include a solid-state imaging device in a portion corresponding to the above-described insertion portion, and a system that includes the scanning endoscope are known. ing.
  • a system including a scanning endoscope moves a subject in a predetermined scanning pattern by swinging the tip of an optical fiber for illumination that guides light emitted from a light source unit. Two-dimensional scanning is performed, and return light from the subject is received by a light receiving optical fiber, and an image of the subject is generated based on the return light received by the light receiving optical fiber.
  • an optical scanning endoscope system disclosed in Japanese Patent Application Laid-Open No. 2010-63497 is known.
  • a timing when an image is generated by scanning a subject with a predetermined scanning pattern, and a monitor or the like is displayed by scanning the image with a predetermined scanning method. It is said that the image quality of the video displayed on the display device may be reduced in accordance with the scanning of the subject due to the mismatch between the timing when sequentially outputting to the device. There is a problem.
  • Japanese Patent Application Laid-Open No. 2010-63497 does not particularly mention a method or the like that can solve the above-mentioned problems, that is, there are still problems corresponding to the above-mentioned problems.
  • the present invention has been made in view of the above-described circumstances, and an object thereof is to provide an optical scanning observation system capable of improving the image quality of an image displayed on a display device in accordance with scanning of a subject. Yes.
  • An optical scanning observation system includes an endoscope configured to scan a subject with illumination light emitted from a light source unit and receive return light, and the subject to a first scanning path. And a scanning control unit configured to perform control for alternately scanning in the second scanning path and return light received by the endoscope, and according to the intensity of the detected return light
  • a light detection unit configured to generate and output a light detection signal, and output from the light detection unit during a first scanning period corresponding to a period during which the subject is scanned by the first scanning path.
  • An image generation unit configured to perform an operation for generating and outputting one image corresponding to the detected light detection signal, and outputting the one image generated by the image generation unit twice in succession
  • Display control configured to perform operations for And an operation for assigning the one image output from the display control unit twice in succession to the odd field and the even field of the interlace method and sequentially outputting each image to the display device one by one.
  • An output processing unit, a period during which the subject is scanned once each in the first scanning path and the second scanning path, and the one image generated by the image generation unit corresponds to one frame of an interlace method Control for synchronizing the operations of the image generation unit, the display control unit, and the output processing unit so that the period of the image displayed on the display device as the same length is the same.
  • a synchronization control unit configured as described above.
  • FIG. 3 is a diagram illustrating an example of a configuration of an image processing unit according to the first embodiment.
  • FIG. 6 is a diagram for explaining specific operations performed in each unit of the image processing unit according to the first embodiment.
  • the figure which shows an example of a structure of the image process part which concerns on the 1st modification in a 1st Example.
  • movement etc. which are performed in each part of the image process part which concerns on the 1st modification in a 1st Example.
  • the figure which shows an example of a structure of the image process part which concerns on the 2nd modification in a 1st Example.
  • movement etc. which are performed in each part of the image process part which concerns on the 2nd modification in a 1st Example.
  • the figure which shows an example of the signal waveform of the drive signal supplied to an actuator part in 2nd Example.
  • FIG. 1 is a diagram illustrating a configuration of a main part of an optical scanning observation system according to an embodiment.
  • the optical scanning observation system 1 includes a scanning endoscope 2 that is inserted into a body cavity of a subject, a main body device 3 that can connect the endoscope 2, and a main body
  • a display device 4 connected to the device 3 and an input device 5 capable of inputting information and giving instructions to the main device 3 are configured.
  • the endoscope 2 includes an insertion portion 11 formed with an elongated shape that can be inserted into a body cavity of a subject.
  • a connector portion 61 for detachably connecting the endoscope 2 to the connector receiving portion 62 of the main body device 3 is provided at the proximal end portion of the insertion portion 11.
  • an electrical connector device for electrically connecting the endoscope 2 and the main body device 3 is provided inside the connector portion 61 and the connector receiving portion 62.
  • an optical connector device for optically connecting the endoscope 2 and the main body device 3 is provided inside the connector portion 61 and the connector receiving portion 62.
  • An illumination fiber 12 that is an optical fiber that guides the illumination light emitted from the light source unit 21 of the main body device 3 to the illumination optical system 14 in a portion from the proximal end portion to the distal end portion inside the insertion portion 11;
  • the incident end including the light incident surface of the illumination fiber 12 is disposed in a multiplexer 32 provided inside the main body device 3. Further, the emission end portion including the light emission surface of the illumination fiber 12 is disposed in the vicinity of the light incident surface of the lens 14 a provided at the distal end portion of the insertion portion 11.
  • the incident end portion including the light incident surface of the light receiving fiber 13 is fixedly disposed around the light emitting surface of the lens 14 b on the distal end surface of the insertion portion 11. Further, the emission end portion including the light emission surface of the light receiving fiber 13 is disposed in a photodetector 37 provided inside the main body device 3.
  • the illumination optical system 14 includes a lens 14a on which illumination light having passed through the light emission surface of the illumination fiber 12 is incident, and a lens 14b that emits illumination light having passed through the lens 14a to a subject.
  • an actuator portion 15 that is driven according to a drive signal supplied from the driver unit 22 of the main body device 3 is provided.
  • the illumination fiber 12 and the actuator unit 15 are arranged so as to have, for example, the positional relationship shown in FIG. 2 in a cross section perpendicular to the longitudinal axis direction of the insertion unit 11.
  • FIG. 2 is a cross-sectional view for explaining the configuration of the actuator unit.
  • a ferrule 41 as a joining member is disposed between the illumination fiber 12 and the actuator unit 15.
  • the ferrule 41 is made of, for example, zirconia (ceramic) or nickel.
  • the ferrule 41 is formed as a quadrangular prism, and side surfaces 42 a and 42 c that are perpendicular to the X-axis direction, which is the first axial direction orthogonal to the longitudinal axis direction of the insertion portion 11, Side surfaces 42b and 42d perpendicular to the Y-axis direction, which is the second axial direction perpendicular to the longitudinal axis direction of the insertion portion 11, are included.
  • the illumination fiber 12 is fixedly arranged at the center of the ferrule 41.
  • the ferrule 41 may be formed as a shape other than the quadrangular column as long as it has a column shape.
  • the actuator unit 15 having a function as an optical scanning unit includes a piezoelectric element 15a disposed along the side surface 42a, a piezoelectric element 15b disposed along the side surface 42b, and a side surface 42c.
  • the piezoelectric element 15c is disposed along the side surface 42d, and the piezoelectric element 15d is disposed along the side surface 42d.
  • the piezoelectric elements 15a to 15d have polarization directions set individually in advance, and are configured to expand and contract in accordance with a drive signal supplied from the main body device 3.
  • the endoscope 2 is configured to scan the subject with the illumination light emitted from the light source unit 21 of the main body device 3 and to receive the return light from the subject through the light receiving fiber 13.
  • a memory 16 for storing endoscope information including ID information unique to each endoscope 2 is provided inside the insertion unit 11.
  • the endoscope information stored in the memory 16 is connected when the connector portion 61 of the endoscope 2 and the connector receiving portion 62 of the main body device 3 are connected and the power of the main body device 3 is turned on. Read by the controller 25 of the main unit 3.
  • the main unit 3 includes a light source unit 21, a driver unit 22, a detection unit 23, a memory 24, and a controller 25.
  • the light source unit 21 includes a light source 31a, a light source 31b, a light source 31c, and a multiplexer 32.
  • the light source 31a includes, for example, a laser light source that emits light in a red wavelength band (hereinafter also referred to as R light).
  • the light source 31a is configured to be switched to a light emitting state (on state) or a quenching state (off state) according to control of the controller 25.
  • the light source 31a is configured to emit R light with a light amount according to the control of the controller 25 in the light emitting state.
  • the light source 31b includes, for example, a laser light source that emits light in a green wavelength band (hereinafter also referred to as G light). Further, the light source 31b is configured to be switched to a light emitting state (on state) or a quenching state (off state) in accordance with the control of the controller 25. The light source 31b is configured to emit a G amount of light according to the control of the controller 25 in the light emitting state.
  • G light green wavelength band
  • the light source 31c includes, for example, a laser light source that emits light in a blue wavelength band (hereinafter also referred to as B light).
  • the light source 31c is configured to be switched to a light emitting state (on state) or a quenching state (off state) according to control of the controller 25.
  • the light source 31c is configured to emit a B amount of light according to the control of the controller 25 in the light emitting state.
  • the multiplexer 32 multiplexes the R light emitted from the light source 31a, the G light emitted from the light source 31b, and the B light emitted from the light source 31c onto the light incident surface of the illumination fiber 12. It is comprised so that it may radiate
  • the driver unit 22 includes a signal generator 33, D / A converters 34a and 34b, and an amplifier 35.
  • the signal generator 33 is configured to generate and output a drive signal for swinging the emission end of the illumination fiber 12 in accordance with the control of the controller 25.
  • the signal generator 33 generates, for example, a signal having a signal waveform as shown in FIG. 3 as the drive signal DA for swinging the emission end of the illumination fiber 12 in the X-axis direction. And output to the D / A converter 34a. In addition, the signal generator 33 shifts the phase of the drive signal DA by 90 °, for example, as a drive signal DB for swinging the emission end of the illumination fiber 12 in the Y-axis direction under the control of the controller 25. A signal is generated and output to the D / A converter 34b.
  • FIG. 3 is a diagram illustrating an example of a signal waveform of a drive signal supplied to the actuator unit in the first embodiment.
  • the D / A converter 34 a is configured to convert the digital drive signal DA output from the signal generator 33 into an analog drive signal DA and output the analog drive signal DA to the amplifier 35.
  • the D / A converter 34 b is configured to convert the digital drive signal DB output from the signal generator 33 into an analog drive signal DB and output the analog drive signal DB to the amplifier 35.
  • the amplifier 35 is configured to amplify the drive signals DA and DB output from the D / A converters 34 a and 34 b and output the amplified signals to the actuator unit 15.
  • FIG. 4 is a diagram illustrating an example of a spiral scanning path from the center point A to the outermost point B, which is observed when a drive signal corresponding to the signal waveform of FIG. 3 is supplied to the actuator unit.
  • FIG. 5 is a diagram illustrating an example of a spiral scanning path from the outermost point B to the center point A, which is observed when a drive signal corresponding to the signal waveform of FIG. 3 is supplied to the actuator unit.
  • the illumination light is irradiated to a position corresponding to the center point A of the irradiation position of the illumination light on the surface of the subject.
  • the irradiation position of the illumination light on the surface of the subject is displaced so as to draw the spiral scanning path SP1 outward from the center point A.
  • the illumination light is irradiated to the outermost point B of the illumination light irradiation position on the surface of the subject.
  • the irradiation position of the illumination light on the surface of the subject draws the spiral scanning path SP2 from the outermost point B as the starting point.
  • illumination light is irradiated to the center point A on the surface of the subject.
  • the actuator unit 15 swings the emission end of the illumination fiber 12 in accordance with the drive signals DA and DB supplied from the driver unit 22, and then passes through the emission end to the subject.
  • the irradiation position of the emitted illumination light can be displaced along the spiral scanning paths SP1 and SP2.
  • scanning is performed such that the number of turns of the spiral scanning path SP1 is larger than the number of turns of the spiral scanning path SP2.
  • the detection unit 23 is configured to detect return light received by the light receiving fiber 13 of the endoscope 2 and generate and output a light detection signal according to the intensity of the detected return light.
  • the detection unit 23 includes a photodetector 37 and an A / D converter 38.
  • the photodetector 37 includes, for example, an avalanche photodiode and the like, detects light (return light) emitted from the light emitting surface of the light receiving fiber 13, and detects analog light according to the intensity of the detected light. A signal is generated and sequentially output to the A / D converter 38.
  • the A / D converter 38 is configured to convert the analog light detection signal output from the light detector 37 into a digital light detection signal and sequentially output the digital light detection signal to the controller 25.
  • information including parameters such as a signal level, a frequency, and a signal amplification factor for specifying the signal waveform of FIG. 3 is stored in advance as control information used when controlling the main unit 3. Yes.
  • the controller 25 includes, for example, an integrated circuit such as an FPGA (Field Programmable Gate Array), and is configured to be able to form a circuit corresponding to each unit described below based on a predetermined program. Further, the controller 25 detects whether or not the insertion portion 11 is electrically connected to the main body device 3 by detecting the connection state of the connector portion 61 in the connector receiving portion 62 via a signal line or the like (not shown). It is configured to be able to. The controller 25 is configured to read control information stored in the memory 24 when the power of the main body device 3 is turned on.
  • the controller 25 includes a light source control unit 25a, a scanning control unit 25b, a synchronization control unit 25c, and an image processing unit 25d.
  • the light source control unit 25a is configured to perform control for the light source unit 21 to repeatedly emit, for example, R light, G light, and B light in this order.
  • the light source control unit 25a according to the present embodiment performs the above-described control in a period during which the subject is scanned by the spiral scanning path SP1 (a period from time Ta to Tb), while the subject is swirled.
  • the light source unit 21 may be controlled to stop the emission of the R light, the G light, and the B light during a period (a period from time Tb to Tc) scanned by the scanning path SP2.
  • the scanning control unit 25b Based on the control information read from the memory 24, the scanning control unit 25b repeatedly generates the drive signals DA and DB having the signal waveforms as described above, for example, thereby causing the subject to move along the spiral scanning paths SP1 and SP2.
  • the driver unit 22 is configured to perform control for alternately scanning.
  • the synchronization control unit 25c uses a period during which the subject is scanned once in the spiral scanning paths SP1 and SP2, and one image generated by the image generation unit 51 (described later) as an image for one frame of the interlace method. It is configured to generate and output a synchronization signal for synchronizing the operation (described later) of each unit of the image processing unit 25d so that the period displayed on the display device 4 has the same length.
  • the synchronization control unit 25c corresponds to a period during which the subject is scanned by the spiral scanning path SP1 and is longer than one field period of the interlace method, and the subject is Corresponding to the scanning period PN1 corresponding to the period scanned by the spiral scanning path SP2 and shorter than one field period of the interlace method, the operation of each part of the image processing unit 25d is synchronized. A synchronization signal is generated and output. That is, according to the operation of the synchronization control unit 25c as described above, the scanning period PS1 is started at the timing corresponding to the time Ta, and the scanning period PS1 is ended at the timing corresponding to the time Tb.
  • the scanning period PN1 is started at the timing corresponding to the time Tb, and the scanning period PN1 is ended at the timing corresponding to the time Tc.
  • the scanning period PS1 corresponds to a period in which the emission end of the illumination fiber 12 is swung so as to draw a spiral locus corresponding to the spiral scanning path SP1.
  • the scanning period PN1 corresponds to a period in which the emission end of the illumination fiber 12 is swung so as to draw a spiral trajectory corresponding to the spiral scanning path SP2.
  • the synchronization signal is not limited to the one that is output from the synchronization control unit 25c integrated with the controller 25.
  • the synchronization signal is output from a device such as a timing generator provided separately from the controller 25. May be output.
  • the image processing unit 25d generates one image corresponding to the light detection signal output from the detection unit 23 during the scanning period PS1 according to the synchronization signal output from the synchronization control unit 25c, and the generated one An image is assigned to an odd field and an even field of an interlace system, and is sequentially output to the display device by one field.
  • the image processing unit 25 d includes an image generation unit 51, a buffer unit 52, a display control unit 53, and an output processing unit 54. Yes.
  • FIG. 6 is a diagram illustrating an example of the configuration of the image processing unit according to the first embodiment.
  • the image generation unit 51 includes a memory 51m that can write one image generated as described later. Further, the image generation unit 51 generates and outputs one image corresponding to the light detection signal output from the detection unit 23 during the scanning period PS1 in accordance with the synchronization signal output from the synchronization control unit 25c. It is configured.
  • the image generation unit 51 maps the light detection signal output from the detection unit 23 in a raster shape as pixel information in the scanning period PS1 in accordance with the synchronization signal output from the synchronization control unit 25c. A single image is generated, and the generated single image is sequentially written in the memory 51m. Further, the image generation unit 51 is configured to read out the latest one image written in the memory 51m and sequentially output it to the buffer unit 52 in the scanning period PN1 in accordance with the synchronization signal output from the synchronization control unit 25c. ing.
  • the buffer unit 52 is configured to be able to temporarily hold the image output from the image generation unit 51 and output it to the display control unit 53 in accordance with the synchronization signal output from the synchronization control unit 25c. Yes.
  • the buffer unit 52 includes, for example, a memory 52m in which an image output from the image generation unit 51 can be written, and is output from the image generation unit 51 in accordance with a synchronization signal output from the synchronization control unit 25c.
  • the images to be written are sequentially written in the memory 52m, and the latest one image written in the memory 52m is read out and sequentially output to the display control unit 53.
  • the display control unit 53 includes a memory 53m having two storage areas MA and MB for individually writing two temporally adjacent images among images sequentially output from the buffer unit 52. ing.
  • the display control unit 53 performs an operation for writing the image output from the buffer unit 52 in one storage area of the memory 53m in accordance with the synchronization signal output from the synchronization control unit 25c, and stores the other storage in the memory 53m. An operation for reading the image written in the area twice in succession and outputting it to the output processing unit 54 is performed. Further, the display control unit 53 is configured to perform a scaling process for enlarging the image read from the memory 53m to a resolution corresponding to, for example, an HD (high definition) image quality, and output the processed image to the output processing unit 54. Has been.
  • the output processing unit 54 assigns one image output from the display control unit 53 twice in succession to the interlaced odd field and even field to the display device 4 according to the synchronization signal output from the synchronization control unit 25c. An operation for sequentially outputting one field at a time is performed. Further, the output processing unit 54 performs processing for outputting the image for one field assigned to the odd field or the even field to the display device 4 in accordance with a digital video transmission standard such as the HD-SDI method. It is configured. In addition, the output processing unit 54, with respect to the image output from the display control unit 53, is, for example, a circular image having a shape similar to the shape of the outermost edge of the spiral scanning path SP1 for an interlaced image for one frame. A trimming process or a masking process for displaying as an image is performed and output to the display device 4. In the present embodiment, for example, information used in the above-described trimming process or masking process may be included in the control information stored in the memory 24.
  • the display device 4 is composed of a monitor that supports digital input, for example.
  • the display device 4 is configured to display images sequentially output from the main device 3 in an interlaced manner.
  • the input device 5 includes, for example, a keyboard or a touch panel.
  • the input device 5 may be configured as a separate device from the main body device 3 or may be configured as an interface integrated with the main body device 3.
  • the user After the user connects each part of the optical scanning observation system 1 and turns on the power, for example, by turning on a scanning start switch (not shown) of the input device 5, the user scans a desired subject with the endoscope 2. Is instructed to the controller 25.
  • the light source controller 25a When the light source controller 25a detects that the scanning start switch of the input device 5 is turned on, the light source controller 25a controls the light source unit 21 to repeatedly emit the R light, the G light, and the B light in this order. .
  • the scanning control unit 25b When the scanning control unit 25b detects that the scanning start switch of the input device 5 has been turned on, the scanning control unit 25b repeatedly generates the drive signals DA and DB, thereby alternating the desired subject with the spiral scanning paths SP1 and SP2. Control for causing the driver unit 22 to scan is performed.
  • a desired subject is scanned with the R light, G light, and B light that are illumination lights emitted from the light source unit 21, and Return light from the desired subject is detected by the detection unit 23, and a light detection signal corresponding to the intensity of the return light is input to the image processing unit 25d.
  • the synchronization control unit 25c When the synchronization control unit 25c detects that the scanning start switch of the input device 5 is turned on, the synchronization control unit 25c synchronizes the operation of each unit of the image processing unit 25d according to the scanning period PS1 and the scanning period PN1. Generate and output a synchronization signal.
  • FIG. 7 is a diagram for explaining specific operations performed in each unit of the image processing unit according to the first embodiment.
  • the scanning period PS1 and the scanning period PN1 are alternately switched in this order.
  • a case where an image of the Pth frame is displayed on the display device 4 by the interlace method will be described as an example.
  • the image generation unit 51 scans from time T1, which is a timing corresponding to the time Ta in FIG. 3, to time T2, which is a timing corresponding to the time Tb in FIG. 3, according to the synchronization signal output from the synchronization control unit 25c.
  • time T1 which is a timing corresponding to the time Ta in FIG. 3, to time T2, which is a timing corresponding to the time Tb in FIG. 3, according to the synchronization signal output from the synchronization control unit 25c.
  • an image IS1 is generated by mapping the light detection signal output from the detection unit 23 as pixel information in a raster shape, and an operation for writing the generated image IS1 in the memory 51m is performed (FIG. 7).
  • Memory 51m Memory 51m).
  • the image generation unit 51 reads the image IS1 written in the memory 51m and outputs it to the buffer unit 52 in the scanning period PN1 from time T2 to time T3, which is the timing corresponding to time Tc and time Ta in FIG. (See the memory 51m in FIG.
  • the buffer unit 52 performs an operation for writing the image IS1 output from the image generation unit 51 in the memory 52m in the scanning period PN1 from time T2 to time T3 in accordance with the synchronization signal output from the synchronization control unit 25c. (See memory 52m in FIG. 7).
  • the buffer unit 52 reads the image IS1 written in the memory 52m in the interlaced one-field period PF1 corresponding to the period from the time T3 to the time T4 in accordance with the synchronization signal output from the synchronization control unit 25c. Then, an operation for outputting to the display control unit 53 is performed (see the memory 52m in FIG. 7).
  • the above-described time T4 is set in advance as a timing that bisects a period obtained by adding the scanning period PS1 and the scanning period PN1 one by one.
  • the display control unit 53 writes the image IS1 output from the buffer unit 52 in the storage area MA of the memory 53m in one field period PF1 from time T3 to time T4 according to the synchronization signal output from the synchronization control unit 25c. (Refer to the memory 53m / storage area MA in FIG. 7) while reading the image written in the storage area MB of the memory 53m before the one-field period PF1 (different from the image IS1) and outputting it An operation for outputting to 54 is performed (see memory 53m / storage area MB in FIG. 7).
  • the display control unit 53 performs an interlaced one-field period PF1 corresponding to a period from time T4 to time T5, which is a timing corresponding to time Tc in FIG. Then, an operation for reading the image IS1 written in the storage area MA of the memory 53m and outputting it to the output processing unit 54 is performed (refer to the memory 53m / storage area MA in FIG. 7).
  • the display controller 53 writes the image written in the storage area MA of the memory 53m in the interlaced one-field period PF1 corresponding to the period from time T5 to time T6. While performing the operation for reading out IS1 and outputting it to the output processing unit 54 (see the memory 53m / storage area MA in FIG. 7), the image output from the buffer unit 52 (different from the image IS1) is stored in the memory 53m. An operation for writing to the storage area MB is performed (see memory 53m / storage area MB in FIG. 7).
  • the time T6 described above is set in advance as the same timing as the time T4.
  • the output processing unit 54 scans the image IS1 output from the display control unit 53 in an interlaced manner in one field period PF1 from time T4 to time T5 in accordance with the synchronization signal output from the synchronization control unit 25c. An image for one field is acquired, and an operation for assigning the acquired image for one field to an odd field and outputting it to the display device 4 is performed (see the output processing unit 54 in FIG. 7). Further, the output processing unit 54 scans the image IS1 output from the display control unit 53 in an interlaced manner in one field period PF1 from time T5 to time T6 according to the synchronization signal output from the synchronization control unit 25c. Thus, an image for one field is acquired, and an operation for assigning the acquired image for one field to an even field and outputting the image to the display device 4 is performed (see the output processing unit 54 in FIG. 7).
  • the image IS1 generated in the scanning period PS1 from the time T1 to the time T2 is interlaced in the period from the time T4 to the time T6. It is displayed on the display device 4 as an image of the P frame (see the display device 4 in FIG. 7).
  • an image obtained by scanning a desired subject using the endoscope 2 can be sequentially output to the display device 4 at a timing suitable for the interlace method. it can. Therefore, according to the present embodiment, it is possible to improve the image quality of the video displayed on the display device in accordance with the scanning of the subject as compared with the related art.
  • the optical scanning observation system 1 may be configured to include, for example, an image processing unit 25e as shown in FIG. 8 instead of the image processing unit 25d in FIG.
  • FIG. 8 is a diagram illustrating an example of the configuration of the image processing unit according to the first modification of the first embodiment. In the following, for the sake of simplicity, specific descriptions regarding the above-described configuration and operation will be omitted as appropriate.
  • the image processing unit 25e includes an image generation unit 51, a buffer unit 52, a display control unit 73, and an output processing unit 54.
  • the display control unit 73 includes a memory 73 for writing images sequentially output from the buffer unit 52. Note that the memory 73m is not provided with a dedicated storage area for individually writing two temporally adjacent images, such as the storage areas MA and MB described above.
  • the display control unit 73 writes the image output from the buffer unit 52 in the memory 73m in accordance with the synchronization signal output from the synchronization control unit 25c, and at the timing when the writing amount of the image to the memory 73m reaches a predetermined amount. The reading of the image is started.
  • the display control unit 73 performs, for example, a scaling process for enlarging the image that has been read from the memory 73m to a resolution corresponding to an HD (high definition) image quality, and performs the scaling process.
  • the output image is output to the output processing unit 54.
  • FIG. 9 is a diagram for explaining specific operations performed in each unit of the image processing unit according to the first modification example of the first embodiment.
  • the image generation unit 51 scans from time T11, which is a timing corresponding to the time Ta in FIG. 3, to time T12, which is a timing corresponding to the time Tb in FIG. 3, in accordance with the synchronization signal output from the synchronization control unit 25c.
  • an image IS2 is generated by mapping the light detection signal output from the detection unit 23 as pixel information in a raster shape, and an operation for writing the generated image IS2 in the memory 51m is performed (FIG. 9).
  • Memory 51m Memory 51m).
  • the image generation unit 51 reads the image IS2 written in the memory 51m and outputs it to the buffer unit 52 in the scanning period PN1 from time T12 to time T13, which is the timing corresponding to time Tc and time Ta in FIG. (See the memory 51m in FIG. 9).
  • the buffer unit 52 performs an operation for writing the image IS2 output from the image generation unit 51 in the memory 52m in the scanning period PN1 from time T12 to time T13 in accordance with the synchronization signal output from the synchronization control unit 25c. (See memory 52m in FIG. 9). Further, the buffer unit 52 reads out the image IS2 written in the memory 52m and outputs it to the display control unit 73 during the period from the time T13 to the time T14 in accordance with the synchronization signal output from the synchronization control unit 25c. (Refer to the memory 52m in FIG. 9).
  • time T14 is a timing that bisects a period obtained by adding the scanning period PS1 and the scanning period PN1 one by one, that is, a period from time T13 to time T14.
  • a case where is equal to one field period PF2 of the interlace method will be described as an example.
  • the display control unit 73 performs an operation for writing the image IS2 output from the buffer unit 52 into the memory 73m in a period from time T13 to time T14 in accordance with the synchronization signal output from the synchronization control unit 25c (FIG. 9 memory 73m).
  • the display control unit 73 for example, at time T15 corresponding to the timing when half of the image IS2 output from the buffer unit 52 is written in the memory 73m, that is, before the writing of the image IS2 to the memory 73m is completed. Reading of the image IS2 is started. The display control unit 73 completes reading of the image IS2 written in the memory 73m and completes reading from the memory 73m in the interlaced one-field period PF2 corresponding to the period from time T15 to time T16. The image IS2 is subjected to a scaling process and output to the output processing unit 54 (see the memory 73m in FIG. 9).
  • the display control unit 73 performs a scaling process on the image IS2 that has been read from the memory 73m in the interlaced one-field period PF2 corresponding to the period from time T16 to time T17, and outputs the output processing unit 54. (See the memory 73m in FIG. 9).
  • the timing at which the display control unit 73 starts reading the image IS2 is adjusted in advance so as to match the one-field period PF2.
  • the output processing unit 54 scans the image IS2 output from the display control unit 53 in an interlaced manner in one field period PF2 from time T15 to time T16 according to the synchronization signal output from the synchronization control unit 25c. An image for one field is acquired, and an operation for assigning the acquired image for one field to an odd field and outputting it to the display device 4 is performed (see the output processing unit 54 in FIG. 9). Further, the output processing unit 54 scans the image IS2 output from the display control unit 53 in an interlaced manner in one field period PF2 from time T16 to time T17 in accordance with the synchronization signal output from the synchronization control unit 25c. Thus, an image for one field is acquired, and an operation for assigning the acquired image for one field to an even field and outputting it to the display device 4 is performed (see the output processing unit 54 in FIG. 9).
  • the image IS2 generated in the scanning period PS1 from the time T11 to the time T12 is interlaced in the period from the time T15 to the time T17. It is displayed on the display device 4 as an image of the P frame (see the display device 4 in FIG. 9). Therefore, according to the operation of the image processing unit 25e as described above, for example, the visual discomfort that may occur when observing a desired subject while viewing the video displayed on the display device 4 is reduced. be able to.
  • an image obtained by scanning a desired subject using the endoscope 2 can be sequentially output to the display device 4 at a timing suitable for the interlace method. it can. Therefore, also in this modification, the image quality of the video displayed on the display device according to the scanning of the subject can be improved as compared with the related art.
  • the optical scanning observation system 1 includes, for example, an image processing unit 25f as shown in FIG. 10 instead of the image processing unit 25d in FIG. 6 and the image processing unit 25e in FIG. It may be configured.
  • FIG. 10 is a diagram illustrating an example of the configuration of the image processing unit according to the second modification example of the first embodiment.
  • the image processing unit 25f includes an image generation unit 51, a display control unit 53, and an output processing unit 54. That is, the image processing unit 25f has substantially the same configuration as that obtained by removing the buffer unit 52 from the image processing unit 25d.
  • FIG. 11 is a diagram for explaining specific operations and the like performed in each unit of the image processing unit according to the second modification example of the first embodiment.
  • the image generation unit 51 scans from time T21, which is the timing corresponding to time Ta in FIG. 3, to time T22, which is the timing corresponding to time Tb in FIG. 3, according to the synchronization signal output from the synchronization control unit 25c.
  • time T21 which is the timing corresponding to time Ta in FIG. 3
  • time T22 which is the timing corresponding to time Tb in FIG. 3
  • an image IS3 is generated by mapping the light detection signal output from the detection unit 23 as pixel information in a raster shape, and an operation for writing the generated image IS3 in the memory 51m is performed (FIG. 11).
  • Memory 51m Memory 51m).
  • the image generation unit 51 reads the image IS3 written in the memory 51m in the scanning period PN1 from time T22 to time T23, which is the timing corresponding to time Tc and time Ta in FIG. An operation for outputting is performed (see the memory 51m in FIG. 11).
  • the display control unit 53 writes the image IS3 output from the image generation unit 51 in the storage area MA of the memory 53m in the scanning period PN1 from time T22 to time T23 according to the synchronization signal output from the synchronization control unit 25c. 11 (refer to the memory 53m / storage area MA in FIG. 11), the image written in the storage area MB of the memory 53m before the scanning period PN1 (different from the image IS3) is read and output processing unit 54 (See the memory 53m / storage area MB in FIG. 11).
  • the display controller 53 writes the image written in the storage area MA of the memory 53m in the interlaced one-field period PF3 corresponding to the period from time T23 to time T24.
  • An operation for reading out IS3 and outputting it to the output processing unit 54 is performed (see the memory 53m / storage area MA in FIG. 11).
  • the above-described time T24 is set in advance as a timing that bisects a period obtained by adding the scanning period PS1 and the scanning period PN1 one by one.
  • the display control unit 53 performs an interlaced one-field period PF3 corresponding to a period from time T24 to time T25, which is a timing corresponding to time Tc in FIG.
  • the image IS3 written in the storage area MA of the memory 53m is read out and output to the output processing section 54 (see the memory 53m / storage area MA in FIG. 11), and is output from the buffer section 52 (see FIG. 11).
  • An operation for writing an image in the storage area MB of the memory 53m (different from the image IS3) is performed (see memory 53m / storage area MB in FIG. 11).
  • the output processing unit 54 scans the image IS3 output from the display control unit 53 in an interlaced manner in one field period PF3 from time T23 to time T24 in accordance with the synchronization signal output from the synchronization control unit 25c. An image for one field is acquired, and an operation for assigning the acquired image for one field to an odd field and outputting it to the display device 4 is performed (see the output processing unit 54 in FIG. 11). Further, the output processing unit 54 scans the image IS3 output from the display control unit 53 in an interlaced manner in one field period PF3 from time T24 to time T25 in accordance with the synchronization signal output from the synchronization control unit 25c. Thus, an image for one field is acquired, and an operation for assigning the acquired image for one field to an even field and outputting it to the display device 4 is performed (see the output processing unit 54 in FIG. 11).
  • the image IS3 generated in the scanning period PS1 from time T21 to time T22 is interlaced in the period from time T23 to time T25. It is displayed on the display device 4 as an image of the P frame (see the display device 4 in FIG. 11). Therefore, according to the operation of the image processing unit 25f as described above, for example, the visual discomfort that may occur when observing a desired subject while viewing the video displayed on the display device 4 is reduced. be able to.
  • an image obtained by scanning a desired subject using the endoscope 2 can be sequentially output to the display device 4 at a timing suitable for the interlace method. it can. Therefore, also in this modification, the image quality of the video displayed on the display device according to the scanning of the subject can be improved as compared with the related art.
  • (Second embodiment) 12 to 15 relate to a second embodiment of the present invention.
  • the scanning control unit 25b uses, for example, a signal waveform as shown in FIG. 12 as a drive signal DC for swinging the emission end of the illumination fiber 12 in the X-axis direction.
  • the driver unit 22 (the signal generator 33) is configured to perform control for repeatedly generating the included signal.
  • the scanning control unit 25b sets the phase of the drive signal DC to 90 as the drive signal DD for swinging the emission end of the illumination fiber 12 in the Y-axis direction based on the control information read from the memory 24.
  • the driver unit 22 (signal generator 33) is configured to perform control for repeatedly generating a shifted signal.
  • FIG. 12 is a diagram illustrating an example of a signal waveform of a drive signal supplied to the actuator unit in the second embodiment.
  • FIG. 13 is a diagram illustrating an example of a spiral scanning path from the center point A to the outermost point B, which is observed when a drive signal corresponding to the signal waveform of FIG. 12 is supplied to the actuator unit.
  • FIG. 14 is a diagram illustrating an example of a spiral scanning path from the outermost point B to the center point A, which is observed when a drive signal corresponding to the signal waveform of FIG. 12 is supplied to the actuator unit.
  • the illumination light is irradiated to a position corresponding to the center point A of the irradiation position of the illumination light on the surface of the subject.
  • the irradiation position of the illumination light on the surface of the subject is displaced so as to draw the spiral scanning path SP3 outward from the center point A.
  • the illumination light is irradiated to the outermost point B of the illumination light irradiation position on the surface of the subject.
  • the irradiation position of the illumination light on the surface of the subject draws the spiral scanning path SP4 inward starting from the outermost point B.
  • illumination light is irradiated to the center point A on the surface of the subject.
  • the actuator unit 15 swings the emission end of the illumination fiber 12 in accordance with the drive signals DC and DD supplied from the driver unit 22, and then passes through the emission end to the subject.
  • the irradiation position of the emitted illumination light can be displaced along the spiral scanning paths SP3 and SP4.
  • scanning is performed such that the number of turns of the spiral scanning path SP3 is the same as the number of turns of the spiral scanning path SP4.
  • the synchronization control unit 25c is configured to display the display 4 as a period in which the subject is scanned once by the spiral scanning paths SP3 and SP4, and one image generated by the image generation unit 51 is an image for one frame of the interlace method.
  • the synchronization signal for synchronizing the operation of each unit of the image processing unit 25d is generated and output so that the period displayed on the screen is the same length.
  • the synchronization control unit 25c corresponds to, for example, a scanning period PS2 corresponding to a period during which the subject is scanned by the spiral scanning path SP3 and having the same length as one field period of the interlace method.
  • a scanning period PS2 corresponding to a period during which the subject is scanned by the spiral scanning path SP4 and having the same length as one field period of the interlace method.
  • a synchronization signal for synchronizing the operation is generated and output. That is, according to the operation of the synchronization control unit 25c as described above, the scanning period PS2 is started at the timing corresponding to the time Td, and the scanning period PS2 is ended at the timing corresponding to the time Te.
  • the scanning period PN2 starts at a timing corresponding to the time Te, and the scanning period PN2 ends at a timing corresponding to the time Tf.
  • the scanning period PS2 corresponds to a period in which the emission end of the illumination fiber 12 is swung so as to draw a spiral trajectory corresponding to the spiral scanning path SP3.
  • the scanning period PN2 corresponds to a period in which the emission end of the illumination fiber 12 is swung so as to draw a spiral trajectory corresponding to the spiral scanning path SP4.
  • the image processing unit 25d generates an image corresponding to the light detection signal output from the detection unit 23 during the scanning period PS2 in accordance with the synchronization signal output from the synchronization control unit 25c, and displays the generated image on the display device. 4 are sequentially output.
  • the image generation unit 51 In response to the synchronization signal output from the synchronization control unit 25c, the image generation unit 51 starts from the timing at which the scanning period PS2 is started (corresponding to time Td in FIG. 12) to a predetermined timing ⁇ after the scanning period PS2 has elapsed.
  • the writing period PW which is a period of time, an operation for generating an image by mapping the photodetection signal output from the detection unit 23 during the scanning period PS2 in a raster shape as pixel information, and the generated image An operation for sequentially writing to the memory 51m is performed.
  • the image generation unit 51 reads out a period from the above-described predetermined timing ⁇ to the timing at which the scanning period PN2 ends (corresponding to time Tf in FIG. 12) in accordance with the synchronization signal output from the synchronization control unit 25c.
  • the period PR the latest image written in the memory 51m is read out and sequentially output to the buffer unit 52.
  • the writing period PW and the reading period PR described above include, for example, a period obtained by adding the scanning period PS2 and the scanning period PN2 once, while considering the time required for mapping pixel information, and the like.
  • the period PW and the reading period PR may be set so as to satisfy the condition that the period obtained by adding the period PW and the reading period PR is equal to each other.
  • the user After the user connects each part of the optical scanning observation system 1 and turns on the power, for example, by turning on a scanning start switch (not shown) of the input device 5, the user scans a desired subject with the endoscope 2. Is instructed to the controller 25.
  • the light source controller 25a When the light source controller 25a detects that the scanning start switch of the input device 5 is turned on, the light source controller 25a controls the light source unit 21 to repeatedly emit the R light, the G light, and the B light in this order. .
  • the scanning control unit 25b When the scanning control unit 25b detects that the scanning start switch of the input device 5 is turned on, the scanning control unit 25b repeatedly generates the drive signals DC and DD, thereby alternating the desired subject with the spiral scanning paths SP3 and SP4. Control for causing the driver unit 22 to scan is performed.
  • a desired subject is scanned with the R light, G light, and B light that are illumination lights emitted from the light source unit 21, and Return light from the desired subject is detected by the detection unit 23, and a light detection signal corresponding to the intensity of the return light is input to the image processing unit 25d.
  • the synchronization control unit 25c When the synchronization control unit 25c detects that the scanning start switch of the input device 5 is turned on, the synchronization control unit 25c synchronizes the operation of each unit of the image processing unit 25d according to the scanning period PS2 and the scanning period PN2. Generate and output a synchronization signal.
  • FIG. 15 is a diagram for explaining specific operations performed in each unit of the image processing unit according to the second embodiment.
  • the scanning period PS2 and the scanning period PN2 are alternately switched in this order.
  • a case where an image of the Q frame is displayed on the display device 4 by the interlace method will be described as an example.
  • the image generation unit 51 corresponds to a predetermined timing after the time Te and before the time Tf in FIG. 12 from the time T31 which is the timing corresponding to the time Td in FIG. 12 according to the synchronization signal output from the synchronization control unit 25c.
  • the writing period PW up to time T32 to be performed an operation for generating the image IS4 by mapping the light detection signal output from the detection unit 23 during the scanning period PS2 in a raster shape as pixel information, and the generated An operation for sequentially writing the image IS4 to the memory 51m is performed (see the memory 51m in FIG. 15).
  • the image generation unit 51 reads the image IS4 written in the memory 51m and outputs it to the buffer unit 52 in the reading period PR from time T32 to time T33, which is the timing corresponding to time Tf and time Td in FIG. (See the memory 51m in FIG. 15).
  • the buffer unit 52 In response to the synchronization signal output from the synchronization control unit 25c, the buffer unit 52 performs an operation for writing the image IS4 output from the image generation unit 51 into the memory 52m in the reading period PR from time T32 to time T33. (See memory 52m in FIG. 15).
  • the buffer unit 52 corresponds to a period from time T33 to time T34, which is a timing corresponding to time Te in FIG. 12, in accordance with the synchronization signal output from the synchronization control unit 25c. Then, an operation for reading the image IS4 written in the memory 52m and outputting it to the display control unit 53 is performed (see the memory 52m in FIG. 15). That is, in the present embodiment, the scanning period PS2 and the one field period PF4 have the same length.
  • the display control unit 53 writes the image IS4 output from the buffer unit 52 in the storage area MA of the memory 53m in one field period PF4 from time T33 to time T34 in accordance with the synchronization signal output from the synchronization control unit 25c. (See memory 53m / storage area MA in FIG. 15) while reading the image written in the storage area MB of the memory 53m before the one-field period PF4 (different from the image IS4) and output processing unit An operation for outputting to 54 is performed (see memory 53m / storage area MB in FIG. 15).
  • the display control unit 53 In response to the synchronization signal output from the synchronization control unit 25c, the display control unit 53 performs an interlaced one-field period PF4 corresponding to a period from time T34 to time T35, which is a timing corresponding to time Tf in FIG. Then, an operation for reading the image IS4 written in the storage area MA of the memory 53m and outputting it to the output processing unit 54 is performed (refer to the memory 53m / storage area MA in FIG. 15).
  • the display control unit 53 performs an interlaced one-field period PF4 corresponding to a period from time T35 to time T36, which is a timing corresponding to time Te in FIG.
  • the image IS1 written in the storage area MA of the memory 53m is read out and output to the output processing section 54 (see the memory 53m / storage area MA in FIG. 15), and is output from the buffer section 52 (see FIG. 15).
  • An operation for writing an image in the storage area MB of the memory 53m (different from the image IS4) is performed (see memory 53m / storage area MB in FIG. 15).
  • the output processing unit 54 scans the image IS4 output from the display control unit 53 in an interlaced manner in one field period PF4 from time T34 to time T35 in accordance with the synchronization signal output from the synchronization control unit 25c. An image for one field is acquired, and an operation for assigning the acquired image for one field to an odd field and outputting it to the display device 4 is performed (see the output processing unit 54 in FIG. 15).
  • the output processing unit 54 scans the image IS4 output from the display control unit 53 in an interlaced manner in one field period PF4 from time T35 to time T36 in accordance with the synchronization signal output from the synchronization control unit 25c. Thus, an image for one field is acquired, and an operation for assigning the acquired image for one field to an even field and outputting it to the display device 4 is performed (see the output processing unit 54 in FIG. 15).
  • the image IS4 generated in the writing period PW from time T31 to time T32 is interlaced in the period from time T34 to time T36. Is displayed on the display device 4 as an image of the Qth frame (see the display device 4 in FIG. 15).
  • an image obtained by scanning a desired subject using the endoscope 2 can be sequentially output to the display device 4 at a timing suitable for the interlace method. it can. Therefore, according to the present embodiment, it is possible to improve the image quality of the video displayed on the display device in accordance with the scanning of the subject as compared with the related art.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)
  • Endoscopes (AREA)

Abstract

La présente invention concerne un système d'observation à balayage optique qui comprend : un endoscope qui balaye un sujet et reçoit une lumière de retour ; une unité de commande de balayage qui provoque le balayage du sujet selon des première et deuxième trajectoires de balayage en alternance ; une unité de détection optique qui génère et délivre en sortie un signal correspondant à la lumière de retour ; une unité de génération d'image qui génère et délivre en sortie une image correspondant au signal qui est délivré en sortie durant la période de temps au cours de laquelle le sujet est balayé selon la première trajectoire de balayage ; une unité de commande d'affichage qui délivre en sortie l'image deux fois consécutives ; une unité de traitement de sortie qui délivre en sortie deux fois consécutives à un dispositif d'affichage la sortie d'image, champ par champ ; et une unité de commande de synchronisation qui synchronise les opérations de l'unité de génération d'image, de l'unité de commande d'affichage, et de l'unité de traitement de sortie de sorte que la période de temps au cours de laquelle le sujet est balayé une fois selon chacune des première et seconde trajectoires de balayage et la période de temps au cours de laquelle une image est affichée en tant qu'image d'une trame sont la même longueur.
PCT/JP2015/075090 2015-01-20 2015-09-03 Système d'observation à balayage optique Ceased WO2016117162A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2016527478A JPWO2016117162A1 (ja) 2015-01-20 2015-09-03 光走査型観察システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015008862 2015-01-20
JP2015-008862 2015-01-20

Publications (1)

Publication Number Publication Date
WO2016117162A1 true WO2016117162A1 (fr) 2016-07-28

Family

ID=56416725

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/075090 Ceased WO2016117162A1 (fr) 2015-01-20 2015-09-03 Système d'observation à balayage optique

Country Status (2)

Country Link
JP (1) JPWO2016117162A1 (fr)
WO (1) WO2016117162A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005501279A (ja) * 2001-08-23 2005-01-13 ユニバーシティ・オブ・ワシントン 奥行きを強調した画像の収集
JP2011092345A (ja) * 2009-10-28 2011-05-12 Hoya Corp 内視鏡装置
JP2011125617A (ja) * 2009-12-21 2011-06-30 Hoya Corp 内視鏡装置
WO2012132754A1 (fr) * 2011-03-31 2012-10-04 オリンパスメディカルシステムズ株式会社 Endoscope de balayage
WO2013111604A1 (fr) * 2012-01-26 2013-08-01 オリンパス株式会社 Dispositif d'observation à balayage lumineux
JP5571268B1 (ja) * 2012-09-19 2014-08-13 オリンパスメディカルシステムズ株式会社 走査型内視鏡システム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005501279A (ja) * 2001-08-23 2005-01-13 ユニバーシティ・オブ・ワシントン 奥行きを強調した画像の収集
JP2011092345A (ja) * 2009-10-28 2011-05-12 Hoya Corp 内視鏡装置
JP2011125617A (ja) * 2009-12-21 2011-06-30 Hoya Corp 内視鏡装置
WO2012132754A1 (fr) * 2011-03-31 2012-10-04 オリンパスメディカルシステムズ株式会社 Endoscope de balayage
WO2013111604A1 (fr) * 2012-01-26 2013-08-01 オリンパス株式会社 Dispositif d'observation à balayage lumineux
JP5571268B1 (ja) * 2012-09-19 2014-08-13 オリンパスメディカルシステムズ株式会社 走査型内視鏡システム

Also Published As

Publication number Publication date
JPWO2016117162A1 (ja) 2017-04-27

Similar Documents

Publication Publication Date Title
JP5571268B1 (ja) 走査型内視鏡システム
JP2011115252A (ja) 医療用プローブ、および医療用観察システム
JP5702023B2 (ja) 走査型内視鏡システム及び走査型内視鏡システムの作動方法
JP2010113312A (ja) 内視鏡装置および内視鏡プロセッサ
JPWO2014065025A1 (ja) 走査型内視鏡システム
US9820639B2 (en) Image processing apparatus for scanning endoscope
US9872602B2 (en) Optical scanning type observation apparatus and method for operating optical scanning type observation apparatus
CN106572787B (zh) 光扫描型观察装置
WO2014020943A1 (fr) Système d'endoscope
JP6265781B2 (ja) 内視鏡システム及び内視鏡システムの制御方法
JP6143953B2 (ja) 走査型内視鏡システム
JP6381123B2 (ja) 光走査型観察システム
WO2016117162A1 (fr) Système d'observation à balayage optique
WO2016116963A1 (fr) Procédé et dispositif de balayage optique
JP5993531B1 (ja) 走査型内視鏡
JPWO2016116962A1 (ja) 光走査方法及び光走査装置
JP6368627B2 (ja) 光走査型観察システム
WO2016017199A1 (fr) Système d'observation à balayage optique
JP6081678B1 (ja) 走査型内視鏡
JPWO2014087798A1 (ja) 走査型内視鏡システム
JP2018201812A (ja) 走査型内視鏡装置、及び画像生成方法
JP6640231B2 (ja) 光走査型観察システム
JP2020130884A (ja) 医療用画像処理装置、画像処理方法およびプログラム
JP2016049352A (ja) 内視鏡システム

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2016527478

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15878851

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15878851

Country of ref document: EP

Kind code of ref document: A1