WO2014020943A1 - Système d'endoscope - Google Patents
Système d'endoscope Download PDFInfo
- Publication number
- WO2014020943A1 WO2014020943A1 PCT/JP2013/060189 JP2013060189W WO2014020943A1 WO 2014020943 A1 WO2014020943 A1 WO 2014020943A1 JP 2013060189 W JP2013060189 W JP 2013060189W WO 2014020943 A1 WO2014020943 A1 WO 2014020943A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- illumination light
- light
- irradiation
- endoscope
- irradiation position
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00057—Operational features of endoscopes provided with means for testing or calibration
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00172—Optical arrangements with means for scanning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/07—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements using light-conductive means, e.g. optical fibres
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/06—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
- A61B1/0638—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
Definitions
- the present invention relates to an endoscope system, and more particularly to an endoscope system that acquires an image by scanning a subject.
- a subject is set in advance by swinging the tip of an illumination fiber that guides illumination light emitted from a light source unit. Obtained by separating the return light received by the light receiving fiber for each color component. An image of the subject is generated using the signal.
- Japanese National Table of Contents 2010-515947 discloses a multicolor image of a multicolor calibration pattern using a scanning beam device, each color component of the acquired multicolor image, and each color component.
- a calibration method is disclosed in which the scanning beam device is calibrated based on the comparison result by comparing the color components of the display of the multicolor calibration pattern corresponding to.
- the irradiation position of the illumination light actually irradiated to the subject is shifted from the (ideal) irradiation position along a predetermined scanning pattern. This may cause distortion in the image generated in response to the illumination light irradiation.
- the present invention has been made in view of the above-described circumstances, and an object thereof is to provide an endoscope system capable of accurately calibrating distortion of an image acquired using a scanning endoscope. It is said.
- a light guide member that guides illumination light emitted from a light source, and an irradiation position of the illumination light that is irradiated to a subject through the light guide member correspond to a predetermined scanning pattern.
- An endoscope provided with a drive unit capable of swinging the light guide member so as to draw a trajectory, and coordinate information capable of detecting the irradiation position of the illumination light emitted from the endoscope
- the coordinate information acquisition unit for acquiring the illumination light, the irradiation position when the illumination light is irradiated along the predetermined scanning pattern, and the irradiation position of the illumination light detected based on the coordinate information are compared.
- the comparison unit Based on the comparison result of the comparison unit, the comparison unit, a determination unit that determines whether or not the irradiation range of the illumination light irradiated from the endoscope satisfies a predetermined angle of view, and the illumination light Determination result that the irradiation range does not satisfy the predetermined angle of view
- control for adjusting a drive signal supplied to the drive unit is performed, and a determination result that the irradiation range of the illumination light satisfies the predetermined angle of view is the determination unit.
- a process for detecting a deviation amount between the locus of the irradiation position along the predetermined scanning pattern and the locus drawn by the irradiation position detected based on the coordinate information In the case of the above, a process for detecting a deviation amount between the locus of the irradiation position along the predetermined scanning pattern and the locus drawn by the irradiation position detected based on the coordinate information. And a control unit for performing.
- the figure for demonstrating the time displacement of the irradiation coordinate of the illumination light from the point YMAX to the point SA when illumination light is irradiated to the virtual XY plane like FIG. The flowchart which shows an example of the process etc. which are performed by the endoscope system which concerns on the Example of this invention.
- FIG. 1 is a diagram illustrating a configuration of a main part of an endoscope system according to an embodiment of the present invention.
- an endoscope system 1 includes a scanning endoscope 2 that is inserted into a body cavity of a subject, a main body device 3 that is connected to the endoscope 2, and a main body device 3. And a monitor 4 connected to the.
- the endoscope 2 includes an insertion portion 11 formed with an elongated shape and flexibility that can be inserted into a body cavity of a subject.
- a connector (not shown) or the like for detachably connecting the endoscope 2 to the main body device 3 is provided at the proximal end portion of the insertion portion 11.
- An illumination fiber having a function as a light guide member for guiding the illumination light supplied from the light source unit 21 of the main body device 3 to the objective optical system 14 is provided in a portion from the proximal end portion to the distal end portion in the insertion portion 11. 12 and a light receiving fiber 13 that receives the return light from the subject and guides it to the detection unit 23 of the main body device 3 are respectively inserted.
- the end including the light incident surface of the illumination fiber 12 is disposed in a multiplexer 32 provided inside the main unit 3. Further, the end portion including the light emission surface of the illumination fiber 12 is disposed in a state in which it is not fixed by a fixing member or the like in the vicinity of the light incident surface of the lens 14 a provided at the distal end portion of the insertion portion 11.
- the end including the light incident surface of the light receiving fiber 13 is fixedly disposed around the light emitting surface of the lens 14 b at the distal end surface of the distal end portion of the insertion portion 11. Further, the end including the light emitting surface of the light receiving fiber 13 is disposed in a duplexer 36 provided inside the main body device 3.
- the objective optical system 14 includes a lens 14a into which illumination light from the illumination fiber 12 is incident, and a lens 14b that emits illumination light that has passed through the lens 14a to a subject.
- An actuator 15 that is driven based on a drive signal output from the driver unit 22 of the main body device 3 is attached to the middle portion of the illumination fiber 12 on the distal end side of the insertion portion 11.
- FIG. 2 is a diagram illustrating an example of a virtual XY plane set on the surface of the subject.
- the point SA on the XY plane in FIG. 2 is the insertion axis when the insertion axis of the insertion unit 11 is virtually set to exist in the direction corresponding to the back side from the front side of the page. It shows the intersection with the page.
- the X-axis direction on the XY plane in FIG. 2 is set as a direction from the left side to the right side of the drawing.
- the Y-axis direction in the XY plane of FIG. 2 is set as a direction from the lower side to the upper side of the drawing.
- the X axis and the Y axis constituting the XY plane of FIG. 2 intersect at the point SA.
- the actuator 15 is based on the first drive signal output from the driver unit 22 of the main unit 3 and operates for swinging the end including the light emitting surface of the illumination fiber 12 in the X-axis direction. Based on an actuator (not shown) and a second drive signal output from the driver unit 22 of the main unit 3, the end including the light emitting surface of the illumination fiber 12 is swung in the Y-axis direction. And a Y-axis actuator (not shown). The end including the light exit surface of the illumination fiber 12 is swung in a spiral shape around the point SA in accordance with the operations of the X-axis actuator and the Y-axis actuator as described above.
- a memory 16 in which endoscope information including various information related to the endoscope 2 is stored in advance.
- the endoscope information stored in the memory 16 is read by the controller 25 of the main body device 3 when the endoscope 2 and the main body device 3 are connected.
- the main unit 3 includes a light source unit 21, a driver unit 22, a detection unit 23, a memory 24, and a controller 25.
- the light source unit 21 includes a light source 31a, a light source 31b, a light source 31c, and a multiplexer 32.
- the light source 31 a includes, for example, a laser light source and the like, and is configured to emit red wavelength band light (hereinafter also referred to as R light) to the multiplexer 32 when turned on under the control of the controller 25. Yes.
- R light red wavelength band light
- the light source 31b includes a laser light source, for example, and is configured to emit light in a green wavelength band (hereinafter also referred to as G light) to the multiplexer 32 when turned on under the control of the controller 25. Yes.
- G light a green wavelength band
- the light source 31c includes, for example, a laser light source, and is configured to emit light in a blue wavelength band (hereinafter also referred to as B light) to the multiplexer 32 when turned on under the control of the controller 25. Yes.
- B light a blue wavelength band
- the multiplexer 32 multiplexes the R light emitted from the light source 31a, the G light emitted from the light source 31b, and the B light emitted from the light source 31c onto the light incident surface of the illumination fiber 12. It is configured so that it can be supplied.
- the driver unit 22 includes a signal generator 33, digital / analog (hereinafter referred to as D / A) converters 34a and 34b, and an amplifier 35.
- D / A digital / analog
- the signal generator 33 is a predetermined drive signal as shown in FIG. 3, for example, as a first drive signal for swinging the end including the light emitting surface of the illumination fiber 12 in the X-axis direction.
- a waveform signal is generated and output to the D / A converter 34a.
- FIG. 3 is a diagram illustrating an example of a signal waveform of a first drive signal supplied to an actuator provided in the endoscope.
- the signal generator 33 is based on the control of the controller 25, for example, as shown in FIG. 4, as a second drive signal for swinging the end including the light emitting surface of the illumination fiber 12 in the Y-axis direction.
- a signal having a waveform in which the phase of the first drive signal is shifted by 90 ° is generated and output to the D / A converter 34b.
- FIG. 4 is a diagram illustrating an example of a signal waveform of a second drive signal supplied to an actuator provided in the endoscope.
- the D / A converter 34 a is configured to convert the digital first drive signal output from the signal generator 33 into an analog first drive signal and output the analog first drive signal to the amplifier 35.
- the D / A converter 34 b is configured to convert the digital second drive signal output from the signal generator 33 into an analog second drive signal and output the analog second drive signal to the amplifier 35.
- the amplifier 35 is configured to amplify the first and second drive signals output from the D / A converters 34 a and 34 b and output the amplified signals to the actuator 15.
- the amplitude value (signal level) of the first drive signal illustrated in FIG. 3 gradually increases from the time T1 at which the minimum value is reached, and gradually decreases after reaching the maximum value at time T2. At time T3, it becomes the minimum value again.
- the amplitude value (signal level) of the second drive signal illustrated in FIG. 4 gradually increases from the time T1 at which the minimum value is reached, and gradually decreases after reaching the maximum value near the time T2. Then, it becomes the minimum value again at time T3.
- FIG. 5A is a diagram for explaining temporal displacement of illumination light irradiation coordinates from point SA to point YMAX when illumination light is irradiated on a virtual XY plane as shown in FIG. is there.
- FIG. 5B is a diagram for explaining temporal displacement of illumination light irradiation coordinates from point YMAX to point SA when illumination light is irradiated onto a virtual XY plane as shown in FIG. is there.
- illumination light is applied to a position corresponding to the point SA on the surface of the subject.
- the amplitude values of the first and second drive signals increase from time T1 to time T2
- the irradiation coordinates of the illumination light on the surface of the subject follow the first spiral locus outward from the point SA.
- illumination light is irradiated to a point YMAX that is the outermost point of the illumination light irradiation coordinates on the surface of the subject.
- the illumination light irradiation coordinates on the surface of the subject have a second spiral trajectory inward starting from the point YMAX.
- illumination light is irradiated to the point SA on the surface of the subject.
- the actuator 15 has a spiral in which the irradiation position of the illumination light applied to the subject through the objective optical system 14 is illustrated in FIGS. 5A and 5B based on the first and second drive signals supplied from the driver unit 22.
- the end portion including the light emitting surface of the illumination fiber 12 can be swung so as to draw a locus corresponding to the scanning pattern.
- the detection unit 23 includes a duplexer 36, detectors 37a, 37b, and 37c, and analog-digital (hereinafter referred to as A / D) converters 38a, 38b, and 38c.
- a / D analog-digital
- the demultiplexer 36 includes a dichroic mirror and the like, and separates the return light emitted from the light emitting surface of the light receiving fiber 13 into light for each of R (red), G (green), and B (blue) color components. And it is comprised so that it may radiate
- the detector 37a detects the intensity of the R light output from the duplexer 36, generates an analog R signal corresponding to the detected intensity of the R light, and outputs the analog R signal to the A / D converter 38a. It is configured.
- the detector 37b detects the intensity of the G light output from the duplexer 36, generates an analog G signal corresponding to the detected intensity of the G light, and outputs the analog G signal to the A / D converter 38b. It is configured.
- the detector 37c detects the intensity of the B light output from the duplexer 36, generates an analog B signal according to the detected intensity of the B light, and outputs the analog B signal to the A / D converter 38c. It is configured.
- the A / D converter 38a is configured to convert the analog R signal output from the detector 37a into a digital R signal and output it to the controller 25.
- the A / D converter 38b is configured to convert the analog G signal output from the detector 37b into a digital G signal and output it to the controller 25.
- the A / D converter 38c is configured to convert the analog B signal output from the detector 37c into a digital B signal and output it to the controller 25.
- the memory 24 stores a control program for controlling the main device 3 in advance, and stores image correction information obtained as a result of processing by the controller 25. Details of such image correction information will be described later.
- the coordinate position corresponding to the point SA is the time at which the illumination light is irradiated to an arbitrary coordinate position on the ideal (spiral) scanning pattern as illustrated in FIGS. 5A and 5B.
- Table data TBD that can specify which time of the period up to time T3 when the illumination light is irradiated is stored in advance.
- the irradiation position (coordinate position) when the illumination light supplied from the light source unit 21 is irradiated along an ideal (spiral) scanning pattern as shown in FIGS. 5A and 5B.
- Table data TBD indicating a correspondence relationship with the irradiation time (elapsed time) is stored in advance.
- the table data TBD is configured as data corresponding to each wavelength band light (R light, G light, and B light) supplied from the light source unit 21, for example.
- the controller 25 is configured to read a control program stored in the memory 24 and to control the light source unit 21 and the driver unit 22 based on the read control program.
- the controller 25 operates so as to store the endoscope information output from the memory 16 in the memory 24 when the insertion unit 11 is connected to the main body device 3.
- the controller 25 is configured to generate an image for one frame based on the R signal, the G signal, and the B signal output from the detection unit 23 in a period corresponding to the time T1 to the time T2. Further, the controller 25 is configured to generate an image for one frame based on the R signal, the G signal, and the B signal output from the detection unit 23 during a period corresponding to the time T2 to the time T3.
- the controller 25 performs an image correction process based on the image correction information on the image of each frame, and a corrected image obtained by performing the image correction process. Is displayed on the monitor 4 at a predetermined frame rate.
- the controller 25 performs processing described later based on the table data TBD stored in the memory 24 and information on the coordinate position output from the light irradiation coordinate detection module 101 (hereinafter also referred to as coordinate information).
- the image correction information is acquired by the operation, and the acquired image correction information is stored in the memory 24.
- the controller 25 is configured to be able to at least temporarily hold coordinate information output from the light irradiation coordinate detection module 101.
- the light irradiation coordinate detection module 101 having a function as a coordinate information acquisition unit includes a position detection element (PSD) and receives illumination light emitted through the objective optical system 14. The position at the time is detected, and the detected position is output as coordinate information.
- PSD position detection element
- the coordinate position of the point SA on the XY plane exemplified in FIGS. 2, 5A and 5B is set in advance to be (0, 0).
- the coordinate information output from the light irradiation coordinate detection module 101 is a relative coordinate position based on the coordinate position (0, 0) of the point SA on the XY plane illustrated in FIGS. 2, 5A, and 5B. It is information which shows.
- the controller 25 determines the irradiation position of the illumination light irradiated in a spiral shape from the endoscope 2 based on the coordinate information output from the light irradiation coordinate detection module 101 having the configuration as described above (coordinates). As position).
- the surgeon or the like connects the endoscope 2 and the monitor 4 to the main body device 3, arranges the light irradiation coordinate detection module 101 at a position facing the distal end surface of the endoscope 2, and further, the light irradiation coordinates.
- the coordinate information output from the detection module 101 is set to be input to the controller 25 of the main device 3.
- the controller 25 controls the light source unit 21 to switch the light source 31b from OFF to ON while turning off the light sources 31a and 31c at a timing immediately after the endoscope information read from the memory 16 is stored in the memory 24.
- the driver unit 22 is controlled to output the first and second drive signals to the actuator 15. Then, under such control of the controller 25, the G light is irradiated on the surface of the light irradiation coordinate detection module 101, and coordinate information corresponding to the position where the G light is received is sequentially output from the light irradiation coordinate detection module 101.
- FIG. 6 is a flowchart illustrating an example of processing performed by the endoscope system according to the embodiment of the present invention.
- the controller 25 selects a coordinate position corresponding to one or more predetermined irradiation times included in the table data TBD from the coordinate information output from the light irradiation coordinate detection module 101.
- the extraction process is performed (step S1 in FIG. 6).
- the controller 25 monitors the signal waveforms of the first and second drive signals output from the driver unit 22 before performing the process of step S1 in FIG.
- the time T1 at the timing when the amplitude value (signal level) of the signal waveform of the signal becomes the minimum, and referring to the table data TBD based on the set time T1, it corresponds to the point XMAX in FIG. 5A.
- the controller 25 receives from the coordinate information output from the light irradiation coordinate detection module 101 the coordinate position XA indicating the position where the G light is actually received at the time TXMAX, and the G light is actually received at the time TYMIN.
- the light irradiation coordinate detection module 101 as the subject is actually irradiated with G light.
- Coordinate positions XA, XB, YA and YB corresponding to the four irradiation positions located on the outermost periphery of the spiral scanning pattern can be extracted from each irradiation position at the time.
- the controller 25 performs a process of comparing each coordinate position extracted in step S1 of FIG. 6 with the coordinate position of the table data TBD corresponding to the predetermined one or more irradiation times (FIG. 6). After step S2), based on the comparison result, whether or not the irradiation range of G light irradiated to the light irradiation coordinate detection module 101 satisfies a predetermined angle of view intended when the endoscope 2 is designed. Is performed (step S3 in FIG. 6).
- the predetermined angle of view substantially coincides with the illumination light irradiation range when the illumination light having passed through the objective optical system 14 is irradiated along an ideal scanning pattern as shown in FIG. 5A (and FIG. 5B). It is assumed that the value is 90 degrees (for example).
- the X axis of the actuator 15 is used.
- the end including the light exit surface of the illumination fiber 12 is not accurately oscillated due to the waveform distortion or the like generated during the oscillating operation in the actuator for the Y axis and the actuator for the Y axis, and is irradiated through the objective optical system 14 The situation where the irradiation position of the illumination light to be deviated from the ideal scanning pattern occurs.
- FIG. 7 is a diagram illustrating an example of a deviation that occurs between an ideal illumination light irradiation range and an actual illumination light irradiation range.
- the coordinate positions XA and XB extracted in step S1 of FIG. 6 are the ideal irradiation drawn by the solid line in FIG.
- the coordinate positions YA and YB extracted in step S1 in FIG. 6 are the ideal irradiation positions drawn by solid lines in FIG. 7 while matching the coordinate positions of the points XMAX and XMIN included in the position locus. There is a deviation that the coordinate positions of the point YMAX and the point YMIN included in the trajectory are not coincident.
- the controller 25 compares the coordinate position XA and the coordinate position of the point XMAX based on each coordinate position extracted in step S1 of FIG. 6 and the coordinate position of the table data TBD stored in the memory 24.
- the process of comparing the coordinate position YB and the coordinate position of the point YMIN, comparing the coordinate position XB and the coordinate position of the point XMIN, and comparing the coordinate position YA and the coordinate position of the point YMAX is shown in FIG.
- step S2 it is possible to detect a deviation between the ideal G light irradiation position and the actual G light irradiation position at the four irradiation positions located on the outermost periphery of the spiral scanning pattern. it can.
- the controller 25 matches the coordinate position XA and the coordinate position of the point XMAX, and the coordinate position YB and the coordinate position of the point YMIN match.
- a process of detecting whether or not the coordinate position XB and the coordinate position of the point XMIN coincide and the coordinate position YA and the coordinate position of the point YMAX coincide is performed in step S3 in FIG. It is possible to detect whether or not the ideal G light irradiation positions at the four irradiation positions located on the outermost periphery of the scanning pattern coincide with the actual G light irradiation positions.
- the irradiation range of the G light irradiated to the light irradiation coordinate detection module 101 is determined. Control for adjusting the first and / or second drive signals supplied to the actuator 15 after obtaining the determination result that the predetermined angle of view intended at the time of designing the endoscope 2 is not satisfied Is performed on the driver unit 22 (step S4 in FIG. 6).
- the controller 25 based on the processing results of step S2 and step S3 in FIG. 6, for example, as shown in FIG. 7, compared to the Y coordinate value of the coordinate position corresponding to the point YMAX,
- the amplitude of the second drive signal supplied to the actuator 15 The driver unit 22 is controlled to increase the value (signal level) from the current amplitude value (signal level).
- control that increases or decreases the amplitude value (signal level) of the drive signal supplied to the actuator 15 from the current amplitude value (signal level) is performed in step S4 in FIG.
- control for changing the phase of at least one of the drive signals so that the phase difference between the first and second drive signals supplied to the actuator 15 is 90 ° is performed in step S4 of FIG. It may be performed.
- step S3 of FIG. 6 the controller 25, for example, matches the coordinate position XA and the coordinate position of the point XMAX, matches the coordinate position YB and the coordinate position of the point YMIN, and sets the coordinate position XB and the point XMIN.
- the irradiation range of the G light irradiated to the light irradiation coordinate detection module 101 is determined by the endoscope. The processing from step S1 to step S4 in FIG. 6 is repeated until a determination result that the predetermined angle of view intended at the time of design 2 is satisfied is obtained.
- step S3 in FIG. 6 the controller 25 determines the determination result that the irradiation range of the G light irradiated to the light irradiation coordinate detection module 101 satisfies a predetermined angle of view intended when the endoscope 2 is designed. If obtained, the processing after step S5 in FIG. 6 is performed while maintaining the control performed on the driver unit 22 at the timing when the determination result is obtained.
- step S1 of FIG. 6 to S4 is not restricted to the case where G light is irradiated with respect to the light irradiation coordinate detection module 101, R light or B light was irradiated with respect to the light irradiation coordinate detection module 101 Even in this case, it can be implemented in substantially the same manner.
- the controller 25 performs a process of extracting coordinate positions corresponding to each irradiation position for one frame from the coordinate information output from the light irradiation coordinate detection module 101 (step S5 in FIG. 6).
- controller 25 responds to the return light received with the G light irradiation to the subject based on each coordinate position extracted in step S5 of FIG. 6 and the table data TBD stored in the memory 24.
- a process for acquiring G image correction information used for correcting the G image generated in this way is performed (step S6 in FIG. 6).
- FIG. 8 is a diagram illustrating an example of a deviation that occurs between the locus of the ideal illumination light irradiation position and the locus of the actual illumination light irradiation position.
- the controller 25 includes, for example, the positional deviation amount GZL between the coordinate position extracted in step S5 of FIG. 6 and the coordinate position of the table data TBD in the table data TBD.
- the controller 25 performs, for example, interpolation processing based on each displacement amount GZL calculated as described above, and is generated according to the return light received as the subject is irradiated with the G light.
- G image correction information including a correction amount for correcting a positional shift of all the pixels of the G image can be acquired.
- a process of detecting a deviation amount between the locus of the ideal irradiation position and the locus of the actual irradiation position, and acquiring image correction information based on the detected deviation amount is not limited to the process of steps S ⁇ b> 5 and S ⁇ b> 6, and other processes may be performed.
- the controller 25 stores the G image correction information acquired in step S6 in FIG. 6 in the memory 24, and then performs an image correction process based on the G image correction information on the G image (step S7 in FIG. 6). .
- step S5 when the light irradiation coordinate detection module 101 is irradiated with the R light, the subject is irradiated with the R light.
- R image correction information including a correction amount for correcting the positional deviation of all the pixels of the R image generated according to the received return light is acquired, and an image based on the acquired R image correction information Correction processing is performed on the R image.
- B image correction information including a correction amount for correcting the positional deviation of all the pixels of the B image generated according to the received return light is acquired, and an image based on the acquired B image correction information. Correction processing is performed on the B image.
- the ideal illumination light irradiation position and the actual illumination light irradiation position are determined. It is possible to generate a corrected image in which the distortion of the image caused by the shift between the two is sufficiently corrected and display it on the monitor 4. As a result, according to the present embodiment, it is possible to accurately calibrate the distortion of an image acquired using a scanning endoscope.
- the series of processes in FIG. 6 is not limited to the process performed when the G light is irradiated along the scanning pattern as shown in FIG. 5A, but, for example, along the scanning pattern as shown in FIG. 5B. It may be performed when G light is irradiated.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
- Endoscopes (AREA)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2012168533 | 2012-07-30 | ||
| JP2012-168533 | 2012-07-30 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014020943A1 true WO2014020943A1 (fr) | 2014-02-06 |
Family
ID=50027640
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2013/060189 Ceased WO2014020943A1 (fr) | 2012-07-30 | 2013-04-03 | Système d'endoscope |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2014020943A1 (fr) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2016106829A (ja) * | 2014-12-05 | 2016-06-20 | オリンパス株式会社 | 光走査型観察システム |
| JP2017000379A (ja) * | 2015-06-09 | 2017-01-05 | オリンパス株式会社 | 走査型内視鏡システム及び走査型内視鏡の較正方法 |
| WO2017037781A1 (fr) * | 2015-08-28 | 2017-03-09 | オリンパス株式会社 | Dispositif d'observation de type à balayage |
| WO2018116464A1 (fr) * | 2016-12-22 | 2018-06-28 | オリンパス株式会社 | Dispositif d'acquisition d'image de balayage et système d'acquisition d'image de balayage |
| CN110613510A (zh) * | 2018-06-19 | 2019-12-27 | 清华大学 | 一种自投影式内窥镜装置 |
Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007526014A (ja) * | 2003-06-23 | 2007-09-13 | マイクロビジョン,インク. | 走査内視鏡 |
| JP2008514342A (ja) * | 2004-10-01 | 2008-05-08 | ユニバーシティ・オブ・ワシントン | イメージの歪みを減らす再マッピング法 |
| JP2010515947A (ja) * | 2007-01-10 | 2010-05-13 | ユニヴァーシティ オブ ワシントン | 走査ビーム装置の較正 |
| JP2010148764A (ja) * | 2008-12-26 | 2010-07-08 | Hoya Corp | 光走査型内視鏡装置、光走査型内視鏡、および光走査型内視鏡プロセッサ |
| JP2010148769A (ja) * | 2008-12-26 | 2010-07-08 | Hoya Corp | 光走査型内視鏡装置、光走査型内視鏡、および光走査型内視鏡プロセッサ |
| JP2010158414A (ja) * | 2009-01-08 | 2010-07-22 | Hoya Corp | 光走査型内視鏡プロセッサおよび光走査型内視鏡装置 |
| JP2010268972A (ja) * | 2009-05-21 | 2010-12-02 | Hoya Corp | 医療用観察システムおよびプロセッサ |
| JP2011004920A (ja) * | 2009-06-25 | 2011-01-13 | Hoya Corp | 内視鏡装置 |
| JP2011004929A (ja) * | 2009-06-25 | 2011-01-13 | Hoya Corp | 内視鏡装置 |
-
2013
- 2013-04-03 WO PCT/JP2013/060189 patent/WO2014020943A1/fr not_active Ceased
Patent Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007526014A (ja) * | 2003-06-23 | 2007-09-13 | マイクロビジョン,インク. | 走査内視鏡 |
| JP2008514342A (ja) * | 2004-10-01 | 2008-05-08 | ユニバーシティ・オブ・ワシントン | イメージの歪みを減らす再マッピング法 |
| JP2010515947A (ja) * | 2007-01-10 | 2010-05-13 | ユニヴァーシティ オブ ワシントン | 走査ビーム装置の較正 |
| JP2010148764A (ja) * | 2008-12-26 | 2010-07-08 | Hoya Corp | 光走査型内視鏡装置、光走査型内視鏡、および光走査型内視鏡プロセッサ |
| JP2010148769A (ja) * | 2008-12-26 | 2010-07-08 | Hoya Corp | 光走査型内視鏡装置、光走査型内視鏡、および光走査型内視鏡プロセッサ |
| JP2010158414A (ja) * | 2009-01-08 | 2010-07-22 | Hoya Corp | 光走査型内視鏡プロセッサおよび光走査型内視鏡装置 |
| JP2010268972A (ja) * | 2009-05-21 | 2010-12-02 | Hoya Corp | 医療用観察システムおよびプロセッサ |
| JP2011004920A (ja) * | 2009-06-25 | 2011-01-13 | Hoya Corp | 内視鏡装置 |
| JP2011004929A (ja) * | 2009-06-25 | 2011-01-13 | Hoya Corp | 内視鏡装置 |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2016106829A (ja) * | 2014-12-05 | 2016-06-20 | オリンパス株式会社 | 光走査型観察システム |
| JP2017000379A (ja) * | 2015-06-09 | 2017-01-05 | オリンパス株式会社 | 走査型内視鏡システム及び走査型内視鏡の較正方法 |
| WO2017037781A1 (fr) * | 2015-08-28 | 2017-03-09 | オリンパス株式会社 | Dispositif d'observation de type à balayage |
| JPWO2017037781A1 (ja) * | 2015-08-28 | 2018-06-14 | オリンパス株式会社 | 走査型観察装置 |
| WO2018116464A1 (fr) * | 2016-12-22 | 2018-06-28 | オリンパス株式会社 | Dispositif d'acquisition d'image de balayage et système d'acquisition d'image de balayage |
| US10859816B2 (en) | 2016-12-22 | 2020-12-08 | Olympus Corporation | Scanning-type image acquisition device and scanning-type image acquisition system |
| CN110613510A (zh) * | 2018-06-19 | 2019-12-27 | 清华大学 | 一种自投影式内窥镜装置 |
| CN110613510B (zh) * | 2018-06-19 | 2020-07-21 | 清华大学 | 一种自投影式内窥镜装置 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5530577B1 (ja) | 走査型内視鏡システム | |
| JP5490331B1 (ja) | 内視鏡システム | |
| JP5702023B2 (ja) | 走査型内視鏡システム及び走査型内視鏡システムの作動方法 | |
| JP5571268B1 (ja) | 走査型内視鏡システム | |
| WO2014020943A1 (fr) | Système d'endoscope | |
| JP5841513B2 (ja) | 走査型内視鏡システム | |
| JP6265781B2 (ja) | 内視鏡システム及び内視鏡システムの制御方法 | |
| JP5974208B1 (ja) | 光走査型観察システム | |
| US9974432B2 (en) | Scanning endoscope apparatus with drive signal correction | |
| JP6381123B2 (ja) | 光走査型観察システム | |
| US20180289247A1 (en) | Endoscope system | |
| JP5639289B2 (ja) | 走査型内視鏡装置 | |
| JP6437808B2 (ja) | 光走査型観察システム | |
| JP2015033456A (ja) | 内視鏡システム | |
| JP6599728B2 (ja) | 走査型内視鏡装置 | |
| WO2016017199A1 (fr) | Système d'observation à balayage optique | |
| JP2018201812A (ja) | 走査型内視鏡装置、及び画像生成方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13825400 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 13825400 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |