[go: up one dir, main page]

WO2018047465A1 - Dispositif d'endoscope - Google Patents

Dispositif d'endoscope Download PDF

Info

Publication number
WO2018047465A1
WO2018047465A1 PCT/JP2017/025661 JP2017025661W WO2018047465A1 WO 2018047465 A1 WO2018047465 A1 WO 2018047465A1 JP 2017025661 W JP2017025661 W JP 2017025661W WO 2018047465 A1 WO2018047465 A1 WO 2018047465A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
unit
optical system
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2017/025661
Other languages
English (en)
Japanese (ja)
Inventor
英巧 山内
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Priority to JP2018538258A priority Critical patent/JP6694964B2/ja
Publication of WO2018047465A1 publication Critical patent/WO2018047465A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes

Definitions

  • the present invention relates to an endoscope apparatus provided with an observation optical system having two parallaxes on the right and left sides.
  • insertion devices that are inserted into a subject have been widely used in the medical field and the industrial field.
  • An endoscope used in the medical field is inserted into a body cavity serving as a subject by inserting an elongated insertion portion into a body cavity to be observed and, if necessary, inserted into an insertion channel of a treatment instrument included in the endoscope.
  • Various treatments can be performed using the treatment tool.
  • endoscopes used in the industrial field are designed to damage and corrode test sites in a subject by inserting the elongated insertion portion of the endoscope into a subject such as a jet engine or factory piping. Etc., various treatments, inspections, etc. can be performed.
  • the distal end portion of the endoscope is provided with two observation optical systems having parallax with respect to each other, and the principle of triangulation using observation images from the two right and left observation optical systems.
  • An endoscope apparatus that measures various spatial characteristics of a subject (stereo measurement) is used.
  • Japanese Patent Application Laid-Open No. 2004-187711 discloses an endoscope apparatus capable of stereo measurement that displays an observation image that can be easily observed by an examiner by reducing the influence of halation occurring in the observation image. ing.
  • the endoscope apparatus disclosed in Japanese Patent Application Laid-Open No. 2004-187711 discriminates whether or not halation has occurred in an observation image, and if halation has occurred, Image processing. Specifically, this endoscope apparatus performs mask processing by painting, cutting, or displaying character information on a portion where halation has occurred.
  • the endoscope apparatus disclosed in Japanese Patent Application Laid-Open No. 2004-187711 reduces the annoyance that the examiner feels with respect to the observation image by eliminating the display portion where halation has occurred from the display screen. be able to.
  • an object of the present invention is to provide an endoscope apparatus that can suppress a reduction in the visibility of an observation image due to the occurrence of halation.
  • An endoscope apparatus includes an endoscope insertion unit, a first optical system in which an optical axis is disposed in a direction intersecting with a parallax direction, and spaced apart from each other in the parallax direction.
  • a stereo optical system disposed at the distal end of the endoscope insertion portion, and disposed at the imaging position of the first optical system and the second optical system.
  • An image sensor that generates an imaging signal from at least one of the two optical images, and a first image corresponding to the first optical image and a second image corresponding to the second optical image based on the imaging signal.
  • a video signal processing unit that generates a video signal related to one of the images, and the video signal is input.
  • the luminance information of the one image displayed on the display unit is detected, and the detected luminance information is
  • a display switching unit that executes a display switching process for switching an image to be displayed on the display unit from the one image to the other image corresponding to the other optical image when the threshold is higher than a predetermined threshold;
  • FIG. 6 is a diagram for explaining an example in which the area of the display image is divided.
  • FIG. 1 is a configuration diagram illustrating a configuration of an endoscope apparatus according to the first embodiment.
  • the endoscope apparatus 1 includes an apparatus main body 2 having a function such as a video processor, and an endoscope 3 connected to the apparatus main body 2.
  • the apparatus main body 2 includes a display unit 4 such as a liquid crystal panel (LCD) on which an endoscopic image, an operation menu, and the like are displayed.
  • the display unit 4 may be provided with a touch panel.
  • the endoscope 3 includes an insertion portion 5 as an endoscope insertion portion that is inserted into a subject, an operation portion 6 that is connected to the proximal end of the insertion portion 5, and a universal cord that extends from the operation portion 6. 7.
  • the endoscope 3 can be attached to and detached from the apparatus main body 2 via a universal cord 7.
  • the insertion portion 5 includes, in order from the distal end side, a distal end portion 11, a bending portion 12 that is connected to the proximal end of the distal end portion 11, for example, can be bent in the vertical and horizontal directions, and the proximal end of the bending portion 12.
  • a long flexible portion 13 having flexibility is provided.
  • An imaging element 22 such as a CCD sensor or a CMOS sensor is built in the distal end portion 11 of the insertion portion 5.
  • An optical adapter 10 that is an optical adapter for an endoscope can be attached to the distal end portion 11. Note that there are a plurality of types of optical adapters 10, and depending on the type to be attached to the distal end portion 11 of the endoscope 3, the viewing angle such as close-up, wide angle, and magnification (telephoto) can be changed, direct view, side view, It is possible to change various optical characteristics such as changing the observation direction such as a perspective view.
  • the operation unit 6 is provided with a bending joystick 6a that bends the bending portion 12 in the vertical and horizontal directions. The user can bend the bending portion 12 in a desired direction by tilting the bending joystick 6a.
  • the operation unit 6 is provided with buttons for instructing an endoscope function, for example, various operation buttons such as a freeze button, a bending lock button, and a recording instruction button.
  • various operation buttons such as a freeze button, a bending lock button, and a recording instruction button.
  • the display unit 4 is provided with a touch panel, the user can operate the touch panel to instruct various operations of the endoscope apparatus 1.
  • the display unit 4 of the apparatus main body 2 displays an endoscopic image captured by the imaging element 22 of the imaging unit provided in the distal end portion 11.
  • various devices such as a control unit 30 (see FIG. 2) that performs image processing and various controls and a recording device that records the processed image are provided inside the apparatus main body 2.
  • FIG. 2 is a diagram for explaining the configuration of the optical adapter, the tip portion, and the apparatus main body.
  • the optical adapter 10 includes a stereo optical system 10 a configured by two left and right observation optical systems (a first optical system and a second optical system), a mechanical shutter 14, and a coil 15. .
  • the stereo optical system 10a is configured to include a left-eye optical system 10L and a right-eye optical system 10R in which an optical axis is disposed in a direction intersecting with the parallax direction and spaced apart from each other in the parallax direction.
  • the coil 15 is connected to the control unit 30 of the apparatus main body 2 via the control line 23 when the optical adapter 10 is attached to the distal end portion 11.
  • the distal end portion 11 of the insertion portion 5 may have the left eye optical system 10L, the right eye optical system 10R, the mechanical shutter 14, and the coil 15.
  • the left eye optical system 10L, the right eye optical system 10R, the mechanical shutter 14 and the coil 15 are arranged at the distal end (the optical adapter 10 or the distal end portion 11) of the insertion portion 5 of the endoscope 3.
  • the distal end portion 11 includes an observation optical system 21 and an image sensor 22.
  • the imaging element 22 is connected to the video signal processing unit 31 of the apparatus main body 2 via the imaging signal line 24.
  • the apparatus main body 2 includes a control unit 30, a video signal processing unit 31, a light source unit 32, and a light source control unit 33.
  • the video signal processing unit 31 includes a luminance calculation unit 34, a halation detection unit 35, an edge detection unit 36, a display control unit 37, a motion detection unit 38, and an object distance calculation unit 39. Yes.
  • a light guide 25 is inserted through the apparatus main body 2, the endoscope 3, and the optical adapter 10.
  • the light source unit 32 is, for example, a xenon lamp, an LED, a laser diode, or the like, and is disposed to face the base end surface of the light guide 25.
  • the light source unit 32 is driven under the control of the light source control unit 33 and makes illumination light incident on the base end surface of the light guide 25.
  • the illumination light incident on the proximal end surface of the light guide 25 is emitted from the distal end surface of the optical adapter 10, and the subject S is irradiated with the illumination light.
  • the light source part 32 is provided in the apparatus main body 2, you may provide in the operation part 6 of the endoscope 3, for example.
  • the left-eye optical system 10L, the right-eye optical system 10R, and the observation optical system 21 of the optical adapter 10 are designed so that return light from the subject S irradiated with illumination light is incident on the image sensor 22.
  • one observation optical system 21 is provided at the distal end portion 11, but a plurality of observation optical systems are provided at the distal end portion 11, and the left eye optical system 10 ⁇ / b> L and the right eye optical of the optical adapter 10 are provided.
  • the return light from the subject S may be incident on the image sensor 22 through the system 10R and the plurality of observation optical systems at the distal end portion 11.
  • the mechanical shutter 14 is alternatively arranged with respect to the first position that blocks the optical path passing through the left eye optical system 10L and the second position that blocks the optical path passing through the right eye optical system 10R.
  • the mechanical shutter 14 is located at the second position between the right eye optical system 10R and the image sensor 22, so that the light from the right eye optical system 10R is blocked and the left eye optical system.
  • the light from 10L is incident on the image sensor 22.
  • the operation signal (switching signal) is input to the control unit 30 by operating the operation unit 6 connected to the apparatus main body 2.
  • the position of the mechanical shutter 14 may be switched using the touch panel of the display unit 4 of the apparatus main body 2 or the like.
  • the control unit 30 causes the coil 15 to generate a magnetic field by causing a current to flow through the coil 15 via the control line 23.
  • the mechanical shutter 14 is provided with a magnet (not shown).
  • the magnetic field is generated in the coil 15, the mechanical shutter 14 becomes close to the coil 15 due to the magnetic force, and is positioned at the first position between the left eye optical system 10 ⁇ / b> L and the image sensor 22. Thereby, the light from the left eye optical system 10L is shielded, and the light from the right eye optical system 10R enters the image sensor 22.
  • the control unit 30 stops supplying the current to the coil 15 via the control line 23 in response to the operation signal, the magnetic field is not generated in the coil 15.
  • the mechanical shutter 14 moves to the second position that is the original position, that is, between the right eye optical system 10 ⁇ / b> R and the image sensor 22.
  • the coil 15 selects the mechanical shutter 14 that is a shielding unit with respect to the first position that blocks the optical path passing through the left eye optical system 10L and the second position that blocks the optical path passing through the right eye optical system 10R.
  • An optical path blocking unit that is disposed in a single block to block any one of the optical paths is configured.
  • the imaging element 22 has a light receiving surface arranged at an imaging position of the left eye optical system 10L and the right eye optical system 10R and the observation optical system 21, and receives light through the left eye optical system 10L and the observation optical system 21.
  • An imaging signal is generated from at least one of the optical image formed on the surface and the optical image formed on the light receiving surface via the right eye optical system 10R and the observation optical system 21.
  • the imaging signal generated by the imaging element 22 is input to the video signal processing unit 31 of the apparatus body 2 via the imaging signal line 24.
  • the video signal processing unit 31 generates a video signal related to one of the images corresponding to the left-eye optical system 10L and the images corresponding to the right-eye optical system 10R based on the input imaging signal.
  • the generated video signal is input to the display unit 4.
  • the display unit 4 displays one of the image corresponding to the left eye optical system 10L and the image corresponding to the right eye optical system 10R.
  • FIG. 3 is a diagram for explaining the relationship between the subject and the observation ranges of the left eye optical system and the right eye optical system.
  • FIG. 4 is a display image acquired by the left eye optical system and displayed on the display unit.
  • FIG. 5 is a diagram illustrating an example of a display image acquired by the right eye optical system and displayed on the display unit.
  • the left and right eye observation ranges are condensed by the left eye optical system 10L and the right eye optical system 10R, parallax occurs. Therefore, as shown in FIG. 3, the range acquired by the left eye optical system 10L and displayed on the display unit 4 is the observation range 40a, and the range acquired by the right eye optical system 10R and displayed on the display unit 4 becomes the observation range 40b. These observation ranges 40a or 40b are displayed on the display unit 4 as display images.
  • the display unit 4 displays an image of the observation range 40a of the left eye optical system 10L as shown in FIG. Is done.
  • FIG. 6 is a diagram for explaining an example in which the area of the display image is divided.
  • the imaging signal imaged by the imaging element 22 is input to the luminance calculation unit 34 of the video signal processing unit 31 via the imaging signal line 24.
  • the luminance calculation unit 34 of the video signal processing unit 31 calculates the luminance value (luminance information) of each pixel of the input imaging signal. Then, as shown in FIG. 6, the luminance calculation unit 34 divides the image into, for example, five areas E1 to E5.
  • the upper left of the image is divided into area E1, the upper right of the image is divided into area E2, the lower left of the image is divided into area E3, the lower right of the image is divided into area E4, and the center of the image is divided into area E5.
  • the luminance calculation unit 34 calculates an average luminance value for each of the areas E1 to E5 divided in this way.
  • the image is divided into five areas E1 to E5.
  • the number of divided areas is not limited to five, and may be four or less, or six or more. .
  • the halation detection unit 35 of the video signal processing unit 31 compares the average luminance value for each of the areas E1 to E5 calculated by the luminance calculation unit 34 with a predetermined threshold value, and there is an area higher than the predetermined threshold value. In this case, it is determined that halation has occurred in the area. On the other hand, the halation detection unit 35 determines that no halation has occurred when the average luminance value of all the areas E1 to E5 is equal to or less than a predetermined threshold value. The halation detection unit 35 outputs a detection result indicating whether or not halation has occurred to the control unit 30.
  • the image is divided into five areas E1 to E5, and the average luminance value of each area E1 to E5 is calculated to detect the occurrence of halation.
  • the area is not divided.
  • the occurrence of halation may be detected from the luminance value of each pixel of the image.
  • the brightness of the portion where halation has occurred in image processing may be suppressed, or the light amount of the light source unit 32 may be suppressed by the light source control unit 33.
  • the control unit 30 continues the observation without switching the left and right eye images.
  • the control unit 30 determines whether halation is likely to be eliminated by switching the left and right eye images.
  • the control unit 30 supplies current to the coil 15 via the control line 23 or stops supplying current.
  • the control unit 30 performs a switching process of switching the position of the mechanical shutter 14 and switching the left and right eye images to be displayed on the display unit 4.
  • the control unit 30 acquires the right eye optical system 10R.
  • the image is switched to the right-eye image, it is determined that there is a high probability that halation will be eliminated. That is, when switching from the left eye image to the right eye image, the halation occurring in the left areas E1 and E3 moves to the left end of the screen or disappears from the screen, so that the visibility of the observation image is improved. To do. Therefore, the control unit 30 determines that there is a high probability that halation will be eliminated.
  • control unit 30 determines that the halation is unlikely to be resolved by switching the left and right eye images, the control unit 30 does not switch the left and right eye images, or performs the switching of the left and right eye images. Let the user determine whether
  • the control unit 30 acquires the right eye optical system 10R. Even if the image is switched to the right-eye image, it is determined that there is little possibility that halation will be eliminated. That is, when switching from the left-eye image to the right-eye image, the halation occurring in the right areas E2 and E4 moves to the center of the screen, so the visibility of the observation image decreases. Therefore, the control unit 30 determines that there is little possibility that halation will be eliminated.
  • control unit 30 determines whether halation is likely to be eliminated by switching the left and right eye images according to the location where the halation has occurred.
  • control unit 30 and the video signal processing unit 31 detect the luminance information of one image displayed on the display unit 4 and, when the detected luminance information is higher than a predetermined threshold value, the display unit 4 constitutes a display switching unit that executes a display switching process for switching an image to be displayed from one image to the other image corresponding to the other optical image.
  • FIG. 7 is a diagram illustrating an example of a display unit when a left-eye image is displayed
  • FIG. 8 is a diagram illustrating an example of a display unit when a right-eye image is displayed.
  • the edge detection unit 36 detects the edges of the left and right eye images acquired by the left eye optical system 10L and the right eye optical system 10R.
  • the display control unit 37 detects the common part of the left and right eye images from the edges of the left and right eye images detected by the edge detection unit 36. Then, the display control unit 37 controls the display position of the left and right eye images so that the common part of the detected left and right eye images is displayed at the same position in the display area of the display unit 4. More preferably, the display control unit 37 controls the display position of the left and right eye images so that the common part of the left and right eye images, that is, the common subject part is displayed at the center of the display area of the display unit 4. As a result, a sense of incongruity when the left and right eye images are switched is suppressed.
  • the display control unit 37 when displaying the left eye image on the display unit 4, the display control unit 37 combines the black image 50 on the right side of the left eye image and displays it on the display unit 4 as shown in FIG. 7. Control. Further, when displaying the right-eye image on the display unit 4, the display control unit 37 combines the black image 51 with the left side of the right-eye image and displays it on the display unit 4 as shown in FIG. 8. Control.
  • FIG. 9 is a flowchart for explaining an example of the processing flow of the optical path switching determination.
  • the luminance calculation unit 34 calculates the average luminance value of each area of the currently displayed image (step S1).
  • the halation detection unit 35 determines whether or not there is an area where the calculated average luminance value is higher than a predetermined threshold (step S2). When there is no area where the average luminance value is higher than the predetermined threshold (S2: NO), the halation detection unit 35 determines that no halation has occurred, and does not execute the display switching process for switching the display of the left and right eye images. The observation is continued (step S3).
  • the halation detection unit 35 determines that halation is occurring in the area where the average luminance value is higher than the predetermined threshold (step) S4).
  • processing for suppressing halation may be performed by image processing or light source control.
  • the display switching process is not executed and the observation is continued.
  • the process of the subsequent step S5 is executed.
  • the control unit 30 determines whether the area is effective by switching the left and right eye images (step S5). If it is determined that the area is not effective by switching the left and right eye images (S5: NO), the control unit 30 does not execute the display switching process and continues the observation or switches the left and right eye images to the user. Is selected (step S6). On the other hand, when it determines with it being an area which has an effect by switching the image of right and left eyes (S5: YES), the control part 30 performs the display switching process which switches the image of right and left eyes (step S7).
  • FIG. 10 is a flowchart for explaining an example of the flow of display switching processing for switching the left and right eye images.
  • the images of the left and right eyes are acquired (step S11), and the edges of the acquired images of the left and right eyes are detected (step S12).
  • a common part of the left and right eye images is detected from the detected edge (step S13).
  • a corrected image is generated so that the detected common part of the left and right eyes is located at the center of the screen of the display unit 4 (step S14).
  • a corrected image is generated for the left-eye image.
  • a black image is synthesized on the right side of the corrected image for the left eye to generate a display image for the left eye (step S15).
  • the generated display image of the left eye is displayed on the display unit 4 (step S16).
  • step S17 it is determined whether to switch the image to be displayed from the left eye image to the right eye image. If it is determined to switch from the left-eye image to the right-eye image (S17: YES), a corrected image is generated for the right-eye image.
  • the method of generating the corrected image is the same as the method of generating the corrected image of the left eye image, and the correction of the right eye is performed so that the detected common part of the left and right eyes is located at the center of the screen of the display unit 4.
  • An image is generated (step S18).
  • a black image is synthesized on the left side of the right-eye correction image to generate a right-eye display image (step S19), and the right-eye display image is displayed on the display unit 4 (step S20).
  • the right-eye display image is continuously displayed.
  • the endoscope apparatus 1 divides an image into, for example, five areas E1 to E5, and detects an area where halation occurs.
  • the endoscope apparatus 1 detects whether or not the area has an effect of suppressing halation, and in the case of the effective area, the left and right eye images are switched. Yes. Thereby, the endoscope apparatus 1 can move the generated halation in a direction away from the center of the display screen of the display unit 4 or can erase the halation from the display screen. Get higher.
  • the endoscope apparatus of the present embodiment it is possible to suppress a reduction in the visibility of the observation image due to the occurrence of halation.
  • the endoscope apparatus 1 controls the display positions of the left and right eye images so that the common part of the left and right eye images is displayed at the same position in the display area of the display unit 4. By doing so, the position of the observation image does not shift even when the left and right eye images are switched, and the uncomfortable feeling when the left and right eye images are switched can be suppressed.
  • the left and right eye images are switched at the timing when halation occurs, and if there is an effect, the timing for switching the left and right eye images is not limited to the timing at which halation occurs. Other timings may be used.
  • the left and right eye images are acquired using the freeze operation by the user as a trigger, and the display switching process is performed by the control unit 30, or either the left eye image or the right eye image is an image suitable for the user. You may make it select whether there exists.
  • the motion detection unit 38 of the video signal processing unit 31 detects a certain motion
  • the left and right eye images are acquired, and either the left eye image or the right eye image is an image suitable for the user. You may make it select. That is, the motion detection unit 38 detects the blur amount of the subject based on the imaging signal or the video signal, and when the detected blur amount is larger than a predetermined threshold value, performs the display switching process by the control unit 30 or the left It is selected which of the image of the eye and the image of the right eye is an image suitable for the user.
  • the motion detection unit 38 when the motion detection unit 38 does not detect a certain motion, the left and right eye images are acquired, and the left eye image or the right eye image is selected as an image suitable for the user. You may do it. That is, the motion detection unit 38 detects the blur amount of the subject based on the imaging signal or the video signal, and when the detected blur amount is smaller than a predetermined threshold value, performs the display switching process by the control unit 30 or the left It is selected which of the image of the eye and the image of the right eye is an image suitable for the user.
  • the left and right eye images are acquired by using the bending lock operation by the user as a trigger, and the display switching process is performed by the control unit 30, or either the left eye image or the right eye image is suitable for the user. You may make it select whether it is an image.
  • the freeze button of the operation unit 6, the motion detection unit 38, and the bending lock button of the operation unit 6 constitute a display switching determination unit that outputs trigger information for display switching processing.
  • the control unit 30 executes display switching processing based on trigger information from the display switching determination unit.
  • the common part of the left and right eye images is detected and arranged at the center of the display screen at the timing of switching the left and right eye images, but the common part of the left and right eye images is detected and the display screen is displayed.
  • the timing at which the image is arranged at the center of the image is not limited to the timing for switching the left and right eye images, and may be another timing.
  • the control unit 30 executes display switching processing when the type of the optical adapter 10 attached to the distal end portion 11 is changed. That is, when the type of the optical adapter 10 attached to the distal end portion 11 is changed, the control unit 30 detects the common part of the left and right eye images and arranges it at the center of the display screen of the display unit 4. The change of the optical adapter 10 may be detected by the control unit 30 or may be input by the user using the touch panel of the display unit 4.
  • FIG. 11 is a flowchart for explaining an example of display image switching processing when the optical adapter is changed.
  • FIG. 11 the same processes as those in FIG. 11
  • control unit 30 determines whether or not the optical adapter 10 has been changed (step S21). When it determines with the optical adapter 10 not having been changed, it returns to step S21 and repeats the same process. On the other hand, when it determines with the optical adapter 10 having been changed, it progresses to step S11. The processes after step S11 are the same as those in FIG.
  • the endoscope apparatus 1 can display an optimal image even when the type of the optical adapter 10 attached to the distal end portion 11 is changed.
  • the observation range common to the left and right eyes is changed depending on the object distance between the subject S and the optical adapter 10 or the tip 11. For example, when the distance between the subject S and the optical adapter 10 or the distal end portion 11 decreases, the observation range common to the left and right eyes decreases, and when the distance between the subject S and the optical adapter 10 or the distal end portion 11 increases, The common observation range increases.
  • the observation range is changed according to the distance between the subject S and the optical adapter 10 or the distal end portion 11. Therefore, when the object distance is changed, a common part of the left and right eye images may be detected and arranged at the center of the display screen.
  • the object distance calculation unit 39 calculates the object distance between the subject S and the optical adapter 10 or the tip 11.
  • the object distance between the subject S and the optical adapter 10 or the distal end portion 11 can be measured by acquiring left and right eye images from the left eye optical system 10L and the right eye optical system 10R and using the known triangulation principle. Good.
  • the object distance is measured at timings such as a freeze operation using the operation unit 6 described in the second embodiment, a motion detection by the motion detection unit 38, a bending operation using the operation unit 6, and the like. .
  • the control unit 30 determines whether or not the object distance calculated by the object distance calculation unit 39 has changed from the previously calculated object distance, and displays when the calculated object distance has changed from the previously calculated object distance. Perform the switching process.
  • FIG. 12 is a flowchart for explaining an example of the display image switching process based on the object distance.
  • the same processes as those in FIG. 12 are identical processes as those in FIG. 12
  • the object distance calculation unit 39 starts object distance measurement (step S31).
  • the object distance calculation unit 39 acquires the images of the left and right eyes (step S32), and measures the object distance using a known object distance measurement technique (step S33).
  • step S34 determines whether or not the object distance has changed from the previous object distance measurement result. If it is determined from the previous object distance measurement result that the object distance has not changed (S34: NO), the process ends without performing anything. On the other hand, when it is determined that the object distance has changed from the previous object distance measurement result (S34: YES), the process proceeds to step S11.
  • the processes after step S11 are the same as those in FIG.
  • the endoscope apparatus 1 can display an optimal image even when the distance (object distance) between the subject S and the optical adapter 10 or the distal end portion 11 is changed.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Optics & Photonics (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Astronomy & Astrophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)
  • Endoscopes (AREA)

Abstract

L'invention concerne un dispositif d'endoscope (1) qui comprend : une unité d'insertion d'endoscope ; un système optique stéréo (10a) comprenant un premier système optique et un second système optique ; un élément de capture d'image (22) qui produit un signal de capture d'image d'après au moins l'une parmi une première image optique formée sur une surface de réception de lumière par l'intermédiaire du premier système optique, et une seconde image optique formée sur la surface de réception de lumière par l'intermédiaire du second système optique ; une unité de traitement de signal vidéo (31) qui produit un signal vidéo correspondant à l'une parmi une première image et une seconde image ; et une unité d'affichage (4) qui affiche l'image lorsque le signal vidéo est entré. En outre, le dispositif d'endoscope comprend une unité de commutation d'affichage, qui détecte des informations de luminance de l'image affichée sur l'unité d'affichage (4), et qui, dans les cas où les informations de luminance détectées sont supérieures à une valeur seuil prédéterminée, effectue une commutation d'affichage dans le but de commuter l'image affichée sur l'unité d'affichage (4) pour passer de l'image à l'autre image correspondant à l'autre image optique.
PCT/JP2017/025661 2016-09-09 2017-07-14 Dispositif d'endoscope Ceased WO2018047465A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018538258A JP6694964B2 (ja) 2016-09-09 2017-07-14 内視鏡装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016176262 2016-09-09
JP2016-176262 2016-09-09

Publications (1)

Publication Number Publication Date
WO2018047465A1 true WO2018047465A1 (fr) 2018-03-15

Family

ID=61562116

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/025661 Ceased WO2018047465A1 (fr) 2016-09-09 2017-07-14 Dispositif d'endoscope

Country Status (2)

Country Link
JP (1) JP6694964B2 (fr)
WO (1) WO2018047465A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114269218A (zh) * 2019-08-29 2022-04-01 奥林巴斯株式会社 图像处理方法和图像处理装置
WO2024188160A1 (fr) * 2023-03-13 2024-09-19 武汉迈瑞生物医疗科技有限公司 Procédé d'imagerie et d'affichage pour système d'endoscope et système d'endoscope

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0662438A (ja) * 1992-08-14 1994-03-04 Olympus Optical Co Ltd 立体像観察システム
JP2004187711A (ja) * 2002-12-06 2004-07-08 Olympus Corp 内視鏡装置
JP2014028008A (ja) * 2012-07-31 2014-02-13 Olympus Corp ステレオ計測装置およびステレオ計測方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0662438A (ja) * 1992-08-14 1994-03-04 Olympus Optical Co Ltd 立体像観察システム
JP2004187711A (ja) * 2002-12-06 2004-07-08 Olympus Corp 内視鏡装置
JP2014028008A (ja) * 2012-07-31 2014-02-13 Olympus Corp ステレオ計測装置およびステレオ計測方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114269218A (zh) * 2019-08-29 2022-04-01 奥林巴斯株式会社 图像处理方法和图像处理装置
WO2024188160A1 (fr) * 2023-03-13 2024-09-19 武汉迈瑞生物医疗科技有限公司 Procédé d'imagerie et d'affichage pour système d'endoscope et système d'endoscope

Also Published As

Publication number Publication date
JPWO2018047465A1 (ja) 2019-03-07
JP6694964B2 (ja) 2020-05-20

Similar Documents

Publication Publication Date Title
US11555997B2 (en) Endoscope with integrated measurement of distance to objects of interest
JP5284731B2 (ja) 立体画像撮影表示システム
US11426052B2 (en) Endoscopic system
JP5996319B2 (ja) ステレオ計測装置およびステレオ計測装置の作動方法
JP6150955B2 (ja) 立体視内視鏡装置およびビデオプロセッサ
US20170042407A1 (en) Image processing apparatus, image processing method, and program
US20120300032A1 (en) Endoscope
WO2020008672A1 (fr) Système d'endoscope
WO2013179855A1 (fr) Système d'endoscope stéréoscopique
US20120130168A1 (en) Endoscope apparatus
WO2006118057A1 (fr) Dispositif d’affichage d’image
JP4801641B2 (ja) タッチパネル及び内視鏡装置のプロセッサ
JP6694964B2 (ja) 内視鏡装置
JPH11318936A (ja) 手術用顕微鏡装置
JP2014175965A (ja) 手術用カメラ
JP2016095458A (ja) 内視鏡装置
US11224329B2 (en) Medical observation apparatus
JP5792401B2 (ja) オートフォーカス装置
JPWO2016181781A1 (ja) 内視鏡装置
JP4246510B2 (ja) 立体内視鏡システム
JP7375022B2 (ja) 画像処理装置の作動方法、制御装置、および内視鏡システム
JP4464103B2 (ja) 画像表示システム
CN116322560A (zh) 医用图像处理装置及医用观察系统
JP2018007837A (ja) 画像処理装置
JP2017038933A (ja) ステレオ計測用画像取得装置及びステレオ計測用画像取得装置の作動方法

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018538258

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17848410

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17848410

Country of ref document: EP

Kind code of ref document: A1