[go: up one dir, main page]

WO2018105396A1 - Dispositif d'endoscope - Google Patents

Dispositif d'endoscope Download PDF

Info

Publication number
WO2018105396A1
WO2018105396A1 PCT/JP2017/041963 JP2017041963W WO2018105396A1 WO 2018105396 A1 WO2018105396 A1 WO 2018105396A1 JP 2017041963 W JP2017041963 W JP 2017041963W WO 2018105396 A1 WO2018105396 A1 WO 2018105396A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical system
light
region
incident
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2017/041963
Other languages
English (en)
Japanese (ja)
Inventor
高橋 進
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Publication of WO2018105396A1 publication Critical patent/WO2018105396A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes

Definitions

  • the present invention relates to an endoscope apparatus, and more particularly to an endoscope apparatus having an observation optical system capable of switching the incident state of light to an image sensor.
  • endoscope apparatuses have been widely used in the industrial field and the medical field.
  • light from a subject incident through an observation window is incident on a light receiving surface of an imaging element through an observation optical system, and a subject image is projected onto the imaging surface of the imaging element.
  • the image sensor photoelectrically converts the subject image projected on the imaging surface and outputs it as an image signal.
  • An endoscopic image is generated from the imaging signal.
  • Some endoscopes can change the state of the observation optical system.
  • Japanese Patent Application Laid-Open No. 2010-128354 discloses an endoscope apparatus that has two optical systems having parallax and is capable of so-called stereo measurement or stereo observation.
  • the endoscope apparatus includes an optical path switching unit, and the operation of the optical path switching unit is controlled so that light from two optical paths is alternately projected onto a common imaging surface of one imaging element.
  • the two object images to be projected are projected onto the common image pickup surface of one image sensor. May be different from each other.
  • the areas where the influence of optical characteristics appears in the two subject images are different when passing through the left and right optical paths.
  • the influence due to the optical characteristics appearing in the two endoscopic images of the subject generated by the left and right optical paths is at least one of aberration, vignetting, shading, color unevenness, ghost, flare, and the like.
  • an object of the present invention is to provide an endoscope apparatus that eliminates the influence of optical characteristics or reduces the appearance of the influence in two endoscopic images generated by switching the state of an optical system.
  • An endoscope apparatus includes an insertion unit, an imaging element that receives incident light that is incident on an observation window provided at a distal end of the insertion unit, and the light from the observation window.
  • An optical system that emits light to the image sensor, wherein the incident state of the light to the image sensor can be switched between a first incident state and a second incident state different from the first incident state.
  • the incident state of the light is the first incident state
  • a first region is extracted from the first light receiving region of the first incident state
  • the incident state of the light is the A region extracting unit that extracts a second region from the second light receiving region in the second incident state in the second incident state;
  • FIG. 1 is a configuration diagram showing a configuration of an endoscope apparatus according to a first embodiment of the present invention. It is a figure for demonstrating the structure of the front-end
  • FIG. 1 is a configuration diagram illustrating a configuration of an endoscope apparatus according to the first embodiment.
  • the endoscope apparatus 1 includes an apparatus main body 2 having a function such as a video processor, and an endoscope 3 connected to the apparatus main body 2.
  • the apparatus main body 2 includes a display unit 4 such as a liquid crystal panel (LCD) on which an endoscopic image, an operation menu, and the like are displayed.
  • the display unit 4 may be provided with a touch panel.
  • the endoscope 3 includes an insertion portion 5 as an endoscope insertion portion that is inserted into a subject, an operation portion 6 that is connected to the proximal end of the insertion portion 5, and a universal cord that extends from the operation portion 6. 7.
  • the endoscope 3 can be attached to and detached from the apparatus main body 2 via a universal cord 7.
  • the insertion portion 5 includes a distal end portion 11, a bending portion 12, and a long flexible portion 13 in order from the distal end side.
  • the bending portion 12 is connected to the proximal end of the distal end portion 11, and is configured to be able to bend in the vertical and horizontal directions, for example.
  • the flexible portion 13 is connected to the base end of the bending portion 12 and has flexibility.
  • An image pickup element 22 (see FIG. 2) such as a CMOS image sensor is built in the distal end portion 11 of the insertion portion 5.
  • the image sensor 22 receives incident light that has entered the observation window 11 a (see FIG. 2) provided at the distal end portion 11 of the insertion portion 5.
  • the operation unit 6 is provided with a bending joystick 6a that bends the bending portion 12 in the vertical and horizontal directions. The user can bend the bending portion 12 in a desired direction by tilting the bending joystick 6a.
  • the operation unit 6 is provided with buttons for instructing an endoscope function, for example, various operation buttons such as a freeze button, a bending lock button, and a recording instruction button.
  • the operation unit 6 is further provided with an image switching button for acquiring either the right eye or the left eye, which will be described later.
  • the display unit 4 is provided with a touch panel, the user may instruct various operations of the endoscope apparatus 1 by operating the touch panel.
  • the display unit 4 of the apparatus main body 2 displays an endoscopic image captured by the imaging element 22 (see FIG. 2) of the imaging unit provided in the distal end portion 11. Also, various devices such as a control unit 31 (see FIG. 2) that performs image processing and various controls, a recording device that records a processed image in a memory (not shown), and the like are provided inside the apparatus main body 2. .
  • FIG. 2 is a view for explaining the configuration of the distal end portion of the insertion portion and the apparatus main body.
  • tip part 11 and the apparatus main body 2 is demonstrated using FIG.
  • the distal end portion 11 includes a stereo optical system 10 a configured by two observation left and right observation optical systems (a first optical system and a second optical system), a mechanical shutter 14, and a coil 15.
  • the stereo optical system 10a is configured to include a left-eye optical system 10L and a right-eye optical system 10R that have an optical axis arranged in a direction orthogonal to the parallax direction and are spaced apart from each other in the parallax direction.
  • the coil 15 is connected to the control unit 31 of the apparatus main body 2 via the control line 23 when the optical adapter 10 is attached to the distal end portion 11.
  • the left eye optical system 10L, the right eye optical system 10R, the mechanical shutter 14, and the coil 15 may be provided on the optical adapter 10 that can be attached to the distal end portion 11. That is, as shown by a dotted line in FIG. 1, when the endoscope apparatus 1 is of a type in which a stereo measurement optical adapter 10 that can be attached to and detached from the distal end portion 11 of the insertion portion 5 can be attached, Further, a left eye optical system 10L, a right eye optical system 10R, a mechanical shutter 14 and a coil 15 are provided.
  • FIG. 3 is a configuration diagram of the mechanical shutter 14.
  • the mechanical shutter 14 includes a circular diaphragm member 14a having two openings and a shielding member 14b.
  • the two openings 14a1 and 14a2 of the diaphragm member 14a constitute the diaphragms of two optical paths of the right eye optical system 10R and the left eye optical system 10L, respectively.
  • the mechanical shutter 14 is disposed in the distal end portion 11 so that the two circular openings 14a1 and 14a2 are disposed at the respective positions of the two optical paths.
  • the shielding member 14b has an arm 14b1 that can rotate about the axis of the shaft member 14c fixed to the diaphragm member 14a, and a circular light shielding plate 14b2 formed at the tip of the arm 14b1.
  • the shielding member 14b is provided with a magnet 14m.
  • the shielding member 14b has a first position covering the opening 14a1 disposed in the optical path of the right eye optical system 10R and a first position covering the opening 14a2 disposed in the optical path of the left eye optical system 10L. It is possible to move between the two positions. As will be described later, the shielding member 14b having the magnet 14m can be positioned at either the first position or the second position by the magnetic field generated in the coil 15.
  • the configuration of the mechanical shutter 14 may be a configuration other than the configuration shown in FIG.
  • FIG. 4 is a configuration diagram of another example of the mechanical shutter.
  • the mechanical shutter 14A has two circular aperture members 14Aa and 14Ab.
  • the two throttle members 14Aa and 14Ab have circular openings 14Aa1 and 14Ab1, respectively.
  • the two aperture members 14Aa and 14Ab have shielding members 14Ad and 14Ae, respectively.
  • the shielding member 14Ad includes an arm 14Ad1 that can rotate about the axis of the shaft member 14c1 fixed to the diaphragm member 14Aa, and a circular light shielding plate 14Ad2 formed at the tip of the arm 14Ad1.
  • the shielding member 14Ae has an arm 14Ae1 that can rotate about the axis of the shaft member 14c2 fixed to the diaphragm member 14Ab, and a circular light shielding plate 14Ae2 formed at the tip of the arm 14Ae1.
  • Each shield member 14d, 14e is provided with a magnet (not shown).
  • the opening 14Aa1 of the shielding member 14Ad is disposed in the optical path of the right eye optical system 10R, and the opening 14Ab1 of the shielding member 14Ae is disposed in the optical path of the left eye optical system 10L.
  • a coil 15 is provided for each of the shielding members 14d and 14e, and the two coils 15 are driven so that the two shielding members 14d and 14e cover the openings 14Aa1 and 14Ab1 alternately.
  • the mechanical shutter 14 may be a mechanical shutter having the configuration shown in FIG.
  • the two openings 14a1 and 14a2 of the diaphragm member 14a constitute a diaphragm for two optical paths of the right eye optical system 10R and the left eye optical system 10L.
  • the mechanical shutter 14 is disposed in the distal end portion 11 so that the two openings 14a1 and 14a2 are disposed at the respective positions of the two optical paths.
  • the left eye optical system 10 ⁇ / b> L, the right eye optical system 10 ⁇ / b> R, the mechanical shutter 14, and the coil 15 are disposed in the distal end portion 11 of the insertion portion 5 of the endoscope 3.
  • the distal end portion 11 includes an imaging optical system 21 and an image sensor 22.
  • the imaging element 22 is connected to the video signal processing unit 32 of the apparatus main body 2 via the imaging signal line 24.
  • the apparatus main body 2 includes a control unit 31, a video signal processing unit 32, a light source unit 33, a light source control unit 34, and an effective pixel region control unit 35.
  • the control unit 31 controls the entire operation of the endoscope apparatus 1 and also controls the driving of the coil 15, the operation of the video signal processing unit 32, and the driving of the effective pixel region control unit 35.
  • the control unit 31 outputs a coil drive signal DS for driving the coil 15 and an effective pixel region change instruction signal CS to the effective pixel region control unit 35 in response to an operation of the image switching button of the operation unit 6. Output.
  • the video signal processing unit 32 receives the imaging signal from the imaging element 22, performs various image processing, generates an endoscope image signal, and outputs the endoscope image signal to the display unit 4.
  • the video signal processing unit 32 may store the generated endoscope image signal in a memory (not shown).
  • a light guide 25 is inserted into the apparatus main body 2, the endoscope 3, and the distal end portion 11.
  • the light source unit 33 is, for example, a xenon lamp, an LED, a laser diode, or the like, and is disposed to face the base end surface of the light guide 25.
  • the light source unit 33 is driven by the control of the light source control unit 34, and makes the illumination light enter the base end surface of the light guide 25.
  • the illumination light incident on the base end surface of the light guide 25 is emitted from an illumination window (not shown) on the distal end surface of the optical adapter 10, and the subject S is irradiated with the illumination light.
  • the light source part 33 is provided in the apparatus main body 2, you may provide in the operation part 6 of the endoscope 3, for example.
  • the effective pixel area control unit 35 outputs a drive signal for driving the image sensor 22 and outputs area information AI that defines an area on the imaging surface 22a of the image sensor 22 output as an image signal via the signal line 24a. To the image sensor 22.
  • the right eye optical system 10R, the left eye optical system 10L, and the imaging optical system 21 at the distal end portion 11 return light from the subject S irradiated with illumination light through an observation window 11a provided at the distal end portion 11.
  • the imaging optical system 21 at the distal end portion 11 return light from the subject S irradiated with illumination light through an observation window 11a provided at the distal end portion 11.
  • the light shielding plate 14b2 of the mechanical shutter 14 is alternative to the first position that blocks the optical path that passes through the right eye optical system 10R and the second position that blocks the optical path that passes through the left eye optical system 10L. Placed in. In FIG. 2, the light-shielding plate 14b2 of the mechanical shutter 14 is located at the second position between the right-eye optical system 10R and the image pickup device 22, and shields light from the right-eye optical system 10R, and the left-eye optical system 10L. The state from which the light from is incident on the image sensor 22 is shown.
  • the stereo optical system 10 a, the mechanical shutter 14, and the imaging optical system 21 are optical systems that receive light from the observation window 11 a and emit the light to the image sensor 22.
  • An observation optical system in which the state (here, the optical path) can be switched between a first incident state and a second incident state different from the first incident state is configured.
  • the observation optical system has first and second optical systems having different optical axes.
  • incident light that has entered the observation window 11a passes through the first optical system.
  • the second incident state is a state where incident light incident on the observation window 11a passes through the second optical system.
  • the endoscope apparatus 1 is a stereo measurement or stereo observation endoscope apparatus using two optical systems, a first optical system and a second optical system.
  • the user operates the operation unit 6 connected to the apparatus main body 2 when switching the position of the mechanical shutter 14.
  • the operation unit 6 has a right eye image acquisition operation unit for positioning the position of the mechanical shutter 14 in the left eye optical system 10L in order to acquire a right eye image by the right eye optical system 10R. 6b1 and a left-eye image acquisition operation unit 6b2 for positioning the mechanical shutter 14 in the right-eye optical system 10R in order to acquire a left-eye image by the left-eye optical system 10L.
  • the user operates any one of the right eye image acquisition operation unit 6b1 and the left eye image acquisition operation unit 6b2.
  • a left eye blocking instruction signal is output from the operation unit 6 to the control unit 31, and when the left eye image acquisition operation unit 6b2 is operated, the operation unit 6 controls the control unit 31.
  • the right eye blocking instruction signal is output at The left eye blocking instruction signal and the right eye blocking instruction signal are input to the control unit 31 as the switching signal SL. Note that the position of the mechanical shutter 14 may be switched using the touch panel of the display unit 4 of the apparatus main body 2 or the like.
  • the control unit 31 supplies the coil drive signal DS to the control line 23 according to the switching signal SL, that is, according to the left eye blocking instruction signal or the right eye blocking instruction signal.
  • the control unit 31 supplies the coil drive signal DS to the control line 23 according to the switching signal SL, that is, according to the left eye blocking instruction signal or the right eye blocking instruction signal.
  • the coil 15 To be supplied to the coil 15.
  • a magnetic field corresponding to the switching signal SL is generated by the coil 15.
  • the position of the light shielding plate 14b2 of the mechanical shutter 14 is changed by the electromagnetic action of the direction of the magnetic field generated in the coil 15 and the magnet 14m of the mechanical shutter 14.
  • the operation unit 6 When the right eye image acquisition operation unit 6b1 is operated, the operation unit 6 outputs a switching signal SL that is a left eye blocking instruction signal to the control unit 31.
  • the control unit 31 supplies the coil 15 with the coil driving signal DS that is the left eye blocking driving signal.
  • a current in the first direction flows through the coil 15, the shutter 14 is driven so as to shield the optical path of the left eye optical system 10L, and the imaging device 22 receives light from the optical path of the right eye optical system 10R, A right eye image is acquired.
  • the operation unit 6 When the left eye image acquisition operation unit 6b2 is operated, the operation unit 6 outputs a switching signal SL that is a right eye blocking instruction signal to the control unit 31.
  • the control unit 31 receives the switching signal SL that is the right eye blocking instruction signal, the control unit 31 supplies the coil 15 with the coil driving signal DS that is the right eye blocking driving signal.
  • a current in the second direction opposite to the first direction flows through the coil 15, the shutter 14 is driven so as to shield the optical path of the right eye optical system 10R, and the image pickup device 22 has the left eye optical system. Light is received from the 10 L optical path, and a left eye image is acquired.
  • FIG. 2 shows a state in which the shielding member 14b is driven so that the light shielding plate of the shutter 14 shields the light of the right eye optical system 10R and allows the light of the left eye optical system 10L to pass through.
  • the shielding plate 14 b 2 is located at a second position between the right eye optical system 10 ⁇ / b> R and the image sensor 22. Thereby, the light from the right eye optical system 10R is shielded, and the light from the left eye optical system 10L enters the image sensor 22.
  • the mechanical shutter 14 alternatively arranges the shielding plate 14b2 with respect to the first position that blocks the optical path passing through the right eye optical system 10R and the second position that blocks the optical path passing through the left eye optical system 10L.
  • an optical path blocking unit that blocks either one of the optical paths is configured.
  • the imaging element 22 has an imaging surface 22a that is a light receiving surface disposed at an imaging position of the left eye optical system 10L and the left eye optical system 10L and the imaging optical system 21, and includes the right eye optical system 10R and the imaging.
  • An imaging signal of one of the optical image formed on the light receiving surface via the optical system 21 and the optical image formed on the light receiving surface via the left eye optical system 10L and the imaging optical system 21 is generated.
  • the imaging signal generated by the imaging element 22 is input to the video signal processing unit 32 of the apparatus body 2 via the imaging signal line 24.
  • the video signal processing unit 32 generates a video signal of the left eye image and the right eye image based on the input imaging signal.
  • the generated video signal is input to the display unit 4.
  • the display unit 4 displays one of the left eye image and the right eye image obtained by the left eye optical system 10L as an endoscopic image.
  • 5 and 6 are diagrams for explaining the relationship between the two optical systems at the tip and the effective pixel region on the imaging surface of the imaging device.
  • a cover glass 41 is provided on the distal end surface of the distal end portion 11, and right and left objective optical systems, that is, a right eye optical system 10 ⁇ / b> R and a left eye optical system 10 ⁇ / b> L are arranged in the distal end portion 11 inside the cover glass 41. It is installed.
  • the mechanical shutter 14 and the imaging lens portions 42 and 43 including a plurality of lenses are provided in the distal end portion 11 from the distal end side toward the proximal end side. Arranged and fixed.
  • the imaging element 22 having a protective glass portion 44 made of a plurality of glasses fixed on the front surface is disposed and fixed. 5 and 6, the effective pixel area CA is fixedly set at the center of the imaging surface 22 a of the imaging element 22.
  • the portion indicated by the alternate long and short dash line is the tip surface of the tip portion 11.
  • a cover glass 41 a indicated by a two-dot chain line is provided on the distal end side of the distal end portion 11.
  • FIG. 5 shows a plurality of optical paths of light passing through the right eye optical system 10R.
  • the right end light beam LR1 the slightly right end light beam LR2, the center light beam LR3, the left end light beam LR4, and the left end light beam when the subject is viewed from the distal end portion 11. It shows that LR5 is focused and focused on areas A1, A2, A3, A4, and A5 on the imaging surface 22a of the imaging element 22, respectively.
  • the light beam LR5 at the left end of the light incident on the right eye optical system 10R passes through the outer edge portion OE1 of the imaging optical system 21, and vignetting or the like is generated by a lens fastening member or the like.
  • Is affected by optical characteristics such as aberration in the area AA1 on one side of the imaging surface 22a of the imaging element 22 (indicated by the oblique line in the upper area in FIG. 5).
  • the influence of the optical characteristics is any one of aberration, vignetting, shading, color unevenness, ghost, flare, and the like. Among these, one or more influences appear in the area AA1 on one side of the imaging surface 22a. Therefore, in the case of FIG. 5, a part of the effective pixel area CA of the imaging surface 22a is also affected by the optical characteristics.
  • FIG. 6 shows a plurality of optical paths of light passing through the left eye optical system 10L.
  • the left end light beam LL1 indicates that light is condensed and imaged in regions A6, A7, A8, A9, and A10 on the imaging surface 22a of the imaging element 22, respectively.
  • the rightmost light beam LL5 passes through the outer edge portion OE2 of the imaging optical system 21, and vignetting or the like occurs due to a lens retaining member.
  • Is affected by optical characteristics such as aberration in a region AA2 on one side of the image pickup surface 22a of the image pickup device 22 (shown by hatching in the lower region in FIG. 6).
  • the influence of the optical characteristics is any one of aberration, vignetting, shading, color unevenness, ghost, flare and the like.
  • one or more influences appear in the area AA2 on one side of the imaging surface 22a.
  • the area AA1 affected by the optical characteristics in FIG. 5 and the area AA2 affected by the optical characteristics in FIG. 6 are opposite to each other with respect to the center of the imaging surface 22a. Therefore, also in FIG. 6, part of the effective pixel area CA on the imaging surface 22a is also affected by the optical characteristics.
  • an effective area that is a data readout area so that the area excluding the areas AA1 and AA2 affected by the optical characteristics described above becomes the effective pixel area NCA.
  • the pixel area NCA is shifted in the imaging surface 22a depending on whether the acquired image is a right eye image or a left eye image. Specifically, when light incident on the right eye optical system 10R is projected onto the imaging surface 22a, an area excluding the area AA1 affected by the optical characteristics is selected as the effective pixel area NCA. When light incident on the left eye optical system 10L is projected onto the imaging surface 22a, an area excluding the area AA2 affected by the optical characteristics is selected as the effective pixel area NCA.
  • FIG. 7 is a diagram illustrating the position of the effective pixel area NCA1 when light incident on the right eye optical system 10R is projected onto the imaging surface 22a.
  • FIG. 7 shows the positions of the imaging surface 22a and the effective pixel area NCA1 when the imaging device 22 is viewed from the subject side.
  • the position of the effective pixel area NCA1 as the data reading area is set so as not to include the area AA1 affected by the optical characteristics.
  • the effective pixel area NCA1 is shifted from the center to the right side.
  • FIG. 8 is a diagram showing the position of the effective pixel area NCA2 when light incident on the left-eye optical system 10L is projected onto the imaging surface 22a.
  • FIG. 8 shows the positions of the imaging surface 22a and the effective pixel area NCA2 when the imaging element 22 is viewed from the subject side.
  • the position of the effective pixel area NCA2 as the data reading area is set so as not to include the area AA2 affected by the optical characteristics.
  • the effective pixel area NCA2 is shifted from the center to the left side.
  • the effective pixel area NCA1 is a rectangular area whose start point in the vertical and horizontal (or vertical and horizontal) directions x and y of the pixel area of the imaging surface 22a is defined by sp1 (x11, y11) and whose end point is defined by ep1 (x12, y12). It is.
  • the effective pixel area NCA2 is a rectangular area whose start point in the vertical and horizontal (or vertical and horizontal) directions of the pixel area of the imaging surface 22a is defined by sp2 (x21, y21) and whose end point is defined by ep2 (x22, y22). is there.
  • the effective pixel area control unit 35 has area information AI, and outputs area information AI indicating the range of the effective pixel area NCA1 or NCA2 to the image sensor 22 in response to the change instruction signal CS from the control unit 31.
  • the area information AI includes information (x11, y11) of the start point sp1 and information (x12, y12) of the end point ep1 in the case of the effective pixel area NCA1, and information (x21, y2) of the start point sp2 in the case of the effective pixel area NCA2. This is a region designation signal including information (x22, y22) of y21) and end point ep1.
  • the control unit 31 and the effective pixel region control unit 35 extract the effective pixel region NCA1 from the light receiving region in the first incident state, and the light incident state is the second incident state.
  • a region extracting unit is configured to extract the second region from the effective pixel region NCA2 in the second incident state.
  • the effective pixel area NCA1 is offset to the left with respect to the parallax direction center of the imaging surface 22a. Located in the area. Further, when light incident on the left-eye optical system 10L is projected onto the imaging surface 22a, the effective pixel area NCA2 is located in an area offset to the right with respect to the parallax direction center of the imaging surface 22a. That is, the effective pixel area NCA1 and the effective pixel area NCA2 are at positions offset in opposite directions.
  • the imaging element 22 outputs an imaging signal of an area on the imaging surface 22a designated by the received area information AI. That is, the effective pixel area control unit 35 outputs area information AI that is an area designation signal that designates a pixel area output from the image sensor 22, and the image sensor 22 outputs the first area or the second area based on the area designation signal. The first region or the second region is extracted by outputting the imaging signal of the region.
  • area information AI that is an area designation signal that designates a pixel area output from the image sensor 22
  • the image sensor 22 outputs the first area or the second area based on the area designation signal.
  • the first region or the second region is extracted by outputting the imaging signal of the region.
  • FIG. 9 and 10 are diagrams showing the subject image projected on the imaging surface 22a of the imaging element 22 and the position of the effective pixel region.
  • FIG. 11 is a diagram illustrating a right eye image displayed on the display unit 4.
  • FIG. 12 is a diagram showing a left eye image displayed on the display unit 4.
  • FIG. 9 is a diagram illustrating the position of the subject image and the effective pixel area NCA1 when the right eye image is acquired.
  • FIG. 10 is a diagram illustrating the position of the subject image and the effective pixel area NCA2 when the left eye image is acquired.
  • the light that has passed through the right eye optical system 10R forms an image on the image pickup surface 22a.
  • the right-eye image of the light passing through the right-eye optical system 10R has image quality degradation in the area AA1 (shown by hatching) on one side of the imaging surface 22a, but the effective pixel area NCA1 is separated from the area AA1. It is set by shifting in the first direction from the center of 22a.
  • the first direction is a direction parallel to the parallax direction. Therefore, even when the image of the effective pixel area NCA1 is displayed on the display unit 4, the endoscopic image of the right eye image displayed on the display unit 4 has little or no deterioration in image quality.
  • the light that has passed through the left eye optical system 10L forms an image on the imaging surface 22a.
  • the left-eye image of the light that has passed through the left-eye optical system 10L has image quality degradation in the area AA2 (shown by hatching) on one side of the imaging surface 22a, but the effective pixel area NCA2 is separated from the area AA2. It is set by shifting in the second direction from the center of 22a.
  • the second direction is a direction parallel to the parallax direction and is opposite to the first direction. For this reason, even when the image of the effective pixel area NCA2 is displayed on the display unit 4, the endoscopic image of the left eye image displayed on the display unit 4 has little or no deterioration in image quality.
  • an image part with degraded image quality is not displayed in the endoscopic images displayed by the respective optical systems.
  • the endoscope in two endoscope images generated by switching the state of the optical system, the endoscope that eliminates the influence of the optical characteristics or reduces the appearance of the influence.
  • An apparatus can be provided.
  • the effective pixel region is changed corresponding to the optical system used in the two optical systems provided in the distal end portion 11 of the insertion portion 5 or the optical adapter 10 attached to the distal end portion 11.
  • the second embodiment relates to an endoscope apparatus that switches an effective pixel area according to the type of adapter when there are a plurality of types of stereo measurement adapters attached to the distal end portion 11 of the insertion portion 5.
  • the endoscope apparatus according to the present embodiment has substantially the same configuration as the endoscope apparatus 1 according to the first embodiment. Therefore, in the endoscope apparatus according to the present embodiment, the endoscope according to the first embodiment.
  • symbol is attached
  • FIG. 13 is a diagram for explaining the configuration of the optical adapter 10, the distal end portion 11 of the insertion portion 5, and the apparatus main body 2 according to the present embodiment.
  • the optical adapter 10 can be attached to and detached from the distal end portion 11.
  • the optical adapter 10 is a stereo measurement adapter, and there are a plurality of types. The user selects the optical adapter 10 according to the inspection purpose, and attaches it to the distal end portion 11 for use.
  • the observation optical system of the present embodiment includes an optical system provided in the optical adapter 10 attached to the distal end portion 11 of the insertion portion 5.
  • the optical adapter 10 includes a stereo optical system 10 a configured by two observation left and right observation optical systems (a first optical system and a second optical system), a mechanical shutter 14, a coil 15, and an identification.
  • An information (hereinafter referred to as ID information) holding unit 16 is provided.
  • the ID information holding unit 16 is a nonvolatile memory that stores ID information indicating the type of the optical adapter 10.
  • the distal end portion 11 of the insertion portion 5 of this embodiment can be mounted with the optical adapter 10 from the distal end portion 11 of the first embodiment, and each optical adapter 10 has ID information that holds its own ID information. The difference is that the holding portion 16 is provided.
  • a signal line 26 that connects the ID information holding unit 16 and the control unit 31 is also inserted into the insertion unit 5.
  • the control unit 31 can read the ID information stored in the ID information holding unit 16.
  • the effective pixel area control unit 35A is connected to the memory 35a and can read out information stored in the memory 35a.
  • the memory 35a is a nonvolatile memory such as a flash memory.
  • An effective pixel area setting table is stored in the memory 35a.
  • FIG. 14 is a diagram showing the configuration of the effective pixel area setting table TBL.
  • the effective pixel area setting table TBL stores right-eye image effective pixel area information A and left-eye image effective pixel area information B according to the ID information of the optical adapter 10.
  • the effective pixel area information A is information on the data reading area for the right eye image
  • the effective pixel area information B is information on the data reading area for the left eye image.
  • Each effective pixel area information A and B includes information on the position of the start point sp and the end point ep of the rectangular area in the vertical and horizontal (or vertical and horizontal) directions of the pixel area of the imaging surface 22a.
  • the start point sp and the end point ep each include position information in the vertical and horizontal (or vertical and horizontal) directions x and y of the pixel area of the imaging surface 22a.
  • the position information of the start point sp and the end point ep is address information in the pixel area of the imaging surface 22a, and is area designation information that designates the pixel area output from the imaging element 22.
  • the optical adapter 10 has ID information “001”
  • the effective pixel area information A for the right eye image is “A1”
  • the effective pixel area information B for the left eye image is “B1”.
  • “A1” and “B1” include information of the start point sp and the end point ep.
  • the control unit 31 reads the ID information stored in the ID information holding unit 16 of the optical adapter 10 and outputs it to the effective pixel region control unit 35A.
  • the effective pixel area control unit 35A refers to the effective pixel area setting table TBL based on the ID information from the control unit 31, and the effective pixel area information A for the right eye image and the left eye image according to the ID information.
  • the effective pixel area information B is read out.
  • the effective pixel region control unit 35A uses the information on the start point sp1 and the end point ep1 included in the read effective pixel region information A and effective pixel region information B as a region designation signal for designating the pixel region output by the image sensor 22.
  • Information AI is output to the image sensor 22.
  • the effective pixel area control unit 35A changes the effective pixel area for the right eye image and the effective pixel area for the left eye image based on the ID information of the optical adapter 10.
  • the effective pixel region for the right eye image and the effective pixel region for the left eye image are changed according to the type of the optical adapter 10, and the optical adapter 10 Depending on the optical characteristics, the influence of the optical characteristics is eliminated or the appearance of the influence is reduced.
  • This embodiment also produces the same effect as the first embodiment.
  • the effective pixel region in the image sensor is changed in accordance with the switching of the state of the optical system.
  • the state of the optical system is switched from the image signal. Accordingly, the region to be cut out is changed so as to eliminate the influence of the optical characteristics or reduce the appearance of the influence.
  • the endoscope apparatus 1B of the present embodiment has substantially the same configuration as the endoscope apparatus 1 of the first embodiment, and therefore the endoscope apparatus 1B of the present embodiment is the same as that of the first embodiment.
  • symbol is attached
  • FIG. 15 is a diagram for explaining the configuration of the distal end portion 11 of the insertion portion 5 and the apparatus main body 2 according to the present embodiment.
  • the apparatus body 2 of the present embodiment has an image sensor driving unit 35 ⁇ / b> B.
  • the image sensor 22 of the present embodiment may be a CMOS image sensor or a CCD image sensor.
  • the image sensor driving unit 35B outputs a drive signal to the image sensor 22 so as to output image signals of all the pixels on the imaging surface 22a. Therefore, as will be described later, the pixel region corresponding to the imaging signal output from the imaging element 22 is unchanged regardless of the coil driving signal DS for driving the coil 15.
  • the video signal processing unit 32 has a cutout region control unit 32a.
  • the control unit 31 supplies the coil drive information DS1 regarding the coil drive signal DS for driving the coil 15 to the cutout region control unit 32a.
  • FIG. 16 is a block diagram of the cutout region control unit 32a.
  • the cutout area control unit 32 a includes a first video signal processing unit 51 and a second video signal processing unit 52.
  • the first video signal processing unit 51 receives imaging signals of all pixel regions of the imaging element 22.
  • the first video signal processing unit 51 generates a first video signal IS1 from the input imaging signal and outputs the first video signal IS1 to the second video signal processing unit 52.
  • the first video signal IS1 is a video signal for all pixel regions of the image sensor 22. Therefore, as shown in FIG. 16, the image G1 of the first video signal IS1 is an image of the entire pixel region.
  • the second video signal processor 52 receives the first video signal IS1 and the coil drive information DS1.
  • the second video signal processing unit 52 cuts out an area corresponding to the coil drive information DS1 from the image G1, generates a first video signal IS2 that is a video signal of the cut-out area EA, and outputs the first video signal IS2 to the display unit 4. .
  • the cutout area EA is changed according to the coil drive information DS1.
  • the second video signal IS2 is a video signal of the cutout area EA. Therefore, at the time of acquiring the right eye image, as shown in FIG. 16, the second video signal IS1 is a signal of the image G3 in the cutout area EA cut out from the entire pixel area. An image G4 corresponding to the image G3 is displayed on the display unit 4.
  • the cutout region control unit 32a extracts the right-eye image region or the left-eye image region by cutting out the effective pixel region NCA1 or the effective pixel region NCA2 from the image G1 generated from the imaging signal output from the imaging element 22.
  • An area extraction unit is configured.
  • This embodiment also produces the same effect as the first embodiment.
  • the effective pixel area in the image sensor is changed according to the type of the optical adapter and the state switching of the optical system.
  • the type of the optical adapter and the optical are selected from the image signals.
  • the area to be cut out is changed so as to eliminate the influence of the optical characteristics or reduce the appearance of the influence in accordance with the switching of the system state.
  • the endoscope apparatus 1C of the present embodiment has substantially the same configuration as the endoscope apparatuses 1A and IB of the second and third embodiments. Therefore, in the endoscope apparatus 1C of the present embodiment, the second Constituent elements that are substantially the same as those of the endoscope apparatuses 1A and IB of the embodiment are denoted by the same reference numerals, description thereof is omitted, and only different configurations are described.
  • FIG. 17 is a diagram for explaining the configuration of the distal end portion 11 of the insertion portion 5 and the apparatus main body 2 according to the present embodiment.
  • the observation optical system of the present embodiment includes an optical system provided in the optical adapter 10 attached to the distal end portion 11 of the insertion portion 5.
  • the apparatus body 2 of the present embodiment has an image sensor driving unit 35 ⁇ / b> B.
  • the image sensor 22 of the present embodiment may be a CMOS image sensor or a CCD image sensor.
  • the image sensor driving unit 35B outputs a drive signal to the image sensor 22 so as to output image signals of all the pixels on the imaging surface 22a. Therefore, as will be described later, the pixel region corresponding to the imaging signal output from the imaging element 22 is unchanged regardless of the coil driving signal DS for driving the coil 15.
  • the video signal processing unit 32 includes a cutout region control unit 32a1.
  • the control unit 31 supplies the ID information of the optical adapter 10 attached to the distal end portion 11 and the coil drive information DS1 about the coil drive signal DS for driving the coil 15 to the cutout region control unit 32a1.
  • the cutout region control unit 32a1 is the same as the cutout region control unit 32a of the third embodiment, and includes a first video signal processing unit 51 and a second video signal processing unit 52A as shown in FIG. In FIG. 16, the ID information is supplied to the second video signal processing unit 52 as indicated by a one-dot chain line.
  • the first video signal processing unit 51 receives imaging signals of all pixel regions of the imaging element 22.
  • the first video signal processing unit 51 generates a first video signal IS1 from the input imaging signal and outputs the first video signal IS1 to the second video signal processing unit 52.
  • the first video signal IS1 is a video signal for all pixel regions of the image sensor 22. Therefore, as shown in FIG. 16, the image G1 of the first video signal IS1 is an image of the entire pixel region.
  • the second video signal processing unit 52 receives the first video signal IS1, the ID information, and the coil drive information DS1.
  • the second video signal processing unit 52 cuts out an area corresponding to the ID information and the coil drive information DS1 from the image G1, generates a first video signal IS2 that is a video signal of the cut-out area EA, and displays the display unit 4 Output to.
  • the cutout area EA is an area corresponding to the type of the optical adapter 10. Further, in the cutout area EA, the effective pixel area NCA1 when light incident on the right eye optical system 10R is projected on the imaging surface 22a, or light incident on the left eye optical system 10L is projected on the imaging surface 22a. This is the effective pixel area NCA2 when being performed.
  • the cutout area EA is changed according to the ID information and the coil drive information DS1.
  • the second video signal IS2 is a video signal of the cutout area EA. Therefore, as shown in FIG. 16, the second video signal IS1 is a signal of the image G3 of the cutout area EA cut out from the effective pixel area CA. An image G4 corresponding to the image G3 is displayed on the display unit 4.
  • the cutout region control unit 32a1 configures a region extraction unit that changes the right eye image region and the left eye image region based on the ID information of the optical adapter 10.
  • This embodiment also produces the same effect as the second embodiment.
  • the effective pixel region is shifted when the first optical system and the second optical system are switched in the two optical systems having parallax.
  • the effective pixel region is changed.
  • the endoscope apparatus 1D of the present embodiment has substantially the same configuration as the endoscope apparatus 1B of the third embodiment, and therefore the endoscope apparatus 1D of the present embodiment is the same as that of the first embodiment.
  • symbol is attached
  • FIG. 18 is a diagram for explaining the configuration of the distal end portion 11 of the insertion portion 5 and the apparatus main body 2 according to the present embodiment.
  • 19 and 20 are diagrams for explaining the relationship between the optical system at the tip and the effective pixel region on the imaging surface of the imaging device.
  • the front end portion 11 has a monocular optical system 61, that is, an optical system having one optical axis of the objective optical system, and an aperture mechanism for changing the aperture in the monocular optical system 61, here, two types of apertures are selected.
  • An aperture mechanism 62 having a mechanism for selecting automatically is provided.
  • the monocular optical system 61 is an observation optical system including an objective optical system 61a and an imaging optical system 61b.
  • the aperture mechanism 62 is provided in the objective optical system 61a.
  • FIG. 21 is a diagram showing the configuration of the diaphragm mechanism 62.
  • the aperture mechanism 62 includes a glass member 71 having a first aperture 71a and a shielding member 72 having a second aperture 72a and an arm 72b.
  • the glass member 71 is disposed on the optical axis of the monocular optical system 61.
  • the shielding member 72 is rotatable around the axis of the shaft member 73 fixed with respect to the distal end portion 11, and the first position where the second diaphragm 72a is disposed on the optical axis of the monocular optical system 61, Alternatively, the second diaphragm 72 a can be arranged at a second position that is off the optical axis of the monocular optical system 61.
  • the shielding member 72 is driven to be located at one of two positions by a drive actuator (not shown) such as an electromagnet.
  • FIG. 19 and 20 show a plurality of optical paths of light passing through the monocular optical system 61.
  • the aperture AP1 of the first aperture 71a is larger than the aperture AP2 of the second aperture 72a. That is, the aperture value of the first aperture 71a is larger than the aperture value of the second aperture 72a. Therefore, when the second diaphragm 72a is at the first position arranged on the optical axis of the monocular optical system 61, the light from the subject is narrowed by the aperture AP2 as shown in FIG. Incident. When the second diaphragm 72a is at the second position off the optical axis of the monocular optical system 61, the light from the subject is narrowed by the aperture AP1 and enters the image sensor 22 as shown in FIG.
  • the monocular optical system 61 is an optical system that receives light from the observation window 11a and emits the light to the image pickup device 22.
  • the monocular optical system 61 changes the incident state of light on the image pickup device 22 (here, the width of the light beam)
  • An observation optical system that can be switched between one incident state and a second incident state different from the first incident state is configured.
  • the video signal processing unit 32 includes a cutout region control unit 32b.
  • the control unit 31 When receiving the aperture designation signal SLa from the operation unit 6, the control unit 31 outputs the aperture instruction signal AS to the aperture mechanism 62.
  • the user can change the aperture value of the monocular optical system 61 by operating a predetermined operation button of the operation unit 6. By operating the operation button, the shielding member 72 is driven, the first position where the second diaphragm 72a is disposed on the optical axis of the monocular optical system 61, and the second diaphragm 72a is the optical axis of the monocular optical system 61. It is arranged at any one of the second positions deviated from.
  • the diaphragm mechanism 62 has a configuration capable of switching between a first diaphragm and a second diaphragm having an opening smaller than the opening of the first diaphragm.
  • the diaphragm mechanism 62 uses, for example, an electromagnetic force from an electromagnet or the like to change any of the first diaphragm and the diaphragm in a state in which the first diaphragm and the second diaphragm coexist but the diaphragm is secondly activated. It can be placed in the optical path of the system.
  • the aperture mechanism 62 is a first aperture or an aperture in a state where the first aperture and the second aperture coexist but the second aperture is in effect. Is placed in the optical path of the monocular optical system.
  • the aperture value of the aperture mechanism 62 becomes smaller (that is, the aperture becomes larger)
  • the outer edge of the subject image when the image is formed becomes darker than the subject image when the aperture value is larger (that is, the aperture is smaller).
  • FIG. 19 is a diagram for explaining the relationship between the optical system at the tip and the effective pixel area on the imaging surface of the imaging element when the aperture value is large (that is, the aperture is small).
  • FIG. 20 is a diagram for explaining the relationship between the optical system at the tip and the effective pixel area on the imaging surface of the imaging device when the aperture value is small (that is, the aperture is large).
  • the image sensor 22 of the present embodiment may be a CMOS image sensor or a CCD image sensor.
  • the image sensor driving unit 35B outputs a drive signal to the image sensor 22 so as to output image signals of all the pixels on the imaging surface 22a.
  • the video signal processing unit 32 includes a cutout region control unit 32b.
  • the control unit 31 supplies aperture information AS1 for the aperture instruction signal AS to the cutout region control unit 32b.
  • the cutout area control unit 32b When the second diaphragm 72a is at the first position arranged on the optical axis of the monocular optical system 61, the cutout area control unit 32b has a predetermined effective pixel area on the imaging surface 22a as shown in FIG. An image in the range of CA1 is cut out.
  • the cutout region control unit 32b When the second diaphragm 72a is at the second position off the optical axis of the monocular optical system 61, the cutout region control unit 32b has a predetermined effective pixel region on the imaging surface 22a as shown in FIG. An image in the range of CA2 is cut out.
  • the effective pixel area CA2 is similar in shape to the effective pixel area CA1.
  • the cutout area control unit 32b cuts out an image in the range of the effective pixel area CA2 having an area smaller than the effective pixel area CA1.
  • an area where there is no difference in brightness or is not noticeable is set as the effective pixel area CA1 according to the aperture AP2 of the second diaphragm 72a.
  • the cutout region control unit 32b extracts the effective pixel region CA1 from the light receiving region in the first incident state, and the light incident state is the second incident state.
  • an area extraction unit is configured to extract the effective pixel area CA2 from the light receiving area in the second incident state.
  • the incident state of light is changed by the diaphragm of the diaphragm mechanism 62 provided in the monocular optical system 61.
  • the cutout region control unit 32b of the video signal processing unit 32 extracts the effective pixel region CA1 or CA2 from the image pickup signal from the image pickup device 22, but the first and second embodiments.
  • the region information AI signal to be output may be included in the drive signal to the imaging device so as to change the region of the imaging signal output by the imaging element 22.
  • the image pickup device 2 outputs an image pickup signal of the effective pixel area CA1 or CA2, and the video signal processing unit 32 does not cut out an image.
  • the shape of the effective pixel areas CA1 and CA2 described above is a rectangle, but it may be a circle, an ellipse, or the like.
  • an endoscope that eliminates the influence of the optical characteristics or reduces the appearance of the influence.
  • a mirror device can be provided.
  • Each of the endoscope apparatuses of the first to fourth embodiments described above has two optical systems for stereo measurement or stereo observation as two optical systems, but each endoscope apparatus has a narrow angle observation.
  • each endoscope apparatus has a narrow angle observation.
  • the first to fourth embodiments described above can be applied.
  • each of the endoscope apparatuses according to the first to fourth embodiments described above is an endoscope apparatus capable of two or more observations (for example, narrow-angle observation and wide-angle observation) having different angles of view, near-point observation, and so on.
  • the present invention can also be applied to an endoscope apparatus capable of far-point observation, an endoscope apparatus capable of direct view and side view, or an endoscope apparatus capable of direct view and perspective view.
  • the endoscope apparatus according to the fifth embodiment changes the aperture value of the monocular optical system, but the endoscope apparatus can take two states for near-point observation and far-point observation.
  • the above-described fifth embodiment is applicable.
  • the position or size of the effective pixel region is changed so as to eliminate the influence of aberrations in the peripheral region of the image, but halation, ghost, etc. are not necessary for part of the image.
  • an area excluding a part of the area may be set as an effective pixel area.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Optics & Photonics (AREA)
  • Biomedical Technology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Astronomy & Astrophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

Un dispositif d'endoscope (1) comprend : une unité d'introduction (5) ; un élément de capture d'image (22) qui reçoit de la lumière d'entrée injectée dans une fenêtre d'observation qui est disposée au niveau d'une partie d'extrémité avant de l'unité d'introduction ; un système stéréo-optique (10a) ; et une unité de commande de zone de pixels efficace (35) en tant qu'unité d'extraction de zone. Le système stéréo-optique (10a) est un système optique, dans lequel de la lumière provenant d'une fenêtre d'observation (11a) est injectée, et hors duquel la lumière est émise en direction de l'élément de capture d'image (22), et il constitue un système optique d'observation apte à effectuer une commutation de l'état dans lequel la lumière doit être injectée dans l'élément de capture d'image (22), ladite commutation étant effectuée entre un premier état d'injection et un second état d'injection différent du premier. L'unité de commande de zone de pixels efficace (35) extrait une première zone d'une première zone de réception de lumière lorsque l'état d'injection de lumière correspond au premier état d'injection, et extrait une seconde zone d'une seconde zone de réception de lumière lorsque l'état d'injection de lumière correspond au second état d'injection.
PCT/JP2017/041963 2016-12-06 2017-11-22 Dispositif d'endoscope Ceased WO2018105396A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016237006 2016-12-06
JP2016-237006 2016-12-06

Publications (1)

Publication Number Publication Date
WO2018105396A1 true WO2018105396A1 (fr) 2018-06-14

Family

ID=62490961

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/041963 Ceased WO2018105396A1 (fr) 2016-12-06 2017-11-22 Dispositif d'endoscope

Country Status (1)

Country Link
WO (1) WO2018105396A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004033487A (ja) * 2002-07-03 2004-02-05 Olympus Corp 内視鏡装置
JP2011055361A (ja) * 2009-09-03 2011-03-17 Canon Inc 画像処理装置及び撮像装置
WO2012081618A1 (fr) * 2010-12-14 2012-06-21 オリンパスメディカルシステムズ株式会社 Dispositif d'imagerie
JP2014228851A (ja) * 2013-05-27 2014-12-08 オリンパス株式会社 内視鏡装置、画像取得方法および画像取得プログラム
JP2015188262A (ja) * 2013-04-18 2015-10-29 オリンパス株式会社 撮像素子、撮像装置および内視鏡システム
JP2016095458A (ja) * 2014-11-17 2016-05-26 オリンパス株式会社 内視鏡装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004033487A (ja) * 2002-07-03 2004-02-05 Olympus Corp 内視鏡装置
JP2011055361A (ja) * 2009-09-03 2011-03-17 Canon Inc 画像処理装置及び撮像装置
WO2012081618A1 (fr) * 2010-12-14 2012-06-21 オリンパスメディカルシステムズ株式会社 Dispositif d'imagerie
JP2015188262A (ja) * 2013-04-18 2015-10-29 オリンパス株式会社 撮像素子、撮像装置および内視鏡システム
JP2014228851A (ja) * 2013-05-27 2014-12-08 オリンパス株式会社 内視鏡装置、画像取得方法および画像取得プログラム
JP2016095458A (ja) * 2014-11-17 2016-05-26 オリンパス株式会社 内視鏡装置

Similar Documents

Publication Publication Date Title
US12114831B2 (en) Endoscope incorporating multiple image sensors for increased resolution
JP5730339B2 (ja) 立体内視鏡装置
US9030543B2 (en) Endoscope system
JP3257640B2 (ja) 立体視内視鏡装置
US5278642A (en) Color imaging system
US10838189B2 (en) Operating microscope having an image sensor and a display, and method for operating an operating microscope
JPWO2018051680A1 (ja) 内視鏡システム
WO2018051679A1 (fr) Dispositif d'aide à la mesure, système d'endoscope, processeur pour système d'endoscope, et procédé d'aide à la mesure
JP2008043742A (ja) 電子内視鏡システム
US12059132B2 (en) Endoscopic camera head incorporating multiple image sensors and related system
JP2001083400A (ja) 撮像光学系
US11042021B2 (en) Image pickup apparatus and endoscope apparatus
JP7012549B2 (ja) 内視鏡装置、内視鏡装置の制御方法、内視鏡装置の制御プログラム、および記録媒体
WO2015156176A1 (fr) Système d'endoscope
JPWO2016194179A1 (ja) 撮像装置、内視鏡装置及び撮像方法
WO2018105396A1 (fr) Dispositif d'endoscope
JP5985117B1 (ja) 撮像装置、画像処理装置、撮像装置の作動方法
US20250090000A1 (en) Endoscope apparatus, processor for endoscope image, and method for generating endoscope image
JP3670717B2 (ja) 内視鏡システム
JP2009028462A (ja) 電子内視鏡装置
TWI667494B (zh) Microscope device
JP2016142777A (ja) 撮像装置
WO2022064659A1 (fr) Appareil de mise au point automatique, système d'endoscope et procédé de commande
JP2007233295A (ja) 光学ファインダ装置および電子カメラ
WO2017064746A1 (fr) Dispositif d'imagerie, dispositif endoscopique et procédé d'imagerie

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17879392

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17879392

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP