WO2015072427A1 - 内視鏡用撮像装置 - Google Patents
内視鏡用撮像装置 Download PDFInfo
- Publication number
- WO2015072427A1 WO2015072427A1 PCT/JP2014/079736 JP2014079736W WO2015072427A1 WO 2015072427 A1 WO2015072427 A1 WO 2015072427A1 JP 2014079736 W JP2014079736 W JP 2014079736W WO 2015072427 A1 WO2015072427 A1 WO 2015072427A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- optical system
- image sensor
- center
- objective optical
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/04—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
- A61B1/05—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
- A61B1/051—Details of CCD assembly
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00064—Constructional details of the endoscope body
- A61B1/00071—Insertion part of the endoscope body
- A61B1/0008—Insertion part of the endoscope body characterised by distal tip features
- A61B1/00096—Optical elements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00163—Optical arrangements
- A61B1/00193—Optical arrangements adapted for stereoscopic vision
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
- G02B23/2407—Optical details
- G02B23/2415—Stereoscopic endoscopes
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/02—Bodies
- G03B17/17—Bodies with reflectors arranged in beam forming the photographic image, e.g. for reducing dimensions of camera
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B37/00—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
- G03B37/005—Photographing internal surfaces, e.g. of pipe
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/60—Noise processing, e.g. detecting, correcting, reducing or removing noise
- H04N25/61—Noise processing, e.g. detecting, correcting, reducing or removing noise the noise originating only from the lens unit, e.g. flare, shading, vignetting or "cos4"
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
Definitions
- the present invention relates to an imaging apparatus, and more particularly to an endoscope imaging apparatus applied to an endoscope capable of observing a subject stereoscopically.
- an image is acquired by one imaging element for two objective optical systems in order to stereoscopically image a subject.
- Light is incident obliquely from the subject.
- Japanese Patent Application Laid-Open No. 2004-259542 discloses a method for countermeasures against shading caused by oblique incidence of an objective optical system, in accordance with the oblique incidence characteristics of the objective optical system. Shifting to the photosensitive portion is disclosed.
- Patent Document 2 it is pointed out that color reproducibility and resolution are deteriorated due to color mixing due to crosstalk generated between pixels of the image sensor.
- Patent Document 1 in order to suppress crosstalk due to light having an oblique incident angle, Patent Document 1 Similarly, it is disclosed that the microlens is shifted from the center position of the photodiode.
- Patent Document 3 discloses that shading can be reduced and crosstalk can be suppressed at the same time by shifting the position of the microlens according to the oblique incidence characteristic from the center of each objective optical system. .
- JP-A-5-346556 JP 2010-56345 A Japanese Patent No. 4054094
- the imaging element in Patent Document 3 determines the position of the microlens assuming an imaging apparatus having two objective optical systems, it can be applied to an imaging apparatus having one objective optical system. Can not. In other words, an imaging device adapted for an imaging apparatus having one objective optical system cannot be directly applied to an imaging apparatus having two objective optical systems. For this reason, it is necessary to manufacture an image pickup device corresponding to each image pickup apparatus, and the manufacturing cost of the image pickup device increases.
- two images arranged in parallel using a telecentric objective optical system are formed on an image sensor that eliminates the shift between the microlens and the photosensitive portion of the image sensor in accordance with the telecentric optical system, shading is performed. Does not occur. However, when such a telecentric optical system is applied, the objective optical system becomes large, and it becomes difficult to arrange the objective optical systems in parallel.
- the present invention has been made in view of the above-described circumstances, and is capable of acquiring a plurality of images arranged in parallel while suppressing shading and crosstalk, and manufacturing an imaging device corresponding to each imaging device.
- An object of the present invention is to provide an endoscope imaging apparatus that can be reduced in size.
- One embodiment of the present invention includes an objective optical system that collects light from a subject and forms two images in parallel, and a plurality of microlenses arranged on the incident side of the light from the objective optical system.
- An image sensor having a pixel assigned to each microlens, wherein the center of the photosensitive portion of each pixel in the image sensor is peripheral from the center of the image sensor to the optical axis of the microlens.
- the position of the exit pupil of the objective optical system is closer to the object side than the imaging position of the objective optical system, and the following conditional expression is satisfied.
- An endoscope imaging apparatus is provided.
- ⁇ is the angle between the principal ray of the maximum horizontal image height and the optical axis
- ⁇ H is the principal ray correction amount (angle) of the microlens with the maximum horizontal image height (H) from the center of the image sensor
- ⁇ D are principal ray correction amounts (angles) of the microlens corresponding to the position of the horizontal image height (D) from the center of the image sensor
- the position of the image height (D) is symmetrical with the optical axis of the objective optical system. This is the intersection of the image sensor with a line drawn to the center of the image sensor as an axis.
- the center of the photosensitive portion of each pixel in the imaging device is displaced so as to gradually increase in the peripheral direction from the central portion of the imaging device to the peripheral portion with respect to the optical axis of the microlens.
- the exit pupil position of the optical system is closer to the object side than the imaging position of the objective optical system, and is configured to satisfy the conditional expression (1).
- ⁇ c is the principal ray correction amount (angle) of the microlens corresponding to the position of the image height (C) in the horizontal direction, and the position of the image height (C) is between the optical axis of the objective optical system and the image sensor. It is an intersection.
- P is the pixel pitch of the image sensor, and K is the distance from the top of the microlens surface of the image sensor to the pixel (photosensitive element).
- ⁇ D is the principal ray correction amount (angle) of the microlens corresponding to the position of the horizontal image height (D) from the center of the image sensor, and the position of the image height (D) is the light of the objective optical system.
- P is the pixel pitch of the image sensor
- K is a pixel (photosensitive from the top of the microlens of the image sensor) The distance to the element.
- the objective optical system is arranged in parallel, and is disposed between two objective lenses that form an optical image of a subject on the imaging device, and between the objective lens and the imaging device, and the objective optics And a reflecting member for displacing the optical axis of the system.
- the optical system can also be obliquely incident, the light beam diameter of the prism can be reduced, and the image pickup apparatus can be miniaturized. Images of two optical systems with shading suppressed can be observed at the same time, and stereoscopic observation is possible by displaying the two images on the 3D monitor as left and right images.
- the objective optical system includes an optical path dividing unit, and an optical image of the subject divided by the optical path dividing unit is formed on the imaging element as an optical image with a different focus position.
- FIG. 1 is a cross-sectional view illustrating an overall configuration of an imaging apparatus according to Embodiment 1 of the present invention. It is explanatory drawing of the image pick-up element applied to the imaging device which concerns on Example 1 of this invention. It is sectional drawing which shows the whole structure of the imaging system which concerns on Example 2 of this invention. It is a figure which shows direction of the to-be-photographed object in the image each imaged in the 1st and 2nd area
- the endoscope imaging apparatus includes two objective optical systems 2 and an imaging element 4.
- the two objective optical systems 2 are arranged in parallel, and each collects light from the subject and makes the collected light incident on the light receiving surface of the image sensor.
- the imaging device 4 includes a photosensitive unit 12 that receives light from the objective optical system 2, and includes a plurality of microlenses 10 and color filters 11 arranged on the incident side of the photosensitive unit 12. A pixel is assigned to each microlens 10.
- the center of the photosensitive portion 12 of each pixel in the image pickup device 4 is displaced with respect to the optical axis of the microlens 10 so as to gradually increase in the peripheral direction from the central portion of the image pickup device 4 to the peripheral portion.
- the exit pupil position of the optical system 2 is positioned closer to the object side than the imaging position of the objective optical system 2.
- an image sensor that is telecentric on the image side of the optical system that is, an image sensor that does not have a displacement amount between the center position of the microlens and the center of the photosensitive portion (hereinafter, this displacement amount is referred to as a “microlens correction amount”).
- a microlens correction amount is arranged in parallel and imaged using a left-right stereoscopic imaging optical system that is emitted at a principal ray angle ⁇ from the exit pupil, the difference between the principal ray angle of the objective optical system and the correction amount of the microlens Are ⁇ and ⁇ in the horizontal direction, and the difference in the horizontal direction is 2 ⁇ , and the difference becomes large (see FIG. 6).
- the emission angle ⁇ at the maximum horizontal image height H of the objective optical system is the correction amount ⁇ H of the microlens of the image sensor at that position, the correction amount and the emission angle.
- the difference from is ⁇ - ⁇ H It can be expressed as
- the difference between the chief ray correction amount of the microlens and the chief ray emission angle of the objective optical system is not uniform in each imaging range, the amount of transfer to the adjacent pixel changes, and shading occurs. .
- ⁇ H> ⁇ D the difference between the principal ray angle from the correction amount of the microlens that can suppress shading and the exit pupil of the objective optical system is smaller than 2 ⁇ , and the above ( It can be said that the condition is better than that of the formula (A), and shading can be suppressed. Therefore, it is desirable that ⁇ H is finite, that is, the correction amount of the microlens is finite.
- the conditions for suppressing the crosstalk as a whole while suppressing it at the center of the screen will be considered.
- the chief ray correction angle of the microlens and the sign of the oblique incident ray angle of the optical system are set to be negative in the direction of the maximum horizontal image height from the image sensor central axis.
- the chief ray correction angle of the microlens and the oblique incident ray angle of the optical system increase in the negative direction as shown in the graphs of FIGS. 5A and 5B.
- the green line is the difference between the two angles (oblique incidence angle-principal ray correction amount).
- the left is the one with the chief ray correction angle (blue graph) as a parameter and balanced so that the difference is uniform over the entire screen.
- the endoscope imaging apparatus is configured to satisfy the following conditional expression.
- the upper limit of the following conditional expression is exceeded or below the lower limit, shading increases.
- ⁇ is an angle formed between the principal ray having the maximum horizontal image height and the optical axis
- ⁇ H is a principal ray correction amount ((L) of the microlens corresponding to the position of the maximum horizontal image height (H) from the center of the image sensor
- ⁇ D is the principal ray correction amount (angle) of the microlens corresponding to the position of the horizontal image height (D) from the center of the image sensor
- the position of the image height (D) is the light of the objective optical system. This is the intersection of the image sensor with a line drawn to the center of the image sensor by ⁇ degrees with the axis as the axis of symmetry.
- the endoscope imaging apparatus is configured to satisfy the following conditional expression. 0 ⁇ K ⁇ Tan ( ⁇ c) ⁇ P / 2 (2)
- ⁇ c is the principal ray correction amount (angle) of the microlens corresponding to the position of the image height (C) in the horizontal direction
- the position of the image height (C) is between the optical axis of the objective optical system and the image sensor. It is an intersection.
- P is the pixel pitch of the image sensor
- K is the distance from the top of the microlens surface of the image sensor to the pixel (photosensitive element).
- the correction amount of the microlens corrects light incident from the opposite direction with the optical axis as the axis of symmetry from FIG.
- the exit pupil position is on the side opposite to the object side from the image plane.
- the optical system itself becomes large, and the imaging device itself becomes large.
- K ⁇ tan ( ⁇ c) When K ⁇ tan ( ⁇ c) becomes 0, the difference between the principal ray angle of the objective optical system and the correction amount of the microlens is ⁇ and ⁇ in the horizontal direction and 2 ⁇ as the difference in the horizontal direction. As shown in FIG. Therefore, it is desirable that ktan ( ⁇ c) be greater than 0 °. If Ktan ( ⁇ c) exceeds P / 2, the chief ray enters the adjacent pixel and color shading occurs.
- ⁇ D is the principal ray correction amount (angle) of the microlens corresponding to the position of the horizontal image height (D) from the center of the image sensor, and the position of the image height (D) is the light of the objective optical system.
- P is the pixel pitch of the image sensor
- K is a pixel (photosensitive from the top of the microlens of the image sensor) The distance to the element. If K ⁇ Tan ( ⁇ H) ⁇ K ⁇ Tan ( ⁇ ) exceeds P / 2, the principal ray emitted from the objective optical system 2 at an angle ⁇ enters the adjacent pixel and color shading occurs. . If K ⁇ Tan ( ⁇ ) ⁇ K ⁇ Tan ( ⁇ D) exceeds P / 2, the principal ray emitted from the objective optical system 2 at the angle ⁇ enters the adjacent pixel, and color shading occurs. Resulting in.
- FIG. 7 is a cross-sectional view illustrating the overall configuration of the endoscope imaging apparatus according to the present embodiment.
- the endoscope imaging apparatus includes two objective optical systems 2 arranged in parallel at an interval, and two parallelogram prisms 3 arranged in a subsequent stage of the objective optical system 2. And one imaging device 4 disposed downstream of the parallelogram prism 3 and diaphragms 5a and 5b.
- Each of the two objective optical systems 2 includes, in order from the object side, a first group 6 having a negative refractive power and a second group 7 having a positive refractive power.
- the light beam condensed by the objective optical system 2 is expanded after being reduced in diameter by the first group 6, is condensed again by the second group 7, and forms an image at the focal position.
- the focal position of the second group 7 is made to coincide with an imaging surface 4a of the imaging element 4 described later.
- the exit pupil position of the objective optical system 2 is designed to be closer to the object side than the image sensor 4.
- the maximum chief ray emission angle in the horizontal image height direction H of the objective optical system 2 is 8 degrees (see FIG. 5A).
- the parallelogram prism 3 includes a first surface 3a, a second surface 3b, a third surface 3c, and a fourth surface 3d.
- the first surface 3a is disposed perpendicular to the optical axis (incident optical axis) A of the objective optical system 2 so that the light emitted from the second group 7 of the objective optical system 2 is incident, and the second surface 3b is The optical axis A of the objective optical system 2 is disposed at an angle of 45 ° so as to deflect the light incident on the inside from the first surface 3a.
- the third surface 3c is disposed in parallel with the second surface 3b, and the fourth surface 3d is disposed in parallel with the first surface 3a.
- Light incident on the inside of the parallelogram prism 3 from the first surface 3a along the incident optical axis A is deflected twice on the second surface 3b and the third surface 3c, and then exits parallel to the incident optical axis A.
- the light is emitted from the fourth surface 3d toward the rear imaging element 4 along the light emission axis B.
- the two parallel optical prisms 3 are condensed by the two objective optical systems 2.
- the optical images formed on the imaging surface 4a can be brought close to each other, and the size of the imaging surface 4a of the imaging element 4 that simultaneously acquires two optical images can be reduced.
- the imaging device 4 is, for example, a CCD, and as shown in FIG. 8, two optical images condensed by each objective optical system 2 are arranged in two effective light receiving areas on the imaging surface 4a. It has come to be imaged. As shown in FIG. 2, a microlens 10 and a color filter 11 are arranged for each pixel on the object side of a photosensitive portion (for example, a photoelectric conversion element) 12 of the image sensor 4.
- a photosensitive portion for example, a photoelectric conversion element
- the amount of displacement between the center position of the microlens 10 and the center of the photosensitive portion 12 increases as the distance from the center position Q of the image sensor 4 increases.
- the amount of displacement is the maximum in the horizontal direction when expressed by an angle between a line connecting the center position of the microlens 10 and the center of the photosensitive portion 12 and the optical axis of the objective optical system 2 (hereinafter referred to as “microlens correction amount”). It is 20 degrees at the image height position.
- a difference between the correction amount of the microlens and the chief ray emission angle of the objective optical system 2 is as shown in FIG. 5A.
- the difference between the correction amount of the microlens and the chief ray emission angle of the objective optical system 2 becomes uniform on the imaging surfaces 4b and 4c, and luminance / color shading does not occur. Therefore, although one image sensor is used, stereoscopic observation can be performed by forming an image with left and right parallaxes, and observation can be performed with good image quality without shading.
- the exit pupil position of the objective optical system 1 is matched with the correction amount of the microlens of the image pickup device 4 (in the case of the image pickup apparatus as shown in FIG. 2), shading does not occur. Therefore, even in an imaging apparatus that forms all the pixels of the imaging element 4 as shown in FIG. 2 as one image, the imaging element can be applied as it is. In addition, since it can be applied to an imaging apparatus that forms two images in parallel as in the present embodiment, it is not necessary to manufacture an imaging element in which the correction amount of the microlens is changed according to each imaging apparatus, and imaging is performed. The manufacturing cost of the element is not increased.
- the prism 3 can be made small, and stereoscopic observation is possible even with a small image pickup device.
- FIG. 9A and 9B are explanatory diagrams showing the configuration of the imaging apparatus system according to the second embodiment of the present invention.
- FIG. 9A is a diagram schematically showing the overall configuration
- FIG. 9B is the first and second regions of the imaging device. It is a figure which shows the direction of the to-be-photographed object in the image imaged in FIG.
- the imaging apparatus system includes an objective lens 21, a depolarizing plate 22, an imaging element 23, a polarizing beam splitter 24, a wave plate 25, a first reflecting member 26, a second reflecting member 27, and an image processing unit 28. is doing.
- the imaging device system is connected to the image display device 20 and can display an image acquired by the imaging device system on the image display device 20.
- the objective lens 21 has a function of forming an image of a light beam from an object, and the exit pupil position is configured to be closer to the object side than the image sensor 23.
- the depolarization plate 22 is disposed between the objective lens 1 and the polarization beam splitter 24.
- the image sensor 23 is composed of a rolling shutter type CMOS sensor, and is disposed in the vicinity of the imaging position of the objective lens 21.
- the image sensor 23 is gradually displaced greatly in the peripheral direction as the center of the photosensitive portion of each pixel with respect to the optical axis of a microlens (not shown) in the image sensor 23 changes from the center to the periphery of the image sensor 23. It has become.
- the exit pupil position of the objective lens 21 is set so as to be located closer to the object side than the image sensor 23.
- the polarization beam splitter 24 is disposed on the optical path between the objective lens 21 and the image sensor 23 and above the first region 23a of the image sensor 3, and reflects the light beam from the objective lens 1 on the polarization beam splitter surface 24a.
- the light beam is divided into two light beams, a light beam and a transmitted light beam.
- the polarization beam splitter 24 reflects the linearly polarized light of the S polarization component and transmits the linearly polarized light of the P polarization component.
- the wave plate 25 is composed of a ⁇ / 4 plate and is configured to be rotatable around the optical axis.
- the first reflecting member 26 is composed of a mirror, and reflects the light beam reflected by the polarization beam splitter surface 24a and transmitted through the wave plate 25 so as to be reflected.
- the second reflecting member 27 is composed of a prism, and reflects the light transmitted through the polarizing beam splitter 4 at the total reflection surface 27a.
- the prism may constitute a reflection surface by applying a mirror coat to the total reflection surface 27a.
- the light beam reflected by the first reflecting member 26 via the wave plate 25 and the polarization beam splitter 24 is imaged on the first region 23a of the imaging element 23, and the second The light beam reflected by the reflecting member 27 is imaged on a second region 23 b different from the first region 23 a in the image sensor 23.
- the image processing unit 28 is connected to the image sensor 23 and is provided in a central processing unit (not shown).
- the first image processing unit 28a1, the second image processing unit 28a2, the third image processing unit 28a3, A fourth image processing unit 28a4 and a fifth image processing unit 28a5 are provided.
- the first image processing unit 28a is configured to correct the orientation (rotation) of the image in the first area 3a and the image in the second area 23b.
- the orientation of the image formed in the first region 23a and the second region 23b is, for example, the orientation shown in FIG. 9B when observing the letter “F” as shown in FIG. That is, the image formed on the first area 23a rotates 90 degrees clockwise around the center point of the first area 23a and is centered on the vertical axis in FIG. 9B passing through the center point of the first area 23a. As shown in FIG. The image formed on the second area 23b is oriented 90 degrees clockwise around the center point of the second area 3b.
- the first region 23a and the second region 23b are connected via the first image processing unit 28a.
- the image formed on each image is rotated by 90 degrees counterclockwise about the center point of each region, and the vertical direction in FIG. 9B passes through the center point of the first region 3a for the image of the first region 23a.
- the mirror image is corrected by rotating 180 degrees about the axis of.
- the third image processing unit 28c is configured to be able to adjust the white balance of the image of the first area 23a and the image of the second area 23b.
- the fourth image processing unit 28d is configured to be able to move (select) the center positions of the image of the first area 23a and the image of the second area 23b.
- the fifth image processing unit 28e is configured to be able to adjust each display range (magnification) of the image of the first area 23a and the image of the second area 23b.
- the second image processing unit 28b corresponds to the image selection unit of the present invention, compares the image in the first region 23a and the image in the second region 23b, and selects the image in the focused region as a display image. It is configured as follows. Specifically, for example, as shown in FIG. 11A, the second image processing unit 28b includes high-pass filters 28b1a and 28b1b connected to the respective regions 23a and 23b, and comparators connected to the high-pass filters 28b1a and 28b1b.
- a comparator 28b2 and a switch 28b3 connected to the respective regions 23a and 23b, and the high-frequency components are extracted from the images of the first region 23a and the second region 23b by the high-pass filters 28b1a and 28b1b.
- the extracted high frequency components are compared by the comparator 28b2, and an image of a region having a high high frequency component is selected by the switch 28b3.
- a defocus filter 28b4 connected to only one region 23a
- a comparator 28b2 connected to the defocus filter 28b4 and connected to the other region 3b
- one region 23a and the comparator A switch 8b3 connected to 28b2
- the image signal of the other region 3b not focused by the image signal of one region 23a defocused by the defocus filter 28b4 is compared by the comparator 8b2, and they are matched.
- the switch 28b3 may be configured to select the image of the other region 23b and the unmatched portion of the image of the region 23a.
- the image display device 20 has a display area for displaying the image selected by the second image processing unit 28b.
- the image display device 20 may have a display area for displaying an image formed in each of the first and second areas 23a and 23b.
- the light beam from the objective lens 21 passes through the depolarization plate 22 and enters the polarization beam splitter 24 in a state where the polarization direction is eliminated.
- the light incident on the polarization beam splitter 24 is separated by the polarization beam splitter surface 24a into a linearly polarized S-polarized component and a P-polarized component.
- the linearly-polarized light beam of the S-polarized component reflected by the polarization beam splitter surface 24 a passes through the ⁇ / 4 plate 25, the polarization state is converted to circularly polarized light, and is reflected by the mirror 26.
- the light beam reflected by the mirror 26 passes through the ⁇ / 4 plate 25 again, the polarization state is converted from circularly polarized light to linearly polarized light of the P-polarized component, enters the polarizing beam splitter 24 again, and passes through the polarizing beam splitter surface 24a.
- an image is formed on the first region 23 a of the CMOS sensor 23.
- the linearly polarized light beam of the S-polarized component that has been transmitted through the polarization beam splitter surface 4a when entering the polarization beam splitter 24 through the objective lens 1 and the depolarization plate 22 is reflected by the total reflection surface 27a of the prism 27, and is CMOS.
- An image is formed on the second region 23 b of the sensor 23.
- the CMOS sensor 3 is configured by the rolling shutter system as described above, and reads an image line by line in the direction indicated by the arrow in FIG. 9B.
- the second image processing unit 28b compares the images formed on the first area 23a and the second area 23b, which are read out line by line, and selects the focused image as the display image.
- the images for each line selected by the second image processing unit 28 b are combined and displayed on the image display device 20.
- the polarization direction of the light beam reflected by the polarization beam splitter 24 through the wave plate 25 is set to 90.
- the brightness of the two images formed in the first region 23a and the second region 23b can be kept substantially the same.
- the ⁇ / 4 plate 25 is used as the wavelength plate, the light beam reflected from the polarization beam splitter 24 is turned back by the first reflecting member 25, so that the polarization direction of the light beam is changed by 90 degrees, and the brightness of the light beam is almost reduced. It is possible to efficiently form an image on the image sensor 23 without loss.
- the second image processing unit 28b as the image selection unit compares the images formed in the first region 23a and the second region 23b, and the images are in focus. Since the image of the region is selected as the display image, an image with a deep focal depth is obtained in a continuous range, and an observed image with a deep depth of field is observed over a wide continuous range via the image display device 20. be able to.
- the imaging apparatus system includes the first image processing unit 28a and can adjust the orientation (rotation) of two images formed in the first region 23a and the second region 23b. Two images can be displayed in the same orientation. Since the first image processing unit 28a can also correct the rotation amount of the two images due to the manufacturing error of the prism or the like, it is necessary to provide a mechanical prism adjustment mechanism or the like for correcting the orientation of the image. No need. For this reason, the size of the entire imaging apparatus can be reduced, and the manufacturing cost can be reduced. Since the third image processing unit 28c is included, two images formed in the first region 23a and the second region 23b due to a coating manufacturing error of the optical system cause a difference in color. Each white balance can be adjusted.
- the first region 23a and the second region 23b are caused by assembly errors or manufacturing errors of the prism 27 or the polarization beam splitter 24.
- the center positions and magnifications of the two formed images are not matched, it is possible to correct the position shift / magnification shift between the two images by adjusting the center position and the display range.
- an imaging processing unit (not shown) that can adjust the speed of the electronic shutter for each image formed in the first region 23a and the second region 23b. By doing so, the brightness of the two images formed in the first region 23a and the second region 23b can be adjusted by adjusting the speed of the electronic shutter.
- the ⁇ / 4 plate 25 Since the ⁇ / 4 plate 25 is configured to be rotatable about the optical axis, the polarization state of the light beam is adjusted by rotating the ⁇ / 4 plate 25 and transmitted through the polarizing beam splitter 24 to the first region 23a. The amount of incident light can be adjusted. For this reason, it is possible to easily adjust the difference in brightness between the two images formed in the first region 23a and the second region 23b due to a manufacturing error of the optical system.
- the objective lens 21 Since the exit pupil position of the objective lens 21 is set on the object side with respect to the image sensor, the objective lens 21 becomes smaller, and the height of the light incident on the prism also becomes smaller. Therefore, the prism 27 and the polarization beam splitter 24 become smaller, and the size is small. Imaging devices with different positions are possible.
- the exit pupil position of the objective lens 21 is set on the object side with respect to the imaging device, and the CMOS sensor 23 moves as the displacement of the center of the photosensitive portion of each pixel with respect to the optical axis of the micro lens changes from the central portion to the peripheral portion of the CMOS sensor. It gradually increases in the peripheral direction. For this reason, an imaging device with less shading and a wider depth is possible.
- the image selection unit 28a2 compares the two images formed on the first region 23a and the second region 23b, selects a focused image for each line, and uses the selected image for each line. When the entire image is combined, the brightness of the combined entire image differs from line to line, which may hinder observation.
- the depolarization plate 22 since the depolarization plate 22 is provided, the depolarization plate even if the light beam passing through the objective lens 21 has a polarization component whose polarization direction is deviated.
- the polarization direction of the light beam can be made random by passing the light beam 22, and the brightness of the two images formed in the first region 23a and the second region 23b can be made substantially the same.
- the image selection unit 28a2 compares the two images formed on the first region 3a and the second region 23b, selects a focused image for each line, and uses the selected image for each line. Thus, when the whole image is synthesized, a whole image with uniform brightness is obtained for each line.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- General Health & Medical Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- General Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Astronomy & Astrophysics (AREA)
- Endoscopes (AREA)
- Instruments For Viewing The Inside Of Hollow Bodies (AREA)
- Lenses (AREA)
Abstract
Description
対物光学系の斜入射によるシェーディングの対策方法について、例えば、特許文献1には、対物光学系の斜入射特性に合わせて、撮像素子上のマイクロレンズの位置を撮像素子の中心部より外周部に向かって感光部に対してシフトさせることが開示されている。
ところで、テレセントリック光学系に合わせてマイクロレンズと撮像素子の感光部とのシフトを無くした撮像素子に、テレセントリックな対物光学系を用いて並列的に並べた2つの像を結像させれば、シェーディングは発生しない。しかしながら、このようなテレセントリック光学系を適用する場合には、対物光学系が大きくなり、対物光学系を並列的に並べることが困難となる。
本発明の一態様は、被写体からの光を集光し、並列的に2つの像を結像させる対物光学系と、該対物光学系からの光の入射側に配列された複数のマイクロレンズを有し、該マイクロレンズ毎に画素が割り当てられた撮像素子と、を備え、該撮像素子における各画素の感光部の中心が、前記マイクロレンズの光軸に対して前記撮像素子の中央部から周辺部になるに従って徐々に周辺方向に大きくなるように変位しており、前記対物光学系の射出瞳位置が、前記対物光学系の結像位置よりも物体側にあり、以下の条件式を満足する内視鏡用撮像装置を提供する。
0.5≦(θH-θD)/α≦3・・・ (1)
ただし、αは水平方向最大像高の主光線と光軸とのなす角であり、θHは撮像素子中心部から水平方向最大像高(H)のマイクロレンズの主光線補正量(角度)であり、θDは、撮像素子中心部から水平方向像高(D)の位置に対応するマイクロレンズの主光線補正量(角度)であり、像高(D)の位置は対物光学系の光軸を対称軸としてα度分撮像素子の中心部へ引いた線と撮像素子との交点である。
0<K×Tan(θc)≦P/2・・・ (2)
ただし、θcは水平方向の像高(C)の位置に対応するマイクロレンズの主光線補正量(角度)であり、像高(C)の位置は前記対物光学系の光軸と撮像素子との交点である。Pは撮像素子の画素ピッチであり、Kは撮像素子のマイクロレンズの面頂から画素(感光素子)までの距離である。
P/2≧K×(Tan(θH)-Tan(α))・・・ (3)
P/2≧K×(Tan(α)-Tan(θD))・・・ (4)
ただし、αは水平方向最大像高の主光線と光軸とのなす角であり、θHは撮像素子中心部から水平方向最大像高(H)の位置に対応するマイクロレンズの主光線補正量(角度)であり、θDは撮像素子中心部から水平方向像高(D)の位置に対応するマイクロレンズの主光線補正量(角度)であり、像高(D)の位置は対物光学系の光軸を対称軸としてα度分撮像素子の中心部へ引いた線と撮像素子との交点であり、Pは撮像素子の画素ピッチであり、Kは撮像素子のマイクロレンズの面頂から画素(感光素子)までの距離である。
図1に示すように、内視鏡用撮像装置は、2つの対物光学系2及び撮像素子4を備えている。
α-θH
で表すことができる。
-α―θD
で表すことができる。
よって、シェーディングを抑制することができるマイクロレンズの補正量と対物光学系の射出瞳から主光線角度の差は、
(α-θH)-(-α―θD)=2α-(θH-θD)・・・ (A)
となる。
よって、(θH-θD)/α=2がシェーディング発生しない望ましい条件である。
図4に示すように、マイクロレンズの主光線補正角度、及び光学系の斜入射光線角度の符号を撮像素子中心軸から水平方向最大像高の方向を負と設定する。
この場合、マイクロレンズの主光線補正角度及び光学系の斜入射光線角度は図5A及び図5Bに示すグラフのように負の方向に大きくなっていく。
ただし、αは水平方向最大像高の主光線と光軸とのなす角であり、θHは撮像素子中心部から水平方向最大像高(H)の位置に対応するマイクロレンズの主光線補正量(角度)であり、θDは撮像素子中心部から水平方向像高(D)の位置に対応するマイクロレンズの主光線補正量(角度)であり、像高(D)の位置は対物光学系の光軸を対称軸としてα度分撮像素子の中心部へ引いた線と撮像素子との交点である。
0<K×Tan(θc)≦P/2・・・ (2)
ただし、θcは水平方向の像高(C)の位置に対応するマイクロレンズの主光線補正量(角度)であり、像高(C)の位置は前記対物光学系の光軸と撮像素子との交点である。Pは撮像素子の画素ピッチであり、Kは撮像素子のマイクロレンズの面頂から画素(感光素子)までの距離である。
P/2≧K×(Tan(θH)-Tan(α))・・・ (3)
P/2≧K×(Tan(α)-Tan(θD))・・・ (4)
ただし、αは水平方向最大像高の主光線と光軸とのなす角であり、θHは撮像素子中心部から水平方向最大像高(H)の位置に対応するマイクロレンズの主光線補正量(角度)であり、θDは撮像素子中心部から水平方向像高(D)の位置に対応するマイクロレンズの主光線補正量(角度)であり、像高(D)の位置は対物光学系の光軸を対称軸としてα度分撮像素子の中心部へ引いた線と撮像素子との交点であり、Pは撮像素子の画素ピッチであり、Kは撮像素子のマイクロレンズの面頂から画素(感光素子)までの距離である。K×Tan(θH)-K×Tan(α)がP/2を超えてしまうと、対物光学系2から角度αで出射した主光線が、隣の画素に入り、色シェーディングが発生してしまう。また、K×Tan(α)-K×Tan(θD)がP/2を超えてしまうと、対物光学系2から角度-αで出射した主光線が、隣の画素に入り、色シェーディングが発生してしまう。
以下に、本発明の実施例1に係る内視鏡用撮像装置について図面を参照して説明する。
図7は、本実施例に係る内視鏡用撮像装置の全体構成を示す断面図を示している。図7に示すように、内視鏡用撮像装置は、間隔をあけて並列に配列される2つの対物光学系2と、該対物光学系2の後段に配置される2つの平行四辺形プリズム3と、平行四辺形プリズム3の後段に配置される1つの撮像素子4と、絞り部5a,5bとを備えている。
図2に示すように、撮像素子4の感光部(例えば、光電変換素子)12の物体側にマイクロレンズ10及び色フィルタ11が画素毎に配置されている。
図9Aおよび図9Bは本発明の実施例2にかかる撮像装置システムの構成を示す説明図で、図9Aは全体構成を模式的に示す図、図9Bは撮像素子の第1及び第2の領域に夫々結像する画像における被写体の向きを示す図である。
偏光解消板22は、対物レンズ1と偏光ビームスプリッタ24との間に配置されている。撮像素子23は、ローリングシャッタ方式のCMOSセンサで構成され、対物レンズ21の結像位置近傍に配置されている。撮像素子23は、撮像素子23におけるマイクロレンズ(図示せず)の光軸に対する各画素の感光部の中心が、撮像素子23の中央部から周辺部になるに従って徐々に周辺方向に大きく変位するようになっている。さらに、撮像素子23よりも物体側に、対物レンズ21の射出瞳位置が位置するように設定する。
第1反射部材26は、ミラーで構成されており、偏光ビームスプリッタ面24aで反射し、波長板25を透過した光束を折り返して反射させる。
第2反射部材27は、プリズムで構成されており、全反射面27aで偏光ビームスプリッタ4を透過した光を反射させる。プリズムは、全反射面27aにミラーコートを施して反射面を構成してもよい。
第1画像処理部28aは、第1領域3aの画像と第2領域23bの画像の向き(回転)を補正するように構成されている。
第4画像処理部28dは、第1領域23aの画像と第2領域23bの画像の夫々の中心位置の移動(選択)可能に構成されている。
第5画像処理部28eは、第1領域23aの画像と第2領域23bの画像の夫々の表示範囲(倍率)を調整可能に構成されている。
詳しくは、第2画像処理部28bは、例えば、図11Aに示すように、夫々の領域23a,23bに接続する高域通過フィルタ28b1a,28b1bと、高域通過フィルタ28b1a,28b1bに接続する比較器28b2と、比較器28b2及び夫々の領域23a,23bに接続する切替器28b3を有し、第1領域23a,第2領域23bの画像を、高域通過フィルタ28b1a,28b1bで高域成分を抽出し、抽出した高域成分を比較器28b2で比較し、高域成分の多い領域の画像を切替器28b3で選択するように構成される。
CMOSセンサ3は、上述したようにローリングシャッタ方式で構成されており、図9Bに矢印で示す方向に1ラインずつ画像を読み出す。
第2画像処理部28bが選択した1ラインずつの画像は、合成されて画像表示装置20に表示される。
本実施例に係る撮像装置システムによれば、画像選択部としての第2画像処理部28bが、第1領域23aと第2領域23bとに結像した夫々の画像を比較し、焦点が合った領域の画像を表示用画像として選択するので、連続した範囲で焦点深度が深い画像が得られ、画像表示装置20を介して、連続した広い範囲で被写界深度の深い被観察像を観察することができる。
第3画像処理部28cを有しているので、光学系のコーティング製造誤差を原因として、第1領域23a及び第2領域23bに結像した2つの画像に色の差が生じる場合、2つの画像の夫々のホワイトバランスを調整することができる。
4 撮像素子
10 マイクロレンズ
11 色フィルタ
12 感光部
Claims (5)
- 被写体からの光を集光し、並列的に2つの像を形成する対物光学系と
該対物光学系からの光の入射側に配列された複数のマイクロレンズを有し、該マイクロレンズ毎に画素が割り当てられた撮像素子と、
を備え、
前記撮像素子における各画素の感光部の中心が、前記マイクロレンズの光軸に対して前記撮像素子の中央部から周辺部になるに従って徐々に周辺方向に大きくなるように変位しており、
前記対物光学系の射出瞳位置が、前記対物光学系の結像位置よりも物体側にあり、
以下の条件式を満足する内視鏡用撮像装置。
0.5≦(θH-θD)/α≦3・・・ (1)
ただし、αは対水平方向最大像高の主光線と光軸とのなす角であり、θHは撮像素子中心部から水平方向最大像高(H)のマイクロレンズの主光線補正量(角度)であり、θDは、撮像素子中心部からの距離D上のマイクロレンズの主光線補正量(角度)であり、像高(D)の位置は対物光学系の光軸を対称軸としてα度分撮像素子の中心部へ引いた線と撮像素子との交点である。 - 以下の条件式を満足する請求項1記載の内視鏡用撮像装置。
0<K×Tan(θc)≦P/2・・・ (2)
ただし、θcは水平方向の像高(C)の位置に対応するマイクロレンズの主光線補正量(角度)であり、像高(C)の位置は前記対物光学系の光軸と撮像素子との交点である。Pは撮像素子の画素ピッチであり、Kは撮像素子のマイクロレンズの面頂から画素(感光素子)までの距離である。 - 以下の条件式を満足する請求項2記載の内視鏡用撮像装置。
P/2≧K×(Tan(θH)-Tan(α))・・・ (3)
P/2≧K×(Tan(α)-Tan(θD))・・・ (4)
αは水平方向最大像高の主光線と光軸とのなす角であり、θHは撮像素子中心部から水平方向最大像高(H)の位置に対応するマイクロレンズの主光線補正量(角度)であり、θDは撮像素子中心部から水平方向像高(D)の位置に対応するマイクロレンズの主光線補正量(角度)であり、像高(D)の位置は対物光学系の光軸を対称軸としてα度分撮像素子の中心部へ引いた線と撮像素子との交点であり、Pは撮像素子の画素ピッチであり、Kは撮像素子のマイクロレンズの面頂から画素(感光素子)までの距離である。 - 前記対物光学系が、
並列に配列され、被写体の光学像を前記撮像素子に結像させる二つの対物レンズと、
該対物レンズと前記撮像素子との間に配置され、前記対物光学系の光軸を変位させるための反射部材と、
を有する請求項1乃至請求項3の何れか1項に記載の内視鏡用撮像装置。 - 前記対物光学系が、光路分割手段を備え、
該光路分割手段によって分割した被写体の光学像を、夫々ピント位置を変えた光学像として前記撮像素子に結像させる請求項1乃至請求項3の何れか1項に記載の内視鏡用撮像装置。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2015529726A JP5829365B2 (ja) | 2013-11-14 | 2014-11-10 | 内視鏡用撮像装置 |
| EP14861990.1A EP3070507B1 (en) | 2013-11-14 | 2014-11-10 | Endoscope imaging device |
| CN201480045544.8A CN105474068B (zh) | 2013-11-14 | 2014-11-10 | 内窥镜用摄像装置 |
| US14/993,649 US20160120397A1 (en) | 2013-11-14 | 2016-01-12 | Endoscope image-acquisition device |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2013235948 | 2013-11-14 | ||
| JP2013-235948 | 2013-11-14 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US14/993,649 Continuation US20160120397A1 (en) | 2013-11-14 | 2016-01-12 | Endoscope image-acquisition device |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2015072427A1 true WO2015072427A1 (ja) | 2015-05-21 |
Family
ID=53057361
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2014/079736 Ceased WO2015072427A1 (ja) | 2013-11-14 | 2014-11-10 | 内視鏡用撮像装置 |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US20160120397A1 (ja) |
| EP (1) | EP3070507B1 (ja) |
| JP (1) | JP5829365B2 (ja) |
| CN (1) | CN105474068B (ja) |
| WO (1) | WO2015072427A1 (ja) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2017018483A (ja) * | 2015-07-14 | 2017-01-26 | オリンパス株式会社 | 医療用プローブ |
| CN106999024A (zh) * | 2015-09-01 | 2017-08-01 | 奥林巴斯株式会社 | 摄像系统、处理装置、处理方法以及处理程序 |
| JPWO2021131921A1 (ja) * | 2019-12-27 | 2021-07-01 |
Families Citing this family (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2637234B2 (ja) | 1989-06-02 | 1997-08-06 | ドーピー建設工業株式会社 | 大型移動支保工の移動用油圧装置 |
| JP2927731B2 (ja) | 1996-06-05 | 1999-07-28 | 今井重機建設株式会社 | Pc桁移送装置 |
| US20190051039A1 (en) * | 2016-02-26 | 2019-02-14 | Sony Corporation | Image processing apparatus, image processing method, program, and surgical system |
| US10983325B2 (en) * | 2016-12-12 | 2021-04-20 | Molecular Devices, Llc | Trans-illumination imaging with an array of light sources |
| US10598918B2 (en) * | 2017-06-28 | 2020-03-24 | Karl Storz Imaging, Inc. | Endoscope lens arrangement for chief ray angle control at sensor |
| CN111565270B (zh) * | 2019-02-13 | 2024-05-03 | 株式会社理光 | 拍摄装置及拍摄光学系统 |
| JP7265376B2 (ja) * | 2019-03-04 | 2023-04-26 | 株式会社タムロン | 観察撮像装置 |
| US11333829B2 (en) * | 2019-11-22 | 2022-05-17 | Karl Storz Imaging, Inc. | Medical imaging device with split image on common image sensor |
| TWI782409B (zh) * | 2020-03-09 | 2022-11-01 | 陳階曉 | 內視鏡影像校正系統及其方法 |
| CN111487759B (zh) * | 2020-06-04 | 2025-05-02 | 深圳开立生物医疗科技股份有限公司 | 一种多孔径成像系统、一种内窥镜和一种内窥镜系统 |
| US11762174B2 (en) * | 2020-09-24 | 2023-09-19 | Apple Inc. | Optical system including lenses and prism for telephoto cameras |
| JP7757150B2 (ja) * | 2021-11-18 | 2025-10-21 | キヤノン株式会社 | レンズ装置および撮像装置 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0454094B2 (ja) | 1983-10-07 | 1992-08-28 | Bridgestone Corp | |
| JPH05346556A (ja) | 1992-06-12 | 1993-12-27 | Victor Co Of Japan Ltd | 固体撮像素子 |
| JPH10239594A (ja) * | 1996-12-27 | 1998-09-11 | Olympus Optical Co Ltd | 電子内視鏡 |
| JP2010056345A (ja) | 2008-08-28 | 2010-03-11 | Brookman Technology Inc | 増幅型固体撮像装置 |
Family Cites Families (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6518640B2 (en) * | 1999-12-02 | 2003-02-11 | Nikon Corporation | Solid-state image sensor, production method of the same, and digital camera |
| WO2008065955A1 (en) * | 2006-11-28 | 2008-06-05 | Olympus Corporation | Endoscope device |
| JP4307497B2 (ja) * | 2007-05-14 | 2009-08-05 | シャープ株式会社 | 固体撮像素子、固体撮像装置および電子情報機器 |
| US20090112061A1 (en) * | 2007-10-25 | 2009-04-30 | Dhs Company Ltd. | Endoscope capable of varying field of vision |
| JP2010062438A (ja) * | 2008-09-05 | 2010-03-18 | Toshiba Corp | 固体撮像装置およびその設計方法 |
-
2014
- 2014-11-10 WO PCT/JP2014/079736 patent/WO2015072427A1/ja not_active Ceased
- 2014-11-10 EP EP14861990.1A patent/EP3070507B1/en not_active Not-in-force
- 2014-11-10 CN CN201480045544.8A patent/CN105474068B/zh active Active
- 2014-11-10 JP JP2015529726A patent/JP5829365B2/ja active Active
-
2016
- 2016-01-12 US US14/993,649 patent/US20160120397A1/en not_active Abandoned
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0454094B2 (ja) | 1983-10-07 | 1992-08-28 | Bridgestone Corp | |
| JPH05346556A (ja) | 1992-06-12 | 1993-12-27 | Victor Co Of Japan Ltd | 固体撮像素子 |
| JPH10239594A (ja) * | 1996-12-27 | 1998-09-11 | Olympus Optical Co Ltd | 電子内視鏡 |
| JP2010056345A (ja) | 2008-08-28 | 2010-03-11 | Brookman Technology Inc | 増幅型固体撮像装置 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP3070507A4 |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2017018483A (ja) * | 2015-07-14 | 2017-01-26 | オリンパス株式会社 | 医療用プローブ |
| CN106999024A (zh) * | 2015-09-01 | 2017-08-01 | 奥林巴斯株式会社 | 摄像系统、处理装置、处理方法以及处理程序 |
| JPWO2021131921A1 (ja) * | 2019-12-27 | 2021-07-01 | ||
| WO2021131921A1 (ja) * | 2019-12-27 | 2021-07-01 | 国立大学法人浜松医科大学 | 硬性鏡装置 |
| US12329359B2 (en) | 2019-12-27 | 2025-06-17 | National University Corporation Hamamatsu University School Of Medicine | Rigid scope device |
| JP7714171B2 (ja) | 2019-12-27 | 2025-07-29 | 国立大学法人浜松医科大学 | 硬性鏡装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP3070507B1 (en) | 2019-05-15 |
| EP3070507A1 (en) | 2016-09-21 |
| EP3070507A4 (en) | 2017-07-05 |
| US20160120397A1 (en) | 2016-05-05 |
| JP5829365B2 (ja) | 2015-12-09 |
| CN105474068A (zh) | 2016-04-06 |
| CN105474068B (zh) | 2018-06-15 |
| JPWO2015072427A1 (ja) | 2017-03-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5829365B2 (ja) | 内視鏡用撮像装置 | |
| CN103597405B (zh) | 摄像装置和摄像装置系统 | |
| JP4961993B2 (ja) | 撮像素子、焦点検出装置および撮像装置 | |
| JP5831105B2 (ja) | 撮像装置及び撮像方法 | |
| US9030543B2 (en) | Endoscope system | |
| JP5619294B2 (ja) | 撮像装置及び合焦用パラメータ値算出方法 | |
| JPH08265804A (ja) | 撮像装置 | |
| JPWO2008032820A1 (ja) | 撮像素子および撮像装置 | |
| JP2012230341A (ja) | 撮像装置及び撮像方法 | |
| JP4983271B2 (ja) | 撮像装置 | |
| JP2013512470A (ja) | 光学的画像化装置 | |
| JP6712362B2 (ja) | 画像処理装置、撮像装置、画像処理方法、およびプログラム | |
| JP4858179B2 (ja) | 焦点検出装置および撮像装置 | |
| JP5251323B2 (ja) | 撮像装置 | |
| WO2017073292A1 (ja) | 内視鏡撮像ユニット | |
| JP2011215545A (ja) | 視差画像取得装置 | |
| JP2017219791A (ja) | 制御装置、撮像装置、制御方法、プログラム、および、記憶媒体 | |
| JP4546781B2 (ja) | 撮像装置及び色ずれ補正プログラム | |
| JP2015106773A (ja) | アレイ光学系を有する撮像装置 | |
| JP2020046482A (ja) | 撮像装置 | |
| JP5907668B2 (ja) | 撮像装置及び撮像素子 | |
| JP2011182041A (ja) | 撮像装置 | |
| JP5589799B2 (ja) | 撮像装置 | |
| JP2012128301A (ja) | 焦点調節方法、焦点調節プログラムおよび撮像装置 | |
| JP5691440B2 (ja) | 撮像装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 201480045544.8 Country of ref document: CN |
|
| ENP | Entry into the national phase |
Ref document number: 2015529726 Country of ref document: JP Kind code of ref document: A |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14861990 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| REEP | Request for entry into the european phase |
Ref document number: 2014861990 Country of ref document: EP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2014861990 Country of ref document: EP |