WO2017064746A1 - Imaging device, endoscopic device and imaging method - Google Patents
Imaging device, endoscopic device and imaging method Download PDFInfo
- Publication number
- WO2017064746A1 WO2017064746A1 PCT/JP2015/078871 JP2015078871W WO2017064746A1 WO 2017064746 A1 WO2017064746 A1 WO 2017064746A1 JP 2015078871 W JP2015078871 W JP 2015078871W WO 2017064746 A1 WO2017064746 A1 WO 2017064746A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- color
- wavelength band
- imaging
- mask
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B23/00—Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
- G02B23/24—Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
- G03B35/12—Stereoscopic photography by simultaneous recording involving recording of different viewpoint images in different colours on a colour film
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/18—Stereoscopic photography by simultaneous viewing
- G03B35/26—Stereoscopic photography by simultaneous viewing using polarised or coloured light separating different viewpoint images
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B9/00—Exposure-making shutters; Diaphragms
- G03B9/08—Shutters
- G03B9/10—Blade or disc rotating or pivoting about axis normal to its plane
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/555—Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/75—Circuitry for compensating brightness variation in the scene by influencing optical camera components
Definitions
- the present invention relates to an imaging device, an endoscope device, an imaging method, and the like.
- Various methods for optically 3D measuring the surface shape of an object have been proposed.
- a method of projecting active pattern illumination onto an object and performing stereo three-dimensional distance measurement by binocular stereoscopic vision is proposed. is there.
- stereo measurement matching processing is performed using a characteristic part (for example, an edge) of an image, and a distance to an object is obtained.
- a characteristic part for example, an edge
- a characteristic part is intentionally added to the subject, and distance measurement can be accurately performed even in a part without a characteristic part (for example, a flat part).
- Patent Documents 1 and 2 disclose a technique for switching left and right imaging light paths with a mechanical shutter in time and acquiring left and right images in a time-sharing manner.
- Patent Document 2 discloses a technique of inserting an RG filter in the left half of a single imaging optical path and a GB filter in the right half, and separating the left and right images from the R image and the B image of the captured image. Yes.
- the RG filter and the GB filter are retracted from the imaging optical path, and an observation image is acquired.
- inspection may be performed by capturing an observation image that is not stereo shooting, and stereo measurement may be performed at a portion to be examined in detail.
- switching between flat illumination and active pattern illumination can be considered, but the illumination mechanism becomes complicated.
- convenience can be improved when real-time stereo measurement can be performed while performing normal observation, but it is preferable that illumination switching is not accompanied at that time. .
- an imaging apparatus an endoscope apparatus, an imaging method, and the like that can capture a normal observation image while performing illumination with active pattern illumination.
- One embodiment of the present invention includes an image including a first color image, a second color image having a longer wavelength side than the first color, and a third color image having a longer wavelength side than the second color.
- An imaging unit that captures an image and can capture the first color image and the third color image as a stereo image, and is included in the wavelength band of the first color and included in the wavelength band of the second color
- a pattern illumination having a given light amount distribution in a second wavelength band that is included in the first wavelength band that is not included in the wavelength band of the third color and that is not included in the wavelength band of the second color.
- an illumination unit includes an illumination.
- a pattern illumination having a given light amount distribution in a first wavelength band included in the first color wavelength band and a second wavelength band included in the third color wavelength band is a subject. Is irradiated. Since the first wavelength band and the second wavelength band are not included in the wavelength band of the second color, the image of the second color is an image that is not affected by a given light amount distribution due to pattern illumination. This makes it possible to capture a normal observation image while performing illumination with active pattern illumination.
- Another aspect of the present invention relates to an endoscope apparatus that includes the imaging apparatus described above.
- a first color image, a second color image having a longer wavelength side than the first color, and a third color image having a longer wavelength side than the second color when the image of the first color and the image of the third color can be captured as a stereo image, the first color that is included in the wavelength band of the first color and not included in the wavelength band of the second color.
- the present invention relates to an imaging method for imaging a captured image including an image of one color, an image of the second color, and an image of the third color.
- An example of pattern illumination The example of the waveform of the light quantity of pattern illumination, the reflection coefficient distribution of a to-be-photographed object, and a captured image.
- Explanatory drawing of a correction process Explanatory drawing of a correction process.
- Second configuration example of illumination unit The structural example of an imaging part.
- the structural example of an imaging part The structural example of an imaging part.
- the 1st detailed structural example of a fixed mask and a movable mask The 1st detailed structural example of a fixed mask and a movable mask.
- the 2nd detailed structural example of a fixed mask and a movable mask The 2nd detailed structural example of a fixed mask and a movable mask.
- an industrial endoscope apparatus will be described below as an application example of the present invention.
- the present invention is not limited to application to an industrial endoscope apparatus, and a stereo shooting method (with an imaging system having parallax).
- a method of detecting the phase difference between the two images and acquiring subject distance information), a three-dimensional measuring device that measures a three-dimensional shape, and an imaging device having a three-dimensional measuring function for example, a medical endoscope device, Microscopes, industrial cameras, robot vision functions, etc. are applicable.
- Pattern illumination In order to obtain a high-quality observation image, it is generally premised that the subject is illuminated uniformly. However, when performing stereo measurement, distance information to the subject cannot be obtained unless there is a feature that easily obtains a phase difference. Therefore, a method of irradiating the subject with pattern illumination that is intentionally characterized is used.
- an observation mode for capturing an observation image using white light and a measurement mode for performing stereo measurement by the color phase difference method are switched and used.
- conventional pattern illumination it is necessary to have a function of switching the illumination itself so that uniform illumination is performed in the observation mode and pattern illumination is performed in the measurement mode.
- a method is used in which pattern illumination is always performed in the observation mode or in the measurement mode, and the pattern in the observation mode is erased by a spectral filter or image processing. Thereby, the illumination switching function can be made unnecessary.
- this method will be described.
- FIG. 1 schematically shows an example of pattern illumination.
- FIG. 1 shows a plan view of the pattern illumination PL (for example, a plan view when projected onto a plane perpendicular to the optical axis of the imaging system) and an example of the light quantity characteristic in the AA section.
- x is a position (coordinate) in a direction perpendicular to the optical axis.
- a small circle pattern DT whose brightness changes is an illumination pattern regularly arranged.
- the outside of the small circle pattern DT is illumination with flat brightness, and the inside of the small circle pattern DT is illumination darker than that.
- a waveform obtained by multiplying the amount of light of pattern illumination PL and the reflection coefficient distribution of the subject is obtained as an imaging waveform (sensor output) as shown in FIG.
- the sensor output in FIG. 2 represents a pixel value or a luminance value (or brightness of image formation on the sensor surface) at a pixel corresponding to the position x.
- a solid line indicates a sensor output when pattern illumination is performed, and a dotted line indicates a sensor output when flat illumination is performed.
- FIG. 3 shows a first spectral characteristic example of pattern illumination and a first spectral characteristic example of a pupil in the observation mode and the stereo measurement mode.
- Dotted lines with B, G, R, and IR symbols represent spectral characteristics of the color filter of the image sensor.
- B, G, and R are spectral characteristics of the blue, green, and red filters of the image sensor, respectively.
- IR is an infrared sensitivity characteristic and passes through any of the blue, green, and red filters.
- the wavelength bands are set to five â Pv1, Pb, Pv2, Pr, Pir â in association with the spectral characteristics of the color filter of the imaging sensor.
- the illumination light in the bands â Pv1, Pv2, Pir â is non-pattern light (flat light) with a uniform light amount distribution
- the illumination light in the bands â Pb, Pr â is pattern light. Light having a spectral characteristic obtained by combining these becomes illumination light to the subject.
- the bands Pv1 and Pb are set to bands that pass through the blue filter of the image sensor but do not pass through the green filter, the band Pv is set to a band that passes through the green filter, and the bands Pr and Pir are red.
- a band that passes through the filter but does not pass through the green filter is set.
- the band Pir includes an infrared sensitivity characteristic IR. Note that the band Pir may not include the infrared sensitivity characteristic IR.
- the bands Pb and Pr are set to wavelength bands that do not interfere with the spectral characteristic G of the green filter of the image sensor.
- the bands Pb and Pr are set to narrow bands such as several nm to several tens of nm, for example.
- band Pb it is desirable to select a wavelength range in which only the blue pixel (spectral characteristic B) of the image sensor can be acquired and the sensitivity is relatively good.
- band Pr it is desirable to select a wavelength range in which only the red pixel (spectral characteristic R) of the imaging sensor can be acquired and the sensitivity is relatively good. That is, since the left pupil image and the right pupil image must be separated by different color pixels of the image sensor, â Pb, Pr â may be selected in a wavelength region that does not have a mutual light receiving sensitivity characteristic.
- the spectral components â Pv1, Pv2, Pir â of the non-pattern illumination light are components of a normal observation image
- the red pixel, the green pixel, and the blue pixel spectral characteristics R, It is desirable that G and B) can cover as many wavelength ranges as possible.
- a standard laser light source with wavelengths of 450 nm and 660 nm is used for â Pb, Pr â so as to be a narrow-band light source, and the spectral components â Pv1, Pv2, Pir â may cover many wavelength components. This is one way.
- the relationship between the spectral characteristics FL, FC, FR of the pupil and the spectral characteristics of the illumination light is set as in the following formula (1).
- the spectral characteristic FC corresponds to the wavelength band â Pv1, Pv2, Pir â , and an observation image by flat illumination is obtained.
- shooting is performed with the left pupil of the spectral characteristic FL and the right pupil of the spectral characteristic FR.
- the spectral characteristics FL and FR correspond to the wavelength bands Pb and Pr, and a stereo image by active pattern illumination is obtained.
- the observation mode is a mode in which monocular imaging is performed with the central aperture 23 (spectral characteristic FC) of the fixed mask 20 as shown in FIGS. 11, 13, and 15, and the stereo measurement mode is shown in FIGS. 12, 14, and 16. In this manner, stereo imaging is performed with the left pupil aperture 21 (spectral characteristic FL) and right pupil aperture 22 (spectral characteristic FR) of the fixed mask 20 as described above. Details of these observation modes and stereo measurement modes will be described later.
- FIG. 4 shows the relationship between the first spectral characteristic example and the captured images in the observation mode and the stereo measurement mode.
- the dotted line virtually indicates the sensor output when flat illumination is performed.
- the active pattern illumination of the present embodiment is applied to the imaging unit of FIGS. 11 to 16 as an example.
- the configuration of the imaging unit is not limited to FIGS. Any observation image can be used as long as a stereo image by spectral characteristics FL and FR can be taken.
- the illumination light is the same regardless of the mode, and the active pattern illumination is projected onto the subject in both the observation mode and the stereo measurement mode.
- the reflected light by the illumination light having the spectral components â Pv1, Pv2, Pir â of the non-pattern light passes through the pupil center optical path (spectral characteristic FC), and a captured image â Vr, Vg, Vb â is obtained.
- Vr is a red image obtained by a pixel having a red filter of the image sensor
- Vg is a green image obtained by a pixel having a green filter of the image sensor, and obtained by a pixel having a blue filter of the image sensor It is a blue image.
- the band of the active pattern is removed, an observation image that is not affected by the active pattern is obtained.
- the reflected light by the illumination light having the spectral component â Pb, Pr â of the pattern light passes through the left and right pupil optical paths, and a captured image â Mr, Mb â is obtained.
- Mr is an image by the left pupil optical path (spectral characteristic FL)
- Mb is an image by the right pupil optical path (spectral characteristic FR).
- the captured image â Mr, Mb â can be easily matched and phase difference detection can be easily performed.
- the imaging device captures the captured images (Vb, Vg, Vr) including the first color image, the second color image, and the third color image.
- An imaging unit capable of capturing a first color image and a third color image as a stereo image (Mb, Mr), and pattern illumination having a given light amount distribution in the first wavelength band Pb and the second wavelength band Pr And an illumination unit that irradiates the subject.
- the second color (green) image is an image on the longer wavelength side than the first color (blue)
- the third color (red) image is an image on the longer wavelength side than the second color.
- the first wavelength band Pb is a band that is included in the first color wavelength band (spectral characteristic B band) and not included in the second color wavelength band (spectral characteristic G band).
- the second wavelength band Pr is a band that is included in the third color wavelength band (spectral characteristic R band) and not included in the second color wavelength band (spectral characteristic G band).
- the first wavelength band Pb and the second wavelength band Pr are not included in the wavelength band of the second color (green), at least the pattern of the given light quantity distribution is present in the second color image. Is not reflected. Thereby, it is possible to switch between stereo measurement and normal observation without switching between pattern illumination and flat illumination.
- the stereo image is composed of a first color (blue) image and a third color (red) image, and the first color image includes a pattern in the first wavelength band Pb, and the third color image. Shows a pattern in the second wavelength band Pr.
- a stereo image intentionally characterized by a pattern is acquired, and high-precision stereo measurement can be performed by performing matching processing thereof.
- an observation image is captured through a spectral filter (spectral characteristic FC) that does not pass the first wavelength band Pb and the second wavelength band Pr, so that an observation image in which a pattern due to a given light amount distribution is not captured is obtained. Is obtained.
- spectral characteristic FC spectral characteristic
- an observation image of the entire band of white light is photographed, and the first color (blue) and third color (red) images in which the pattern is captured are not captured.
- an observation image from which the pattern has been removed is obtained.
- a pattern with a given light quantity distribution is not provided in the wavelength band of the second color, it is possible to obtain an observation image as in flat illumination while performing pattern illumination.
- the first wavelength band is the wavelength band Pb in FIG. 3 and the second wavelength band is the wavelength band Pr in FIG. 3, but the present invention is not limited to this.
- the first wavelength band may be the wavelength band Pb1 in FIG. 5
- the second wavelength band may be the wavelength band Pr2 in FIG.
- a given light amount distribution (a light amount distribution of a given shape) is a distribution having a light / dark (light amount) boundary or a distribution having an edge portion where the light amount changes abruptly.
- a plurality of parts are provided in the irradiation area of the pattern illumination.
- this given light quantity distribution is given only to specific wavelength bands Pb and Pr.
- small circle patterns DT darker than the surroundings are regularly arranged in the wavelength bands Pb and Pr, but the given light quantity distribution is not limited to this.
- the pattern DT may not be a small circle, the arrangement of the pattern DT may not be regular, and the inside of the pattern DT may be brighter than the outside.
- the imaging unit switches between a stereo mode for capturing a stereo image and a non-stereo mode for capturing a captured image with a single eye.
- the imaging unit in the imaging unit described later with reference to FIGS. 11 to 16, in the non-stereo mode (observation mode), monocular imaging is performed with the central aperture 23 (spectral characteristic FC) of the fixed mask 20, and in the stereo mode (stereo measurement mode), the image is fixed. Stereo imaging is performed with the left pupil aperture 21 (spectral characteristics FL) and right pupil aperture 22 (spectral characteristics FR) of the mask 20.
- the imaging unit to which the pattern illumination of the present embodiment can be applied is not limited to this, and any imaging unit that captures the first color image and the third color image as a stereo image in the stereo mode may be used.
- the monocular in the non-stereo mode uses the first wavelength band Pb and the second wavelength band Pr out of the wavelength bands (white light wavelength bands) of the first color, the second color, and the third color. Except the wavelength bands â Pv1, Pv2, Pir â .
- the wavelength band â Pv1, Pv2, Pir â through which the monocular passes is a pattern based on the given light quantity distribution. Is not attached.
- an observation image such as flat illumination can be taken despite pattern illumination in the non-stereo mode.
- the wavelength bands â Pv1, Pv2, Pir â excluding them are substantially the wavelength band of white light. Therefore, it is possible to obtain an image that is not inferior to a captured image obtained by illumination with white light.
- the imaging apparatus may include a phase difference detection unit 330 and an image output unit (color image generation unit 320).
- the phase difference detection unit 330 detects a phase difference between the first color image and the third color image captured in the stereo mode.
- the image output unit outputs an image for observation based on the captured image (first color to third color image) captured in the non-stereo mode.
- the phase difference detection unit 330 can detect the phase difference with high accuracy.
- the pattern is not captured in the captured image captured in the non-stereo mode even though pattern illumination is performed, an image for observation can be output by the image output unit.
- the wavelength band of the first color wavelength band (spectral characteristic B) excluding the first wavelength band Pb, the second color wavelength band (spectral characteristic G), and the third color wavelength band. It is a flat light amount distribution in the wavelength band excluding the second wavelength band Pr in (spectral characteristic R).
- the illumination light has a flat light amount distribution other than the first wavelength band Pb and the second wavelength band Pr, at least the second color image is an image by flat illumination.
- the flat light quantity distribution means that the light quantity distribution is constant (substantially constant) in the photographing region (field of view) photographed by the imaging unit.
- the light amount distribution on a surface having a constant distance from the imaging unit is constant.
- the light amount distribution need not be completely constant, and there may be a gradual light amount change without an abrupt light amount change (edge portion) such as pattern illumination.
- the first color is blue
- the second color is green
- the third color is red
- the first color image, the second color image, and the third color image correspond to a blue image, a green image, and a red image, respectively, and these images have an image pickup element having a primary color filter. It is not limited to what was imaged (for example, image sensor of primary color Bayer arrangement).
- a complementary color image may be captured by an imaging element having a complementary color filter, and a blue image, a green image, and a red image may be acquired from the complementary color image by conversion processing.
- Second spectral characteristic example In the first spectral characteristic example, the case where the imaging bands of the observation mode and the stereo measurement mode are separated has been described. In the second spectral characteristic example, a case where the imaging bands of the observation mode and the measurement mode are not separated will be described. If you can always use illumination that combines non-patterned light and patterned light without separating the wavelength band, regardless of the mode, you can use the wavelength band effectively, and the illumination switching function is unnecessary, ensuring imaging sensitivity. Covering the color information of the subject is advantageous in terms of simplifying the illumination mechanism. However, if the wavelength bands are not separated, the captured image in the observation mode is also affected by the pattern light. Therefore, in order to obtain a high-quality and faithful observation image, it is necessary to remove or reduce the influence of the pattern light.
- FIG. 5 shows a second spectral characteristic example of pattern illumination and a second spectral characteristic example of the pupil in the observation mode and the stereo measurement mode.
- the wavelength band is divided into five parts â Pb1, Pb2, Pr1, Pr2, Pir â in association with the spectral characteristics of the color filter of the image sensor.
- the illumination light in the bands â Pb2, Pr1, Pir â is non-pattern light (flat light) with a uniform light quantity distribution, and the illumination light in the bands â Pb1, Pr2 â is pattern light. Light having a spectral characteristic obtained by combining these becomes illumination light to the subject.
- Pb1 is set in a band that passes through the blue filter but does not pass through the green filter
- Pb2 is set in a band that passes through both the blue filter and the green filter
- Pr1 is set in a band that passes through both the red filter and the green filter
- Pr2 is set in a band that passes through the red filter but does not pass through the green filter
- Pir is a band corresponding to the infrared sensitivity characteristic IR, and is a band that passes through all the red, green, and blue filters.
- the wavelength bands Pb1 and Pr2 are set to wavelength bands that do not interfere with the spectral characteristics G of the green filter of the image sensor.
- an image is taken with the pupil of the spectral characteristic FC, which includes the wavelength band â Pb1, Pb2, Pr1, Pr2, Pir â .
- FC the spectral characteristic
- no filter is provided in the central aperture 23 of the fixed mask 20, and the central pupil optical path allows light in the entire band to pass.
- the relationship between the red image Vr, the green image Vg, and the blue image Vb constituting the color image captured in the observation mode and the wavelength band covered by them is expressed by the following equation (2).
- the spectral characteristic FL corresponds to the wavelength band Pb1
- the spectral characteristic FR corresponds to the wavelength band Pr2. That is, the relationship between the red image Mr and the blue image Mb captured in the stereo measurement mode and the wavelength band covered by them is expressed by the following equation (3).
- FIG. 6 shows the relationship between the second spectral characteristic example and the captured images in the observation mode and the stereo measurement mode.
- the dotted line virtually indicates the sensor output when flat illumination is performed.
- the reflected light by the illumination light having the spectral components â Pb1, Pb2, Pr1, Pr2, Pir â passes through the pupil center optical path (spectral characteristic FC), and a captured image â Vr, Vg, Vb â is obtained.
- the reflected light by the illumination light having the spectral components â Pb1, Pr2 â of the pattern light passes through the left and right pupil optical paths, and a captured image â Mr, Mb â is obtained.
- the observation images Vb and Vr are affected by the pattern illumination, so that the images are intentionally changed in brightness.
- the observation image Vg is composed of the wavelength range â Pb2, Pr1, Pir â that is uniform illumination, no intentional brightness change occurs, and the image profile reflects only the reflection coefficient of the subject.
- the observation images Vr and Vb are profiles affected by the pattern, but assuming a profile (dotted line in FIG. 6) obtained when the illumination is uniform. Although the average brightness is different, the similarity (similarity) with the observed image Vg is partially high. This is because the observation image â Vr, Vg â is an image in which the spectral characteristics of the imaging sensor overlap in the band Pb2, and the observation image â Vb, Vg â is an image in which the spectral characteristics of the imaging sensor overlap in the band Pr1. This is because there is a considerable correlation between them.
- the observation image Vg not affected by the pattern is used, the observation images Vr and Vb can be corrected, and the influence of the pattern can be removed or reduced. That is, it is possible to restore an observation image as if it was taken with flat illumination. This correction process will be described below.
- FIG 7 and 8 are explanatory diagrams of the correction process.
- the correction of the blue image Vb will be described as an example, but the correction of the red image Vr can be similarly performed.
- the correlation value between the waveform of the blue image Vb and the green image Vg in the section of the width d centering on the arbitrary position XL on the sensor surface of the image sensor is calculated. This is performed for all positions x on the sensor surface (that is, all pixels of the captured image). For example, when ZNCC (Zero-mean Normalized Cross-Correlation) is used, the correlation value becomes 1 when the similarity is maximum, and the correlation value approaches 0 as the similarity becomes lower.
- the correlation value is compared with the threshold Th, and when the correlation value is equal to or greater than the threshold Th, the flag value is set to â1â, and when the correlation value is smaller than the threshold Th, the flag value is set to â0â. That is, the flag value is obtained by binarizing the similarity with / without similarity. In the correction process, a pixel with a flag value â1â is determined as a valid pixel, and a pixel with a flag value â0â is determined as an invalid pixel.
- the correlation value is not limited to the case of obtaining all the pixels of the captured image.
- the correlation value may be obtained for pixels in a predetermined region, or may be obtained for pixels thinned at a predetermined interval.
- the determination of the flag value is not limited to the above. For example, when using a correlation calculation in which the correlation value decreases as the similarity increases, the flag value is set to â1â when the correlation value is equal to or less than the threshold Th The flag value may be set to â0â when the correlation value is larger than the threshold Th.
- the fitting process for example, when the effective pixel range of the processing section is e1 and e2, the level of the green image Vg is changed, and the total absolute value of the difference between the green image Vg and the blue image Vb at each level is obtained.
- a method of superimposing such that the sum is minimized can be considered.
- a method is conceivable in which the gain of the green image Vg is changed, the sum of the absolute values of the differences between the green image Vg and the blue image Vb at each gain is obtained, and the sum is made so that the sum is minimized.
- the pixel value Vg (XL) at the position XL of the green image Vg after the fitting process is set as a correction value Vb â² (XL).
- the series of correction processes described above is performed at all positions x on the sensor surface (that is, all pixels of the captured image) to generate a corrected blue image Vb â².
- a corrected red image Vr â² is generated.
- a high-quality observation image can be generated even with illumination light including pattern light.
- the monocular in the non-stereo mode passes the wavelength band including the wavelength bands of the first color (blue), the second color (green), and the third color (red).
- the first color image and the third color image in which the pattern by the pattern illumination is captured, and the second color image in which the pattern by the pattern illumination is not captured are captured. Then, by using these three color images, it is possible to erase (reduce) the pattern by pattern illumination, and an observation image can be obtained.
- the imaging apparatus may include a phase difference detection unit 330 and an image output unit (color image generation unit 320).
- the phase difference detection unit 330 detects a phase difference between the first color image and the third color image captured in the stereo mode.
- the image output unit outputs an image for observation based on the captured image (first color to third color image) captured in the non-stereo mode.
- the image output unit corrects changes in the pixel values of the first color image and the third color image due to a given light amount distribution based on the second color image.
- the pattern by the pattern illumination is not reflected in the second color image. Accordingly, it is possible to correct the pixel values of the first color image and the third color image in which the pattern by the pattern illumination is captured using the second color image as a reference. In other words, it is considered that the profiles of the first to third color images are almost similar in a normal image (for example, an image of a subject that is normally captured by an industrial endoscope or an image captured of the natural world). . Therefore, the influence of pattern illumination can be corrected by correcting the profiles of the first color image and the third color image so as to be similar to the profile of the second color image in which no pattern is captured.
- Illumination Unit An illumination unit that performs active pattern illumination according to this embodiment will be described below.
- a light source part is not necessarily limited to the apparatus using a light guide member from a light source part to an output end.
- FIG. 9 shows a first configuration example of the illumination unit. 9 includes a white light source 401, a light guide member 402 (light guide member for non-pattern light), an illumination lens 403, a red laser light source 404, a blue laser light source 405, dichroic prisms 406 and 407 (mirrors in a broad sense). ), A light guide member 408 (light guide member for pattern light), a mask pattern 409, and a projection lens 410.
- a white light source 401 includes a white light source 401, a light guide member 402 (light guide member for non-pattern light), an illumination lens 403, a red laser light source 404, a blue laser light source 405, dichroic prisms 406 and 407 (mirrors in a broad sense).
- a light guide member 408 light guide member for pattern light
- a mask pattern 409 includes a projection lens 410.
- the illumination unit has two light guide members 402 and 408.
- One light guide member 402 guides the white light from the white light source 401 to the illumination lens 403 provided at the distal end of the scope unit.
- the guided white light is irradiated to the subject through the illumination lens 403.
- the other light guide member 408 guides blue laser light and red laser light to the projection lens 410 provided at the distal end of the scope portion. That is, the blue laser light from the blue laser light source 405 and the red laser light from the red laser light source 404 are incident on the light guide member 408 by optical path synthesis by the dichroic prisms 406 and 407.
- the laser light guided by the light guide member 408 passes through the mask pattern 409, and the added pattern is projected onto the subject 5 by the projection lens 410.
- the non-pattern light and the pattern light are combined on the surface of the subject 5.
- FIG. 10 shows a second configuration example of the illumination unit. 10 includes a white light source 451, a polarizing element 452, a blue laser light source 453, a red laser light source 454, dichroic prisms 455 and 456, a mask pattern 457, a polarizing element 458, a prism 459 (synthesis prism), and a light guide member 460. Projection lens 461.
- the illumination unit has one light guide member 460.
- White light from the white light source 451 is converted into, for example, P-polarized light (polarized light perpendicular to S-polarized light) by the polarizing element 452, and the polarized white light enters the prism 459.
- the optical paths of the blue laser light from the blue laser light source 453 and the red laser light from the red laser light source 454 are synthesized by the dichroic prisms 455 and 456.
- the optical path synthesized laser light passes through the mask pattern 409, whereby a pattern is added.
- the laser light to which the pattern is added is converted into, for example, S-polarized light (polarized light parallel to the reflecting surface of the prism 459) by the polarizing element 452, and the polarized laser light enters the prism 459.
- White light that is non-patterned light and laser light that is patterned light are incident on the light guide member 460 by optical path synthesis by the prism 459.
- the light guide member 460 guides incident light to the projection lens 461 provided at the distal end portion of the scope portion.
- the guided non-pattern light and pattern light are projected onto the subject 5 by the projection lens 410.
- pattern illumination is generated in advance before the light guide member 460 is incident, and is combined with non-patterned light and then incident on the light guide member 460.
- the light guide member 460 is an optical fiber bundle (image guide) that can transmit the pattern as it is, the pattern of the generated pattern light is transmitted conveniently.
- the light guide member 460 is a simple optical fiber bundle (light guide)
- the fiber arrangement is different between the incident end and the output end of the light guide member 460, so that the generated pattern light pattern is transmitted as it is.
- the pattern is a random pattern, the patterns at the incident end and the outgoing end may be appropriately changed.
- Imaging Unit Hereinafter, a configuration example of an imaging unit capable of switching between the observation mode and the stereo measurement mode will be described.
- a scope In an inspection with an endoscopic device, for example, a scope is inserted into the inspection object and a normal image is taken to check for abnormalities. Measure the three-dimensional shape and examine whether further inspection is necessary. Thus, a normal observation image is taken with white light.
- a method of achieving both such shooting with white light and stereo measurement for example, performing stereo shooting with white light can be considered.
- the image sensor is divided into left and right parts, and the left image and the right image need to be imaged in the respective regions.
- there is a color phase difference method As a method for forming the left image and the right image in the same area of the image sensor, there is a color phase difference method. However, since the captured image becomes a color shift image, it cannot be used as an observation image.
- Patent Document 2 As a technique for performing stereo measurement in a non-time-division manner using a color phase difference, for example, there is Patent Document 2 described above.
- Patent Document 2 applies stereo measurement to autofocus, and it is considered that high-speed switching with an observation image is not assumed.
- it is considered disadvantageous in terms of high-speed switching.
- Patent Document 2 has a problem in that it is difficult to increase the distance measurement accuracy because it is difficult to increase the distance between the pupils because the single optical path is divided into right and left in the middle.
- the diaphragm is small (F value is large). Therefore, the small diaphragm diameter is divided into right and left, and the distance between the pupils tends to be close.
- 11 and 12 show a configuration example of the imaging unit of the present embodiment that can solve the above-described problems.
- 11 and 12 are cross-sectional views (on a plane including the optical axis) of the image pickup unit, and a light amount of an image formed on the image pickup element (or a pixel value of an image picked up by the image pickup element).
- the relationship of the position x is shown.
- the position x is a position (coordinates) in a direction perpendicular to the optical axis of the imaging optical system, for example, a pixel position of the image sensor. Actually, it is a two-dimensional coordinate system, but here, a two-dimensional one-dimensional coordinate system in the parallax direction will be described.
- the imaging unit includes an imaging optical system 10, a movable mask 30 (first mask), a fixed mask 20 (second mask), an imaging element 40 (imaging sensor, image sensor), and an illumination unit 60 (illumination device).
- the imaging optical system 10 is a monocular optical system, and includes, for example, one or a plurality of lenses.
- the image pickup device 40 has RGB color filters of the Bayer array will be described as an example.
- the present invention is not limited to this, and may include, for example, a complementary color filter.
- the reflected light from the subject 5 is imaged on the surface of the image sensor 40 by the imaging optical system 10.
- the fixed mask 20 divides the pupil center and the left and right pupils, and the movable mask 30 switches between the image formation by the pupil center and the image formation by the left and right pupils. These are imaged in the same area of the image sensor 40.
- d is the distance between the center line IC1 of the left pupil (the left eye aperture of the fixed mask 20) and the center line IC2 of the right pupil (the right eye aperture of the fixed mask 20), and the baseline length in stereo measurement It becomes.
- the straight line AXC is the optical axis of the imaging optical system 10.
- the center lines IC1 and IC2 are provided at an equal distance from the optical axis AXC of the single-lens imaging optical system 10, for example.
- the center lines IC1 and IC2 and the optical axis AXC are preferably in the same plane, but are not necessarily in the same plane.
- the fixed mask 20 and the movable mask 30 are provided at the pupil position of the imaging optical system 10, for example. Alternatively, it may be provided on the imaging side with respect to the imaging optical system 10.
- the fixed mask 20 is fixed with respect to the imaging optical system 10, and the movable mask 30 is configured such that the position can be switched in a plane perpendicular to the optical axis AXC.
- the movable mask 30 has an observation mode (first mode, non-stereo mode, monocular mode) which is the first state shown in FIG. 11 and a stereo measurement mode (second mode) which is the second state shown in FIG. , Stereo mode), which can be switched at high speed.
- the fixed mask 20 includes a plate-shaped light shielding portion (light shielding member) provided with three apertures (left eye aperture, right eye aperture, and central aperture), and a short wavelength ( Blue) spectral filter and a long wavelength (red) spectral filter provided in the right eye aperture.
- the portions other than the aperture hole are covered with a light shielding portion so that light does not pass through.
- the central aperture may be, for example, a through hole, or some spectral filter (for example, a broadband spectral filter that transmits at least white light) may be provided.
- the movable mask 30 includes a plate-shaped light shielding portion (light shielding member) provided with three aperture holes. In each mode, the movable mask 30 is configured in such a size that the light blocking portion covers the central aperture hole or the left and right eye aperture holes among the three aperture holes of the fixed mask 20.
- the aperture is provided at a position overlapping the central aperture of the fixed mask 20 in the observation mode and at a position overlapping the left eye aperture and the right eye aperture in the stereo measurement mode.
- the movable mask 30 is also referred to as a left eye aperture, a right eye aperture, and a center aperture.
- 11 and 12 illustrate the case where the movable mask 30 is provided on the imaging side with respect to the fixed mask 20, the movable mask 30 may be provided on the objective side with respect to the fixed mask 20.
- the illumination unit 60 is preferably provided so that the tip (illumination exit end) is positioned symmetrically with respect to the left pupil and the right pupil.
- the negative of the illumination unit 60 is not necessarily limited to the left pupil and the right pupil. May not be symmetrical. 11 and 12, the tip of the illumination unit 60 is disposed in front of the imaging optical system 10.
- the present invention is not limited to this.
- the illumination unit 60 and the imaging optical system 10 at the tip of the imaging unit. May be arranged side by side.
- each diaphragm hole of the movable mask 30 is not provided with a spectral filter (is an open hole), and allows the entire band to pass.
- FIG. 11 shows the state of the observation mode, where the optical path at the center of the pupil is opened through the central aperture of the fixed mask 20 and the central aperture of the movable mask, and the optical path of the left and right pupils is blocked by the movable mask 30. (Light shielding).
- the image formed on the image sensor 40 is a formed image IC formed only by the pupil center, and a normal (monocular white light) captured image is obtained.
- FIG. 12 shows a state of the stereo measurement mode, in which the left eye diaphragm hole of the fixed mask 20 and the left eye diaphragm hole of the movable mask 30 are overlapped, and the right eye diaphragm hole of the fixed mask 20 and the movable mask 30 are overlapped.
- the right eye aperture hole is overlapped.
- the optical path at the center of the pupil is blocked (shielded) by the movable mask 30. That is, in the optical path on the left pupil side, the imaging light is filtered by the short wavelength (blue) spectral filter FL (first filter), and an image IL based on the short wavelength component is formed on the image sensor 40. In the optical path on the right pupil side, the imaging light is filtered by a long wavelength (red) spectral filter FR (second filter), and an image IR based on the long wavelength component is formed on the same image sensor 40.
- red red
- the image IL obtained from the blue pixels of the image sensor 40 is a short wavelength image
- the image IR obtained from the red pixels of the image sensor 40 is a long wavelength image
- the images IL and IR from the two optical paths are obtained.
- the imaging unit of the present embodiment can take the observation mode that is the first state and the stereo measurement mode that is the second state, and can switch between these states at high speed. Thereby, for example, 3D measurement can be performed in real time while performing non-stereo normal observation.
- FIGS. 13 and 14 show a first detailed configuration example of the fixed mask 20 and the movable mask 30.
- 13 and 14 are cross-sectional views of the imaging optical system 10, the fixed mask 20, and the movable mask 30, and a view of the fixed mask 20 and the movable mask 30 in the optical axis direction (a rear view viewed from the imaging side). ).
- a diaphragm hole 21 having a short wavelength filter FL is opened in the optical path of the left pupil of the fixed mask 20, and a diaphragm hole 22 having a long wavelength spectral filter FR is formed in the optical path of the right pupil.
- the optical path is provided with an aperture (through hole) aperture 23.
- the aperture 23 may be provided with a spectral filter FC that allows the bands Pv1, Pv2, and Pir of FIG. 3 to pass therethrough.
- the aperture holes 21 and 22 are opened in the light shielding portion 24 (light shielding member), and are, for example, holes having a size corresponding to the depth of field necessary for the imaging system (for example, circular holes, the size is a diameter).
- the centers of the aperture holes 21, 22, and 23 coincide with (including substantially coincident with) the center lines IC1 and IC2 and the optical axis AXC, respectively.
- the light shielding unit 24 is provided so as to close the housing containing the imaging optical system 10 when viewed from the front (or the back), and is provided, for example, perpendicular to the optical axis AXC. It is a plate-like member.
- the movable mask 30 has aperture holes 31, 32, 33 in an open state (through hole) and a light shielding portion 34 (light shielding member) in which the aperture holes 31, 32, 33 are opened.
- the aperture holes 31, 32, and 33 are holes that are slightly larger than the aperture holes 21, 22, and 23 of the fixed mask 20, for example. Alternatively, it may be a hole having a size corresponding to the depth of field necessary for the imaging system (for example, a circular hole having a diameter).
- the center of the aperture 33 (for example, the center of a circle) coincides with (including substantially coincides with) the optical axis AXC in the observation mode.
- the light shielding unit 34 is connected to a rotation shaft 35 perpendicular to the optical axis AXC, and is a plate-like member provided perpendicular to the optical axis AXC, for example.
- the shape of the light shielding part 34 is, for example, a fan shape (the fan base is connected to the shaft 35), but is not limited thereto, and may be any shape that can realize the states of FIGS.
- the movable mask 30 is configured to rotate about a rotation axis 35 by a predetermined angle in a direction perpendicular to the optical axis AXC.
- rotational movement can be realized by a piezo element or a motor.
- the movable mask 30 is rotated and tilted to the right eye by a predetermined angle, the pupil center optical path (aperture hole 23) of the fixed mask 20 is opened, and the left and right pupil optical paths (aperture holes 21, 22) is in a light shielding state.
- the movable mask 30 rotates and tilts to the left eye side by a predetermined angle, the pupil center optical path (aperture hole 23) of the fixed mask 20 enters a light-shielded state, and the left and right pupil optical paths (aperture hole 21). 22) is in an open state.
- the aperture 21 having the spectral filter FL By exposing the aperture 21 having the spectral filter FL, the left pupil passes only the short wavelength component, and by exposing the aperture 22 having the spectral filter FR, the right pupil passes only the long wavelength component.
- the movable mask 30 may be moved by a sliding operation to create two states.
- the rotation operation or the slide operation can be realized by, for example, a magnet mechanism or a piezoelectric mechanism, and an appropriate one may be selected in consideration of high speed and durability.
- the imaging device includes the imaging element 40, the imaging optical system 10, the fixed mask 20, and the movable mask 30.
- the imaging optical system 10 forms an image of the subject 5 on the image sensor 40.
- the fixed mask 20 includes first to third apertures (diaphragm holes 21, 22, and 23) that divide the pupil of the imaging optical system 10, and a first wavelength band (Pb in FIG. 3 or Pb1 in FIG. 5).
- a first filter FL that passes therethrough and a second filter FR that passes the second wavelength band (Pr in FIG. 3 or Pr2 in FIG. 5) are included.
- the movable mask 30 includes a light shielding portion 34 and fourth to sixth openings (throttle holes 31, 32) provided in the light shielding portion 34 corresponding to the first to third openings (throttle holes 21, 22, 23). 33) and is movable with respect to the imaging optical system 10.
- the first filter FL is provided in the first opening (throttle hole 21).
- the second filter FR is provided in the second opening (throttle hole 22).
- the third opening (aperture hole 23) is provided on the optical axis AXC of the imaging optical system 10.
- the movable mask 30 since there is one movable mask 30 that is a movable part, it is possible to realize high-speed switching, simplification of the driving mechanism, and suppression of failures and errors in mode switching.
- the movable mask 30 has a simple configuration in which openings (diaphragm holes 31, 32, 33) are provided in the light-shielding portion 34, and troubles such as filter removal due to switching vibration can be suppressed.
- the fixed mask 20 is provided with three openings (diaphragm holes 21, 22, and 23), and one of them is provided on the optical axis AXC, so that the observation image becomes a pupil center image. .
- the vignetting of the light beam is reduced and an observation image having a wide viewing angle can be acquired.
- high quality (for example, less distortion) imaging can be obtained.
- the center (s / 2 position) of the phase difference (s in FIG. 12) in stereo measurement coincides with the light beam passing through the center of the pupil. That is, in the present embodiment, the same pixel in the observation image and the distance map corresponds to the same position on the subject 5.
- the observation image has a parallax on the left and is not the center of the pupil, so that the pixel on the subject image 5 and the distance map corresponding to the same position on the subject 5 corresponds.
- the present embodiment is more advantageous when, for example, displaying an observation image and three-dimensional information in an overlapping manner.
- the first aperture corresponds to the left pupil
- the second aperture corresponds to the right pupil
- the third aperture is at the pupil center.
- the first opening may correspond to the right pupil
- the second opening may correspond to the left pupil.
- the aperture is referred to as a diaphragm aperture.
- the aperture does not necessarily have a function as a diaphragm (a function of limiting the cross-sectional area of the light beam passing through the pupil).
- the apertures 23 and 33 overlap in the observation mode, but when the aperture 23 is smaller, the aperture 23 has a function of aperture, and when the aperture 33 is smaller, the aperture 33 is the aperture. It has the function of.
- the pupil is for separating (or defining) the imaging optical path by the imaging optical system 10.
- the optical path is a path from the light that forms an image on the image sensor 40 to the image sensor 40 after entering from the objective side of the optical system. That is, the optical paths that pass through the imaging optical system 10 and the apertures 21 and 22 of the fixed mask 20 (the apertures 31 and 32 of the movable mask 30 in the stereo measurement mode) are the first and second optical paths.
- the optical path that passes through the imaging optical system 10 and the aperture 23 of the fixed mask 20 (or the aperture 33 of the movable mask 30 in the observation mode) is the third optical path.
- the mask is a member or component that shields light incident on the mask and allows part of the light to pass.
- the light shielding portions 24 and 34 shield light
- the aperture holes 21, 22, 23, 31, 32, and 33 emit light (full band or partial band). Let it pass.
- the imaging apparatus includes a movable mask control unit 340 (FIG. 18) that controls the movable mask 30.
- the movable mask control unit 340 When the movable mask control unit 340 is viewed in the optical axis AXC direction in the non-stereo mode (observation mode), the light shielding unit 34 overlaps the first and second openings (diaphragm holes 21 and 22) and the sixth opening.
- the movable mask 30 is set in a first state (first position) where the (aperture hole 33) overlaps with the third opening (aperture hole 23).
- the fourth and fifth apertures become the first and second apertures (diaphragm apertures 21 and 22).
- the movable mask 30 is set in a second state (second position) where the light shielding portion 34 overlaps with the third opening (aperture hole 23).
- FIGS. 15 and 16 show a second detailed configuration example of the fixed mask 20 and the movable mask 30.
- 15 and 16 are cross-sectional views of the imaging optical system 10, the fixed mask 20, and the movable mask 30, and a view of the fixed mask 20 and the movable mask 30 in the optical axis direction (a rear view viewed from the imaging side). ).
- the movable mask 30 includes a light shielding part 34 and aperture holes 31 and 32 provided in the light shielding part 34.
- the aperture holes 31 and 32 are in an open state (through hole), and are arranged on the same circle around the rotation shaft 35.
- the aperture hole 31 has a shape extending in the circumferential direction of the same circle, and overlaps the aperture hole 23 of the fixed mask 20 in the observation mode and also overlaps the aperture hole 21 of the fixed mask 20 in the stereo measurement mode. ing.
- the fixed mask 20 includes a light shielding part 24 and three aperture holes 21, 22, 23 provided in the light shielding part 24.
- the aperture holes 21 and 22 are provided with spectral filters FL and FR.
- the aperture hole 23 may be in an open state (through hole), or a spectral filter FC that allows the bands Pv1, Pv2, and Pir of FIG. 3 to pass therethrough may be provided.
- the aperture holes 21, 22, and 23 are arranged on the same circle around the rotation shaft 35.
- the aperture hole 23 at the center of the pupil of the fixed mask 20 is opened by the aperture hole 31 of the movable mask 30, and the aperture holes 21 and 22 of the left and right pupils of the fixed mask 20 are shielded by the light shielding portion 34 of the movable mask 30.
- a white light image is captured by a single eye.
- the left and right pupil apertures 21 and 22 of the fixed mask 20 are opened by the aperture holes 31 and 32 of the movable mask 30, and the aperture hole 23 at the center of the pupil of the fixed mask 20 is the light shielding portion 34 of the movable mask 30.
- a parallax image (red image, blue image) by the color phase difference method is captured.
- the imaging device includes the imaging element 40, the imaging optical system 10, the fixed mask 20, and the movable mask 30.
- the imaging optical system 10 forms an image of the subject 5 on the image sensor 40.
- the fixed mask 20 includes first to third apertures (diaphragm holes 21, 22, and 23) that divide the pupil of the imaging optical system 10, and a first wavelength band (Pb in FIG. 3 or Pb1 in FIG. 5).
- a first filter FL that passes therethrough and a second filter FR that passes the second wavelength band (Pr in FIG. 3 or Pr2 in FIG. 5) are included.
- the movable mask 30 includes a light shielding part 34, a fourth opening (aperture hole 31) provided in the light shielding part 34 corresponding to the first and third openings (throttle holes 21 and 23), and a second opening.
- a fifth aperture (diaphragm hole 32) provided in the light shielding portion 34 corresponding to the (diaphragm hole 22) is provided, and is movable with respect to the imaging optical system 10.
- the first filter FL is provided in the first opening (throttle hole 21).
- the second filter FR is provided in the second opening (throttle hole 22).
- the third opening (aperture hole 23) is provided on the optical axis AXC of the imaging optical system 10.
- the imaging apparatus includes a movable mask control unit 340 that controls the movable mask 30.
- the movable mask control unit 340 When the movable mask control unit 340 is viewed in the optical axis AXC direction in the non-stereo mode (observation mode), the light shielding unit 34 overlaps the first and second openings (diaphragm holes 21 and 22) and the fourth opening.
- the movable mask 30 is set in a first state where the (aperture hole 31) overlaps the third opening (aperture hole 23).
- the fourth and fifth apertures become the first and second apertures (diaphragm apertures 21 and 22).
- the movable mask 30 is set in a second state in which the light shielding portion 34 overlaps with the third opening (aperture hole 23).
- a coordinate system X, Y, and Z of the three-dimensional space is defined as follows. That is, the X axis and the Y axis orthogonal to the X axis are set along the imaging sensor surface, and the Z axis is set in a direction toward the subject in a direction orthogonal to the imaging sensor surface and parallel to the optical axis AXC. The Z axis intersects the X axis and Y axis at the zero point. Note that the Y-axis is omitted here for convenience.
- the distance between the imaging lens 10 and the imaging sensor surface is b, and the distance from the imaging lens 10 to the arbitrary point Q (x, z) of the subject 5 is z.
- the distance between the pupil center lines IC1 and IC2 and the Z axis is the same, and each is d / 2. That is, the baseline length in stereo measurement is d.
- the X coordinate of the corresponding point at which the arbitrary point Q (x, y) of the subject 5 is imaged on the imaging sensor surface by the imaging lens 10 is XL, and the arbitrary point Q (x, y) of the subject 5 is the imaging lens 10.
- the X coordinate of the corresponding point imaged on the imaging sensor surface is defined as XR.
- the following equation (4) can be obtained by using a similarity relationship between a plurality of partial right-angled triangles formed in the triangle surrounded by the arbitrary point Q (x, z) and the coordinates XL and XR.
- D and b are known set values, and the unknowns XL and XR are obtained as follows. That is, the position XL on the imaging sensor surface is considered as a reference (the pixel position of the left image is regarded as XL), and the position XR corresponding to the position XL is detected by matching processing (correlation calculation). By calculating the distance z for each position XL, the shape of the subject can be measured. If the matching is not good, the distance z may not be obtained, but may be obtained by interpolation from the distance z of surrounding pixels, for example.
- FIG. 18 shows a configuration example of the endoscope device (imaging device in a broad sense) of the present embodiment.
- the endoscope apparatus includes a scope unit 100 (imaging unit) and a main body unit 200 (control device).
- the scope unit 100 includes an imaging optical system 10, a fixed mask 20, a movable mask 30, an image sensor 40, a drive unit 50, and an illumination unit 60.
- the main body unit 200 includes a processing unit 210, a monitor display unit 220, and an imaging processing unit 230.
- the processing unit 210 includes a light source drive control unit 305, an image selection unit 310 (image frame selection unit), a color image generation unit 320 (image output unit), a phase difference detection unit 330, and a movable mask control unit 340 (movable mask drive control unit). ), A movable mask position detector 350, a distance information calculator 360, and a three-dimensional information generator 370.
- the main body unit 200 may include an operation unit that operates the main body unit 200, an interface unit that is connected to an external device, and the like as components (not illustrated).
- the scope unit 100 may include, for example, an operation unit that operates the scope unit 100, a treatment instrument, and the like as components not shown.
- an industrial and medical so-called video scope an endoscope apparatus incorporating an image sensor
- the present invention can be applied to both a flexible mirror in which the scope unit 100 is configured to be bendable and a rigid mirror in which the scope unit 100 is configured in a stick shape.
- the main body 200 and the imaging unit 110 are configured as portable devices that can be carried, and are used for manufacturing inspection and maintenance inspection of industrial products, maintenance inspection of buildings and piping, and the like.
- the driving unit 50 drives the movable mask 30 based on a control signal from the movable mask control unit 340, and switches between the first state (observation mode) and the second state (stereo measurement mode).
- the drive unit 50 is configured by an actuator using a piezoelectric element or a magnet mechanism.
- the imaging processing unit 230 performs imaging processing on the signal from the imaging element 40 and outputs a captured image (for example, a Bayer image). For example, correlated double sampling processing, gain control processing, A / D conversion processing, gamma correction, color correction, noise reduction, and the like are performed.
- the imaging processing unit 230 may be configured by, for example, a discrete IC such as an ASIC, or may be incorporated in the imaging device 40 (sensor chip) or the processing unit 210.
- the monitor display unit 220 displays an image captured by the scope unit 100, 3D shape information of the subject 5, and the like.
- the monitor display unit 220 includes a liquid crystal display, an EL (Electro-Luminescence) display, or the like.
- the illumination unit 60 irradiates the subject 5 with the combined light of the non-pattern light and the pattern light described above.
- the light source drive control unit 305 optimally controls each light amount of the non-pattern light and the pattern light based on a signal from the imaging processing unit 230 (so-called dimming control). For example, the brightness of the captured image is obtained, and the amount of light is controlled so that the brightness is within a predetermined range.
- the movable mask control unit 340 controls the driving unit 50 to switch the position of the movable mask 30.
- the movable mask control unit 340 sets the movable mask 30 to the observation mode, the reflected light from the subject 5 is imaged on the image sensor 40 via the pupil center optical path.
- the imaging processing unit 230 reads the pixel value of the image formed on the imaging element 40, performs A / D conversion or the like, and outputs the image data to the image selection unit 310.
- the image selection unit 310 detects that the movable mask 30 is in the observation mode based on the control signal from the movable mask control unit 340, and selects â Vr, Vg, Vb â from the captured image to generate a color image.
- the color image generation unit 320 performs demosaicing processing (processing for generating an RGB image from a Bayer image) and various types of image processing, and outputs a three-plate RGB primary color image to the monitor display unit 220.
- the monitor display unit 220 displays the color image.
- the movable mask control unit 340 sets the movable mask 30 to the stereo measurement mode, the reflected light from the subject 5 is simultaneously imaged on the image sensor 40 via the left pupil optical path and the right pupil optical path.
- the imaging processing unit 230 reads the pixel value of the image formed on the imaging element 40, performs A / D conversion or the like, and outputs the image data to the image selection unit 310.
- the image selection unit 310 detects that the movable mask 30 is in the stereo measurement mode based on a control signal from the movable mask control unit 340, selects â Mr, Mb â from the captured image, and detects the phase difference. To 330.
- the phase difference detection unit 330 performs matching processing on the two separated images Mr and Mb, and detects a phase difference (phase shift) for each pixel. Further, the phase difference detection unit 330 determines whether or not the phase difference detection is reliable. If it is determined that the phase difference detection is not reliable, an error flag is output for each pixel.
- a matching evaluation method for obtaining a shift amount (phase difference) between two similar waveforms is based on a normalized cross-correlation calculation method represented by ZNCC (Zero-meanCCNormalized Cross-Correlation), and a sum of absolute values of mutual differences.
- ZNCC Zero-meanCCNormalized Cross-Correlation
- SAD Sud of Absolute Difference
- phase shift can be detected by using Vr and Mr, which are parallax images, although they are time-divisionally affected by subject blur and imaging system blur.
- Vr and Mr which are parallax images, although they are time-divisionally affected by subject blur and imaging system blur.
- the phase difference detection unit 330 outputs the detected phase difference information and error flag to the distance information calculation unit 360.
- the distance information calculation unit 360 calculates the distance information of the subject 5 (for example, the distance z in FIG. 17) for each pixel, and outputs the distance information to the three-dimensional information generation unit 370.
- the pixel on which the error flag is set may be regarded as a flat portion (region having a small edge component) of the subject 5 and may be interpolated from distance information of surrounding pixels, for example.
- the three-dimensional information generation unit 370 generates three-dimensional information from the distance information (or the distance information and the RGB image from the color image generation unit 320).
- the three-dimensional information generation unit 370 generates the generated three-dimensional image, the three-dimensional data, or a display image in which these and the observation image are superimposed as necessary, and outputs the generated image to the monitor display unit 220.
- the monitor display unit 220 displays the three-dimensional information.
- the movable mask position detector 350 detects whether the movable mask 30 is in the observation mode position or the stereo measurement mode position using the image â Mr, Mb â obtained in the stereo measurement mode. If it is determined that the state of the movable mask 30 does not match the mode, a position error flag is output to the movable mask control unit 340.
- the movable mask control unit 340 receives the position error flag and corrects the movable mask 30 to a correct state (a state corresponding to image selection). For example, when it is determined that there is no color shift in the image â Mr, Mb â even though the movable mask control unit 340 outputs a control signal for the stereo measurement mode, the actual movable mask 30 is positioned in the observation mode. It has become. In this case, correction is performed to match the position of the control signal and the movable mask 30. If the correct state is not obtained even if the correction operation is performed, it is determined that some failure has occurred, and the entire function is stopped.
- the movable mask 30 is composed of a mechanical mechanism, it is conceivable that a malfunction occurs in the switching operation. According to the present embodiment, since it is possible to detect whether the switching position is the observation mode or the measurement mode, it is possible to cope with a problem of the switching operation.
- Detecting or judging whether the movable mask 30 is in the observation mode position or the stereo measurement mode position is performed as follows, for example. That is, after matching the levels (average level, etc.) in the judgment areas of the images Mr and Mb, judgment based on the sum of absolute difference values of the images Mr and Mb (first method) and the correlation between the images Mr and Mb The position error is determined by determination based on the number (second method) or the like.
- the absolute value of the difference value of the pixel value is obtained for each pixel, and it is integrated in all pixels or a partial pixel group. If the result exceeds a predetermined threshold, it is determined as an image in the stereo measurement mode, and if the result is less than the predetermined threshold, it is determined as an image in the observation mode.
- the image Mr and the image Mb are basically images that have undergone color misregistration, so that the fact that a predetermined amount of difference value is obtained is used.
- the correlation coefficient in the predetermined range between the image Mr and the image Mb is calculated, and when the result is equal to or smaller than the predetermined threshold, it is determined as an image in the stereo measurement mode, and the result exceeds the predetermined threshold.
- the image is an observation mode.
- the image Mr and the image Mb are basically images that have undergone color misregistration, so the correlation coefficient is small, whereas in the observation mode, the image Mr and the image Mb are almost the same image, so the correlation coefficient is Take advantage of big things.
- the endoscope apparatus, the imaging apparatus, and the like of the present embodiment may include a processor and a memory.
- the processor here may be, for example, a CPU (Central Processing Unit). However, the processor is not limited to the CPU, and various processors such as a GPU (GraphicsGProcessing Unit) or a DSP (Digital Signal Processor) can be used.
- the processor may be an ASIC hardware circuit.
- the memory stores instructions that can be read by a computer. When the instructions are executed by the processor, each unit (for example, each unit of the processing unit 210) of the endoscope apparatus, the imaging apparatus, and the like according to the present embodiment. Etc.) will be realized.
- the memory here may be a semiconductor memory such as SRAM or DRAM, or a register or a hard disk.
- the instruction here may be an instruction of an instruction set constituting the program, or an instruction for instructing an operation to the hardware circuit of the processor.
- FIG. 19 shows a sequence (operation timing chart) for switching between the observation mode and the stereo measurement mode in moving image shooting.
- switching of the state of the movable mask 30, imaging timing, and selection of a captured image are interlocked.
- the mask state in the observation mode and the mask state in the stereo measurement mode are alternately repeated.
- imaging is performed once in each mask state.
- an image that is exposed and imaged by the image sensor 40 when in the mask state of the observation mode is selected as an observation image.
- an image that is exposed and imaged by the image sensor 40 when in the mask state of the stereo measurement mode is selected as a measurement image.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Astronomy & Astrophysics (AREA)
- Optics & Photonics (AREA)
- Endoscopes (AREA)
Abstract
Description
ãæ¬çºæã¯ãæ®åè£ çœ®ãå èŠé¡è£ 眮åã³æ®åæ¹æ³çã«é¢ããã The present invention relates to an imaging device, an endoscope device, an imaging method, and the like.
ãç©äœã®è¡šé¢åœ¢ç¶ãå åŠçã«ïŒïŒ€èšæž¬ããææ³ã¯çš®ã ææ¡ãããŠããããã®ææ³ã®ïŒã€ãšããŠãã¢ã¯ãã£ããã¿ãŒã³ç §æãç©äœã«æåœ±ããŠãäž¡çŒç«äœèŠã«ããã¹ãã¬ãªïŒæ¬¡å è·é¢èšæž¬ãè¡ãææ³ããããã¹ãã¬ãªèšæž¬ã§ã¯ãç»åã®ç¹åŸŽéšåïŒäŸãã°ãšããžïŒãçšããŠãããã³ã°åŠçãè¡ããç©äœãŸã§ã®è·é¢ãæ±ãããã¢ã¯ãã£ããã¿ãŒã³ãæåœ±ããããšã§ãæå³çã«è¢«åäœã«ç¹åŸŽéšåãä»äžããç¹åŸŽéšåã®ç¡ãéšåïŒäŸãã°å¹³åŠéšïŒã«ãããŠãæ£ç¢ºã«è·é¢èšæž¬ãå¯èœã«ããã Various methods for optically 3D measuring the surface shape of an object have been proposed. As one of the methods, a method of projecting active pattern illumination onto an object and performing stereo three-dimensional distance measurement by binocular stereoscopic vision is proposed. is there. In stereo measurement, matching processing is performed using a characteristic part (for example, an edge) of an image, and a distance to an object is obtained. By projecting the active pattern, a characteristic part is intentionally added to the subject, and distance measurement can be accurately performed even in a part without a characteristic part (for example, a flat part).
ãã¹ãã¬ãªèšæž¬ã®åŸæ¥æè¡ãšããŠã¯ãäŸãã°ç¹èš±æç®ïŒãïŒã«é瀺ãããæè¡ããããç¹èš±æç®ïŒã«ã¯ãå·Šå³ã®çµåå
è·¯ãã¡ã«ãã«ã«ã·ã£ãã¿ãŒã«ããæéçã«åãæ¿ããå·Šå³ã®ç»åãæåå²ã«ååŸããææ³ãé瀺ãããŠãããç¹èš±æç®ïŒã«ã¯ãåäžã®çµåå
è·¯ã®å·Šååã«ïŒ²ïŒ§ãã£ã«ã¿ãæ¿å
¥ããå³ååã«ïŒ§ïŒ¢ãã£ã«ã¿ãæ¿å
¥ããæ®åç»åã®ïŒ²ç»åãšïŒ¢ç»åã«ããå·Šå³ç»åãåé¢ããææ³ãé瀺ãããŠããããŸãç¹èš±æç®ïŒã§ã¯ãé垞芳å¯ã®å Žåã«ãããŠïŒ²ïŒ§ãã£ã«ã¿åã³ïŒ§ïŒ¢ãã£ã«ã¿ãçµåå
è·¯ããéé¿ããã芳å¯ç»åãååŸããã
As a conventional technique of stereo measurement, for example, there are techniques disclosed in
ãäžèšã®ãããªã¢ã¯ãã£ããã¿ãŒã³ç §æãçšããå Žåãæ®åœ±ãããç»åã«ã¢ã¯ãã£ããã¿ãŒã³ãåãããããã©ããç §æã«ããéåžžã®ç»åãåŸãããªããšãã課é¡ããããäŸãã°ãå èŠé¡è£ 眮ãçšããéç Žå£æ€æ»çã«ãããŠãã¹ãã¬ãªæ®åœ±ã§ã¯ãªã芳å¯ç»åã®æ®åœ±ã«ããæ€æ»ãè¡ãã詳现ã«èª¿ã¹ããéšåã§ã¹ãã¬ãªèšæž¬ãè¡ãããšãããããã®ãããªå ŽåãäŸãã°ãã©ããç §æãšã¢ã¯ãã£ããã¿ãŒã³ç §æãåãæ¿ããããšãèããããããç §ææ©æ§ãç ©éã«ãªã£ãŠããŸããæãã¯ãã¹ãã¬ãªæ®åœ±ãšé垞芳å¯ã亀äºã«åãæ¿ããããšã«ãã£ãŠãé垞芳å¯ãè¡ããªãããªã¢ã«ã¿ã€ã ãªã¹ãã¬ãªèšæž¬ãè¡ãããšãã§ããå Žåãå©äŸ¿æ§ãåäžãããããã®éã«ç §æã®åãæ¿ãã䌎ããªãæ¹ãæãŸããã When the active pattern illumination as described above is used, there is a problem that a normal image cannot be obtained by flat illumination because the active pattern is reflected in the captured image. For example, in a non-destructive inspection using an endoscope apparatus, inspection may be performed by capturing an observation image that is not stereo shooting, and stereo measurement may be performed at a portion to be examined in detail. In such a case, for example, switching between flat illumination and active pattern illumination can be considered, but the illumination mechanism becomes complicated. Alternatively, by alternately switching between stereo shooting and normal observation, convenience can be improved when real-time stereo measurement can be performed while performing normal observation, but it is preferable that illumination switching is not accompanied at that time. .
ãæ¬çºæã®å¹Ÿã€ãã®æ æ§ã«ããã°ãã¢ã¯ãã£ããã¿ãŒã³ç §æã«ããç §æãè¡ããªããéåžžã®èгå¯ç»åãæ®åœ±ããããšãå¯èœãªæ®åè£ çœ®ãå èŠé¡è£ 眮åã³æ®åæ¹æ³çãæäŸã§ããã According to some aspects of the present invention, it is possible to provide an imaging apparatus, an endoscope apparatus, an imaging method, and the like that can capture a normal observation image while performing illumination with active pattern illumination.
ãæ¬çºæã®äžæ æ§ã¯ã第ïŒè²ã®ç»åãšãåèšç¬¬ïŒè²ãããé·æ³¢é·åŽã®ç¬¬ïŒè²ã®ç»åãšãåèšç¬¬ïŒè²ãããé·æ³¢é·åŽã®ç¬¬ïŒè²ã®ç»åãšãå«ãæ®åç»åãæ®åããåèšç¬¬ïŒè²ã®ç»åãšåèšç¬¬ïŒè²ã®ç»åãã¹ãã¬ãªç»åãšããŠæ®åå¯èœãªæ®åéšãšãåèšç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸãããšå ±ã«åèšç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸããªã第ïŒã®æ³¢é·åž¯åãåã³åèšç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸãããšå ±ã«åèšç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸããªã第ïŒã®æ³¢é·åž¯åã«ãããŠæäžã®å éååžãæãããã¿ãŒã³ç §æã被åäœã«ç §å°ããç §æéšãšããå«ãæ®åè£ çœ®ã«é¢ä¿ããã One embodiment of the present invention includes an image including a first color image, a second color image having a longer wavelength side than the first color, and a third color image having a longer wavelength side than the second color. An imaging unit that captures an image and can capture the first color image and the third color image as a stereo image, and is included in the wavelength band of the first color and included in the wavelength band of the second color A pattern illumination having a given light amount distribution in a second wavelength band that is included in the first wavelength band that is not included in the wavelength band of the third color and that is not included in the wavelength band of the second color. And an illumination unit.
ãæ¬çºæã®äžæ æ§ã«ããã°ã第ïŒè²ã®æ³¢é·åž¯åã«å«ãŸãã第ïŒã®æ³¢é·åž¯åãšç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸãã第ïŒã®æ³¢é·åž¯åã«ãããŠæäžã®å éååžãæãããã¿ãŒã³ç §æã被åäœã«ç §å°ãããã第ïŒã®æ³¢é·åž¯åãšç¬¬ïŒã®æ³¢é·åž¯åã¯ç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸããªãã®ã§ã第ïŒè²ã®ç»åã¯ããã¿ãŒã³ç §æã«ããæäžã®å éååžã®åœ±é¿ãåããªãç»åãšãªããããã«ãããã¢ã¯ãã£ããã¿ãŒã³ç §æã«ããç §æãè¡ããªããéåžžã®èгå¯ç»åãæ®åœ±ããããšãå¯èœã«ãªãã According to one aspect of the present invention, a pattern illumination having a given light amount distribution in a first wavelength band included in the first color wavelength band and a second wavelength band included in the third color wavelength band is a subject. Is irradiated. Since the first wavelength band and the second wavelength band are not included in the wavelength band of the second color, the image of the second color is an image that is not affected by a given light amount distribution due to pattern illumination. This makes it possible to capture a normal observation image while performing illumination with active pattern illumination.
ããŸãæ¬çºæã®ä»ã®æ æ§ã¯ãäžèšã«èšèŒãããæ®åè£ çœ®ãå«ãå èŠé¡è£ 眮ã«é¢ä¿ããã Further, another aspect of the present invention relates to an endoscope apparatus that includes the imaging apparatus described above.
ããŸãæ¬çºæã®æŽã«ä»ã®æ æ§ã¯ã第ïŒè²ã®ç»åãšãåèšç¬¬ïŒè²ãããé·æ³¢é·åŽã®ç¬¬ïŒè²ã®ç»åãšãåèšç¬¬ïŒè²ãããé·æ³¢é·åŽã®ç¬¬ïŒè²ã®ç»åã®ãã¡ãåèšç¬¬ïŒè²ã®ç»åãšåèšç¬¬ïŒè²ã®ç»åãã¹ãã¬ãªç»åãšããŠæ®åå¯èœã§ããå Žåã«ãåèšç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸãããšå ±ã«åèšç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸããªã第ïŒã®æ³¢é·åž¯åãåã³åèšç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸãããšå ±ã«åèšç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸããªã第ïŒã®æ³¢é·åž¯åã«ãããŠæäžã®å éååžãæãããã¿ãŒã³ç §æã被åäœã«ç §å°ããåèšç¬¬ïŒè²ã®ç»åãšåèšç¬¬ïŒè²ã®ç»åãšåèšç¬¬ïŒè²ã®ç»åãšãå«ãæ®åç»åãæ®åããæ®åæ¹æ³ã«é¢ä¿ããã According to still another aspect of the present invention, there is provided a first color image, a second color image having a longer wavelength side than the first color, and a third color image having a longer wavelength side than the second color. Among these, when the image of the first color and the image of the third color can be captured as a stereo image, the first color that is included in the wavelength band of the first color and not included in the wavelength band of the second color. A pattern illumination having a given light amount distribution in a second wavelength band that is included in the third wavelength band and not included in the second color wavelength band; and The present invention relates to an imaging method for imaging a captured image including an image of one color, an image of the second color, and an image of the third color.
ã以äžãæ¬å®æœåœ¢æ ã«ã€ããŠèª¬æããããªãã以äžã«èª¬æããæ¬å®æœåœ¢æ ã¯ãè«æ±ã®ç¯å²ã«èšèŒãããæ¬çºæã®å 容ãäžåœã«éå®ãããã®ã§ã¯ãªãããŸãæ¬å®æœåœ¢æ ã§èª¬æãããæ§æã®å šãŠããæ¬çºæã®å¿ é æ§æèŠä»¶ã§ãããšã¯éããªãã Hereinafter, this embodiment will be described. In addition, this embodiment demonstrated below does not unduly limit the content of this invention described in the claim. In addition, all the configurations described in the present embodiment are not necessarily essential configuration requirements of the present invention.
ãäŸãã°ä»¥äžã§ã¯å·¥æ¥çšã®å èŠé¡è£ çœ®ãæ¬çºæã®é©çšäŸãšããŠèª¬æããããæ¬çºæã¯å·¥æ¥çšã®å èŠé¡è£ 眮ãžã®é©çšã«éå®ããããã¹ãã¬ãªæ®åœ±æ¹åŒïŒèŠå·®ããã£ãæ®åç³»ã§åŸãïŒç»åã®äœçžå·®ãæ€åºããŠè¢«åäœã®è·é¢æ å ±ãååŸããæ¹æ³ïŒã«ããïŒæ¬¡å 圢ç¶ãèšæž¬ããïŒæ¬¡å èšæž¬è£ 眮ãïŒæ¬¡å èšæž¬æ©èœãæããæ®åè£ çœ®ïŒäŸãã°å»ççšã®å èŠé¡è£ 眮ãé¡åŸ®é¡ãå·¥æ¥çšã«ã¡ã©ãããããã®èŠèŠæ©èœãªã©ïŒã§ããã°é©çšã§ããã For example, an industrial endoscope apparatus will be described below as an application example of the present invention. However, the present invention is not limited to application to an industrial endoscope apparatus, and a stereo shooting method (with an imaging system having parallax). A method of detecting the phase difference between the two images and acquiring subject distance information), a three-dimensional measuring device that measures a three-dimensional shape, and an imaging device having a three-dimensional measuring function (for example, a medical endoscope device, Microscopes, industrial cameras, robot vision functions, etc. are applicable.
ãïŒïŒãã¿ãŒã³ç
§æ
ãé«å質ãªèгå¯ç»åãåŸãããã«ãäžè¬çã«ã¯è¢«åäœãžã®ç
§æã¯ã ã©ã®ãªãäžæ§ç
§æãåæã§ãããããããªãããã¹ãã¬ãªæ¹åŒã®èšæž¬ãè¡ãå Žåã«ã¯ãäœçžå·®ãåŸãããããç¹åŸŽéšãååšããªããšè¢«åäœãŸã§ã®è·é¢æ
å ±ãæ±ããããšãã§ããªãããã®ãããæå³çã«ç¹åŸŽä»ããè¡ããã¿ãŒã³ç
§æã被åäœã«ç
§å°ããæ¹æ³ãçšããã
1. Pattern illumination In order to obtain a high-quality observation image, it is generally premised that the subject is illuminated uniformly. However, when performing stereo measurement, distance information to the subject cannot be obtained unless there is a feature that easily obtains a phase difference. Therefore, a method of irradiating the subject with pattern illumination that is intentionally characterized is used.
ãæ¬å®æœåœ¢æ ã§ã¯ãçœè²å ã«ãã芳å¯ç»åãæ®åœ±ãã芳å¯ã¢ãŒããšã«ã©ãŒäœçžå·®æ³ã§ã¹ãã¬ãªèšæž¬ãè¡ãèšæž¬ã¢ãŒããšãåãæ¿ããŠäœ¿çšãããåŸæ¥ã®ãã¿ãŒã³ç §æãçšããå Žåã芳å¯ã¢ãŒãæã¯äžæ§ç §æãè¡ããèšæž¬ã¢ãŒãæã¯ãã¿ãŒã³ç §æãè¡ããããªç §æèªäœã®åãæ¿ãæ©èœãå¿ èŠãšãªãã In the present embodiment, an observation mode for capturing an observation image using white light and a measurement mode for performing stereo measurement by the color phase difference method are switched and used. When conventional pattern illumination is used, it is necessary to have a function of switching the illumination itself so that uniform illumination is performed in the observation mode and pattern illumination is performed in the measurement mode.
ãããã§æ¬å®æœåœ¢æ ã§ã¯ã芳å¯ã¢ãŒãæã§ãã£ãŠãèšæž¬ã¢ãŒãã§ãã£ãŠãåžžæãã¿ãŒã³ç §æãè¡ãªããåå ãã£ã«ã¿åã¯ç»ååŠçã«ãã芳å¯ã¢ãŒãæã®ãã¿ãŒã³ãæ¶å»ããæ¹æ³ãçšãããããã«ãããç §æã®åãæ¿ãæ©èœãäžèŠã«ã§ããã以äžããã®ææ³ã«ã€ããŠèª¬æããã Therefore, in the present embodiment, a method is used in which pattern illumination is always performed in the observation mode or in the measurement mode, and the pattern in the observation mode is erased by a spectral filter or image processing. Thereby, the illumination switching function can be made unnecessary. Hereinafter, this method will be described.
ãå³ïŒã¯ããã¿ãŒã³ç §æã®äŸãæš¡åŒçã«ç€ºãããã®ã§ãããå³ïŒã«ã¯ããã¿ãŒã³ç §æïŒ°ïŒ¬ã®å¹³é¢èŠå³ïŒäŸãã°ãæ®åç³»ã®å 軞ã«åçŽãªå¹³é¢ã«æåœ±ãããšãã®å¹³é¢èŠå³ïŒãšããã®ïŒ¡ïŒ¡æé¢ã§ã®å éç¹æ§ã®äŸãšã瀺ããïœã¯å 軞ã«åçŽãªæ¹åã§ã®äœçœ®ïŒåº§æšïŒã§ããã FIG. 1 schematically shows an example of pattern illumination. FIG. 1 shows a plan view of the pattern illumination PL (for example, a plan view when projected onto a plane perpendicular to the optical axis of the imaging system) and an example of the light quantity characteristic in the AA section. x is a position (coordinate) in a direction perpendicular to the optical axis.
ãå³ïŒã«ç€ºãããã«ãæãããå€åããå°åãã¿ãŒã³ïŒ€ïŒŽãèŠåçã«é 眮ãããç §æãã¿ãŒã³ãšãªã£ãŠãããå°åãã¿ãŒã³ïŒ€ïŒŽã®å€éšã¯ãã©ãããªæããã®ç §æã«ãªã£ãŠãããå°åãã¿ãŒã³ïŒ€ïŒŽã®å éšã¯ãããããæãç §æã«ãªã£ãŠããã As shown in FIG. 1, a small circle pattern DT whose brightness changes is an illumination pattern regularly arranged. The outside of the small circle pattern DT is illumination with flat brightness, and the inside of the small circle pattern DT is illumination darker than that.
ããã®ãããªãã¿ãŒã³ç §æïŒ°ïŒ¬ã«ãã被åäœç»åãæããå Žåãå³ïŒã«ç€ºãããã«ãã¿ãŒã³ç §æïŒ°ïŒ¬ã®å éãšè¢«åäœã®åå°ä¿æ°ååžãä¹ç®ãããçµæã®æ³¢åœ¢ãæ®å波圢ïŒã»ã³ãµåºåïŒãšããŠåŸããããå³ïŒã®ã»ã³ãµåºåã¯ãäœçœ®ïœã«å¯Ÿå¿ããç»çŽ ã§ã®ç»çŽ å€åã¯èŒåºŠå€ïŒåã¯ã»ã³ãµé¢ã§ã®çµåã®æããïŒã衚ããå®ç·ã¯ãã¿ãŒã³ç §æãããå Žåã®ã»ã³ãµåºåã瀺ããç¹ç·ã¯ãã©ããç §æãããå Žåã®ã»ã³ãµåºåã瀺ãã When a subject image is captured by such pattern illumination PL, a waveform obtained by multiplying the amount of light of pattern illumination PL and the reflection coefficient distribution of the subject is obtained as an imaging waveform (sensor output) as shown in FIG. The sensor output in FIG. 2 represents a pixel value or a luminance value (or brightness of image formation on the sensor surface) at a pixel corresponding to the position x. A solid line indicates a sensor output when pattern illumination is performed, and a dotted line indicates a sensor output when flat illumination is performed.
ãïŒïŒç¬¬ïŒåå
ç¹æ§äŸ
ãå³ïŒã«ããã¿ãŒã³ç
§æã®ç¬¬ïŒåå
ç¹æ§äŸãšã芳å¯ã¢ãŒãåã³ã¹ãã¬ãªèšæž¬ã¢ãŒãã§ã®ç³ã®ç¬¬ïŒåå
ç¹æ§äŸã瀺ãããããã®ç¬Šå·ãä»ããç¹ç·ã¯ãæ®åã»ã³ãµã®ã«ã©ãŒãã£ã«ã¿ã®åå
ç¹æ§ã衚ããããã¯ãããããæ®åã»ã³ãµã®éè²ãç·è²ãèµ€è²ãã£ã«ã¿ã®åå
ç¹æ§ã§ãããã¯ãèµ€å€ã®æåºŠç¹æ§ã§ãããéè²ãç·è²ãèµ€è²ãã£ã«ã¿ã®ããããééããã
2. First Spectral Characteristics Example FIG. 3 shows a first spectral characteristic example of pattern illumination and a first spectral characteristic example of a pupil in the observation mode and the stereo measurement mode. Dotted lines with B, G, R, and IR symbols represent spectral characteristics of the color filter of the image sensor. B, G, and R are spectral characteristics of the blue, green, and red filters of the image sensor, respectively. IR is an infrared sensitivity characteristic and passes through any of the blue, green, and red filters.
ãå³ïŒã®äžå³ã«ç€ºãããã«ããã¿ãŒã³ç §æã®ç¬¬ïŒåå ç¹æ§äŸã§ã¯ãæ®åã»ã³ãµã®ã«ã©ãŒãã£ã«ã¿ã®åå ç¹æ§ãšé¢é£ä»ããŠãæ³¢é·åž¯åãïœïŒ°ïœïŒïŒïŒ°ïœïŒïŒ°ïœïŒïŒïŒ°ïœïŒïŒ°ïœïœïœã®ïŒã€ã«åå²ããã垯åïœïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïœïœã®ç §æå ã¯ãå éååžãäžæ§ãªéãã¿ãŒã³å ïŒãã©ããå ïŒã§ããã垯åïœïŒ°ïœïŒïŒ°ïœïœã®ç §æå ã¯ãã¿ãŒã³å ã§ãããããããåæããåå ç¹æ§ããã€å ã被åäœãžã®ç §æå ãšãªãã As shown in the upper diagram of FIG. 3, in the first spectral characteristic example of the pattern illumination, the wavelength bands are set to five {Pv1, Pb, Pv2, Pr, Pir} in association with the spectral characteristics of the color filter of the imaging sensor. To divide. The illumination light in the bands {Pv1, Pv2, Pir} is non-pattern light (flat light) with a uniform light amount distribution, and the illumination light in the bands {Pb, Pr} is pattern light. Light having a spectral characteristic obtained by combining these becomes illumination light to the subject.
ã垯åïœïŒãïœã¯ãæ®åã»ã³ãµã®éè²ãã£ã«ã¿ãééãããç·è²ãã£ã«ã¿ãééããªã垯åã«èšå®ãããŠããã垯åïœã¯ãç·è²ãã£ã«ã¿ãééãã垯åã«èšå®ãããŠããã垯åïœãïœïœã¯ãèµ€è²ãã£ã«ã¿ãééãããç·è²ãã£ã«ã¿ãééããªã垯åã«èšå®ãããŠããã垯åïœïœã¯ãèµ€å€ã®æåºŠç¹æ§ïŒ©ïŒ²ãå«ãããªãã垯åïœïœãèµ€å€ã®æåºŠç¹æ§ïŒ©ïŒ²ãå«ãŸãªããŠãããããã®ããã«ã垯åïœãïœã¯ãæ®åã»ã³ãµã®ç·è²ãã£ã«ã¿ã®åå ç¹æ§ïŒ§ãšã¯å¹²æžããªãæ³¢é·åž¯åã«èšå®ãããããŸãã垯åïœãïœã¯ãäŸãã°æ°ïœïœïœæ°ïŒïŒïœïœçã®ç垯åã«èšå®ãããã The bands Pv1 and Pb are set to bands that pass through the blue filter of the image sensor but do not pass through the green filter, the band Pv is set to a band that passes through the green filter, and the bands Pr and Pir are red. A band that passes through the filter but does not pass through the green filter is set. The band Pir includes an infrared sensitivity characteristic IR. Note that the band Pir may not include the infrared sensitivity characteristic IR. Thus, the bands Pb and Pr are set to wavelength bands that do not interfere with the spectral characteristic G of the green filter of the image sensor. The bands Pb and Pr are set to narrow bands such as several nm to several tens of nm, for example.
ãããå ·äœçã«ã¯ã垯åïœã¯æ®åã»ã³ãµã®éè²ç»çŽ ïŒåå ç¹æ§ïŒ¢ïŒã®ã¿ãååŸå¯èœã§ãã£ãŠæ¯èŒçæåºŠãè¯å¥œãªæ³¢é·åãéžæããããšãæãŸããããŸã垯åïœã¯æ®åã»ã³ãµã®èµ€è²ç»çŽ ïŒåå ç¹æ§ïŒ²ïŒã®ã¿ãååŸå¯èœã§ãã£ãŠæ¯èŒçæåºŠãè¯å¥œãªæ³¢é·åãéžæããã®ãæãŸãããã€ãŸããå·Šç³ç»åãšå³ç³ç»åãæ®åã»ã³ãµã®ç°ãªãè²ç»çŽ ã«ããåé¢ãããªããã°ãªããªãã®ã§ãçžäºã«åå æåºŠç¹æ§ããªãæ³¢é·åã«ãŠïœïŒ°ïœïŒïŒ°ïœïœãéžã¹ã°ãããéãã¿ãŒã³ç §æå ã®åå æåïœïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïœïœã¯éåžžã®èгå¯ç»åã®æåãšãªãã®ã§ãæåºŠããã³ã«ã©ãŒå質ã確ä¿ããããã«æ®åã»ã³ãµã®èµ€è²ç»çŽ ãç·è²ç»çŽ ãéè²ç»çŽ ïŒåå ç¹æ§ïŒ²ããïŒããªãã¹ãå€ãã®æ³¢é·åãã«ããŒã§ããããã«ããããšãæãŸãããäŸãã°ãïœïŒ°ïœïŒïŒ°ïœïœã¯ç垯åå æºãšãªãããã«æ³¢é·ïŒïŒïŒïœïœïŒïŒïŒïŒïœïœã®æšæºçãªã¬ãŒã¶ãŒå æºãçšããåå æåïœïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïœïœãå€ãã®æ³¢é·æåãã«ããŒããããã«ããããšãäžã€ã®æ¹æ³ã§ããã More specifically, for the band Pb, it is desirable to select a wavelength range in which only the blue pixel (spectral characteristic B) of the image sensor can be acquired and the sensitivity is relatively good. For the band Pr, it is desirable to select a wavelength range in which only the red pixel (spectral characteristic R) of the imaging sensor can be acquired and the sensitivity is relatively good. That is, since the left pupil image and the right pupil image must be separated by different color pixels of the image sensor, {Pb, Pr} may be selected in a wavelength region that does not have a mutual light receiving sensitivity characteristic. Since the spectral components {Pv1, Pv2, Pir} of the non-pattern illumination light are components of a normal observation image, the red pixel, the green pixel, and the blue pixel (spectral characteristics R, It is desirable that G and B) can cover as many wavelength ranges as possible. For example, a standard laser light source with wavelengths of 450 nm and 660 nm is used for {Pb, Pr} so as to be a narrow-band light source, and the spectral components {Pv1, Pv2, Pir} may cover many wavelength components. This is one way.
ãå³ïŒã®äžå³ãäžå³ã«ç€ºãããã«ãç³ã®åå
ç¹æ§ïŒŠïŒ¬ãããšç
§æå
ã®åå
ç¹æ§ãšã®é¢ä¿ãäžåŒïŒïŒïŒã®ããã«èšå®ããã
ã芳å¯ã¢ãŒãã§ã¯ãåå ç¹æ§ïŒŠïŒ£ã®ç³ã§æ®åœ±ãããåå ç¹æ§ïŒŠïŒ£ã¯æ³¢é·åž¯åïœïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïœïœã«å¯Ÿå¿ããŠããããã©ããç §æã«ãã芳å¯ç»åãåŸããããäžæ¹ãã¹ãã¬ãªèšæž¬ã¢ãŒãã§ã¯ãåå ç¹æ§ïŒŠïŒ¬ã®å·Šç³ãšåå ç¹æ§ïŒŠïŒ²ã®å³ç³ã§æ®åœ±ãããåå ç¹æ§ïŒŠïŒ¬ãã¯ãæ³¢é·åž¯åïœãïœã«å¯Ÿå¿ããŠãããã¢ã¯ãã£ããã¿ãŒã³ç §æã«ããã¹ãã¬ãªç»åãåŸãããã In the observation mode, images are taken with the pupil of the spectral characteristic FC. The spectral characteristic FC corresponds to the wavelength band {Pv1, Pv2, Pir}, and an observation image by flat illumination is obtained. On the other hand, in the stereo measurement mode, shooting is performed with the left pupil of the spectral characteristic FL and the right pupil of the spectral characteristic FR. The spectral characteristics FL and FR correspond to the wavelength bands Pb and Pr, and a stereo image by active pattern illumination is obtained.
ã芳å¯ã¢ãŒãã¯ãå³ïŒïŒãå³ïŒïŒãå³ïŒïŒã®ããã«åºå®ãã¹ã¯ïŒïŒã®äžå¿çµãåïŒïŒïŒåå
ç¹æ§ïŒŠïŒ£ïŒã§åçŒæ®åœ±ããã¢ãŒãã§ãããã¹ãã¬ãªèšæž¬ã¢ãŒãã¯ãå³ïŒïŒãå³ïŒïŒãå³ïŒïŒã®ããã«åºå®ãã¹ã¯ïŒïŒã®å·Šç³çµãåïŒïŒïŒåå
ç¹æ§ïŒŠïŒ¬ïŒãšå³ç³çµãåïŒïŒïŒåå
ç¹æ§ïŒŠïŒ²ïŒã§ã¹ãã¬ãªæ®åœ±ããã¢ãŒãã§ããããããã®èгå¯ã¢ãŒããã¹ãã¬ãªèšæž¬ã¢ãŒãã®è©³çްã¯åŸè¿°ããã
The observation mode is a mode in which monocular imaging is performed with the central aperture 23 (spectral characteristic FC) of the fixed
ãå³ïŒã«ã第ïŒåå ç¹æ§äŸãšã芳å¯ã¢ãŒãåã³ã¹ãã¬ãªèšæž¬ã¢ãŒãã«ãããæ®åç»åãšã®é¢ä¿ã瀺ããå³ïŒã®å³å³ã®æ³¢åœ¢ã«ãããŠãç¹ç·ã¯ãã©ããç §æãããå Žåã®ã»ã³ãµåºåãä»®æ³çã«ç€ºããŠããã FIG. 4 shows the relationship between the first spectral characteristic example and the captured images in the observation mode and the stereo measurement mode. In the waveform in the right diagram of FIG. 4, the dotted line virtually indicates the sensor output when flat illumination is performed.
ããªãã以äžã§ã¯ãå³ïŒïŒïœå³ïŒïŒã®æ®åéšã«æ¬å®æœåœ¢æ ã®ã¢ã¯ãã£ããã¿ãŒã³ç §æãé©çšããå ŽåãäŸã«èª¬æããããæ®åéšã®æ§æã¯å³ïŒïŒïœå³ïŒïŒã«éå®ããããåå ç¹æ§ïŒŠïŒ£ã«ãã芳å¯ç»åãšåå ç¹æ§ïŒŠïŒ¬ãã«ããã¹ãã¬ãªç»åãæ®åœ±ã§ãããã®ã§ããã°ããã In the following, the case where the active pattern illumination of the present embodiment is applied to the imaging unit of FIGS. 11 to 16 will be described as an example. However, the configuration of the imaging unit is not limited to FIGS. Any observation image can be used as long as a stereo image by spectral characteristics FL and FR can be taken.
ãç §æå ã¯ã¢ãŒãã«é¢ãããåäžã§ããã芳å¯ã¢ãŒãã§ãã¹ãã¬ãªèšæž¬ã¢ãŒãã§ã被åäœã«ã¯ã¢ã¯ãã£ããã¿ãŒã³ç §æãæåœ±ãããç¶æ ã§ããã The illumination light is the same regardless of the mode, and the active pattern illumination is projected onto the subject in both the observation mode and the stereo measurement mode.
ã芳å¯ã¢ãŒãã§ã¯ãéãã¿ãŒã³å ã®åå æåïœïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïœïœããã£ãç §æå ã«ããåå°å ãç³äžå¿å è·¯ïŒåå ç¹æ§ïŒŠïŒ£ïŒãééãæ®åç»åïœïŒ¶ïœïŒïŒ¶ïœïŒïŒ¶ïœïœãåŸããããïœã¯æ®åã»ã³ãµã®èµ€è²ãã£ã«ã¿ãæããç»çŽ ã§åŸãããèµ€è²ç»åã§ãããïœã¯æ®åã»ã³ãµã®ç·è²ãã£ã«ã¿ãæããç»çŽ ã§åŸãããç·è²ç»åã§ãããæ®åã»ã³ãµã®éè²ãã£ã«ã¿ãæããç»çŽ ã§åŸãããéè²ç»åã§ããã芳å¯ã¢ãŒãã§ã¯ãã¢ã¯ãã£ããã¿ãŒã³ã®åž¯åãé€ãããŠããã®ã§ãã¢ã¯ãã£ããã¿ãŒã³ã®åœ±é¿ãåããªã芳å¯ç»åãåŸãããã In the observation mode, the reflected light by the illumination light having the spectral components {Pv1, Pv2, Pir} of the non-pattern light passes through the pupil center optical path (spectral characteristic FC), and a captured image {Vr, Vg, Vb} is obtained. Vr is a red image obtained by a pixel having a red filter of the image sensor, Vg is a green image obtained by a pixel having a green filter of the image sensor, and obtained by a pixel having a blue filter of the image sensor It is a blue image. In the observation mode, since the band of the active pattern is removed, an observation image that is not affected by the active pattern is obtained.
ãã¹ãã¬ãªèšæž¬ã¢ãŒãã§ã¯ããã¿ãŒã³å ã®åå æåïœïŒ°ïœïŒïŒ°ïœïœããã£ãç §æå ã«ããåå°å ãå·Šå³ç³å è·¯ãééãæ®åç»åïœïŒïœïŒïŒïœïœãåŸããããïŒïœã¯å·Šç³å è·¯ïŒåå ç¹æ§ïŒŠïŒ¬ïŒã«ããç»åã§ãããïŒïœã¯å³ç³å è·¯ïŒåå ç¹æ§ïŒŠïŒ²ïŒã«ããç»åã§ãããã¹ãã¬ãªèšæž¬ã¢ãŒãã§ã¯ãå¹³æ»ãªè¢«åäœé¢ã«ãããŠãæå³çã«ãã¿ãŒã³ã圢æãããã®ã§ãæ®åç»åïœïŒïœïŒïŒïœïœã®ãããã³ã°ãåããããäœçžå·®æ€åºã容æã«ãªãã In the stereo measurement mode, the reflected light by the illumination light having the spectral component {Pb, Pr} of the pattern light passes through the left and right pupil optical paths, and a captured image {Mr, Mb} is obtained. Mr is an image by the left pupil optical path (spectral characteristic FL), and Mb is an image by the right pupil optical path (spectral characteristic FR). In the stereo measurement mode, since a pattern is intentionally formed even on a smooth subject surface, the captured image {Mr, Mb} can be easily matched and phase difference detection can be easily performed.
ã以äžã®å®æœåœ¢æ ã«ããã°ãæ®åè£ çœ®ïŒå èŠé¡è£ 眮ïŒã¯ã第ïŒè²ã®ç»åãšç¬¬ïŒè²ã®ç»åãšç¬¬ïŒè²ã®ç»åãå«ãæ®åç»åïŒïŒ¶ïœãïœãïœïŒãæ®åœ±ãããšå ±ã«ç¬¬ïŒè²ã®ç»åãšç¬¬ïŒè²ã®ç»åãã¹ãã¬ãªç»åïŒïŒïœãïŒïœïŒãšããŠæ®åå¯èœãªæ®åéšãšã第ïŒã®æ³¢é·åž¯åïœåã³ç¬¬ïŒã®æ³¢é·åž¯åïœã«ãããŠæäžã®å éååžãæãããã¿ãŒã³ç §æã被åäœã«ç §å°ããç §æéšãšããå«ãã第ïŒè²ïŒç·è²ïŒã®ç»åã¯ã第ïŒè²ïŒéè²ïŒãããé·æ³¢é·åŽã®ç»åã§ããã第ïŒè²ïŒèµ€è²ïŒã®ç»åã¯ã第ïŒè²ãããé·æ³¢é·åŽã®ç»åã§ããã第ïŒã®æ³¢é·åž¯åïœã¯ã第ïŒè²ã®æ³¢é·åž¯åïŒåå ç¹æ§ïŒ¢ã®åž¯åïŒã«å«ãŸãããšå ±ã«ç¬¬ïŒè²ã®æ³¢é·åž¯åïŒåå ç¹æ§ïŒ§ã®åž¯åïŒã«å«ãŸããªã垯åã§ããã第ïŒã®æ³¢é·åž¯åïœã¯ã第ïŒè²ã®æ³¢é·åž¯åïŒåå ç¹æ§ïŒ²ã®åž¯åïŒã«å«ãŸãããšå ±ã«ç¬¬ïŒè²ã®æ³¢é·åž¯åïŒåå ç¹æ§ïŒ§ã®åž¯åïŒã«å«ãŸããªã垯åã§ããã According to the above embodiment, the imaging device (endoscope device) captures the captured images (Vb, Vg, Vr) including the first color image, the second color image, and the third color image. An imaging unit capable of capturing a first color image and a third color image as a stereo image (Mb, Mr), and pattern illumination having a given light amount distribution in the first wavelength band Pb and the second wavelength band Pr And an illumination unit that irradiates the subject. The second color (green) image is an image on the longer wavelength side than the first color (blue), and the third color (red) image is an image on the longer wavelength side than the second color. The first wavelength band Pb is a band that is included in the first color wavelength band (spectral characteristic B band) and not included in the second color wavelength band (spectral characteristic G band). The second wavelength band Pr is a band that is included in the third color wavelength band (spectral characteristic R band) and not included in the second color wavelength band (spectral characteristic G band).
ããã®ããã«ããã°ã第ïŒã®æ³¢é·åž¯åïœãšç¬¬ïŒã®æ³¢é·åž¯åïœã¯ç¬¬ïŒè²ïŒç·è²ïŒã®æ³¢é·åž¯åã«å«ãŸããªãã®ã§ãå°ãªããšã第ïŒè²ã®ç»åã«ã¯æäžã®å éååžã«ãããã¿ãŒã³ãåããªããããã«ããããã¿ãŒã³ç §æãšãã©ããç §æãåãæ¿ããããšãªãã¹ãã¬ãªèšæž¬ãšé垞芳å¯ãåãæ¿ããããšãå¯èœãšãªãã In this way, since the first wavelength band Pb and the second wavelength band Pr are not included in the wavelength band of the second color (green), at least the pattern of the given light quantity distribution is present in the second color image. Is not reflected. Thereby, it is possible to switch between stereo measurement and normal observation without switching between pattern illumination and flat illumination.
ãå³ã¡ãã¹ãã¬ãªç»åã¯ç¬¬ïŒè²ïŒéè²ïŒã®ç»åãšç¬¬ïŒè²ïŒèµ€è²ïŒã®ç»åã§æ§æããããã®ç¬¬ïŒè²ã®ç»åã«ã¯ç¬¬ïŒã®æ³¢é·åž¯åïœã«ããããã¿ãŒã³ãåãã第ïŒè²ã®ç»åã«ã¯ç¬¬ïŒã®æ³¢é·åž¯åïœã«ããããã¿ãŒã³ãåããããã«ããããã¿ãŒã³ã«ãã£ãŠæå³çã«ç¹åŸŽãä»ãããã¹ãã¬ãªç»åãååŸããããããã®ãããã³ã°åŠçãè¡ãããšã§é«ç²ŸåºŠãªã¹ãã¬ãªèšæž¬ãå¯èœãšãªãããŸããäžè¿°ã®ããã«ç¬¬ïŒã®æ³¢é·åž¯åïœãšç¬¬ïŒã®æ³¢é·åž¯åïœãééãããªãåå ãã£ã«ã¿ïŒåå ç¹æ§ïŒŠïŒ£ïŒãéããŠèгå¯ç»åãæ®åœ±ããããšã§ãæäžã®å éååžã«ãããã¿ãŒã³ãåããªã芳å¯ç»åãåŸããããæãã¯å³ïŒïœå³ïŒã§åŸè¿°ããããã«ãçœè²å ã®å šåž¯åã®èгå¯ç»åãæ®åœ±ãããã¿ãŒã³ãåã£ã第ïŒè²ïŒéè²ïŒã第ïŒè²ïŒèµ€è²ïŒã®ç»åãããã¿ãŒã³ãåããªã第ïŒè²ïŒç·è²ïŒã®ç»åãçšããŠè£æ£ããããšã§ããã¿ãŒã³ãé€å»ããã芳å¯ç»åãåŸãããããã®ããã«ã第ïŒè²ã®æ³¢é·åž¯åã«ãããŠæäžã®å éååžã«ãããã¿ãŒã³ãä»ãããŠããªãããšã§ããã¿ãŒã³ç §æãè¡ã£ããŸãŸãã©ããç §ææã®ãããªèгå¯ç»åãåŸãããšãå¯èœãšãªãã That is, the stereo image is composed of a first color (blue) image and a third color (red) image, and the first color image includes a pattern in the first wavelength band Pb, and the third color image. Shows a pattern in the second wavelength band Pr. Thereby, a stereo image intentionally characterized by a pattern is acquired, and high-precision stereo measurement can be performed by performing matching processing thereof. In addition, as described above, an observation image is captured through a spectral filter (spectral characteristic FC) that does not pass the first wavelength band Pb and the second wavelength band Pr, so that an observation image in which a pattern due to a given light amount distribution is not captured is obtained. Is obtained. Alternatively, as will be described later with reference to FIGS. 5 to 8, an observation image of the entire band of white light is photographed, and the first color (blue) and third color (red) images in which the pattern is captured are not captured. By correcting using two-color (green) images, an observation image from which the pattern has been removed is obtained. As described above, since a pattern with a given light quantity distribution is not provided in the wavelength band of the second color, it is possible to obtain an observation image as in flat illumination while performing pattern illumination.
ããªããããã§ã¯ç¬¬ïŒã®æ³¢é·åž¯åãå³ïŒã®æ³¢é·åž¯åïœã§ããã第ïŒã®æ³¢é·åž¯åãå³ïŒã®æ³¢é·åž¯åïœã§ãããšããããããã«éå®ãããªããäŸãã°ç¬¬ïŒã®æ³¢é·åž¯åãå³ïŒã®æ³¢é·åž¯åïœïŒã§ããã第ïŒã®æ³¢é·åž¯åãå³ïŒã®æ³¢é·åž¯åïœïŒã§ãã£ãŠãããã Note that, here, the first wavelength band is the wavelength band Pb in FIG. 3 and the second wavelength band is the wavelength band Pr in FIG. 3, but the present invention is not limited to this. For example, the first wavelength band may be the wavelength band Pb1 in FIG. 5, and the second wavelength band may be the wavelength band Pr2 in FIG.
ãããã§æäžã®å éååžïŒæäžã®åœ¢ç¶ã®å éååžïŒã¯ãææïŒå éã®å€§å°ïŒã®å¢çãæããååžãæãã¯ãå éãæ¥æ¿ã«å€åãããšããžéšãæããååžã§ããããã®å¢çããšããžéšã¯ãã¿ãŒã³ç §æã®ç §å°é åã«è€æ°èšããããŠããããã®æäžã®å éååžãæãããã¿ãŒã³ç §æã§ç §æãã被åäœãæ®åœ±ããããšã§ãæäžã®å éååžã®å¢çããšããžéšã®åœ¢ç¶ãé çœ®ãæ®åç»åã«åããããã«ãã£ãŠã¹ãã¬ãªèšæž¬ã«å¿ èŠãªç¹åŸŽãæ®åç»åã«äžããããšãã§ãããæ¬å®æœåœ¢æ ã§ã¯ããã®æäžã®å éååžã¯ç¹å®ã®æ³¢é·åž¯åïœãïœã«ã®ã¿ä»äžãããŠããããªããå³ïŒã§ã¯æ³¢é·åž¯åïœãïœã«ãããŠåšå²ãããæãå°åã®ãã¿ãŒã³ïŒ€ïŒŽãèŠåçã«é 眮ããŠããããæäžã®å éååžã¯ããã«éå®ãããªããäŸãã°ãã¿ãŒã³ïŒ€ïŒŽã¯å°åã§ãªããŠããããããã¿ãŒã³ïŒ€ïŒŽã®é 眮ã«ã¯èŠåæ§ããªããŠããããããã¿ãŒã³ïŒ€ïŒŽã®å éšã¯å€éšãããæããç §æã«ãªã£ãŠããŠãããã Here, a given light amount distribution (a light amount distribution of a given shape) is a distribution having a light / dark (light amount) boundary or a distribution having an edge portion where the light amount changes abruptly. A plurality of parts are provided in the irradiation area of the pattern illumination. By photographing a subject illuminated with pattern illumination having a given light intensity distribution, the boundaries of the given light intensity distribution and the shape and arrangement of edges are reflected in the captured image, thereby capturing the features required for stereo measurement. Can be given to images. In the present embodiment, this given light quantity distribution is given only to specific wavelength bands Pb and Pr. In FIG. 1, small circle patterns DT darker than the surroundings are regularly arranged in the wavelength bands Pb and Pr, but the given light quantity distribution is not limited to this. For example, the pattern DT may not be a small circle, the arrangement of the pattern DT may not be regular, and the inside of the pattern DT may be brighter than the outside.
ããŸãæ¬å®æœåœ¢æ ã§ã¯ãæ®åéšã¯ãã¹ãã¬ãªç»åãæ®åããã¹ãã¬ãªã¢ãŒããšãåçŒã«ããæ®åç»åãæ®åããéã¹ãã¬ãªã¢ãŒããšãåãæ¿ããã In this embodiment, the imaging unit switches between a stereo mode for capturing a stereo image and a non-stereo mode for capturing a captured image with a single eye.
ãäŸãã°å³ïŒïŒïœå³ïŒïŒã§åŸè¿°ããæ®åéšã§ã¯ãéã¹ãã¬ãªã¢ãŒãïŒèгå¯ã¢ãŒãïŒã§ã¯ãåºå®ãã¹ã¯ïŒïŒã®äžå¿çµãåïŒïŒïŒåå
ç¹æ§ïŒŠïŒ£ïŒã§åçŒæ®åœ±ããã¹ãã¬ãªã¢ãŒãïŒã¹ãã¬ãªèšæž¬ã¢ãŒãïŒã§ã¯ãåºå®ãã¹ã¯ïŒïŒã®å·Šç³çµãåïŒïŒïŒåå
ç¹æ§ïŒŠïŒ¬ïŒãšå³ç³çµãåïŒïŒïŒåå
ç¹æ§ïŒŠïŒ²ïŒã§ã¹ãã¬ãªæ®åœ±ããããªããæ¬å®æœåœ¢æ
ã®ãã¿ãŒã³ç
§æãé©çšã§ããæ®åéšã¯ããã«éå®ããããã¹ãã¬ãªã¢ãŒãã«ãããŠç¬¬ïŒè²ã®ç»åãšç¬¬ïŒè²ã®ç»åãã¹ãã¬ãªç»åãšããŠæ®åããæ®åéšã§ããã°ããã
For example, in the imaging unit described later with reference to FIGS. 11 to 16, in the non-stereo mode (observation mode), monocular imaging is performed with the central aperture 23 (spectral characteristic FC) of the fixed
ããã®ããã«ã第ïŒè²ã®ç»åãšç¬¬ïŒè²ã®ç»åãã¹ãã¬ãªç»åãšããŠæ®åœ±ããã¹ãã¬ãªã¢ãŒããšã第ïŒè²ïœç¬¬ïŒè²ã®ç»åãæ®åç»åãšããŠåçŒã«ããæ®åããéã¹ãã¬ãªã¢ãŒããåãæ¿ããããšã§ãã¹ãã¬ãªèšæž¬ãšèгå¯ç»åã®æ®åœ±ãåãæ¿ããããšãã§ããããŸãããã®å Žåã«ãããŠäžè¿°ãããã¿ãŒã³ç §æãçšããããšã«ãã£ãŠãåžžæãã¿ãŒã³ç §æãè¡ããªããã¹ãã¬ãªèšæž¬ãšèгå¯ç»åã®æ®åœ±ãåãæ¿ããããšãå¯èœãšãªãã In this way, by switching between a stereo mode in which the first color image and the third color image are captured as a stereo image and a non-stereo mode in which the first to third color images are captured with a monocular as a captured image, It is possible to switch between stereo measurement and observation image capturing. In this case, by using the above-described pattern illumination, it is possible to switch between stereo measurement and observation image capturing while always performing pattern illumination.
ããŸãæ¬å®æœåœ¢æ ã§ã¯ãéã¹ãã¬ãªã¢ãŒãã«ãããåçŒã¯ã第ïŒè²ãšç¬¬ïŒè²ãšç¬¬ïŒè²ã®æ³¢é·åž¯åïŒçœè²å ã®æ³¢é·åž¯åïŒã®ãã¡ç¬¬ïŒã®æ³¢é·åž¯åïœåã³ç¬¬ïŒã®æ³¢é·åž¯åïœãé€ãæ³¢é·åž¯åïœïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïœïœãééãããã In this embodiment, the monocular in the non-stereo mode uses the first wavelength band Pb and the second wavelength band Pr out of the wavelength bands (white light wavelength bands) of the first color, the second color, and the third color. Except the wavelength bands {Pv1, Pv2, Pir}.
ãæäžã®å éååžã«ãããã¿ãŒã³ã¯ç¬¬ïŒã®æ³¢é·åž¯åïœåã³ç¬¬ïŒã®æ³¢é·åž¯åïœã«ä»ãããŠããã®ã§ãåçŒãééãããæ³¢é·åž¯åïœïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïœïœã«ã¯æäžã®å éååžã«ãããã¿ãŒã³ãä»ãããŠããªããããã«ãããéã¹ãã¬ãªã¢ãŒãã«ãããŠãã¿ãŒã³ç §æãè¡ã£ãŠããã«ãé¢ããããã©ããç §æã®ãããªèгå¯ç»åãæ®åœ±ããããšãã§ãããç¹ã«ãäžè¿°ã®ããã«ç¬¬ïŒã®æ³¢é·åž¯åïœåã³ç¬¬ïŒã®æ³¢é·åž¯åïœãç垯åã§ããå Žåã«ã¯ããããé€ããæ³¢é·åž¯åïœïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïœïœã¯ã»ãŒçœè²å ã®æ³¢é·åž¯åãšãªãã®ã§ãçœè²å ã®ç §æã«ããæ®åç»åãšéè²ãªãç»åãåŸãããšãå¯èœãšãªãã Since the pattern based on the given light quantity distribution is attached to the first wavelength band Pb and the second wavelength band Pr, the wavelength band {Pv1, Pv2, Pir} through which the monocular passes is a pattern based on the given light quantity distribution. Is not attached. As a result, an observation image such as flat illumination can be taken despite pattern illumination in the non-stereo mode. In particular, when the first wavelength band Pb and the second wavelength band Pr are narrow bands as described above, the wavelength bands {Pv1, Pv2, Pir} excluding them are substantially the wavelength band of white light. Therefore, it is possible to obtain an image that is not inferior to a captured image obtained by illumination with white light.
ããŸãæ¬å®æœåœ¢æ
ã§ã¯ãå³ïŒïŒçã§åŸè¿°ããããã«ãæ®åè£
眮ãäœçžå·®æ€åºéšïŒïŒïŒãšç»ååºåéšïŒã«ã©ãŒç»åçæéšïŒïŒïŒïŒãå«ãã§ããããäœçžå·®æ€åºéšïŒïŒïŒã¯ãã¹ãã¬ãªã¢ãŒãã«ãããŠæ®åããã第ïŒè²ã®ç»åãšç¬¬ïŒè²ã®ç»åãšã®éã®äœçžå·®ãæ€åºãããç»ååºåéšã¯ãéã¹ãã¬ãªã¢ãŒãã«ãããŠæ®åãããæ®åç»åïŒç¬¬ïŒè²ïœç¬¬ïŒè²ã®ç»åïŒã«åºã¥ããŠèгå¯çšã®ç»åãåºåããã
In this embodiment, as will be described later with reference to FIG. 18 and the like, the imaging apparatus may include a phase
ãäžè¿°ã®ããã«ç¬¬ïŒè²ã®ç»åãšç¬¬ïŒè²ã®ç»åã«ã¯ããã¿ãŒã³ç
§æã«ãããã¿ãŒã³ãåã£ãŠãããããäœçžå·®æ€åºéšïŒïŒïŒã«ãã£ãŠé«ç²ŸåºŠãªäœçžå·®æ€åºãè¡ãããšãå¯èœã§ããããŸããéã¹ãã¬ãªã¢ãŒãã«ãããŠæ®åãããæ®åç»åã«ã¯ããã¿ãŒã³ç
§æãè¡ã£ãŠããã«ãé¢ããããã¿ãŒã³ãåã£ãŠããªããããç»ååºåéšã«ãã£ãŠèгå¯çšã®ç»åãåºåã§ããã
As described above, since the pattern of pattern illumination is reflected in the first color image and the third color image, the phase
ããŸãæ¬å®æœåœ¢æ ã§ã¯ã第ïŒè²ã®æ³¢é·åž¯åïŒåå ç¹æ§ïŒ¢ïŒã®ãã¡ç¬¬ïŒã®æ³¢é·åž¯åïœãé€ãæ³¢é·åž¯åãšã第ïŒè²ã®æ³¢é·åž¯åïŒåå ç¹æ§ïŒ§ïŒãšã第ïŒè²ã®æ³¢é·åž¯åïŒåå ç¹æ§ïŒ²ïŒã®ãã¡ç¬¬ïŒã®æ³¢é·åž¯åïœãé€ãæ³¢é·åž¯åãšã«ãããŠãã©ãããªå éååžã§ããã In this embodiment, the wavelength band of the first color wavelength band (spectral characteristic B) excluding the first wavelength band Pb, the second color wavelength band (spectral characteristic G), and the third color wavelength band. It is a flat light amount distribution in the wavelength band excluding the second wavelength band Pr in (spectral characteristic R).
ããã®ããã«ããã°ã第ïŒã®æ³¢é·åž¯åïœãšç¬¬ïŒã®æ³¢é·åž¯åïœä»¥å€ã§ã¯ãã©ãããªå éååžã®ç §æå ãšãªãã®ã§ãå°ãªããšã第ïŒè²ã®ç»åã¯ãã©ããç §æã«ããç»åãšãªããããã«ãããäžè¿°ããããã«ãã¿ãŒã³ç §æãåžžæè¡ããªããã¹ãã¬ãªèšæž¬ãšèгå¯ç»åã®æ®åœ±ãåãæ¿ããããšãå¯èœãšãªãã In this way, since the illumination light has a flat light amount distribution other than the first wavelength band Pb and the second wavelength band Pr, at least the second color image is an image by flat illumination. As a result, as described above, it is possible to switch between stereo measurement and observation image capturing while always performing pattern illumination.
ãããã§ããã©ãããªå éååžïŒãã©ããç §æïŒãšã¯ãæ®åéšã«ããæ®åœ±ãããæ®åœ±é åïŒèŠéïŒã«ãããŠå éååžãäžå®ïŒç¥äžå®ïŒãšããããšã§ãããå ·äœçã«ã¯ãæ®åéšããã®è·é¢ãäžå®ã®é¢ïŒå 軞ã«åçŽãªé¢ïŒã§ã®å éååžãäžå®ãšããããšã§ããããªããå®å šã«å éååžãäžå®ã§ããå¿ èŠã¯ãªããäŸãã°ãã¿ãŒã³ç §æã®ãããªæ¥æ¿ãªå éå€åïŒãšããžéšïŒããªãç·©ãããªå éå€åããã£ãŠããããæãã¯ãæ®åœ±é åã®åšèŸºéšã«ãããŠäžå€®éšãããå éãç·©ããã«äžéãããããªãéåžžã®ç §æã§èãããããããªå éååžããã£ãŠãããã Here, the flat light quantity distribution (flat illumination) means that the light quantity distribution is constant (substantially constant) in the photographing region (field of view) photographed by the imaging unit. Specifically, the light amount distribution on a surface having a constant distance from the imaging unit (a surface perpendicular to the optical axis) is constant. Note that the light amount distribution need not be completely constant, and there may be a gradual light amount change without an abrupt light amount change (edge portion) such as pattern illumination. Alternatively, there may be a light amount distribution that can be considered in normal illumination, in which the light amount falls more slowly in the peripheral part of the imaging region than in the central part.
ããŸãæ¬å®æœåœ¢æ ã§ã¯ã第ïŒè²ã¯éè²ã§ããã第ïŒè²ã¯ç·è²ã§ããã第ïŒè²ã¯èµ€è²ã§ããã In this embodiment, the first color is blue, the second color is green, and the third color is red.
ããªãã第ïŒè²ã®ç»åã第ïŒè²ã®ç»åã第ïŒè²ã®ç»åã¯ãããããéè²ã®ç»åãç·è²ã®ç»åãèµ€è²ã®ç»åã«å¯Ÿå¿ãããããããã®ç»åã¯åè²ã®ã«ã©ãŒãã£ã«ã¿ãæããæ®åçŽ åïŒäŸãã°åè²ãã€ã€é åã®æ®åçŽ åïŒã«ãã£ãŠæ®åããããã®ã«éå®ãããªããäŸãã°è£è²ã®ã«ã©ãŒãã£ã«ã¿ãæããæ®åçŽ åã«ããè£è²ã®ç»åãæ®åããããã®è£è²ã®ç»åãã倿åŠçã«ããéè²ã®ç»åãç·è²ã®ç»åãèµ€è²ã®ç»åãååŸãããŠãããã Note that the first color image, the second color image, and the third color image correspond to a blue image, a green image, and a red image, respectively, and these images have an image pickup element having a primary color filter. It is not limited to what was imaged (for example, image sensor of primary color Bayer arrangement). For example, a complementary color image may be captured by an imaging element having a complementary color filter, and a blue image, a green image, and a red image may be acquired from the complementary color image by conversion processing.
ãïŒïŒç¬¬ïŒåå
ç¹æ§äŸ
ã第ïŒåå
ç¹æ§äŸã§ã¯ã芳å¯ã¢ãŒããšã¹ãã¬ãªèšæž¬ã¢ãŒãã®æ®å垯åãåé¢ãããå Žåã説æããã第ïŒåå
ç¹æ§äŸã§ã¯ã芳å¯ã¢ãŒããšèšæž¬ã¢ãŒãã®æ®å垯åãåé¢ãããŠããªãå Žåã«ã€ããŠè¿°ã¹ããæ³¢é·åž¯åãåé¢ããéãã¿ãŒã³å
ãšãã¿ãŒã³å
ãåæãããç
§æãã¢ãŒãã«é¢ãããåžžæå©çšããããšãã§ããã°ãæ³¢é·åž¯åãæå¹ã«å©çšã§ããç
§æã®åæ¿ãæ©èœãäžèŠãªã®ã§ãæ®åæåºŠã®ç¢ºä¿ã被åäœã®è²æ
å ±ãã«ããŒããããšãç
§ææ©æ§ã®ç°¡ååã®ç¹ã§æå©ã§ãããããããªããæ³¢é·åž¯åãåé¢ãããŠããªãå Žåã¯ã芳å¯ã¢ãŒãã®æ®åç»åããã¿ãŒã³å
ã®åœ±é¿ãåããã®ã§ãé«å質ã§å¿ å®ãªèгå¯ç»åãåŸãã«ã¯ãã¿ãŒã³å
ã®åœ±é¿ãé€å»ãŸãã¯äœæžããå¿
èŠãããã
3. Second spectral characteristic example In the first spectral characteristic example, the case where the imaging bands of the observation mode and the stereo measurement mode are separated has been described. In the second spectral characteristic example, a case where the imaging bands of the observation mode and the measurement mode are not separated will be described. If you can always use illumination that combines non-patterned light and patterned light without separating the wavelength band, regardless of the mode, you can use the wavelength band effectively, and the illumination switching function is unnecessary, ensuring imaging sensitivity. Covering the color information of the subject is advantageous in terms of simplifying the illumination mechanism. However, if the wavelength bands are not separated, the captured image in the observation mode is also affected by the pattern light. Therefore, in order to obtain a high-quality and faithful observation image, it is necessary to remove or reduce the influence of the pattern light.
ãå³ïŒã«ããã¿ãŒã³ç §æã®ç¬¬ïŒåå ç¹æ§äŸãšã芳å¯ã¢ãŒãåã³ã¹ãã¬ãªèšæž¬ã¢ãŒãã§ã®ç³ã®ç¬¬ïŒåå ç¹æ§äŸã瀺ãã FIG. 5 shows a second spectral characteristic example of pattern illumination and a second spectral characteristic example of the pupil in the observation mode and the stereo measurement mode.
ããã¿ãŒã³ç §æã®åå ç¹æ§ã§ã¯ãæ®åã»ã³ãµã®ã«ã©ãŒãã£ã«ã¿ã®åå ç¹æ§ãšé¢é£ä»ããŠãæ³¢é·åž¯åãïœïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïœïœã®ïŒã€ã«åå²ããã垯åïœïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïœïœã®ç §æå ã¯ãå éååžãäžæ§ãªéãã¿ãŒã³å ïŒãã©ããå ïŒã§ããã垯åïœïŒ°ïœïŒïŒïŒ°ïœïŒïœã®ç §æå ã¯ãã¿ãŒã³å ã§ãããããããåæããåå ç¹æ§ããã€å ã被åäœãžã®ç §æå ãšãªãã In the spectral characteristics of the pattern illumination, the wavelength band is divided into five parts {Pb1, Pb2, Pr1, Pr2, Pir} in association with the spectral characteristics of the color filter of the image sensor. The illumination light in the bands {Pb2, Pr1, Pir} is non-pattern light (flat light) with a uniform light quantity distribution, and the illumination light in the bands {Pb1, Pr2} is pattern light. Light having a spectral characteristic obtained by combining these becomes illumination light to the subject.
ãïœïŒã¯ãéè²ãã£ã«ã¿ãééãããç·è²ãã£ã«ã¿ãééããªã垯åã«èšå®ãããŠãããïœïŒã¯ãéè²ãã£ã«ã¿ãšç·è²ãã£ã«ã¿ãå ±ã«ééãã垯åã«èšå®ãããŠãããïœïŒã¯ãèµ€è²ãã£ã«ã¿ãšç·è²ãã£ã«ã¿ãå ±ã«ééãã垯åã«èšå®ãããŠãããïœïŒã¯ãèµ€è²ãã£ã«ã¿ãééãããç·è²ãã£ã«ã¿ãééããªã垯åã«èšå®ãããŠãããïœïœã¯ãèµ€å€ã®æåºŠç¹æ§ïŒ©ïŒ²ã«å¯Ÿå¿ãã垯åã§ãããèµ€è²ãç·è²ãéè²ã®å šãŠã®ãã£ã«ã¿ãééãã垯åã§ããããã®ããã«ãæ³¢é·åž¯åïœïŒãïœïŒã¯ãæ®åã»ã³ãµã®ç·è²ãã£ã«ã¿ã®åå ç¹æ§ïŒ§ãšã¯å¹²æžããªãæ³¢é·åž¯åã«èšå®ãããã Pb1 is set in a band that passes through the blue filter but does not pass through the green filter, and Pb2 is set in a band that passes through both the blue filter and the green filter. Pr1 is set in a band that passes through both the red filter and the green filter, and Pr2 is set in a band that passes through the red filter but does not pass through the green filter. Pir is a band corresponding to the infrared sensitivity characteristic IR, and is a band that passes through all the red, green, and blue filters. Thus, the wavelength bands Pb1 and Pr2 are set to wavelength bands that do not interfere with the spectral characteristics G of the green filter of the image sensor.
ã芳å¯ã¢ãŒãã§ã¯ãåå
ç¹æ§ïŒŠïŒ£ã®ç³ã§æ®åœ±ãããããã®åå
ç¹æ§ïŒŠïŒ£ã¯æ³¢é·åž¯åïœïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïœïœãå«ããäŸãã°ãåºå®ãã¹ã¯ïŒïŒã®äžå¿çµãåïŒïŒã«ã¯ãã£ã«ã¿ãèšãããããäžå¿ç³å
è·¯ãå
šåž¯åã®å
ãééãããã芳å¯ã¢ãŒãã«ãããŠæ®åãããã«ã©ãŒç»åãæ§æããèµ€è²ç»åïœãç·è²ç»åïœãéè²ç»åïœãšãããããã«ããŒããæ³¢é·åž¯åã®é¢ä¿ã¯äžåŒïŒïŒïŒã®ããã«ãªãã
ãã¹ãã¬ãªèšæž¬ã¢ãŒãã§ã¯ãåå
ç¹æ§ïŒŠïŒ¬ã®å·Šç³ãšåå
ç¹æ§ïŒŠïŒ²ã®å³ç³ã§æ®åœ±ãããåå
ç¹æ§ïŒŠïŒ¬ã¯æ³¢é·åž¯åïœïŒã«å¯Ÿå¿ããŠãããåå
ç¹æ§ïŒŠïŒ²ã¯æ³¢é·åž¯åïœïŒã«å¯Ÿå¿ããŠãããå³ã¡ãã¹ãã¬ãªèšæž¬ã¢ãŒãã«ãããŠæ®åãããèµ€è²ç»åïŒïœãéè²ç»åïŒïœãšãããããã«ããŒããæ³¢é·åž¯åã®é¢ä¿ã¯äžåŒïŒïŒïŒã®ããã«ãªãã
ãå³ïŒã«ã第ïŒåå ç¹æ§äŸãšã芳å¯ã¢ãŒãåã³ã¹ãã¬ãªèšæž¬ã¢ãŒãã«ãããæ®åç»åãšã®é¢ä¿ã瀺ããå³ïŒã®å³å³ã®æ³¢åœ¢ã«ãããŠãç¹ç·ã¯ãã©ããç §æãããå Žåã®ã»ã³ãµåºåãä»®æ³çã«ç€ºããŠããã FIG. 6 shows the relationship between the second spectral characteristic example and the captured images in the observation mode and the stereo measurement mode. In the waveform in the right diagram of FIG. 6, the dotted line virtually indicates the sensor output when flat illumination is performed.
ã芳å¯ã¢ãŒãã§ã¯ãåå æåïœïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïœïœããã£ãç §æå ã«ããåå°å ãç³äžå¿å è·¯ïŒåå ç¹æ§ïŒŠïŒ£ïŒãééãæ®åç»åïœïŒ¶ïœïŒïŒ¶ïœïŒïŒ¶ïœïœãåŸããããã¹ãã¬ãªèšæž¬ã¢ãŒãã§ã¯ããã¿ãŒã³å ã®åå æåïœïŒ°ïœïŒïŒïŒ°ïœïŒïœããã£ãç §æå ã«ããåå°å ãå·Šå³ç³å è·¯ãééãæ®åç»åïœïŒïœïŒïŒïœïœãåŸãããã In the observation mode, the reflected light by the illumination light having the spectral components {Pb1, Pb2, Pr1, Pr2, Pir} passes through the pupil center optical path (spectral characteristic FC), and a captured image {Vr, Vg, Vb} is obtained. In the stereo measurement mode, the reflected light by the illumination light having the spectral components {Pb1, Pr2} of the pattern light passes through the left and right pupil optical paths, and a captured image {Mr, Mb} is obtained.
ã芳å¯ã¢ãŒãã§ã¯ã芳å¯ç»åïœãïœã¯ãã¿ãŒã³ç §æã®åœ±é¿ãåããã®ã§ãæå³çãªæããå€åãçããç»åãšãªããäžæ¹ã芳å¯ç»åïœã¯äžæ§ç §æã§ããæ³¢é·åïœïŒ°ïœïŒïŒïŒ°ïœïŒïŒïŒ°ïœïœïœã§æ§æãããã®ã§æå³çãªæããå€åã¯çºçããã被åäœã®åå°ä¿æ°ã®ã¿ãåæ ãããç»åãããã¡ã€ã«ã«ãªãã In the observation mode, the observation images Vb and Vr are affected by the pattern illumination, so that the images are intentionally changed in brightness. On the other hand, since the observation image Vg is composed of the wavelength range {Pb2, Pr1, Pir} that is uniform illumination, no intentional brightness change occurs, and the image profile reflects only the reflection coefficient of the subject.
ãïŒïŒè£æ£åŠç
ã第ïŒåå
ç¹æ§äŸãçšããå Žåã芳å¯ç»åïœãïœã¯ãã¿ãŒã³ã®åœ±é¿ãåãããããã¡ã€ã«ãšãªãããä»®ã«äžæ§ç
§æã ãšãããšãã«åŸããããããã¡ã€ã«ïŒå³ïŒã®ç¹ç·ïŒãæ³å®ãããšãå¹³åçæããã¯ç°ãªããã®ã®éšåçã«èгå¯ç»åïœãšã®é¡äŒŒæ§ïŒçžäŒŒæ§ïŒã¯é«ãããªããªãã°ã芳å¯ç»åïœïŒ¶ïœïŒïŒ¶ïœïœã¯åž¯åïœïŒã«ãããŠæ®åã»ã³ãµã®åå
ç¹æ§ãéè€ããŠããç»åã§ããã芳å¯ç»åïœïŒ¶ïœïŒïŒ¶ïœïœã¯åž¯åïœïŒã«ãããŠæ®åã»ã³ãµã®åå
ç¹æ§ãéè€ããŠããç»åã§ããã®ã§ãå°ãªãããçžäºã«çžé¢æ§ããã€ããã§ãããåŸã£ãŠãã¿ãŒã³ã®åœ±é¿ãåããªã芳å¯ç»åïœãçšããã°èгå¯ç»åïœãïœã®è£æ£ãå¯èœãšãªãããã¿ãŒã³ã®åœ±é¿ãé€å»ãŸãã¯äœæžã§ãããå³ã¡ããã©ããç
§æã§æ®åœ±ãããã®ãããªèгå¯ç»åã埩å
å¯èœã§ããããã®è£æ£åŠçã«ã€ããŠä»¥äžã«èª¬æããã
4). Correction Processing When the second spectral characteristic example is used, the observation images Vr and Vb are profiles affected by the pattern, but assuming a profile (dotted line in FIG. 6) obtained when the illumination is uniform. Although the average brightness is different, the similarity (similarity) with the observed image Vg is partially high. This is because the observation image {Vr, Vg} is an image in which the spectral characteristics of the imaging sensor overlap in the band Pb2, and the observation image {Vb, Vg} is an image in which the spectral characteristics of the imaging sensor overlap in the band Pr1. This is because there is a considerable correlation between them. Therefore, if the observation image Vg not affected by the pattern is used, the observation images Vr and Vb can be corrected, and the influence of the pattern can be removed or reduced. That is, it is possible to restore an observation image as if it was taken with flat illumination. This correction process will be described below.
ãå³ïŒãå³ïŒã«ãè£æ£åŠçã®èª¬æå³ã瀺ãããªã以äžã§ã¯éè²ç»åïœã®è£æ£ãäŸã«ãšã£ãŠèª¬æããããèµ€è²ç»åïœã®è£æ£ãåæ§ã«è¡ãããšãã§ããã 7 and 8 are explanatory diagrams of the correction process. In the following description, the correction of the blue image Vb will be described as an example, but the correction of the red image Vr can be similarly performed.
ãå³ïŒã«ç€ºãããã«ãæ®åã»ã³ãµã®ã»ã³ãµé¢ã«ãããä»»æã®äœçœ®ïŒžïŒ¬ãäžå¿ãšããå¹ ïœã®åºéã§ã®ãéè²ç»åïœãšç·è²ç»åïœã®æ³¢åœ¢ã®éã®çžé¢å€ãèšç®ããããããã»ã³ãµé¢ã«ãããå šãŠã®äœçœ®ïœïŒå³ã¡ãæ®åç»åã®å šãŠã®ç»çŽ ïŒã«å¯ŸããŠè¡ããäŸãã°ïŒºïŒ®ïŒ£ïŒ£ïŒZero-mean Normalized Cross-CorrelationïŒãçšããå Žåãé¡äŒŒåºŠãæå€§ã®å Žåã«çžé¢å€ãïŒãšãªããé¡äŒŒåºŠãäœãã»ã©çžé¢å€ãïŒã«è¿ã¥ãã As shown in FIG. 7, the correlation value between the waveform of the blue image Vb and the green image Vg in the section of the width d centering on the arbitrary position XL on the sensor surface of the image sensor is calculated. This is performed for all positions x on the sensor surface (that is, all pixels of the captured image). For example, when ZNCC (Zero-mean Normalized Cross-Correlation) is used, the correlation value becomes 1 when the similarity is maximum, and the correlation value approaches 0 as the similarity becomes lower.
ãçžé¢å€ãšéŸå€ïŒŽïœãæ¯èŒããçžé¢å€ãéŸå€ïŒŽïœä»¥äžã§ããå Žåã«ã¯ãã©ã°å€ãâïŒâãšããçžé¢å€ãéŸå€ïŒŽïœããå°ããå Žåã«ã¯ãã©ã°å€ãâïŒâãšãããå³ã¡ããã©ã°å€ã¯é¡äŒŒåºŠãæïŒç¡ã«ïŒå€åãããã®ã§ãããè£æ£åŠçã§ã¯ããã©ã°å€âïŒâã®ç»çŽ ãæå¹ç»çŽ ãšå€æãããã©ã°å€âïŒâã®ç»çŽ ãç¡å¹ç»çŽ ãšå€æããã The correlation value is compared with the threshold Th, and when the correlation value is equal to or greater than the threshold Th, the flag value is set to â1â, and when the correlation value is smaller than the threshold Th, the flag value is set to â0â. That is, the flag value is obtained by binarizing the similarity with / without similarity. In the correction process, a pixel with a flag value â1â is determined as a valid pixel, and a pixel with a flag value â0â is determined as an invalid pixel.
ããªããçžé¢å€ã¯æ®åç»åã®å šãŠã®ç»çŽ ã«ã€ããŠæ±ããå Žåã«éãããäŸãã°æå®ã®é åã®ç»çŽ ã«ã€ããŠæ±ããŠãããããæå®ééã§éåŒããç»çŽ ã«ã€ããŠæ±ããŠãããããŸãããã©ã°å€ã®å€å®ã¯äžèšã«éãããäŸãã°é¡äŒŒåºŠãé«ãã»ã©çžé¢å€ãå°ãããªãçžé¢æŒç®ãçšããå Žåã«ã¯ãçžé¢å€ãéŸå€ïŒŽïœä»¥äžã§ããå Žåã«ãã©ã°å€ãâïŒâãšããçžé¢å€ãéŸå€ïŒŽïœãã倧ããå Žåã«ãã©ã°å€ãâïŒâãšããŠãããã Note that the correlation value is not limited to the case of obtaining all the pixels of the captured image. For example, the correlation value may be obtained for pixels in a predetermined region, or may be obtained for pixels thinned at a predetermined interval. In addition, the determination of the flag value is not limited to the above. For example, when using a correlation calculation in which the correlation value decreases as the similarity increases, the flag value is set to â1â when the correlation value is equal to or less than the threshold Th The flag value may be set to â0â when the correlation value is larger than the threshold Th.
ãæ¬¡ã«å³ïŒã«ç€ºãããã«ãéè²ç»åïœã®äœçœ®ïŒžïŒ¬ã®ãã©ã°å€ãâïŒâã®å Žåã¯ãäœãããéè²ç»åïœã®äœçœ®ïŒžïŒ¬ã®ç»çŽ å€ïŒ¶ïœïŒïŒžïŒ¬ïŒãããã®ãŸãŸè£æ£å€ïŒ¶ïœâïŒïŒžïŒ¬ïŒãšãããéè²ç»åïœã®äœçœ®ïŒžïŒ¬ã®ãã©ã°å€ãïŒã®å Žåã¯ãéè²ç»åïœåã³ç·è²ç»åïœã®äœçœ®ïŒžïŒ¬ãäžå¿ãšããå¹ ïœã®åºéã«çç®ãããã®åºéã«ãããŠãã©ã°å€ãâïŒâã®ç»çŽ ã®ã¿ã䜿ã£ãŠéè²ç»åïœãšç·è²ç»åïœãšã®åºéãã£ããã£ã³ã°åŠçãè¡ããå³ïŒã®äŸã§ã¯ãå¹ ïœã®çç®åºéã®æå¹ç»çŽ ç¯å²ãïœ ïŒãïœ ïŒãšè¡šèšããŠããããã©ã°å€ãâïŒâã®ç¡å¹ç»çŽ ç¯å²ã¯åºéãã£ããã£ã³ã°åŠçã«ã¯äœ¿ããªãã Next, as shown in FIG. 8, when the flag value of the position XL of the blue image Vb is â1â, nothing is done and the pixel value Vb (XL) of the position XL of the blue image Vb is directly used as the correction value Vb â² ( XL). When the flag value at the position XL of the blue image Vb is 0, pay attention to the section of the width w centering on the position XL of the blue image Vb and the green image Vg, and only the pixel having the flag value â1â in the section is selected. The section fitting process between the blue image Vb and the green image Vg is performed. In the example of FIG. 8, the effective pixel ranges of the target section with the width w are denoted as e1 and e2. The invalid pixel range with the flag value â0â is not used for the interval fitting process.
ããã£ããã£ã³ã°åŠçãšããŠã¯ãäŸãã°ãåŠçåºéã®æå¹ç»çŽ ç¯å²ãïœ ïŒãïœ ïŒã«ãããŠãç·è²ç»åïœã®ã¬ãã«ãå€åãããåã¬ãã«ã§ã®ç·è²ç»åïœãšéè²ç»åïœã®å·®åã®çµ¶å¯Ÿå€ã®åèšãæ±ãããã®åèšãæå°ãšãªãããã«éãåãããæ¹æ³ãèãããããæãã¯ãç·è²ç»åïœã®ã²ã€ã³ãå€åãããåã²ã€ã³ã§ã®ç·è²ç»åïœãšéè²ç»åïœã®å·®åã®çµ¶å¯Ÿå€ã®åèšãæ±ãããã®åèšãæå°ãšãªãããã«éãåãããæ¹æ³ãèããããã As the fitting process, for example, when the effective pixel range of the processing section is e1 and e2, the level of the green image Vg is changed, and the total absolute value of the difference between the green image Vg and the blue image Vb at each level is obtained. A method of superimposing such that the sum is minimized can be considered. Alternatively, a method is conceivable in which the gain of the green image Vg is changed, the sum of the absolute values of the differences between the green image Vg and the blue image Vb at each gain is obtained, and the sum is made so that the sum is minimized.
ããã£ããã£ã³ã°åŠçåŸã®ç·è²ç»åïœã®äœçœ®ïŒžïŒ¬ã®ç»çŽ å€ïŒ¶ïœïŒïŒžïŒ¬ïŒãè£æ£å€ïŒ¶ïœâïŒïŒžïŒ¬ïŒãšãããäžèšäžé£ã®è£æ£åŠçãã»ã³ãµé¢ã«ãããå šãŠã®äœçœ®ïœïŒå³ã¡ãæ®åç»åã®å šãŠã®ç»çŽ ïŒã«ãããŠè¡ããè£æ£ãããéè²ç»åïœâãçæãããåæ§ã«ããŠè£æ£ãããèµ€è²ç»åïœâãçæããããããã®è£æ£ãããç»åãšãæ¢ã«æ®åãããŠããç·è²ç»åïœãšãåãããŠã衚瀺çšã®èгå¯ç»åãšããŠã®ïŒ²ïŒ§ïŒ¢ç»åãåæ§æããã The pixel value Vg (XL) at the position XL of the green image Vg after the fitting process is set as a correction value Vb â² (XL). The series of correction processes described above is performed at all positions x on the sensor surface (that is, all pixels of the captured image) to generate a corrected blue image Vb â². Similarly, a corrected red image Vr â² is generated. These corrected images and the already captured green image Vg are combined to reconstruct an RGB image as an observation image for display.
ã以äžã®ãããªè£æ£åŠçãè¡ãããšã§ããã¿ãŒã³å ãå«ãç §æå ã§ãã£ãŠãé«åäœãªèгå¯ç»åãçæããããšãã§ããã By performing the correction process as described above, a high-quality observation image can be generated even with illumination light including pattern light.
ã以äžã®å®æœåœ¢æ ã«ããã°ãéã¹ãã¬ãªã¢ãŒãã«ãããåçŒã¯ã第ïŒè²ïŒéè²ïŒãšç¬¬ïŒè²ïŒç·è²ïŒãšç¬¬ïŒè²ïŒèµ€è²ïŒã®æ³¢é·åž¯åãå«ãæ³¢é·åž¯åãééãããã According to the embodiment described above, the monocular in the non-stereo mode passes the wavelength band including the wavelength bands of the first color (blue), the second color (green), and the third color (red).
ããã®ããã«ããã°ãéã¹ãã¬ãªã¢ãŒãã«ãããŠããã¿ãŒã³ç §æã«ãããã¿ãŒã³ãåã£ã第ïŒè²ã®ç»ååã³ç¬¬ïŒè²ã®ç»åãšããã¿ãŒã³ç §æã«ãããã¿ãŒã³ãåããªã第ïŒè²ã®ç»åãæ®åãããããããŠããããïŒè²ã®ç»åãçšããããšã§ãã¿ãŒã³ç §æã«ãããã¿ãŒã³ãæ¶å»ïŒäœæžïŒããããšãå¯èœãšãªãã芳å¯ç»åãåŸãããšãå¯èœãšãªãã In this way, in the non-stereo mode, the first color image and the third color image in which the pattern by the pattern illumination is captured, and the second color image in which the pattern by the pattern illumination is not captured are captured. Then, by using these three color images, it is possible to erase (reduce) the pattern by pattern illumination, and an observation image can be obtained.
ããŸãæ¬å®æœåœ¢æ
ã§ã¯ãå³ïŒïŒçã§åŸè¿°ããããã«ãæ®åè£
眮ãäœçžå·®æ€åºéšïŒïŒïŒãšç»ååºåéšïŒã«ã©ãŒç»åçæéšïŒïŒïŒïŒãå«ãã§ããããäœçžå·®æ€åºéšïŒïŒïŒã¯ãã¹ãã¬ãªã¢ãŒãã«ãããŠæ®åããã第ïŒè²ã®ç»åãšç¬¬ïŒè²ã®ç»åãšã®éã®äœçžå·®ãæ€åºãããç»ååºåéšã¯ãéã¹ãã¬ãªã¢ãŒãã«ãããŠæ®åãããæ®åç»åïŒç¬¬ïŒè²ïœç¬¬ïŒè²ã®ç»åïŒã«åºã¥ããŠèгå¯çšã®ç»åãåºåããããŸãç»ååºåéšã¯ãæäžã®å
éååžã«ãã第ïŒè²ã®ç»ååã³ç¬¬ïŒè²ã®ç»åã®ç»çŽ å€ã®å€åãã第ïŒè²ã®ç»åã«åºã¥ããŠè£æ£ããã
In this embodiment, as will be described later with reference to FIG. 18 and the like, the imaging apparatus may include a phase
ãæ¬å®æœåœ¢æ ã®ãã¿ãŒã³ç §æãçšããå Žåã第ïŒè²ã®ç»åã«ã¯ãã¿ãŒã³ç §æã«ãããã¿ãŒã³ãåããªããããã«ããããã®ç¬¬ïŒè²ã®ç»åãåºæºãšããŠããã¿ãŒã³ç §æã«ãããã¿ãŒã³ãåã£ã第ïŒè²ã®ç»ååã³ç¬¬ïŒè²ã®ç»åã®ç»çŽ å€ãè£æ£ããããšãå¯èœãšãªããå³ã¡ãéåžžã®ç»åïŒäŸãã°å·¥æ¥çšå èŠé¡ã§éåžžæ®åœ±ãã被åäœã®ç»åããèªç¶çãæ®åœ±ããç»åçïŒã§ã¯ç¬¬ïŒè²ïœç¬¬ïŒè²ã®ç»åã®ãããã¡ã€ã«ã¯ãã»ãŒçžäŒŒããŠãããšèããããããã®ããããã¿ãŒã³ãåã£ãŠããªã第ïŒè²ã®ç»åã®ãããã¡ã€ã«ã«çžäŒŒãããããã«ç¬¬ïŒè²ã®ç»ååã³ç¬¬ïŒè²ã®ç»åã®ãããã¡ã€ã«ãè£æ£ããããšã«ãã£ãŠããã¿ãŒã³ç §æã®åœ±é¿ãè£æ£ã§ããã When the pattern illumination of this embodiment is used, the pattern by the pattern illumination is not reflected in the second color image. Accordingly, it is possible to correct the pixel values of the first color image and the third color image in which the pattern by the pattern illumination is captured using the second color image as a reference. In other words, it is considered that the profiles of the first to third color images are almost similar in a normal image (for example, an image of a subject that is normally captured by an industrial endoscope or an image captured of the natural world). . Therefore, the influence of pattern illumination can be corrected by correcting the profiles of the first color image and the third color image so as to be similar to the profile of the second color image in which no pattern is captured.
ãïŒïŒç
§æéš
ã以äžãæ¬å®æœåœ¢æ
ã®ã¢ã¯ãã£ããã¿ãŒã³ç
§æãè¡ãç
§æéšã«ã€ããŠèª¬æããããªããç
§æå
ã®åºå°éšãå
æºéšããé¢ããå
èŠé¡ã¹ã³ãŒãã®ãããªè£
眮ãäŸã«ãšã£ãŠèª¬æããããå
æºéšã¯ãå¿
ãããå
æºéšããåºå°ç«¯ãŸã§ãå°å
éšæãçšããè£
眮ã«éå®ãããªãã
5). Illumination Unit An illumination unit that performs active pattern illumination according to this embodiment will be described below. In addition, although demonstrated taking the case of apparatuses, such as an endoscope scope in which the emission part of illumination light was separated from the light source part, a light source part is not necessarily limited to the apparatus using a light guide member from a light source part to an output end.
ãå³ïŒã«ãç
§æéšã®ç¬¬ïŒæ§æäŸã瀺ããå³ïŒã®ç
§æéšã¯ãçœè²å
æºïŒïŒïŒãå°å
éšæïŒïŒïŒïŒéãã¿ãŒã³å
çšå°å
éšæïŒãç
§æçšã¬ã³ãºïŒïŒïŒãèµ€è²ã¬ãŒã¶ãŒå
æºïŒïŒïŒãéè²ã¬ãŒã¶ãŒå
æºïŒïŒïŒããã€ã¯ãã€ãã¯ããªãºã ïŒïŒïŒãïŒïŒïŒïŒåºçŸ©ã«ã¯ãã©ãŒïŒãå°å
éšæïŒïŒïŒïŒãã¿ãŒã³å
çšå°å
éšæïŒããã¹ã¯ãã¿ãŒã³ïŒïŒïŒãæåœ±ã¬ã³ãºïŒïŒïŒãå«ãã
FIG. 9 shows a first configuration example of the illumination unit. 9 includes a
ããã®ç¬¬ïŒæ§æäŸã§ã¯ãç
§æéšãïŒç³»çµ±ã®å°å
éšæïŒïŒïŒãïŒïŒïŒãæããŠãããäžæ¹ã®å°å
éšæïŒïŒïŒã¯ãçœè²å
æºïŒïŒïŒããã®çœè²å
ããã¹ã³ãŒãéšã®å
端éšã«èšããããç
§æçšã¬ã³ãºïŒïŒïŒãŸã§å°å
ããããã®å°å
ãããçœè²å
ã¯ãç
§æçšã¬ã³ãºïŒïŒïŒãä»ããŠè¢«åäœã«ç
§å°ãããã仿¹ã®å°å
éšæïŒïŒïŒã¯ãéè²ã¬ãŒã¶ãŒå
ãšèµ€è²ã¬ãŒã¶ãŒå
ããã¹ã³ãŒãéšã®å
端éšã«èšããããæåœ±ã¬ã³ãºïŒïŒïŒãŸã§å°å
ãããå³ã¡ãéè²ã¬ãŒã¶ãŒå
æºïŒïŒïŒããã®éè²ã¬ãŒã¶ãŒå
ãšèµ€è²ã¬ãŒã¶ãŒå
æºïŒïŒïŒããã®èµ€è²ã¬ãŒã¶ãŒå
ãããã€ã¯ãã€ãã¯ããªãºã ïŒïŒïŒãïŒïŒïŒã«ããå
è·¯åæã«ãããå°å
éšæïŒïŒïŒã«å
¥å°ãããå°å
éšæïŒïŒïŒã«ããå°å
ãããã¬ãŒã¶ãŒå
ã¯ãã¹ã¯ãã¿ãŒã³ïŒïŒïŒãééããããã«ããä»å ããããã¿ãŒã³ãæåœ±ã¬ã³ãºïŒïŒïŒã«ãã被åäœïŒã«æåœ±ãããã
In this first configuration example, the illumination unit has two
ããã®ç¬¬ïŒæ§æäŸã§ã¯ãéãã¿ãŒã³å
ãšãã¿ãŒã³å
ã¯è¢«åäœïŒã®è¡šé¢ã«ãããŠåæãããç
§æã«ãªãã
In this first configuration example, the non-pattern light and the pattern light are combined on the surface of the
ãå³ïŒïŒã«ãç
§æéšã®ç¬¬ïŒæ§æäŸã瀺ããå³ïŒïŒã®ç
§æéšã¯ãçœè²å
æºïŒïŒïŒãåå
çŽ åïŒïŒïŒãéè²ã¬ãŒã¶ãŒå
æºïŒïŒïŒãèµ€è²ã¬ãŒã¶ãŒå
æºïŒïŒïŒããã€ã¯ãã€ãã¯ããªãºã ïŒïŒïŒãïŒïŒïŒããã¹ã¯ãã¿ãŒã³ïŒïŒïŒãåå
çŽ åïŒïŒïŒãããªãºã ïŒïŒïŒïŒåæããªãºã ïŒãå°å
éšæïŒïŒïŒãæåœ±ã¬ã³ãºïŒïŒïŒãå«ãã
FIG. 10 shows a second configuration example of the illumination unit. 10 includes a
ããã®ç¬¬ïŒæ§æäŸã§ã¯ãç
§æéšãïŒç³»çµ±ã®å°å
éšæïŒïŒïŒãæããŠãããçœè²å
æºïŒïŒïŒããã®çœè²å
ã¯ãåå
çŽ åïŒïŒïŒã«ããäŸãã°ïŒ°åå
ïŒïŒ³åå
ã«åçŽãªåå
ïŒã«ããããã®åå
ãããçœè²å
ãããªãºã ïŒïŒïŒã«å
¥å°ãããéè²ã¬ãŒã¶ãŒå
æºïŒïŒïŒããã®éè²ã¬ãŒã¶ãŒå
ãšèµ€è²ã¬ãŒã¶ãŒå
æºïŒïŒïŒããã®èµ€è²ã¬ãŒã¶ãŒå
ã¯ããã€ã¯ãã€ãã¯ããªãºã ïŒïŒïŒãïŒïŒïŒã«ããå
è·¯åæããããå
è·¯åæãããã¬ãŒã¶ãŒå
ã¯ãã¹ã¯ãã¿ãŒã³ïŒïŒïŒãééããããã«ãããã¿ãŒã³ãä»å ãããããã®ãã¿ãŒã³ãä»å ãããã¬ãŒã¶ãŒå
ã¯åå
çŽ åïŒïŒïŒã«ããäŸãã°ïŒ³åå
ïŒããªãºã ïŒïŒïŒã®åå°é¢ã«å¹³è¡ãªåå
ïŒã«ããããã®åå
ãããã¬ãŒã¶ãŒå
ãããªãºã ïŒïŒïŒã«å
¥å°ãããéãã¿ãŒã³å
ã§ããçœè²å
ãšãã¿ãŒã³å
ã§ããã¬ãŒã¶ãŒå
ããããªãºã ïŒïŒïŒã«ããå
è·¯åæã«ããå°å
éšæïŒïŒïŒã«å
¥å°ãããå°å
éšæïŒïŒïŒã¯å
¥å°å
ããã¹ã³ãŒãéšã®å
端éšã«èšããããæåœ±ã¬ã³ãºïŒïŒïŒãŸã§å°å
ãããå°å
ãããéãã¿ãŒã³å
ãšãã¿ãŒã³å
ãæåœ±ã¬ã³ãºïŒïŒïŒã«ãã被åäœïŒã«æåœ±ãããã
In the second configuration example, the illumination unit has one
ããã®ç¬¬ïŒæ§æäŸã§ã¯ãå°å
éšæïŒïŒïŒã®å
¥å°åã«äºããã¿ãŒã³ç
§æãçæããéãã¿ãŒã³å
ãšåæãããäžã§å°å
éšæïŒïŒïŒã«å
¥å°ããããå°å
éšæïŒïŒïŒããã¿ãŒã³ããã®ãŸãŸäŒéã§ããå
ãã¡ã€ããŒãã³ãã«ïŒã€ã¡ãŒãžã¬ã€ãïŒã§ããã°ãçæãããã¿ãŒã³å
ã®ãã¿ãŒã³ãäŒéãããŠå¥œéœåã§ãããå°å
éšæïŒïŒïŒãåãªãå
ãã¡ã€ããŒãã³ãã«ïŒã©ã€ãã¬ã€ãïŒã®ãšãã¯ãå°å
éšæïŒïŒïŒã®å
¥å°ç«¯ãšåºå°ç«¯ã«ãŠãã¡ã€ããŒã®é
åãç°ãªã£ãŠããå¯èœæ§ãé«ãã®ã§ãçæãããã¿ãŒã³å
ã®ãã¿ãŒã³ããã®ãŸãŸäŒéã§ããªããããããªãããã¿ãŒã³ãã©ã³ãã ãã¿ãŒã³ã§ããã°è¯ãå Žåãå
¥å°ç«¯ãšåºå°ç«¯ã®ãã¿ãŒã³ãé©åºŠã«å€åããŠããŠãæ§ããªãã
In this second configuration example, pattern illumination is generated in advance before the
ãïŒïŒæ®åéš
ã以äžã芳å¯ã¢ãŒããšã¹ãã¬ãªèšæž¬ã¢ãŒããåãæ¿ãå¯èœãªæ®åéšã®æ§æäŸã«ã€ããŠèª¬æããã
6). Imaging Unit Hereinafter, a configuration example of an imaging unit capable of switching between the observation mode and the stereo measurement mode will be described.
ãå èŠé¡è£ 眮ã§ã®æ€æ»ã§ã¯ãäŸãã°æ€æ»å¯Ÿè±¡ã«ã¹ã³ãŒããæ¿å ¥ããŠéåžžã®ç»åãæ®åœ±ããªããç°åžžããªãããã§ãã¯ããŠãããå·ãªã©ã®è©³çްã«èгå¯ãããéšåãèŠã€ãã£ããšãã«ããã®éšåã®ïŒæ¬¡å 圢ç¶ãèšæž¬ããŠæŽãªãæ€æ»ãå¿ èŠããæ€èšããããã®ããã«ãéåžžã®èгå¯ç»åã¯çœè²å ã§æ®åœ±ãè¡ãããã®ãããªçœè²å ã§ã®æ®åœ±ãšã¹ãã¬ãªèšæž¬ãäž¡ç«ããæ¹æ³ãšããŠãäŸãã°çœè²å ã§ã¹ãã¬ãªæ®åœ±ãè¡ãããšãèãããããããããªãããã¹ãã¬ãªæ®åœ±ã§çœè²å ãçšããå Žåãã€ã¡ãŒãžã»ã³ãµãå·Šå³ã«åå²ããŠãããããã®é åã«å·Šç»åãšå³ç»åãçµåãããå¿ èŠããããããç»åãäœè§£åã«ãªããã€ã¡ãŒãžã»ã³ãµã®åäžé åã«å·Šç»åãšå³ç»åãçµåããææ³ãšããŠã¯ãã«ã©ãŒäœçžå·®æ³ãããããæ®åœ±ãããç»åã¯è²ããç»åã«ãªããã芳å¯ç»åãšããŠçšããããšãã§ããªãã In an inspection with an endoscopic device, for example, a scope is inserted into the inspection object and a normal image is taken to check for abnormalities. Measure the three-dimensional shape and examine whether further inspection is necessary. Thus, a normal observation image is taken with white light. As a method of achieving both such shooting with white light and stereo measurement, for example, performing stereo shooting with white light can be considered. However, when white light is used in stereo shooting, the image sensor is divided into left and right parts, and the left image and the right image need to be imaged in the respective regions. As a method for forming the left image and the right image in the same area of the image sensor, there is a color phase difference method. However, since the captured image becomes a color shift image, it cannot be used as an observation image.
ãäžèšã®ããšãããçœè²å ã§ã€ã¡ãŒãžã»ã³ãµã®åäžé åã§å·Šç»åãšå³ç»åãåãããã«ã¯ãæåå²åãæ¿ãïŒäŸãã°ç¹èš±æç®ïŒïŒãå¿ èŠãšãªããããããªãããæ®åç³»ãšè¢«åäœãçžå¯Ÿçã«åããå Žåã«ã¯å·Šç»åãšå³ç»åã®éã«åããã¬ããããããäžè§æž¬å®ãäžæ£ç¢ºã«ãªã£ãŠããŸããç¹ã«å èŠé¡ã®ããã«ã«ã¡ã©ã被åäœã«å¯ŸããŠåºå®ã§ããªãå Žåã«ã¯ãåããã¬ãçºçããããã From the above, in order to capture the left image and the right image in the same area of the image sensor with white light, time division switching (for example, Patent Document 1) is required. However, when the imaging system and the subject move relative to each other, there is a motion blur between the left image and the right image, so that the triangle measurement becomes inaccurate. In particular, motion blur is likely to occur when the camera cannot be fixed to the subject as in an endoscope.
ãã«ã©ãŒäœçžå·®ã§éæåå²ã«ã¹ãã¬ãªèšæž¬ãè¡ãææ³ãšããŠãäŸãã°äžè¿°ããç¹èš±æç®ïŒããããããããªãããç¹èš±æç®ïŒã¯ãªãŒããã©ãŒã«ã¹ã«ã¹ãã¬ãªèšæž¬ãé©çšãããã®ã§ããã芳å¯ç»åãšã®é«éãªåãæ¿ããæ³å®ããŠããªããšèãããããäžè¿°ããããã«ãå¯åéšã§ãããã£ã«ã¿ãïŒã€ããããé«éãªåãæ¿ããšããç¹ã§ã¯äžå©ãšèããããã
As a technique for performing stereo measurement in a non-time-division manner using a color phase difference, for example, there is
ããŸããç¹èš±æç®ïŒã®æ§æã§ã¯åäžå
è·¯ãçãäžã§å·Šå³ã«åããã ããªã®ã§ç³éã®è·é¢ãé¢ãããšãé£ãããè·é¢æž¬å®ã®ç²ŸåºŠãåºãã«ãããšããåé¡ããããå
èŠé¡è£
眮ã§ã¯ãã³ãã©ãŒã«ã¹ãå¿
èŠã§ããããçµããå°ããïŒïŒŠå€ã倧ããïŒã®ã§ããã®å°ããªçµãåŸãå·Šå³ã«åããããšã«ãªããç³éã®è·é¢ãè¿ããªããããã
Further, the configuration of
ããŸããã¹ãã¬ãªã«ãããå·Šå³ã®æåå²åãæ¿ããå«ããŠãæåå²ã®åãæ¿ãã§ã¯ã·ã£ãã¿ãåå
ãã£ã«ã¿ãæ©æ¢°çã«åããïŒåãæ¿ããïŒå¿
èŠããããæ©æ¢°çãªåãã§ã¯ãã¹ãæ
éãçºçãããããã·ã£ãã¿ãåå
ãã£ã«ã¿ãåãæ¿ãã®ãããã®ç¶æ
ïŒäœçœ®ïŒã«ããããæ€åºãããšã©ãŒãããã°ä¿®åŸ©ããå¿
èŠããããšããåé¡ãããããã®ãããªæ€åºæ©èœãå®çŸããå Žåããšã©ãŒã®çš®é¡ãå°ãªãæ¹ãæ€åºã修埩ã容æã§ãããäŸãã°ç¹èš±æç®ïŒã®æ§æã§ã¯ãïŒã€ã®åå
ãã£ã«ã¿ã®ãã¡äž¡æ¹ãç³ã«æ¿å
¥ãããªãã£ãå Žåãäžæ¹ã ãç³ã«æ¿å
¥ãããå Žåãªã©ãè€æ°ã®çš®é¡ã®ãšã©ãŒãçºçãåŸãã®ã§ãæ€åºã修埩ã確å®ã«è¡ãããšãé£ãããªãã
In addition, it is necessary to mechanically move (switch) the shutter and the spectral filter for time division switching, including left and right time division switching in stereo. Since mechanical movements cause mistakes and failures, there is a problem that it is necessary to detect which state (position) of the shutter and the spectral filter is switched, and to repair any errors. When realizing such a detection function, detection and repair are easier when there are fewer types of errors. For example, in the configuration of
ãå³ïŒïŒãå³ïŒïŒã«ãäžèšã®ãããªèª²é¡ã解決ã§ããæ¬å®æœåœ¢æ ã®æ®åéšã®æ§æäŸã瀺ããå³ïŒïŒãå³ïŒïŒã«ã¯ãæ®åéšã暪ããèŠãïŒå 軞ãå«ãå¹³é¢ã§ã®ïŒæé¢å³ãšãæ®åçŽ åäžã®çµåã®å éïŒåã¯æ®åçŽ åã«æ®åãããç»åã®ç»çŽ å€ïŒãšäœçœ®ïœã®é¢ä¿ãšãã瀺ããäœçœ®ïœã¯ãçµåå åŠç³»ã®å 軞ã«åçŽãªæ¹åã«ãããäœçœ®ïŒåº§æšïŒã§ãããäŸãã°æ®åçŽ åã®ç»çŽ äœçœ®ã§ãããå®éã«ã¯ïŒæ¬¡å ã®åº§æšç³»ã§ããããããã§ã¯ïŒæ¬¡å ã®ãã¡èŠå·®æ¹åã®ïŒæ¬¡å ã®åº§æšç³»ã§èª¬æããã 11 and 12 show a configuration example of the imaging unit of the present embodiment that can solve the above-described problems. 11 and 12 are cross-sectional views (on a plane including the optical axis) of the image pickup unit, and a light amount of an image formed on the image pickup element (or a pixel value of an image picked up by the image pickup element). The relationship of the position x is shown. The position x is a position (coordinates) in a direction perpendicular to the optical axis of the imaging optical system, for example, a pixel position of the image sensor. Actually, it is a two-dimensional coordinate system, but here, a two-dimensional one-dimensional coordinate system in the parallax direction will be described.
ãæ®åéšã¯ãçµåå
åŠç³»ïŒïŒãå¯åãã¹ã¯ïŒïŒïŒç¬¬ïŒã®ãã¹ã¯ïŒãåºå®ãã¹ã¯ïŒïŒïŒç¬¬ïŒã®ãã¹ã¯ïŒãæ®åçŽ åïŒïŒïŒæ®åã»ã³ãµãã€ã¡ãŒãžã»ã³ãµïŒãç
§æéšïŒïŒïŒç
§æè£
眮ïŒãå«ããçµåå
åŠç³»ïŒïŒã¯ãåçŒã®å
åŠç³»ã§ãããäŸãã°ïŒåã¯è€æ°ã®ã¬ã³ãºã§æ§æããããããã§ã¯æ®åçŽ åïŒïŒãã®ãã€ã€é
åã®ã«ã©ãŒãã£ã«ã¿ãæããå ŽåãäŸã«èª¬æããããããã«éå®ããããäŸãã°è£è²ãã£ã«ã¿çãæããŠãããã
The imaging unit includes an imaging
ãå³ïŒïŒãå³ïŒïŒã«ç€ºãããã«ã被åäœïŒããã®åå°å
ãçµåå
åŠç³»ïŒïŒã«ããæ®åçŽ åïŒïŒã®é¢äžã«çµåãããããã®ãšããåºå®ãã¹ã¯ïŒïŒã«ããç³äžå¿ãšå·Šå³ç³ã«åå²ãããå¯åãã¹ã¯ïŒïŒã«ãããç³äžå¿ã«ããçµåãšãå·Šå³ç³ã«ããçµåãåãæ¿ããããããããã¯æ®åçŽ åïŒïŒã®åäžé åã«çµåãããïœã¯ãå·Šç³ïŒåºå®ãã¹ã¯ïŒïŒã®å·ŠçŒçµãåïŒã®äžå¿ç·ïŒ©ïŒ£ïŒãšå³ç³ïŒåºå®ãã¹ã¯ïŒïŒã®å³çŒçµãåïŒã®äžå¿ç·ïŒ©ïŒ£ïŒãšã®éã®è·é¢ã§ãããã¹ãã¬ãªèšæž¬ã«ãããŠã¯åºç·é·ãšãªãããªããçŽç·ïŒ¡ïŒžïŒ£ã¯ãçµåå
åŠç³»ïŒïŒã®å
軞ã§ãããäžå¿ç·ïŒ©ïŒ£ïŒãïŒã¯ãäŸãã°äžçŒã®çµåå
åŠç³»ïŒïŒã®å
軞ããçè·é¢ã«èšãããããäžå¿ç·ïŒ©ïŒ£ïŒãïŒãšå
軞ã¯åäžå¹³é¢å
ã§ããããšãæãŸããããå¿
ãããåäžå¹³é¢å
ã§ãªããšãããã
As shown in FIGS. 11 and 12, the reflected light from the
ãåºå®ãã¹ã¯ïŒïŒãå¯åãã¹ã¯ïŒïŒã¯ãäŸãã°çµåå
åŠç³»ïŒïŒã®ç³äœçœ®ã«èšãããããæãã¯çµåå
åŠç³»ïŒïŒãããçµååŽã«èšããããŠããããåºå®ãã¹ã¯ïŒïŒã¯çµåå
åŠç³»ïŒïŒã«å¯ŸããŠåºå®ãããŠãããå¯åãã¹ã¯ïŒïŒã¯å
軞ã«åçŽãªå¹³é¢å
ã§äœçœ®ãåãæ¿ããããæ§æãšãªã£ãŠãããå¯åãã¹ã¯ïŒïŒã¯ãå³ïŒïŒã«ç€ºã第ïŒã®ç¶æ
ã§ãã芳å¯ã¢ãŒãïŒç¬¬ïŒã®ã¢ãŒããéã¹ãã¬ãªã¢ãŒããåçŒã¢ãŒãïŒãšãå³ïŒïŒã«ç€ºã第ïŒã®ç¶æ
ã§ããã¹ãã¬ãªèšæž¬ã¢ãŒãïŒç¬¬ïŒã®ã¢ãŒããã¹ãã¬ãªã¢ãŒãïŒã®ïŒã€ã®ã¢ãŒãããšãããšãã§ããããããé«éã«åãæ¿ããããããã«ãªã£ãŠããã
The fixed
ãåºå®ãã¹ã¯ïŒïŒã¯ãïŒã€ã®çµãåïŒå·ŠçŒçµãåãå³çŒçµãåãäžå¿çµãåïŒãèšããããæ¿ç¶ã®é®å
éšïŒé®å
éšæïŒãšãå·ŠçŒçµãåã«èšããããçæ³¢é·ïŒéè²ïŒåå
ãã£ã«ã¿ãšãå³çŒçµãåã«èšããããé·æ³¢é·ïŒèµ€è²ïŒåå
ãã£ã«ã¿ãšããå«ããçµãå以å€ã®éšåã¯é®å
éšã§èŠãããŠãããå
ãééããªãããã«ãªã£ãŠãããäžå¿çµãåã¯ãäŸãã°è²«é穎ã§ãã£ãŠãããããæãã¯äœããã®åå
ãã£ã«ã¿ïŒäŸãã°ãå°ãªããšãçœè²å
ãééããåºåž¯åã®åå
ãã£ã«ã¿ïŒãèšããããŠãããã
The fixed
ãå¯åãã¹ã¯ïŒïŒã¯ãïŒã€ã®çµãåãèšããããæ¿ç¶ã®é®å
éšïŒé®å
éšæïŒãå«ããåã¢ãŒãã«ãããŠãåºå®ãã¹ã¯ïŒïŒã®ïŒã€ã®çµãåã®ãã¡äžå¿çµãååã¯å·Šå³çŒçµãåãé®å
éšãèŠãããããªå€§ããã«å¯åãã¹ã¯ïŒïŒãæ§æãããŠãããçµãåã¯ã芳å¯ã¢ãŒãã«ãããŠåºå®ãã¹ã¯ïŒïŒã®äžå¿çµãåã«éãªãäœçœ®ãšãã¹ãã¬ãªèšæž¬ã¢ãŒãã«ãããŠå·ŠçŒçµãåãšå³çŒçµãåã«éãªãäœçœ®ãšãã«èšããããã以äžã§ã¯ã䟿å®çã«ãå¯åãã¹ã¯ïŒïŒã«ãããŠãå·ŠçŒçµãåãå³çŒçµãåãäžå¿çµãåãšåŒã¶ãå³ïŒïŒãå³ïŒïŒã§ã¯å¯åãã¹ã¯ïŒïŒãåºå®ãã¹ã¯ïŒïŒãããçµååŽã«èšããããå Žåãå³ç€ºããŠããããå¯åãã¹ã¯ïŒïŒãåºå®ãã¹ã¯ïŒïŒããã察ç©åŽã«èšããããŠãããã
The
ãç
§æéšïŒïŒã¯ããã®å
端éšïŒç
§æã®å°åºç«¯ïŒãå·Šç³ãšå³ç³ã«å¯ŸããŠå¯Ÿç§°ãªäœçœ®ãšãªãããã«ãèšããããããšãæãŸããããå¿
ãããç
§æéšïŒïŒã®å
è² ãå·Šç³ãšå³ç³ã«å¯ŸããŠå¯Ÿç§°ã§ãªããŠããããå³ïŒïŒãå³ïŒïŒã§ã¯ç
§æéšïŒïŒã®å
端éšãçµåå
åŠç³»ïŒïŒãããåã«é
眮ãããŠããããããã«éå®ããããäŸãã°æ®åéšã®å
端éšã«ãããŠç
§æéšïŒïŒãšçµåå
åŠç³»ïŒïŒã暪䞊ã³ã«é
眮ãããŠãããã
The
ã以äžãåºå®ãã¹ã¯ïŒïŒã®å·ŠçŒçµãåãå³çŒçµãåãäžå¿çµãåã®åå
ç¹æ§ããããšè¡šèšããããŸããåãããããããããã«ãåçµãåã«èšããããåå
ãã£ã«ã¿ã«ã€ããŠããåã笊å·ïŒŠïŒ¬ããã§è¡šèšãããã®ãšããããªããå¯åãã¹ã¯ïŒïŒã®åçµãåã«ã¯åå
ãã£ã«ã¿ãèšããããŠãããïŒéæŸåã§ããïŒãå
šåž¯åãééãããã
Hereinafter, the spectral characteristics of the left eye aperture, the right eye aperture, and the center aperture of the fixed
ãå³ïŒïŒã¯èгå¯ã¢ãŒãã®ç¶æ
ã瀺ããŠãããç³äžå¿ã®å
è·¯ã¯åºå®ãã¹ã¯ïŒïŒã®äžå¿çµãåãšå¯åãã¹ã¯ã®äžå¿çµãåãä»ããŠéæŸãããç¶æ
ãšãªããå·Šå³ç³ã®å
è·¯ã¯å¯åãã¹ã¯ïŒïŒã«ãã鮿ïŒé®å
ïŒãããç¶æ
ã«ãªã£ãŠããããã®å Žåãæ®åçŽ åïŒïŒã«çµåãããç»åã¯ç³äžå¿ã®ã¿ã«ããçµåç»åãšãªããéåžžã®ïŒåçŒã«ããçœè²å
ã®ïŒæ®åç»åãåŸãããã
FIG. 11 shows the state of the observation mode, where the optical path at the center of the pupil is opened through the central aperture of the fixed
ãäžæ¹ãå³ïŒïŒã¯ã¹ãã¬ãªèšæž¬ã¢ãŒãã®ç¶æ
ã瀺ããŠãããåºå®ãã¹ã¯ïŒïŒã®å·ŠçŒçµãåãšå¯åãã¹ã¯ïŒïŒã®å·ŠçŒçµãåãéãªã£ãç¶æ
ãšãªããåºå®ãã¹ã¯ïŒïŒã®å³çŒçµãåãšå¯åãã¹ã¯ïŒïŒã®å³çŒçµãåãéãªã£ãç¶æ
ãšãªã£ãŠãããç³äžå¿ã®å
è·¯ã¯å¯åãã¹ã¯ïŒïŒã«ãã鮿ïŒé®å
ïŒãããç¶æ
ã«ãªã£ãŠãããå³ã¡ãå·Šç³åŽã®å
è·¯ã¯ãçµåå
ãçæ³¢é·ïŒéè²ïŒåå
ãã£ã«ã¿ïŒŠïŒ¬ïŒç¬¬ïŒã®ãã£ã«ã¿ïŒã«ãããã£ã«ã¿ãªã³ã°ãããã®çæ³¢é·æåã«ããç»åïŒ©ïŒ¬ãæ®åçŽ åïŒïŒã«çµåãããå³ç³åŽã®å
è·¯ã¯ãçµåå
ãé·æ³¢é·ïŒèµ€è²ïŒåå
ãã£ã«ã¿ïŒŠïŒ²ïŒç¬¬ïŒã®ãã£ã«ã¿ïŒã«ãããã£ã«ã¿ãªã³ã°ãããã®é·æ³¢é·æåã«ããç»åãåäžã®æ®åçŽ åïŒïŒã«çµåããã
On the other hand, FIG. 12 shows a state of the stereo measurement mode, in which the left eye diaphragm hole of the fixed
ããããã£ãŠã¹ãã¬ãªèšæž¬ã¢ãŒãã§ã¯ãæ®åçŽ åïŒïŒã®éè²ç»çŽ ã«ããåŸãããç»åã¯çæ³¢é·ç»åãšãªããæ®åçŽ åïŒïŒã®èµ€è²ç»çŽ ã«ããåŸãããç»åã¯é·æ³¢é·ç»åãšãªããïŒã€ã®å
è·¯ããã®ç»åããåé¢ååŸããããšãã§ãããã€ãŸãã¹ãã¬ãªèšæž¬ã¢ãŒãã§ã¯ãäœçžå·®ããã£ãå·ŠçŒç»åãšå³çŒç»åãåæã«äžã€ç¬ç«ããŠåŸãããšãã§ããäœçžå·®ç»åã«ããã¹ãã¬ãªèšæž¬ãå¯èœãšãªãããŸãæ¬å®æœåœ¢æ
ã®æ®åéšã¯ã第ïŒã®ç¶æ
ã§ãã芳å¯ã¢ãŒããšç¬¬ïŒã®ç¶æ
ã§ããã¹ãã¬ãªèšæž¬ã¢ãŒãããšãããšãå¯èœã§ããããããã®ç¶æ
ãé«éã«åãæ¿ããããšãå¯èœã§ãããããã«ãã£ãŠãäŸãã°éã¹ãã¬ãªã®é垞芳å¯ãè¡ããªããããªã¢ã«ã¿ã€ã ã«ïŒïŒ€èšæž¬ãè¡ãããšãå¯èœãšãªãã
Therefore, in the stereo measurement mode, the image IL obtained from the blue pixels of the
ãïŒïŒåºå®ãã¹ã¯ãå¯åãã¹ã¯ã®ç¬¬ïŒã®è©³çŽ°ãªæ§æäŸ
ãå³ïŒïŒãå³ïŒïŒã«åºå®ãã¹ã¯ïŒïŒãå¯åãã¹ã¯ïŒïŒã®ç¬¬ïŒã®è©³çŽ°ãªæ§æäŸã瀺ããå³ïŒïŒãå³ïŒïŒã«ã¯ãçµåå
åŠç³»ïŒïŒãšåºå®ãã¹ã¯ïŒïŒãšå¯åãã¹ã¯ïŒïŒã®æé¢å³ãšãåºå®ãã¹ã¯ïŒïŒãšå¯åãã¹ã¯ïŒïŒãå
軞æ¹åã«èŠãå³ïŒçµååŽããèŠãèé¢å³ïŒãšãã瀺ãã
7). First Detailed Configuration Example of Fixed Mask and Movable Mask FIGS. 13 and 14 show a first detailed configuration example of the fixed
ãåºå®ãã¹ã¯ïŒïŒã®å·Šç³ã®å
è·¯ã«ã¯ãçæ³¢é·ãã£ã«ã¿ïŒŠïŒ¬ãæããçµãåïŒïŒãéããŠãããå³ç³ã®å
è·¯ã«ã¯ãé·æ³¢é·åå
ãã£ã«ã¿ïŒŠïŒ²ãæããçµãåïŒïŒãæ§æãããŠãããç³äžå¿ã®å
è·¯ã«ã¯éæŸç¶æ
ïŒã¹ã«ãŒããŒã«ïŒã®çµãåïŒïŒãèšããããŠããããªããçµãåïŒïŒã«ã¯å³ïŒã®åž¯åïœïŒãïœïŒãïœïœãééãããåå
ãã£ã«ã¿ïŒŠïŒ£ãèšããããŠããããçµãåïŒïŒãïŒïŒã¯é®å
éšïŒïŒïŒé®å
éšæïŒã«éããããŠãããäŸãã°æ®åç³»ã«å¿
èŠãªè¢«åçæ·±åºŠã«å¯Ÿå¿ãããµã€ãºã®åïŒäŸãã°å圢ç¶ã®åã§ããµã€ãºã¯çŽåŸïŒã§ãããçµãåïŒïŒãïŒïŒãïŒïŒã®äžå¿ïŒäŸãã°åã®äžå¿ïŒã¯ãããããäžå¿ç·ïŒ©ïŒ£ïŒãïŒãå
軞ã«äžèŽïŒç¥äžèŽãå«ãïŒããŠãããé®å
éšïŒïŒã¯ãçµåå
åŠç³»ïŒïŒãåããããçäœãæ£é¢ïŒåã¯èé¢ïŒããèŠããšãã«çäœãå¡ãããã«èšããããŠãããäŸãã°å
軞ã«å¯ŸããŠåçŽã«èšããããæ¿ç¶éšæã§ããã
A
ãå¯åãã¹ã¯ïŒïŒã¯ãéæŸç¶æ
ïŒã¹ã«ãŒããŒã«ïŒã®çµãåïŒïŒãïŒïŒãïŒïŒãšããã®çµãåïŒïŒãïŒïŒãïŒïŒãéããããé®å
éšïŒïŒïŒé®å
éšæïŒãšããæãããçµãåïŒïŒãïŒïŒãïŒïŒã¯ãäŸãã°åºå®ãã¹ã¯ïŒïŒã®çµãåïŒïŒãïŒïŒãïŒïŒãããå°ã倧ãããµã€ãºã®åã§ãããæãã¯ãæ®åç³»ã«å¿
èŠãªè¢«åçæ·±åºŠã«å¯Ÿå¿ãããµã€ãºã®åïŒäŸãã°å圢ç¶ã®åã§ããµã€ãºã¯çŽåŸïŒã§ãã£ãŠããããçµãåïŒïŒã®äžå¿ïŒäŸãã°åã®äžå¿ïŒã¯ã芳å¯ã¢ãŒãã«ãããŠå
軞ã«äžèŽïŒç¥äžèŽãå«ãïŒããŠãããé®å
éšïŒïŒã¯ãå
軞ã«å¯ŸããŠåçŽãªå転軞ïŒïŒã«æ¥ç¶ãããŠãããäŸãã°å
軞ã«å¯ŸããŠåçŽã«èšããããæ¿ç¶éšæã§ãããé®å
éšïŒïŒã®åœ¢ç¶ã¯ãäŸãã°æåïŒæã®æ ¹å
ã軞ïŒïŒã«æ¥ç¶ãããïŒã§ããããããã«éå®ããããå³ïŒïŒåã³å³ïŒïŒã®ç¶æ
ãå®çŸã§ãã圢ç¶ã§ããã°ããã
The
ãå¯åãã¹ã¯ïŒïŒã¯ãå転軞ïŒïŒãäžå¿ãšããŠå
軞ã«åçŽãªæ¹åã«æå®ã®è§åºŠã ãå転ããæ§æãšãªã£ãŠãããäŸãã°ããšãŸçŽ åãã¢ãŒã¿ãŒçã«ãã£ãŠå転éåãå®çŸã§ãããå³ïŒïŒã®èгå¯ã¢ãŒãã«ãããŠã¯ãå¯åãã¹ã¯ïŒïŒã¯å³çŒåŽã«æå®ã®è§åºŠã ãå転ããŠåŸããåºå®ãã¹ã¯ïŒïŒã®ç³äžå¿å
è·¯ïŒçµãåïŒïŒïŒã¯éæŸç¶æ
ãšãªããå·Šå³ç³å
è·¯ïŒçµãåïŒïŒãïŒïŒïŒã¯é®å
ç¶æ
ãšãªããå³ïŒïŒã®ã¹ãã¬ãªèšæž¬ã¢ãŒãã«ãããŠã¯ãå¯åãã¹ã¯ïŒïŒã¯å·ŠçŒåŽã«æå®ã®è§åºŠã ãå転ããŠåŸããåºå®ãã¹ã¯ïŒïŒã®ç³äžå¿å
è·¯ïŒçµãåïŒïŒïŒã¯é®å
ç¶æ
ãšãªããå·Šå³ç³å
è·¯ïŒçµãåïŒïŒãïŒïŒïŒã¯éæŸç¶æ
ãšãªããåå
ãã£ã«ã¿ïŒŠïŒ¬ãæããçµãåïŒïŒãé²åãããããšã«ããå·Šç³ã¯çæ³¢é·æåã®ã¿ãééãããåå
ãã£ã«ã¿ïŒŠïŒ²ãæããçµãåïŒïŒãé²åºãããããšã«ããå³ç³ã¯é·æ³¢é·æåã®ã¿ãééãããã
The
ããªããäžèšã§ã¯å¯åãã¹ã¯ïŒïŒãæå®è§åºŠã ã軞å転ããããšã«ããïŒã€ã®ç¶æ
ãäœãå Žåã説æããããããã«éå®ãããªããäŸãã°ãã¹ã©ã€ãåäœã«ããå¯åãã¹ã¯ïŒïŒãç§»åãããŠïŒã€ã®ç¶æ
ãäœããã®ã§ããããå転åäœåã¯ã¹ã©ã€ãåäœã¯ãäŸãã°ãã°ãããæ©æ§ããå§é»æ©æ§ãªã©ã§å®çŸå¯èœã§ãããé«éæ§ãèä¹
æ§ãèæ
®ããŠé©åãªãã®ãéžæããã°ããã
In the above description, the case where two states are formed by rotating the
ã以äžã®å®æœåœ¢æ
ã«ããã°ãæ®åè£
眮ïŒå
èŠé¡è£
眮ïŒã¯ãæ®åçŽ åïŒïŒãšãçµåå
åŠç³»ïŒïŒãšãåºå®ãã¹ã¯ïŒïŒãšãå¯åãã¹ã¯ïŒïŒãšããå«ããçµåå
åŠç³»ïŒïŒã¯ãæ®åçŽ åïŒïŒã«è¢«åäœïŒãçµåããããåºå®ãã¹ã¯ïŒïŒã¯ãçµåå
åŠç³»ïŒïŒã®ç³ãåå²ãã第ïŒïœç¬¬ïŒã®éå£ïŒçµãåïŒïŒãïŒïŒãïŒïŒïŒãšã第ïŒã®æ³¢é·åž¯åïŒå³ïŒã®ïŒ°ïœåã¯å³ïŒã®ïŒ°ïœïŒïŒãééããã第ïŒã®ãã£ã«ã¿ïŒŠïŒ¬ãšã第ïŒã®æ³¢é·åž¯åïŒå³ïŒã®ïŒ°ïœåã¯å³ïŒã®ïŒ°ïœïŒïŒãééããã第ïŒã®ãã£ã«ã¿ïŒŠïŒ²ãšãæãããå¯åãã¹ã¯ïŒïŒã¯ãé®å
éšïŒïŒãšã第ïŒïœç¬¬ïŒã®éå£ïŒçµãåïŒïŒãïŒïŒãïŒïŒïŒã«å¯Ÿå¿ããŠé®å
éšïŒïŒã«èšãããã第ïŒïœç¬¬ïŒã®éå£ïŒçµãåïŒïŒãïŒïŒãïŒïŒïŒãšãæããçµåå
åŠç³»ïŒïŒã«å¯ŸããŠå¯åã§ããããããŠã第ïŒã®ãã£ã«ã¿ïŒŠïŒ¬ã¯ã第ïŒã®éå£ïŒçµãåïŒïŒïŒã«èšããããã第ïŒã®ãã£ã«ã¿ïŒŠïŒ²ã¯ã第ïŒã®éå£ïŒçµãåïŒïŒïŒã«èšããããã第ïŒã®éå£ïŒçµãåïŒïŒïŒã¯ãçµåå
åŠç³»ïŒïŒã®å
軞äžã«èšããããã
According to the above embodiment, the imaging device (endoscope device) includes the
ããã®ãããªæ§æã«ããããšã§ãå³ïŒïŒïœå³ïŒïŒã§èª¬æãããããªèŠ³æž¬ã¢ãŒããšã¹ãã¬ãªèšæž¬ã¢ãŒãã®åãæ¿ããå¯èœãšãªãããŸãã«ã©ãŒäœçžå·®æ³ã«ãããèŠå·®ç»åãåæã«ïŒæåå²ã§ãªãïŒååŸã§ããã®ã§ãæ£ç¢ºãªã¹ãã¬ãªèšæž¬ãå¯èœã«ãªãããŸããå¯åéšã§ããå¯åãã¹ã¯ïŒïŒãïŒã€ãªã®ã§ãåãæ¿ãã®é«éåããé§åæ©æ§ã®ç°¡çŽ åãã¢ãŒãåãæ¿ãã«ãããæ
éããšã©ãŒã®æå¶ãå®çŸã§ããããŸããå¯åãã¹ã¯ïŒïŒã«ã¯é®å
éšïŒïŒã«éå£ïŒçµãåïŒïŒãïŒïŒãïŒïŒïŒãèšããããç°¡çŽ ãªæ§æã§ãããåãæ¿ãã®æ¯åã«ãããã£ã«ã¿å€ããªã©ã®ãã©ãã«ãæå¶ã§ããããŸããåºå®ãã¹ã¯ïŒïŒã®éå£ïŒçµãåïŒïŒãïŒïŒïŒã«ããå·Šå³ã®ç³ãæç¢ºã«åããããããã¹ãã¬ãªèšæž¬ã«ãããåºç·é·ïŒå³ïŒïŒãå³ïŒïŒã®ïœïŒã倧ãããšãããããæ£ç¢ºãªè·é¢æž¬å®ãå¯èœã«ãªãã
With this configuration, it is possible to switch between the observation mode and the stereo measurement mode as described with reference to FIGS. Moreover, since the parallax images in the color phase difference method can be acquired simultaneously (not in time division), accurate stereo measurement can be performed. In addition, since there is one
ããŸããäŸãã°å·Šå³ç³ã®äžæ¹ã䜿ã£ãŠäžçŒã®èгå¯ç»åãæ®åœ±ããæ¯èŒäŸãèãããšãå
軞ããå€ããç³ã§ã®æ®åœ±ã«ãªãããã®ç¹ãæ¬å®æœåœ¢æ
ã§ã¯ãåºå®ãã¹ã¯ïŒïŒã«ïŒã€ã®éå£ïŒçµãåïŒïŒãïŒïŒãïŒïŒïŒãèšãããã®ãã¡ã®ïŒã€ãå
軞äžã«èšããããšã§ã芳å¯ç»åãç³äžå¿ç»åã«ãªããããã«ãããå
ç·ã®ã±ã©ã¬ãå°ãããªããåºèŠéè§ãªèгå¯ç»åãååŸã§ããããŸããé«å質ãªïŒäŸãã°æªã¿ãå°ãªãïŒçµåãåŸãããšãã§ããã
Also, for example, when considering a comparative example in which an observation image of a single eye is photographed using one of the left and right pupils, photographing is performed with a pupil off the optical axis. In this respect, in this embodiment, the fixed
ããŸãã被åäœïŒäžã®äœçœ®ãšç»çŽ äœçœ®ãšã®å¯Ÿå¿ãèããå Žåãã¹ãã¬ãªèšæž¬ã§ã®äœçžå·®ïŒå³ïŒïŒã®ïœïŒã®äžå¿ïŒïœïŒïŒã®äœçœ®ïŒã¯ãç³äžå¿ãéãå
ç·ãšäžèŽãããå³ã¡ãæ¬å®æœåœ¢æ
ã§ã¯ã芳å¯ç»åãšè·é¢ãããã®åäžç»çŽ ã¯ã被åäœïŒäžã®åäžäœçœ®ã«å¯Ÿå¿ãããäžæ¹ãäžèšã®æ¯èŒäŸã§ã¯èгå¯ç»åãå·Šã«èŠå·®ãæã£ãŠãããç³äžå¿ã§ãªãããã被åäœïŒäžã®åäžäœçœ®ã«å¯ŸããŠã芳å¯ç»åãšè·é¢ãããã®ç°ãªãç»çŽ ã察å¿ããŠããŸãã芳å¯ç»åãšïŒæ¬¡å
æ
å ±ãéããŠè¡šç€ºããå Žåãªã©ã«ãããŠãæ¬å®æœåœ¢æ
ã®æ¹ãæå©ã§ããã
Also, when considering the correspondence between the position on the
ãæ¬å®æœåœ¢æ
ã§ã¯ã第ïŒã®éå£ïŒçµãåïŒïŒïŒãå·Šç³ã«å¯Ÿå¿ãã第ïŒã®éå£ïŒçµãåïŒïŒïŒã¯å³ç³ã«å¯Ÿå¿ãã第ïŒã®éå£ïŒçµãåïŒïŒïŒãç³äžå¿ã«å¯Ÿå¿ããããªãã第ïŒã®éå£ãå³ç³ã«å¯Ÿå¿ãã第ïŒã®éå£ãå·Šç³ã«å¯Ÿå¿ãããããŸãã䟿å®çã«ã¹ãã¬ãªèšæž¬ã«ãããç³ãå·Šå³ã«åé¢ããŠããããç³ã®å颿¹åã¯å·Šå³ã«éããªããæ¬å®æœåœ¢æ
ã§ã¯éå£ãçµãåãšåŒãã§ããããéå£ã¯çµããšããŠã®æ©èœïŒç³ãééããå
æã®æé¢ç©ãå¶éããæ©èœïŒãå¿
ãããæããªããŠããããäŸãã°ã芳å¯ã¢ãŒãã«ãããŠçµãåïŒïŒãïŒïŒãéãªãããçµãåïŒïŒã®æ¹ãå°ããå Žåã«ã¯çµãåïŒïŒãçµãã®æ©èœãæããçµãåïŒïŒã®æ¹ãå°ããå Žåã«ã¯çµãåïŒïŒãçµãã®æ©èœãæããããšã«ãªãã
In the present embodiment, the first aperture (diaphragm aperture 21) corresponds to the left pupil, the second aperture (diaphragm aperture 22) corresponds to the right pupil, and the third aperture (diaphragm aperture 23) is at the pupil center. Correspond. Note that the first opening may correspond to the right pupil, and the second opening may correspond to the left pupil. Moreover, although the pupil in stereo measurement is separated into right and left for convenience, the separating direction of the pupil is not limited to right and left. In the present embodiment, the aperture is referred to as a diaphragm aperture. However, the aperture does not necessarily have a function as a diaphragm (a function of limiting the cross-sectional area of the light beam passing through the pupil). For example, the
ãããã§ç³ãšã¯ãçµåå
åŠç³»ïŒïŒã«ããçµåå
è·¯ãåé¢ïŒåã¯èŠå®ïŒãããã®ã§ãããå
è·¯ãšã¯ãæ®åçŽ åïŒïŒã«çµåããå
ããå
åŠç³»ã®å¯Ÿç©åŽããå
¥å°ããŠæ®åçŽ åïŒïŒã«å°éãããŸã§ã®çµè·¯ã®ããšã§ãããå³ã¡ãçµåå
åŠç³»ïŒïŒãšåºå®ãã¹ã¯ïŒïŒã®çµãåïŒïŒãïŒïŒïŒã¹ãã¬ãªèšæž¬ã¢ãŒãã§ã¯æŽã«å¯åãã¹ã¯ïŒïŒã®çµãåïŒïŒãïŒïŒïŒãééããå
è·¯ã第ïŒã第ïŒã®å
è·¯ã§ããããŸããçµåå
åŠç³»ïŒïŒãšåºå®ãã¹ã¯ïŒïŒã®çµãåïŒïŒïŒèгå¯ã¢ãŒãã§ã¯æŽã«å¯åãã¹ã¯ïŒïŒã®çµãåïŒïŒïŒãééããå
è·¯ã第ïŒã®å
è·¯ã§ããã
Here, the pupil is for separating (or defining) the imaging optical path by the imaging
ãããã§ãã¹ã¯ãšã¯ããã¹ã¯ã«å
¥å°ããå
ãé®èœãããšå
±ã«äžéšã®å
ãééãããéšæãéšåã®ããšã§ãããæ¬å®æœåœ¢æ
ã®åºå®ãã¹ã¯ïŒïŒãå¯åãã¹ã¯ïŒïŒã§ã¯ãé®å
éšïŒïŒãïŒïŒãå
ãé®èœãããšå
±ã«çµãåïŒïŒãïŒïŒãïŒïŒãïŒïŒãïŒïŒãïŒïŒãå
ïŒå
šåž¯ååã¯ãäžéšã®åž¯åïŒãééãããã
Here, the mask is a member or component that shields light incident on the mask and allows part of the light to pass. In the fixed
ããŸãæ¬å®æœåœ¢æ
ã§ã¯ãæ®åè£
眮ã¯ãå¯åãã¹ã¯ïŒïŒãå¶åŸ¡ããå¯åãã¹ã¯å¶åŸ¡éšïŒïŒïŒïŒå³ïŒïŒïŒãå«ããå¯åãã¹ã¯å¶åŸ¡éšïŒïŒïŒã¯ãéã¹ãã¬ãªã¢ãŒãïŒèгå¯ã¢ãŒãïŒã«ãããŠãå
軞æ¹åã«èŠãå Žåã«é®å
éšïŒïŒã第ïŒã第ïŒã®éå£ïŒçµãåïŒïŒãïŒïŒïŒã«éãªããšå
±ã«ç¬¬ïŒã®éå£ïŒçµãåïŒïŒïŒã第ïŒã®éå£ïŒçµãåïŒïŒïŒã«éãªã第ïŒã®ç¶æ
ïŒç¬¬ïŒã®äœçœ®ïŒã«ãå¯åãã¹ã¯ïŒïŒãèšå®ãããäžæ¹ãã¹ãã¬ãªã¢ãŒãïŒã¹ãã¬ãªèšæž¬ã¢ãŒãïŒã«ãããŠãå
軞æ¹åã«èŠãå Žåã«ç¬¬ïŒã第ïŒã®éå£ïŒçµãåïŒïŒãïŒïŒïŒã第ïŒã第ïŒã®éå£ïŒçµãåïŒïŒãïŒïŒïŒã«éãªããšå
±ã«é®å
éšïŒïŒã第ïŒã®éå£ïŒçµãåïŒïŒïŒã«éãªã第ïŒã®ç¶æ
ïŒç¬¬ïŒã®äœçœ®ïŒã«ãå¯åãã¹ã¯ïŒïŒãèšå®ããã
In the present embodiment, the imaging apparatus includes a movable mask control unit 340 (FIG. 18) that controls the
ããã®ãããªå¯åãã¹ã¯ïŒïŒã®é§åå¶åŸ¡ãè¡ãããšã§ãå³ïŒïŒãå³ïŒïŒã®èгå¯ã¢ãŒããšå³ïŒïŒãå³ïŒïŒã®ã¹ãã¬ãªèšæž¬ã¢ãŒãã®åãæ¿ãå¶åŸ¡ãå®çŸã§ãããå³ã¡ãå¯åãã¹ã¯ïŒïŒã第ïŒã®ç¶æ
ã«èšå®ããå Žåã«ã¯ã第ïŒã第ïŒã®éå£ãé®å
éšïŒïŒã§é®èœãããã®ã§ç¬¬ïŒã®éå£ã®ã¿ã§ã®æ®åœ±ãšãªãã第ïŒã®éå£ã«ã¯åå
ãã£ã«ã¿ãæ¿å
¥ãããªãã®ã§é垞芳å¯çšã®ç»åïŒçœè²å
ç»åïŒãæ®åœ±ããããšãå¯èœãšãªããäžæ¹ãå¯åãã¹ã¯ïŒïŒã第ïŒã®ç¶æ
ã«èšå®ããå Žåã«ã¯ã第ïŒã®éå£ã«ç¬¬ïŒã®ãã£ã«ã¿ïŒŠïŒ¬ãåºå®ããã第ïŒã®éå£ã«ç¬¬ïŒã®ãã£ã«ã¿ïŒŠïŒ²ãåºå®ãããŠããã®ã§ãã«ã©ãŒäœçžå·®æ³ã«ãããèŠå·®ç»åãæ®åœ±ããããšãå¯èœãšãªãã
By controlling the driving of the
ãïŒïŒåºå®ãã¹ã¯ãå¯åãã¹ã¯ã®ç¬¬ïŒã®è©³çŽ°ãªæ§æäŸ
ãå³ïŒïŒãå³ïŒïŒã«åºå®ãã¹ã¯ïŒïŒãå¯åãã¹ã¯ïŒïŒã®ç¬¬ïŒã®è©³çŽ°ãªæ§æäŸã瀺ããå³ïŒïŒãå³ïŒïŒã«ã¯ãçµåå
åŠç³»ïŒïŒãšåºå®ãã¹ã¯ïŒïŒãšå¯åãã¹ã¯ïŒïŒã®æé¢å³ãšãåºå®ãã¹ã¯ïŒïŒãšå¯åãã¹ã¯ïŒïŒãå
軞æ¹åã«èŠãå³ïŒçµååŽããèŠãèé¢å³ïŒãšãã瀺ãã
8). Second Detailed Configuration Example of Fixed Mask and Movable Mask FIGS. 15 and 16 show a second detailed configuration example of the fixed
ãå¯åãã¹ã¯ïŒïŒã¯ãé®å
éšïŒïŒãšãé®å
éšïŒïŒã«èšããããçµãåïŒïŒãïŒïŒãšããå«ããçµãåïŒïŒãïŒïŒã¯éæŸç¶æ
ïŒã¹ã«ãŒããŒã«ïŒã§ãããå転軞ïŒïŒãäžå¿ãšããŠåäžåäžã«äžŠã¶ãçµãåïŒïŒã¯ããã®åäžåã®ååšæ¹åã«äŒžã³ã圢ç¶ã§ããã芳å¯ã¢ãŒãã«ãããŠåºå®ãã¹ã¯ïŒïŒã®çµãåïŒïŒã«éãªããšå
±ã«ã¹ãã¬ãªèšæž¬ã¢ãŒãã«ãããŠåºå®ãã¹ã¯ïŒïŒã®çµãåïŒïŒã«éãªããããªåœ¢ç¶ãšãªã£ãŠããã
The
ãåºå®ãã¹ã¯ïŒïŒã¯ãé®å
éšïŒïŒãšãé®å
éšïŒïŒã«èšããããïŒã€ã®çµãåïŒïŒãïŒïŒãïŒïŒãšããå«ããçµãåïŒïŒãïŒïŒã«ã¯ãåå
ãã£ã«ã¿ïŒŠïŒ¬ããèšãããããçµãåïŒïŒã¯éæŸç¶æ
ïŒã¹ã«ãŒããŒã«ïŒã§ãã£ãŠãããããå³ïŒã®åž¯åïœïŒãïœïŒãïœïœãééãããåå
ãã£ã«ã¿ïŒŠïŒ£ãèšããããŠããããçµãåïŒïŒãïŒïŒãïŒïŒã¯ãå転軞ïŒïŒãäžå¿ãšããŠåäžåäžã«äžŠã¶ã
The fixed
ã芳å¯ã¢ãŒãã§ã¯ãåºå®ãã¹ã¯ïŒïŒã®ç³äžå¿ã®çµãåïŒïŒãå¯åãã¹ã¯ïŒïŒã®çµãåïŒïŒã«ããéæŸç¶æ
ãšãªããåºå®ãã¹ã¯ïŒïŒã®å·Šå³ç³ã®çµãåïŒïŒãïŒïŒãå¯åãã¹ã¯ïŒïŒã®é®å
éšïŒïŒã§é®å
ãããåçŒã«ããçœè²å
ã®ç»åãæ®åããããã¹ãã¬ãªèšæž¬ã¢ãŒãã§ã¯ãåºå®ãã¹ã¯ïŒïŒã®å·Šå³ç³ã®çµãåïŒïŒãïŒïŒãå¯åãã¹ã¯ïŒïŒã®çµãåïŒïŒãïŒïŒã«ããéæŸç¶æ
ãšãªããåºå®ãã¹ã¯ïŒïŒã®ç³äžå¿ã®çµãåïŒïŒãå¯åãã¹ã¯ïŒïŒã®é®å
éšïŒïŒã§é®å
ãããã«ã©ãŒäœçžå·®æ³ã«ããèŠå·®ç»åïŒèµ€è²ç»åãéè²ç»åïŒãæ®åãããã
In the observation mode, the
ã以äžã®å®æœåœ¢æ
ã«ããã°ãæ®åè£
眮ïŒå
èŠé¡è£
眮ïŒã¯ãæ®åçŽ åïŒïŒãšçµåå
åŠç³»ïŒïŒãšåºå®ãã¹ã¯ïŒïŒãšå¯åãã¹ã¯ïŒïŒãšãå«ããçµåå
åŠç³»ïŒïŒã¯ãæ®åçŽ åïŒïŒã«è¢«åäœïŒãçµåããããåºå®ãã¹ã¯ïŒïŒã¯ãçµåå
åŠç³»ïŒïŒã®ç³ãåå²ãã第ïŒïœç¬¬ïŒã®éå£ïŒçµãåïŒïŒãïŒïŒãïŒïŒïŒãšã第ïŒã®æ³¢é·åž¯åïŒå³ïŒã®ïŒ°ïœåã¯å³ïŒã®ïŒ°ïœïŒïŒãééããã第ïŒã®ãã£ã«ã¿ïŒŠïŒ¬ãšã第ïŒã®æ³¢é·åž¯åïŒå³ïŒã®ïŒ°ïœåã¯å³ïŒã®ïŒ°ïœïŒïŒãééããã第ïŒã®ãã£ã«ã¿ïŒŠïŒ²ãšãæãããå¯åãã¹ã¯ïŒïŒã¯ãé®å
éšïŒïŒãšã第ïŒã第ïŒã®éå£ïŒçµãåïŒïŒãïŒïŒïŒã«å¯Ÿå¿ããŠé®å
éšïŒïŒã«èšãããã第ïŒã®éå£ïŒçµãåïŒïŒïŒãšã第ïŒã®éå£ïŒçµãåïŒïŒïŒã«å¯Ÿå¿ããŠé®å
éšïŒïŒã«èšãããã第ïŒã®éå£ïŒçµãåïŒïŒïŒãšãæããçµåå
åŠç³»ïŒïŒã«å¯ŸããŠå¯åã§ããããããŠã第ïŒã®ãã£ã«ã¿ïŒŠïŒ¬ã¯ã第ïŒã®éå£ïŒçµãåïŒïŒïŒã«èšããããã第ïŒã®ãã£ã«ã¿ïŒŠïŒ²ã¯ã第ïŒã®éå£ïŒçµãåïŒïŒïŒã«èšããããã第ïŒã®éå£ïŒçµãåïŒïŒïŒã¯ãçµåå
åŠç³»ïŒïŒã®å
軞äžã«èšããããã
According to the above embodiment, the imaging device (endoscope device) includes the
ãå
·äœçã«ã¯ãæ®åè£
眮ã¯ãå¯åãã¹ã¯ïŒïŒãå¶åŸ¡ããå¯åãã¹ã¯å¶åŸ¡éšïŒïŒïŒãå«ããå¯åãã¹ã¯å¶åŸ¡éšïŒïŒïŒã¯ãéã¹ãã¬ãªã¢ãŒãïŒèгå¯ã¢ãŒãïŒã«ãããŠãå
軞æ¹åã«èŠãå Žåã«é®å
éšïŒïŒã第ïŒã第ïŒã®éå£ïŒçµãåïŒïŒãïŒïŒïŒã«éãªããšå
±ã«ç¬¬ïŒã®éå£ïŒçµãåïŒïŒïŒã第ïŒã®éå£ïŒçµãåïŒïŒïŒã«éãªã第ïŒã®ç¶æ
ã«ãå¯åãã¹ã¯ïŒïŒãèšå®ãããäžæ¹ãã¹ãã¬ãªã¢ãŒãïŒã¹ãã¬ãªèšæž¬ã¢ãŒãïŒã«ãããŠãå
軞æ¹åã«èŠãå Žåã«ç¬¬ïŒã第ïŒã®éå£ïŒçµãåïŒïŒãïŒïŒïŒã第ïŒã第ïŒã®éå£ïŒçµãåïŒïŒãïŒïŒïŒã«éãªããšå
±ã«é®å
éšïŒïŒã第ïŒã®éå£ïŒçµãåïŒïŒïŒã«éãªã第ïŒã®ç¶æ
ã«ãå¯åãã¹ã¯ïŒïŒãèšå®ããã
Specifically, the imaging apparatus includes a movable
ããã®ãããªæ§æã«ãã£ãŠãã芳枬ã¢ãŒããšã¹ãã¬ãªèšæž¬ã¢ãŒãã®åãæ¿ãããã¹ãã¬ãªèšæž¬ã¢ãŒãã«ãããèŠå·®ç»åã®åæååŸãã¢ãŒãåãæ¿ãã®é«éåãå¯åãã¹ã¯ïŒïŒã®é§åæ©æ§ã®ç°¡çŽ åãã¢ãŒãåãæ¿ãã«ãããæ
éããšã©ãŒã®æå¶ãã¹ãã¬ãªèšæž¬ã«ãããåºç·é·ã®ç¢ºä¿ãªã©ãå®çŸã§ããã
Even with such a configuration, switching between the observation mode and the stereo measurement mode, simultaneous acquisition of parallax images in the stereo measurement mode, speeding up of mode switching, simplification of the driving mechanism of the
ãïŒïŒã¹ãã¬ãªïŒæ¬¡å
èšæž¬ã®åç
ãã¹ãã¬ãªèšæž¬ã¢ãŒãã«ãããã¹ãã¬ãªèšæž¬ã®åçã«ã€ããŠèª¬æãããå³ïŒïŒã«ç€ºãããã«ãå·ŠçŒãšå³çŒã®å
è·¯ãç¬ç«ããŠæ§æããã被åäœïŒããã®åå°ç»åã¯ããããå
è·¯ãä»ããŠæ®åã»ã³ãµé¢ïŒåå
é¢ïŒã«çµåãããïŒæ¬¡å
空éã®åº§æšç³»ïŒžããã以äžã®ããã«å®çŸ©ãããå³ã¡ãæ®åã»ã³ãµé¢ã«æ²¿ã£ãŠïŒžè»žãšã軞ã«çŽäº€ãã軞ãšãèšå®ããæ®åã»ã³ãµé¢ã«çŽäº€ãäžã€å
軞ãšå¹³è¡ãªæ¹åã§è¢«åäœã«åããæ¹åã«ïŒºè»žãèšå®ããã軞ã¯ïŒžè»žã軞ãšãŒãç¹ã«ãŠäº€å·®ãããã®ãšããããªããããã§ã¯äŸ¿å®äžïŒ¹è»žã¯çç¥ããã
9. Principle of Stereo 3D Measurement The principle of stereo measurement in the stereo measurement mode will be described. As shown in FIG. 17, the optical paths of the left eye and the right eye are configured independently, and the reflected image from the subject 5 forms an image on the imaging sensor surface (light receiving surface) through these optical paths. A coordinate system X, Y, and Z of the three-dimensional space is defined as follows. That is, the X axis and the Y axis orthogonal to the X axis are set along the imaging sensor surface, and the Z axis is set in a direction toward the subject in a direction orthogonal to the imaging sensor surface and parallel to the optical axis AXC. The Z axis intersects the X axis and Y axis at the zero point. Note that the Y-axis is omitted here for convenience.
ãçµåã¬ã³ãºïŒïŒãšæ®åã»ã³ãµé¢ã®è·é¢ãïœãšããçµåã¬ã³ãºïŒïŒãã被åäœïŒã®ä»»æç¹ïŒ±ïŒïœïŒïœïŒãŸã§ã®è·é¢ãïœãšãããç³ã®äžå¿ç·ïŒ©ïŒ£ïŒãïŒãšïŒºè»žãŸã§ã®è·é¢ãåäžãšããåïœïŒïŒãšãããã€ãŸãã¹ãã¬ãªèšæž¬ã«ãããåºç·é·ã¯ïœãšãªãã被åäœïŒã®ä»»æç¹ïŒ±ïŒïœïŒïœïŒãçµåã¬ã³ãºïŒïŒã«ããæ®åã»ã³ãµé¢ã«çµåããã察å¿ç¹ã®ïŒžåº§æšããšãã被åäœïŒã®ä»»æç¹ïŒ±ïŒïœïŒïœïŒãçµåã¬ã³ãºïŒïŒã«ããæ®åã»ã³ãµé¢ã«çµåããã察å¿ç¹ã®ïŒžåº§æšããšãããä»»æç¹ïŒ±ïŒïœïŒïœïŒãšåº§æšïŒžïŒ¬ãã«å²ãŸããäžè§åœ¢å
ã«ã§ããè€æ°ã®éšåçãªçŽè§äžè§åœ¢ã®çžäŒŒé¢ä¿ã䜿ã£ãŠäžåŒïŒïŒïŒãåŸãããšãã§ããã
ãããã§ãäžåŒïŒïŒïŒãïŒïŒïŒãæãç«ã€ã
ãããã«ãããäžåŒïŒïŒïŒã®çµ¶å¯Ÿå€ãäžåŒïŒïŒïŒã®ããã«å€ãããšãã§ããã
ãäžåŒïŒïŒïŒãïœã«ã€ããŠè§£ããšäžåŒïŒïŒïŒãšãªãã
ãäžåŒïŒïŒïŒã®ïœãäžåŒïŒïŒïŒã«ä»£å
¥ãããšãäžåŒïŒïŒïŒãåŸãããïœãæ±ããããšãã§ããã
ãïœãïœã¯æ¢ç¥ã®èšå®å€ã§ãããæªç¥æ°ïŒžïŒ¬ãïŒžïŒ²ã¯æ¬¡ã®ããã«ããŠæ±ãããããå³ã¡ãæ®åã»ã³ãµé¢ã®äœçœ®ïŒžïŒ¬ãåºæºã«èãïŒå·Šç»åã®ç»çŽ äœçœ®ããšèŠãªãïŒãäœçœ®ïŒžïŒ¬ã«å¯Ÿå¿ããäœçœ®ïŒžïŒ²ããããã³ã°åŠçïŒçžé¢æŒç®ïŒã«ããæ€åºãããåäœçœ®ïŒžïŒ¬ã«ã€ããŠè·é¢ïœãèšç®ããããšã§è¢«åäœã®åœ¢ç¶ãèšæž¬ã§ããããªãããããã³ã°ãè¯å¥œã§ãªãå Žåã«ã¯è·é¢ïœãæ±ããããªãå¯èœæ§ãããããäŸãã°åšå²ã®ç»çŽ ã®è·é¢ïœããè£éããããšçã«ããæ±ããŠãããã D and b are known set values, and the unknowns XL and XR are obtained as follows. That is, the position XL on the imaging sensor surface is considered as a reference (the pixel position of the left image is regarded as XL), and the position XR corresponding to the position XL is detected by matching processing (correlation calculation). By calculating the distance z for each position XL, the shape of the subject can be measured. If the matching is not good, the distance z may not be obtained, but may be obtained by interpolation from the distance z of surrounding pixels, for example.
ãïŒïŒïŒå
èŠé¡è£
眮
ãå³ïŒïŒã«ãæ¬å®æœåœ¢æ
ã®å
èŠé¡è£
眮ïŒåºçŸ©ã«ã¯ãæ®åè£
眮ïŒã®æ§æäŸã瀺ããå
èŠé¡è£
眮ã¯ãã¹ã³ãŒãéšïŒïŒïŒïŒæ®åéšïŒãæ¬äœéšïŒïŒïŒïŒå¶åŸ¡è£
眮ïŒãå«ããã¹ã³ãŒãéšïŒïŒïŒã¯ãçµåå
åŠç³»ïŒïŒãåºå®ãã¹ã¯ïŒïŒãå¯åãã¹ã¯ïŒïŒãæ®åçŽ åïŒïŒãé§åéšïŒïŒãç
§æéšïŒïŒãå«ããæ¬äœéšïŒïŒïŒã¯ãåŠçéšïŒïŒïŒãã¢ãã¿è¡šç€ºéšïŒïŒïŒãæ®ååŠçéšïŒïŒïŒãå«ããåŠçéšïŒïŒïŒã¯ãå
æºé§åå¶åŸ¡éšïŒïŒïŒãç»åéžæéšïŒïŒïŒïŒç»åãã¬ãŒã éžæéšïŒãã«ã©ãŒç»åçæéšïŒïŒïŒïŒç»ååºåéšïŒãäœçžå·®æ€åºéšïŒïŒïŒãå¯åãã¹ã¯å¶åŸ¡éšïŒïŒïŒïŒå¯åãã¹ã¯é§åå¶åŸ¡éšïŒãå¯åãã¹ã¯äœçœ®æ€åºéšïŒïŒïŒãè·é¢æ
å ±ç®åºéšïŒïŒïŒãïŒæ¬¡å
æ
å ±çæéšïŒïŒïŒãå«ãã
10. Endoscopic Device FIG. 18 shows a configuration example of the endoscope device (imaging device in a broad sense) of the present embodiment. The endoscope apparatus includes a scope unit 100 (imaging unit) and a main body unit 200 (control device). The
ããªãæ¬äœéšïŒïŒïŒã¯ãäžå³ç€ºã®æ§æèŠçŽ ãšããŠãæ¬äœéšïŒïŒïŒãæäœããæäœéšãå€éšæ©åšãšæ¥ç¶ããã€ã³ã¿ãŒãã§ãŒã¹éšçãå«ãã§ããããã¹ã³ãŒãéšïŒïŒïŒã¯ãäžå³ç€ºã®æ§æèŠçŽ ãšããŠãäŸãã°ã¹ã³ãŒãéšïŒïŒïŒãæäœããæäœéšããåŠçœ®å
·çãå«ãã§ãããã
The
ãå
èŠé¡è£
眮ãšããŠã¯ãå·¥æ¥çšãå»ççšã®ãããããããªã¹ã³ãŒãïŒæ®åçŽ åãå
èµããå
èŠé¡è£
眮ïŒãæ³å®ã§ãããã¹ã³ãŒãéšïŒïŒïŒã湟æ²å¯èœã«æ§æãããè»æ§é¡ãã¹ã³ãŒãéšïŒïŒïŒãã¹ãã£ãã¯ç¶ã«æ§æããã硬æ§é¡ããããã«ãæ¬çºæãé©çšã§ãããäŸãã°å·¥æ¥çšã®è»æ§é¡ã®å Žåãæ¬äœéšïŒïŒïŒåã³æ®åéšïŒïŒïŒã¯æã¡éã³å¯èœãªããŒã¿ãã«æ©åšãšããŠæ§æãããŠãããå·¥æ¥è£œåã®è£œé æ€æ»ãã¡ã³ããã³ã¹æ€æ»ã建ç¯ç©ãé
管ã®ã¡ã³ããã³ã¹æ€æ»çã«çšããããã
As the endoscope apparatus, an industrial and medical so-called video scope (an endoscope apparatus incorporating an image sensor) can be assumed. The present invention can be applied to both a flexible mirror in which the
ãé§åéšïŒïŒã¯ãå¯åãã¹ã¯å¶åŸ¡éšïŒïŒïŒããã®å¶åŸ¡ä¿¡å·ã«åºã¥ããŠå¯åãã¹ã¯ïŒïŒãé§åãã第ïŒã®ç¶æ
ïŒèгå¯ã¢ãŒãïŒãšç¬¬ïŒã®ç¶æ
ïŒã¹ãã¬ãªèšæž¬ã¢ãŒãïŒãåãæ¿ãããäŸãã°ãé§åéšïŒïŒã¯ããšãŸçŽ åããã°ãããæ©æ§ã«ããã¢ã¯ãã¥ãšãŒã¿ã§æ§æãããã
The driving
ãæ®ååŠçéšïŒïŒïŒã¯ãæ®åçŽ åïŒïŒããã®ä¿¡å·ã«å¯ŸããŠæ®ååŠçãè¡ããæ®åç»åïŒäŸãã°ãã€ã€ç»åçïŒãåºåãããäŸãã°ãçžé¢ïŒéãµã³ããªã³ã°åŠçãã²ã€ã³ã³ã³ãããŒã«åŠçãïŒïŒ€å€æåŠçãã¬ã³ãè£æ£ãè²è£æ£ããã€ãºäœæžçãè¡ããæ®ååŠçéšïŒïŒïŒã¯ãäŸãã°ïŒ¡ïŒ³ïŒ©ïŒ£çã®ãã£ã¹ã¯ãªãŒãïŒ©ïŒ£ã§æ§æãããŠãããããæãã¯æ®åçŽ åïŒïŒïŒã»ã³ãµãããïŒãåŠçéšïŒïŒïŒã«å
èµãããŠãããã
The
ãã¢ãã¿è¡šç€ºéšïŒïŒïŒã¯ãã¹ã³ãŒãéšïŒïŒïŒãæ®åããç»åãã被åäœïŒã®ïŒæ¬¡å
åœ¢ç¶æ
å ±çã衚瀺ãããäŸãã°ãã¢ãã¿è¡šç€ºéšïŒïŒïŒã¯ãæ¶²æ¶ãã£ã¹ãã¬ã€ãïŒElectro-LuminescenceïŒãã£ã¹ãã¬ã€çã«ããæ§æãããã
The
ã以äžãå
èŠé¡è£
眮ã®åäœã説æãããç
§æéšïŒïŒã¯ãäžè¿°ããéãã¿ãŒã³å
ãšãã¿ãŒã³å
ã®åæå
ã被åäœïŒãžç
§å°ãããå
æºé§åå¶åŸ¡éšïŒïŒïŒã¯ãæ®ååŠçéšïŒïŒïŒããã®ä¿¡å·ã«åºã¥ããŠéãã¿ãŒã³å
ãšãã¿ãŒã³å
ã®åå
éãæé©ã«å¶åŸ¡ïŒãããã調å
å¶åŸ¡ïŒãããäŸãã°ãæ®åç»åã®èŒåºŠãæ±ãããã®èŒåºŠãæå®ã®ç¯å²å
ãšãªãããã«å
éãå¶åŸ¡ããã
Hereinafter, the operation of the endoscope apparatus will be described. The
ãå¯åãã¹ã¯å¶åŸ¡éšïŒïŒïŒã¯ãé§åéšïŒïŒãå¶åŸ¡ããŠå¯åãã¹ã¯ïŒïŒã®äœçœ®ãåãæ¿ãããå¯åãã¹ã¯å¶åŸ¡éšïŒïŒïŒãå¯åãã¹ã¯ïŒïŒã芳å¯ã¢ãŒãã«èšå®ããå Žåã被åäœïŒããã®åå°å
ãç³äžå¿å
è·¯ãä»ããŠæ®åçŽ åïŒïŒã«çµåããããæ®ååŠçéšïŒïŒïŒã¯ãæ®åçŽ åïŒïŒã«çµåãããç»åã®ç»çŽ å€ãèªã¿åºããïŒïŒ€å€æçãè¡ã£ãŠç»åéžæéšïŒïŒïŒã«ç»åããŒã¿ãåºåããã
The movable
ãç»åéžæéšïŒïŒïŒã¯ãå¯åãã¹ã¯å¶åŸ¡éšïŒïŒïŒããã®å¶åŸ¡ä¿¡å·ã«åºã¥ããŠå¯åãã¹ã¯ïŒïŒã®ç¶æ
ã芳å¯ã¢ãŒãã§ããããšãæ€ç¥ããæ®åç»åããïœïŒ¶ïœãïœãïœïœãéžæããŠã«ã©ãŒç»åçæéšïŒïŒïŒã«åºåãããã«ã©ãŒç»åçæéšïŒïŒïŒã¯ãã¢ã¶ã€ãã³ã°åŠçïŒãã€ã€ç»åããç»åãçæããåŠçïŒãåçš®ç»ååŠçãè¡ããïŒæ¿ååè²ç»åãã¢ãã¿è¡šç€ºéšïŒïŒïŒã«åºåãããã¢ãã¿è¡šç€ºéšïŒïŒïŒã¯ããã®ã«ã©ãŒç»åã衚瀺ããã
The
ãå¯åãã¹ã¯å¶åŸ¡éšïŒïŒïŒãå¯åãã¹ã¯ïŒïŒãã¹ãã¬ãªèšæž¬ã¢ãŒãã«èšå®ããå Žåã被åäœïŒããã®åå°å
ãå·Šç³å
è·¯åã³å³ç³å
è·¯ãä»ãæ®åçŽ åïŒïŒã«åæã«çµåããããæ®ååŠçéšïŒïŒïŒã¯ãæ®åçŽ åïŒïŒã«çµåãããç»åã®ç»çŽ å€ãèªã¿åºããïŒïŒ€å€æçãè¡ã£ãŠç»åéžæéšïŒïŒïŒã«ç»åããŒã¿ãåºåããã
When the movable
ãç»åéžæéšïŒïŒïŒã¯ãå¯åãã¹ã¯å¶åŸ¡éšïŒïŒïŒããã®å¶åŸ¡ä¿¡å·ã«åºã¥ããŠå¯åãã¹ã¯ïŒïŒã®ç¶æ
ãã¹ãã¬ãªèšæž¬ã¢ãŒãã§ããããšãæ€ç¥ããæ®åç»åããïœïŒïœïŒïŒïœïœãéžæããŠäœçžå·®æ€åºéšïŒïŒïŒã«åºåãããäœçžå·®æ€åºéšïŒïŒïŒã¯ãåé¢ãããïŒã€ã®ç»åïŒïœãïŒïœã«å¯ŸããŠãããã³ã°åŠçãè¡ããç»çŽ æ¯ã«äœçžå·®ïŒäœçžããïŒãæ€åºããããŸãäœçžå·®æ€åºéšïŒïŒïŒã¯ãäœçžå·®æ€åºãä¿¡é Œã§ãããåŠãã®å€æãè¡ããä¿¡é Œã§ããªããšå€æããå Žåã¯ãšã©ãŒãã©ã°ãç»çŽ æ¯ã«åºåãããåŸæ¥ããïŒã€ã®é¡äŒŒæ³¢åœ¢ã®ããéïŒäœçžå·®ïŒãæ±ããããã®ãããã³ã°è©äŸ¡æ¹æ³ã¯ïŒºïŒ®ïŒ£ïŒ£ïŒZero-mean Normalized Cross-CorrelationïŒã«ä»£è¡šãããæ£èŠåçžäºçžé¢æŒç®æ³ãçžäºã®å·®å絶察å€ã®åèšã«ããïŒSum of Absolute DifferenceïŒãªã©ãçš®ã
ææ¡ãããŠããã®ã§é©å®å©çšãå¯èœã§ããã
The
ããªããæåå²ãšãªã被åäœãã¬ãæ®åç³»ã®ãã¬ã®åœ±é¿ã¯åãããã®ã®èŠå·®ç»åãšãªãïœãšïŒïœã䜿ã£ãŠãäœçžããïŒäœçžå·®ïŒãæ€åºããããšãã§ããã被åäœïŒã®åå°ãéè²æåã¯å°ãªãèµ€è²æåãå€ãå ŽåãïŒïœãšïŒïœã§ã¯æ€åºãé£ãã被åäœïŒã§ãã£ãŠãå ±ã«èµ€è²æåãæããïœãšïŒïœãªãã°èšæž¬ãå¯èœãšãªãã It should be noted that the phase shift (phase difference) can be detected by using Vr and Mr, which are parallax images, although they are time-divisionally affected by subject blur and imaging system blur. When the reflection of the subject 5 has a small blue component and a large red component, even if the subject 5 is difficult to detect with Mr and Mb, measurement is possible if both Vr and Mr have a red component.
ãäœçžå·®æ€åºéšïŒïŒïŒã¯ãæ€åºããäœçžå·®æ
å ±ãšãšã©ãŒãã©ã°ãè·é¢æ
å ±ç®åºéšïŒïŒïŒã«åºåãããè·é¢æ
å ±ç®åºéšïŒïŒïŒã¯ã被åäœïŒã®è·é¢æ
å ±ïŒäŸãã°å³ïŒïŒã®è·é¢ïœïŒãåç»çŽ ã«ã€ããŠèšç®ãããã®è·é¢æ
å ±ãïŒæ¬¡å
æ
å ±çæéšïŒïŒïŒã«åºåããããšã©ãŒãã©ã°ãç«ã£ãŠããç»çŽ ã¯ãäŸãã°è¢«åäœïŒã®å¹³åŠéšïŒãšããžæåãå°ãªãé åïŒãšèŠãªããŠãäŸãã°åšå²ã®ç»çŽ ã®è·é¢æ
å ±ããè£éããŠããããïŒæ¬¡å
æ
å ±çæéšïŒïŒïŒã¯ãè·é¢æ
å ±ïŒåã¯ãè·é¢æ
å ±ãšã«ã©ãŒç»åçæéšïŒïŒïŒããã®ïŒ²ïŒ§ïŒ¢ç»åïŒããïŒæ¬¡å
æ
å ±ãçæãããïŒæ¬¡å
æ
å ±ã¯ãäŸãã°ïŒºå€ãããïŒè·é¢ãããïŒãããªãŽã³ãç䌌çãªïŒæ¬¡å
衚瀺ç»åïŒäŸãã°ã·ã§ãŒãã£ã³ã°çã«ãã圢ç¶åŒ·èª¿ïŒçãçš®ã
ã®æ
å ±ãæ³å®ã§ãããïŒæ¬¡å
æ
å ±çæéšïŒïŒïŒã¯ãçæããïŒæ¬¡å
ç»åãïŒæ¬¡å
ããŒã¿ãæãã¯ããããšèгå¯ç»åãšãéç³ãã衚瀺ç»åãªã©ãå¿
èŠã«å¿ãçæããã¢ãã¿è¡šç€ºéšïŒïŒïŒãžåºåãããã¢ãã¿è¡šç€ºéšïŒïŒïŒã¯ããã®ïŒæ¬¡å
æ
å ±ã衚瀺ããã
The phase
ãå¯åãã¹ã¯äœçœ®æ€åºéšïŒïŒïŒã¯ãã¹ãã¬ãªèšæž¬ã¢ãŒãæã«åŸãããç»åïœïŒïœïŒïŒïœïœã䜿ã£ãŠãå¯åãã¹ã¯ïŒïŒã芳å¯ã¢ãŒãã®äœçœ®ã«ãããã¹ãã¬ãªèšæž¬ã¢ãŒãã®äœçœ®ã«ããããæ€åºããããããŠãå¯åãã¹ã¯ïŒïŒã®ç¶æ
ãã¢ãŒãã«äžèŽããŠããªããšå€æããå Žåã«ã¯ãå¯åãã¹ã¯å¶åŸ¡éšïŒïŒïŒã«äœçœ®ãšã©ãŒãã©ã°ãåºåãããå¯åãã¹ã¯å¶åŸ¡éšïŒïŒïŒã¯ãäœçœ®ãšã©ãŒãã©ã°ãåããŠãå¯åãã¹ã¯ïŒïŒãæ£ããç¶æ
ïŒç»åéžæã«å¯Ÿå¿ããç¶æ
ïŒã«ä¿®æ£ãããäŸãã°å¯åãã¹ã¯å¶åŸ¡éšïŒïŒïŒãã¹ãã¬ãªèšæž¬ã¢ãŒãã®å¶åŸ¡ä¿¡å·ãåºåããŠããã«ãé¢ããããç»åïœïŒïœïŒïŒïœïœã«è²ãããç¡ããšå€æãããå Žåãå®éã®å¯åãã¹ã¯ïŒïŒã¯èгå¯ã¢ãŒãã®äœçœ®ã«ãªã£ãŠããããã®å Žåãå¶åŸ¡ä¿¡å·ãšå¯åãã¹ã¯ïŒïŒã®äœçœ®ãäžèŽãããä¿®æ£ãè¡ãããªããä¿®æ£åäœãããŠãæ£ããç¶æ
ã«ãªããªãå Žåã¯ãäœããã®æ
éãçºçãããšå€æããŠå
šäœã®æ©èœã忢ãããã
The movable
ãå¯åãã¹ã¯ïŒïŒã¯ã¡ã«ãã«ã«ãªæ©æ§ããæ§æãããã®ã§åãæ¿ãåäœã«äžå
·åãçºçããããšãèãããããæ¬å®æœåœ¢æ
ã«ããã°ãåãæ¿ãäœçœ®ã芳å¯ã¢ãŒããªã®ãèšæž¬ã¢ãŒããªã®ããæ€åºã§ããã®ã§ãåãæ¿ãåäœã®äžå
·åã«å¯Ÿå¿ã§ããã
Since the
ãå¯åãã¹ã¯ïŒïŒã芳å¯ã¢ãŒãã®äœçœ®ã«ãããã¹ãã¬ãªèšæž¬ã¢ãŒãã®äœçœ®ã«ãããã®æ€åºã倿ã¯ãäŸãã°ä»¥äžã®ããã«è¡ããå³ã¡ãç»åïŒïœãšç»åïŒïœã®å€æãšãªã¢ã§ã®ã¬ãã«ïŒå¹³åã¬ãã«ãªã©ïŒãåãããåŸãç»åïŒïœãšç»åïŒïœã®çµ¶å¯Ÿå·®åå€åã«ãã倿ïŒç¬¬ïŒææ³ïŒããç»åïŒïœãšç»åïŒïœã®çžé¢ä¿æ°ã«ãã倿ïŒç¬¬ïŒææ³ïŒãªã©ã«ãããäœçœ®ãšã©ãŒã®å€æãè¡ãã
Detecting or judging whether the
ãç¬¬ïŒææ³ã§ã¯ãåç»çŽ ã§ç»çŽ å€ã®å·®åå€ã®çµ¶å¯Ÿå€ãæ±ãããããå šç»çŽ åã¯éšåç»çŽ çŸ€ã§ç©ç®ããããã®çµæãæå®ã®éŸå€ãè¶ããå Žåã¯ãã¹ãã¬ãªèšæž¬ã¢ãŒãã®ç»åãšå€æãããã®çµæãæå®ã®éŸå€ä»¥äžã§ããå Žåã¯ã芳å¯ã¢ãŒãã®ç»åãšå€æãããã¹ãã¬ãªèšæž¬ã¢ãŒãã§ã¯ç»åïŒïœãšç»åïŒïœã¯åºæ¬çã«è²ãããèµ·ãããŠããç»åãªã®ã§ãæå®éã®å·®åå€ãåŸãããããšãå©çšããŠããã In the first method, the absolute value of the difference value of the pixel value is obtained for each pixel, and it is integrated in all pixels or a partial pixel group. If the result exceeds a predetermined threshold, it is determined as an image in the stereo measurement mode, and if the result is less than the predetermined threshold, it is determined as an image in the observation mode. In the stereo measurement mode, the image Mr and the image Mb are basically images that have undergone color misregistration, so that the fact that a predetermined amount of difference value is obtained is used.
ãç¬¬ïŒææ³ã§ã¯ãç»åïŒïœãšç»åïŒïœã®æå®ç¯å²ã«ãããçžé¢ä¿æ°ãèšç®ãããã®çµæãæå®ã®éŸå€ä»¥äžã®å Žåã¯ãã¹ãã¬ãªèšæž¬ã¢ãŒãã®ç»åãšå€æãããã®çµæãæå®ã®éŸå€ãè¶ããå Žåã¯ã芳å¯ã¢ãŒãã®ç»åãšå€æãããããã¯ã¹ãã¬ãªèšæž¬ã¢ãŒãã§ã¯ç»åïŒïœãšç»åïŒïœã¯åºæ¬çã«è²ãããèµ·ãããŠããç»åãªã®ã§çžé¢ä¿æ°ãå°ããã®ã«å¯Ÿãã芳å¯ã¢ãŒãã§ã¯ç»åïŒïœãšç»åïŒïœã¯ã»ãŒäžèŽããç»åãªã®ã§çžé¢ä¿æ°ã倧ããããšãå©çšããŠããã In the second method, the correlation coefficient in the predetermined range between the image Mr and the image Mb is calculated, and when the result is equal to or smaller than the predetermined threshold, it is determined as an image in the stereo measurement mode, and the result exceeds the predetermined threshold. In this case, it is determined that the image is an observation mode. In the stereo measurement mode, the image Mr and the image Mb are basically images that have undergone color misregistration, so the correlation coefficient is small, whereas in the observation mode, the image Mr and the image Mb are almost the same image, so the correlation coefficient is Take advantage of big things.
ããªããæ¬å®æœåœ¢æ ã®å èŠé¡è£ çœ®ãæ®åè£ çœ®çã¯ãããã»ããµãšã¡ã¢ãªãå«ãã§ããããããã§ã®ããã»ããµã¯ãäŸãã°ïŒ£ïŒ°ïŒµïŒCentral Processing UnitïŒã§ãã£ãŠãããããã ãããã»ããµã¯ïŒ£ïŒ°ïŒµã«éå®ããããã®ã§ã¯ãªããïŒGraphics Processing UnitïŒãæãã¯ïŒ€ïŒ³ïŒ°ïŒDigital Signal ProcessorïŒçãåçš®ã®ããã»ããµãçšããããšãå¯èœã§ããããŸãããã»ããµã¯ïŒ¡ïŒ³ïŒ©ïŒ£ã«ããããŒããŠã§ã¢åè·¯ã§ãããããŸããã¡ã¢ãªã¯ã³ã³ãã¥ãŒã¿ã«ããèªã¿åãå¯èœãªåœä»€ãæ ŒçŽãããã®ã§ãããåœè©²åœä»€ãããã»ããµã«ããå®è¡ãããããšã§ãæ¬å®æœåœ¢æ ã«ä¿ãå èŠé¡è£ çœ®ãæ®åè£ çœ®çã®åéšïŒäŸãã°åŠçéšïŒïŒïŒã®åéšçïŒãå®çŸãããããšã«ãªããããã§ã®ã¡ã¢ãªã¯ãïŒãïŒãªã©ã®åå°äœã¡ã¢ãªã§ãã£ãŠãããããã¬ãžã¹ã¿ãŒãããŒããã£ã¹ã¯çã§ãããããŸããããã§ã®åœä»€ã¯ãããã°ã©ã ãæ§æããåœä»€ã»ããã®åœä»€ã§ãããããããã»ããµã®ããŒããŠã§ã¢åè·¯ã«å¯ŸããŠåäœãæç€ºããåœä»€ã§ãã£ãŠãããã Note that the endoscope apparatus, the imaging apparatus, and the like of the present embodiment may include a processor and a memory. The processor here may be, for example, a CPU (Central Processing Unit). However, the processor is not limited to the CPU, and various processors such as a GPU (GraphicsProcessing Unit) or a DSP (Digital Signal Processor) can be used. The processor may be an ASIC hardware circuit. The memory stores instructions that can be read by a computer. When the instructions are executed by the processor, each unit (for example, each unit of the processing unit 210) of the endoscope apparatus, the imaging apparatus, and the like according to the present embodiment. Etc.) will be realized. The memory here may be a semiconductor memory such as SRAM or DRAM, or a register or a hard disk. Further, the instruction here may be an instruction of an instruction set constituting the program, or an instruction for instructing an operation to the hardware circuit of the processor.
ãïŒïŒïŒã¢ãŒãåãæ¿ãã·ãŒã±ã³ã¹
ãå³ïŒïŒã«ãåç»æ®åœ±ã«ãããŠèгå¯ã¢ãŒããšã¹ãã¬ãªèšæž¬ã¢ãŒããåãæ¿ããã·ãŒã±ã³ã¹ïŒåäœã¿ã€ãã³ã°ãã£ãŒãïŒã瀺ãã
11. Mode Switching Sequence FIG. 19 shows a sequence (operation timing chart) for switching between the observation mode and the stereo measurement mode in moving image shooting.
ãäžè¿°ããã¹ãã¬ãªèšæž¬ã¢ãŒãã§ã¯ãåãããã被åäœã«å¯ŸããŠãé«ç²ŸåºŠãªã¹ãã¬ãªåæèšæž¬ãå®çŸã§ããããè²ããç»åãšãªã£ãŠããŸãã®ã§é«åäœãªèгå¯ç»åã«ã¯äœ¿ããªããããã§èгå¯ã¢ãŒããšã¹ãã¬ãªèšæž¬ã¢ãŒããé«éã«åãæ¿ããããšã«ããããã®åé¡ã解決ã§ããã»ãŒãªã¢ã«ã¿ã€ã ã«è¿ãç¶æ ã§èгå¯ç»åã衚瀺ãã€ã€ã¹ãã¬ãªèšæž¬ãå®è¡å¯èœã§ããã In the stereo measurement mode described above, high-precision simultaneous measurement can be realized even for a moving subject. However, since it becomes a color shift image, it cannot be used for a high-quality observation image. Therefore, this problem can be solved by switching between the observation mode and the stereo measurement mode at high speed, and stereo measurement can be performed while displaying the observation image in a state almost in real time.
ãå³ïŒïŒã«ç€ºãããã«ãå¯åãã¹ã¯ïŒïŒã®ç¶æ
ã®åãæ¿ããšæ®åã¿ã€ãã³ã°ãšæ®åç»åã®éžæã¯é£åããŠãããïŒãïŒã«ç€ºãããã«ã芳å¯ã¢ãŒãã®ãã¹ã¯ç¶æ
ãšã¹ãã¬ãªèšæž¬ã¢ãŒãã®ãã¹ã¯ç¶æ
ã亀äºã«ç¹°ãè¿ããïŒãïŒã«ç€ºãããã«ãåãã¹ã¯ç¶æ
ã§ïŒåãã€æ®åãè¡ããããïŒã«ç€ºãããã«ã芳å¯ã¢ãŒãã®ãã¹ã¯ç¶æ
ã«ãããšãã«æ®åçŽ åïŒïŒã«ããé²å
æ®åãããç»åã¯èгå¯ç»åãšããŠéžæããããïŒã«ç€ºãããã«ãã¹ãã¬ãªèšæž¬ã¢ãŒãã®ãã¹ã¯ç¶æ
ã«ãããšãã«æ®åçŽ åïŒïŒã«ããé²å
æ®åãããç»åã¯èšæž¬ç»åãšããŠéžæãããã
As shown in FIG. 19, switching of the state of the
ããã®ããã«èгå¯ã¢ãŒããšã¹ãã¬ãªèšæž¬ã¢ãŒãã亀äºã«ç¹°ãè¿ãããšã«ãããã»ãŒãªã¢ã«ã¿ã€ã ã«è¿ãç¶æ
ã§èгå¯ç»åãšèšæž¬ç»åãé£ç¶çã«åŸãããšãã§ããã®ã§ã被åäœïŒã«åããããå Žåã芳å¯ãšèšæž¬ãäž¡æ¹å®çŸããããšãã§ããã芳å¯ã¢ãŒãã®ç»åã衚瀺ãã€ã€ãããã«å¿
èŠã«å¿ããŠèšæž¬ãããæ
å ±ãéããŠåãããŠè¡šç€ºããã°ããŠãŒã¶ã«å¯ŸããŠç®èŠæ€æ»ãšå®éæ€æ»ãåæã«æäŸããããšãã§ããæçšãªæ
å ±æäŸãå¯èœãšãªãã
By alternately repeating the observation mode and the stereo measurement mode in this way, an observation image and a measurement image can be obtained continuously in a state almost in real time, so both observation and measurement can be performed even when the
ã以äžãæ¬çºæãé©çšãã宿œåœ¢æ ããã³ãã®å€åœ¢äŸã«ã€ããŠèª¬æããããæ¬çºæã¯ãå宿œåœ¢æ ããã®å€åœ¢äŸãã®ãŸãŸã«éå®ããããã®ã§ã¯ãªãã宿œæ®µéã§ã¯ãçºæã®èŠæšãéžè±ããªãç¯å²å ã§æ§æèŠçŽ ãå€åœ¢ããŠå ·äœåããããšãã§ããããŸããäžèšããå宿œåœ¢æ ãå€åœ¢äŸã«é瀺ãããŠããè€æ°ã®æ§æèŠçŽ ãé©å®çµã¿åãããããšã«ãã£ãŠãçš®ã ã®çºæã圢æããããšãã§ãããäŸãã°ãå宿œåœ¢æ ãå€åœ¢äŸã«èšèŒããå šæ§æèŠçŽ ããããã€ãã®æ§æèŠçŽ ãåé€ããŠããããããã«ãç°ãªã宿œã®åœ¢æ ãå€åœ¢äŸã§èª¬æããæ§æèŠçŽ ãé©å®çµã¿åãããŠãããããã®ããã«ãçºæã®äž»æšãéžè±ããªãç¯å²å ã«ãããŠçš®ã ã®å€åœ¢ãå¿çšãå¯èœã§ããããŸããæçްæžåã¯å³é¢ã«ãããŠãå°ãªããšãäžåºŠãããåºçŸ©ãŸãã¯å矩ãªç°ãªãçšèªãšå ±ã«èšèŒãããçšèªã¯ãæçްæžåã¯å³é¢ã®ãããªãç®æã«ãããŠãããã®ç°ãªãçšèªã«çœ®ãæããããšãã§ããã As mentioned above, although embodiment and its modification which applied this invention were described, this invention is not limited to each embodiment and its modification as it is, and in the range which does not deviate from the summary of invention in an implementation stage. The component can be modified and embodied. Further, various inventions can be formed by appropriately combining a plurality of constituent elements disclosed in the above-described embodiments and modifications. For example, some constituent elements may be deleted from all the constituent elements described in each embodiment or modification. Furthermore, you may combine suitably the component demonstrated in different embodiment and modification. Thus, various modifications and applications are possible without departing from the spirit of the invention. In addition, a term described together with a different term having a broader meaning or the same meaning at least once in the specification or the drawings can be replaced with the different term anywhere in the specification or the drawings.
ïŒã被åäœãïŒïŒãçµåå
åŠç³»ãïŒïŒãåºå®ãã¹ã¯ã
ïŒïŒïŒïŒïŒïŒïŒïŒãéå£ïŒçµãåïŒãïŒïŒãé®å
éšãïŒïŒãå¯åãã¹ã¯ã
ïŒïŒïŒïŒïŒïŒïŒïŒãéå£ïŒçµãåïŒãïŒïŒãé®å
éšãïŒïŒãå転軞ã
ïŒïŒãæ®åçŽ åãïŒïŒãé§åéšãïŒïŒãç
§æéšãïŒïŒïŒãã¹ã³ãŒãéšã
ïŒïŒïŒãæ®åéšãïŒïŒïŒãæ¬äœéšãïŒïŒïŒãåŠçéšãïŒïŒïŒãã¢ãã¿è¡šç€ºéšã
ïŒïŒïŒãæ®ååŠçéšãïŒïŒïŒãå
æºé§åå¶åŸ¡éšãïŒïŒïŒãç»åéžæéšã
ïŒïŒïŒãã«ã©ãŒç»åçæéšãïŒïŒïŒãäœçžå·®æ€åºéšã
ïŒïŒïŒãå¯åãã¹ã¯å¶åŸ¡éšãïŒïŒïŒãå¯åãã¹ã¯äœçœ®æ€åºéšã
ïŒïŒïŒãè·é¢æ
å ±ç®åºéšãïŒïŒïŒãïŒæ¬¡å
æ
å ±çæéšãïŒïŒïŒãçœè²å
æºã
ïŒïŒïŒãå°å
éšæãïŒïŒïŒãç
§æçšã¬ã³ãºãïŒïŒïŒãèµ€è²ã¬ãŒã¶ãŒå
æºã
ïŒïŒïŒãéè²ã¬ãŒã¶ãŒå
æºãïŒïŒïŒïŒïŒïŒïŒããã€ã¯ãã€ãã¯ããªãºã ã
ïŒïŒïŒãå°å
éšæãïŒïŒïŒããã¹ã¯ãã¿ãŒã³ãïŒïŒïŒãæåœ±ã¬ã³ãºã
ïŒïŒïŒãçœè²å
æºãïŒïŒïŒãåå
çŽ åãïŒïŒïŒãéè²ã¬ãŒã¶ãŒå
æºã
ïŒïŒïŒãèµ€è²ã¬ãŒã¶ãŒå
æºãïŒïŒïŒïŒïŒïŒïŒããã€ã¯ãã€ãã¯ããªãºã ã
ïŒïŒïŒããã¹ã¯ãã¿ãŒã³ãïŒïŒïŒãåå
çŽ åãïŒïŒïŒãããªãºã ã
ïŒïŒïŒãå°å
éšæãïŒïŒïŒãæåœ±ã¬ã³ãºã
ã第ïŒã®ãã£ã«ã¿ãã第ïŒã®ãã£ã«ã¿ãïŒïœïŒïŒïœãç»åã
ïœïŒïŒ°ïœïŒã第ïŒã®æ³¢é·åž¯åãïœïŒïŒ°ïœïŒã第ïŒã®æ³¢é·åž¯åã
ïœïŒïŒžïŒ¬ïŒãäœçžå·®ãïœïŒïŒ¶ïœïŒïŒ¶ïœãç»å
5 subject, 10 imaging optical system, 20 fixed mask,
21, 22, 23 Opening (aperture hole), 24 light shielding part, 30 movable mask,
31, 32, 33 Aperture (diaphragm hole), 34 Shading part, 35 Rotating shaft
40 image sensor, 50 drive unit, 60 illumination unit, 100 scope unit,
110 imaging unit, 200 main body unit, 210 processing unit, 220 monitor display unit,
230 imaging processing unit, 305 light source drive control unit, 310 image selection unit,
320 color image generation unit, 330 phase difference detection unit,
340 movable mask control unit, 350 movable mask position detection unit,
360 distance information calculation unit, 370 three-dimensional information generation unit, 401 white light source,
402 light guide member, 403 illumination lens, 404 red laser light source,
405 Blue laser light source, 406, 407 dichroic prism,
408 light guide member, 409 mask pattern, 410 projection lens,
451 white light source, 452 polarizing element, 453 blue laser light source,
454 red laser light source, 455,456 dichroic prism,
457 mask pattern, 458 polarizing element, 458 prism,
460 light guide member, 461 projection lens,
FL first filter, FR second filter, Mb, Mr image,
Pb, Pb1 first wavelength band, Pr, Pr2 second wavelength band,
s (XL) phase difference, Vb, Vg, Vr image
Claims (14)
ãåèšç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸãããšå ±ã«åèšç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸããªã第ïŒã®æ³¢é·åž¯åãåã³åèšç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸãããšå ±ã«åèšç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸããªã第ïŒã®æ³¢é·åž¯åã«ãããŠæäžã®å éååžãæãããã¿ãŒã³ç §æã被åäœã«ç §å°ããç §æéšãšã
ããå«ãããšãç¹åŸŽãšããæ®åè£ çœ®ã Capturing a captured image including a first color image, a second color image having a longer wavelength than the first color, and a third color image having a longer wavelength than the second color; An imaging unit capable of imaging one color image and the third color image as a stereo image;
A first wavelength band that is included in the wavelength band of the first color and not included in the wavelength band of the second color, and that is included in the wavelength band of the third color and included in the wavelength band of the second color. An illumination unit that irradiates a subject with pattern illumination having a given light amount distribution in a second wavelength band that is not present;
An imaging apparatus comprising:
ãåèšæ®åéšã¯ã
ãåèšã¹ãã¬ãªç»åãæ®åããã¹ãã¬ãªã¢ãŒããšãåçŒã«ããåèšæ®åç»åãæ®åããéã¹ãã¬ãªã¢ãŒããšãåãæ¿ããããšãç¹åŸŽãšããæ®åè£ çœ®ã In claim 1,
The imaging unit
An imaging apparatus characterized by switching between a stereo mode for capturing the stereo image and a non-stereo mode for capturing the captured image with a single eye.
ãåèšéã¹ãã¬ãªã¢ãŒãã«ãããåèšåçŒã¯ãåèšç¬¬ïŒè²ãšåèšç¬¬ïŒè²ãšåèšç¬¬ïŒè²ã®æ³¢é·åž¯åã®ãã¡åèšç¬¬ïŒã®æ³¢é·åž¯ååã³åèšç¬¬ïŒã®æ³¢é·åž¯åãé€ãæ³¢é·åž¯åãééãããããšãç¹åŸŽãšããæ®åè£ çœ®ã In claim 2,
The monocular in the non-stereo mode allows a wavelength band excluding the first wavelength band and the second wavelength band among the wavelength bands of the first color, the second color, and the third color to pass. An imaging device that is characterized.
ãåèšã¹ãã¬ãªã¢ãŒãã«ãããŠæ®åãããåèšç¬¬ïŒè²ã®ç»åãšåèšç¬¬ïŒè²ã®ç»åãšã®éã®äœçžå·®ãæ€åºããäœçžå·®æ€åºéšãšã
ãåèšéã¹ãã¬ãªã¢ãŒãã«ãããŠæ®åãããåèšæ®åç»åã«åºã¥ããŠèгå¯çšã®ç»åãåºåããç»ååºåéšãšã
ããå«ãããšãç¹åŸŽãšããæ®åè£ çœ®ã In claim 3,
A phase difference detector that detects a phase difference between the first color image and the third color image captured in the stereo mode;
An image output unit that outputs an image for observation based on the captured image captured in the non-stereo mode;
An imaging apparatus comprising:
ãåèšéã¹ãã¬ãªã¢ãŒãã«ãããåèšåçŒã¯ãåèšç¬¬ïŒè²ãšåèšç¬¬ïŒè²ãšåèšç¬¬ïŒè²ã®æ³¢é·åž¯åãå«ãæ³¢é·åž¯åãééãããããšãç¹åŸŽãšããæ®åè£ çœ®ã In claim 2,
The imaging apparatus according to claim 1, wherein the monocular in the non-stereo mode passes a wavelength band including the wavelength bands of the first color, the second color, and the third color.
ãåèšã¹ãã¬ãªã¢ãŒãã«ãããŠæ®åãããåèšç¬¬ïŒè²ã®ç»åãšåèšç¬¬ïŒè²ã®ç»åãšã®éã®äœçžå·®ãæ€åºããäœçžå·®æ€åºéšãšã
ãåèšéã¹ãã¬ãªã¢ãŒãã«ãããŠæ®åãããåèšæ®åç»åã«åºã¥ããŠèгå¯çšã®ç»åãåºåããç»ååºåéšãšã
ããå«ã¿ã
ãåèšç»ååºåéšã¯ã
ãåèšæäžã®å éååžã«ããåèšç¬¬ïŒè²ã®ç»ååã³åèšç¬¬ïŒè²ã®ç»åã®ç»çŽ å€ã®å€åããåèšç¬¬ïŒè²ã®ç»åã«åºã¥ããŠè£æ£ããããšãç¹åŸŽãšããæ®åè£ çœ®ã In claim 5,
A phase difference detector that detects a phase difference between the first color image and the third color image captured in the stereo mode;
An image output unit that outputs an image for observation based on the captured image captured in the non-stereo mode;
Including
The image output unit includes:
An imaging apparatus, wherein a change in pixel values of the first color image and the third color image due to the given light amount distribution is corrected based on the second color image.
ãåèšãã¿ãŒã³ç §æã¯ãåèšç¬¬ïŒè²ã®æ³¢é·åž¯åã®ãã¡åèšç¬¬ïŒã®æ³¢é·åž¯åãé€ãæ³¢é·åž¯åãšãåèšç¬¬ïŒè²ã®æ³¢é·åž¯åãšãåèšç¬¬ïŒè²ã®æ³¢é·åž¯åã®ãã¡åèšç¬¬ïŒã®æ³¢é·åž¯åãé€ãæ³¢é·åž¯åãšã«ãããŠãã©ãããªå éååžã§ããããšãç¹åŸŽãšããæ®åè£ çœ®ã In any one of Claims 1 thru | or 6.
The pattern illumination includes a wavelength band excluding the first wavelength band among the wavelength bands of the first color, a wavelength band of the second color, and a second wavelength band of the wavelength bands of the third color. An image pickup apparatus having a flat light amount distribution in a wavelength band excluding.
ãåèšç¬¬ïŒè²ã¯éè²ã§ãããåèšç¬¬ïŒè²ã¯ç·è²ã§ãããåèšç¬¬ïŒè²ã¯èµ€è²ã§ããããšãç¹åŸŽãšããæ®åè£ çœ®ã In any one of Claims 1 thru | or 7,
The imaging apparatus according to claim 1, wherein the first color is blue, the second color is green, and the third color is red.
ãæ®åçŽ åãšã
ãåèšæ®åçŽ åã«è¢«åäœãçµåãããçµåå åŠç³»ãšã
ãåèšçµåå åŠç³»ã®ç³ãåå²ãã第ïŒïœç¬¬ïŒã®éå£ãšãåèšç¬¬ïŒã®æ³¢é·åž¯åãééããã第ïŒã®ãã£ã«ã¿ãšãåèšç¬¬ïŒã®æ³¢é·åž¯åãééããã第ïŒã®ãã£ã«ã¿ãšãæããåºå®ãã¹ã¯ãšã
ãé®å éšãšãåèšç¬¬ïŒïœç¬¬ïŒã®éå£ã«å¯Ÿå¿ããŠåèšé®å éšã«èšãããã第ïŒïœç¬¬ïŒã®éå£ãšãæããåèšçµåå åŠç³»ã«å¯ŸããŠå¯åã§ããå¯åãã¹ã¯ãšã
ããå«ã¿ã
ãåèšç¬¬ïŒã®ãã£ã«ã¿ã¯ãåèšç¬¬ïŒã®éå£ã«èšãããã
ãåèšç¬¬ïŒã®ãã£ã«ã¿ã¯ãåèšç¬¬ïŒã®éå£ã«èšãããã
ãåèšç¬¬ïŒã®éå£ã¯ãåèšçµåå åŠç³»ã®å 軞äžã«èšããããããšãç¹åŸŽãšããæ®åè£ çœ®ã In any one of Claims 1 thru | or 8.
An image sensor;
An imaging optical system for imaging a subject on the image sensor;
First to third apertures that divide the pupil of the imaging optical system, a first filter that passes the first wavelength band, and a second filter that passes the second wavelength band A fixed mask,
A movable mask having a light shielding part and fourth to sixth openings provided in the light shielding part corresponding to the first to third openings, and movable with respect to the imaging optical system;
Including
The first filter is provided in the first opening;
The second filter is provided in the second opening;
The image pickup apparatus, wherein the third opening is provided on an optical axis of the imaging optical system.
ãåèšå¯åãã¹ã¯ãå¶åŸ¡ããå¯åãã¹ã¯å¶åŸ¡éšãå«ã¿ã
ãåèšå¯åãã¹ã¯å¶åŸ¡éšã¯ã
ãéã¹ãã¬ãªã¢ãŒãã«ãããŠãåèšå 軞æ¹åã«èŠãå Žåã«åèšé®å éšãåèšç¬¬ïŒã第ïŒã®éå£ã«éãªããšå ±ã«åèšç¬¬ïŒã®éå£ãåèšç¬¬ïŒã®éå£ã«éãªã第ïŒã®ç¶æ ã«ãåèšå¯åãã¹ã¯ãèšå®ãã
ãã¹ãã¬ãªã¢ãŒãã«ãããŠãåèšå 軞æ¹åã«èŠãå Žåã«åèšç¬¬ïŒã第ïŒã®éå£ãåèšç¬¬ïŒã第ïŒã®éå£ã«éãªããšå ±ã«åèšé®å éšãåèšç¬¬ïŒã®éå£ã«éãªã第ïŒã®ç¶æ ã«ãåèšå¯åãã¹ã¯ãèšå®ããããšãç¹åŸŽãšããæ®åè£ çœ®ã In claim 9,
A movable mask control unit for controlling the movable mask;
The movable mask controller is
In the non-stereo mode, when viewed in the optical axis direction, the movable portion is in a first state in which the light shielding portion overlaps the first and second openings and the sixth opening overlaps the third opening. Set the mask,
In the stereo mode, when viewed in the optical axis direction, the fourth and fifth openings overlap the first and second openings, and the light shielding portion overlaps the third opening. An imaging apparatus, wherein the movable mask is set.
ãæ®åçŽ åãšã
ãåèšæ®åçŽ åã«è¢«åäœãçµåãããçµåå åŠç³»ãšã
ãåèšçµåå åŠç³»ã®ç³ãåå²ãã第ïŒïœç¬¬ïŒã®éå£ãšãåèšç¬¬ïŒã®æ³¢é·åž¯åãééããã第ïŒã®ãã£ã«ã¿ãšãåèšç¬¬ïŒã®æ³¢é·åž¯åãééããã第ïŒã®ãã£ã«ã¿ãšãæããåºå®ãã¹ã¯ãšã
ãé®å éšãšãåèšç¬¬ïŒã第ïŒã®éå£ã«å¯Ÿå¿ããŠåèšé®å éšã«èšãããã第ïŒã®éå£ãšãåèšç¬¬ïŒã®éå£ã«å¯Ÿå¿ããŠåèšé®å éšã«èšãããã第ïŒã®éå£ãšãæããåèšçµåå åŠç³»ã«å¯ŸããŠå¯åã§ããå¯åãã¹ã¯ãšã
ããå«ã¿ã
ãåèšç¬¬ïŒã®ãã£ã«ã¿ã¯ãåèšç¬¬ïŒã®éå£ã«èšãããã
ãåèšç¬¬ïŒã®ãã£ã«ã¿ã¯ãåèšç¬¬ïŒã®éå£ã«èšãããã
ãåèšç¬¬ïŒã®éå£ã¯ãåèšçµåå åŠç³»ã®å 軞äžã«èšããããããšãç¹åŸŽãšããæ®åè£ çœ®ã In any one of Claims 1 thru | or 8.
An image sensor;
An imaging optical system for imaging a subject on the image sensor;
First to third apertures that divide the pupil of the imaging optical system, a first filter that passes the first wavelength band, and a second filter that passes the second wavelength band A fixed mask,
A light shielding portion, a fourth opening provided in the light shielding portion corresponding to the first and third openings, and a fifth opening provided in the light shielding portion corresponding to the second opening. And a movable mask movable with respect to the imaging optical system,
Including
The first filter is provided in the first opening;
The second filter is provided in the second opening;
The image pickup apparatus, wherein the third opening is provided on an optical axis of the imaging optical system.
ãåèšå¯åãã¹ã¯ãå¶åŸ¡ããå¯åãã¹ã¯å¶åŸ¡éšãå«ã¿ã
ãåèšå¯åãã¹ã¯å¶åŸ¡éšã¯ã
ãéã¹ãã¬ãªã¢ãŒãã«ãããŠãåèšå 軞æ¹åã«èŠãå Žåã«åèšé®å éšãåèšç¬¬ïŒã第ïŒã®éå£ã«éãªããšå ±ã«åèšç¬¬ïŒã®éå£ãåèšç¬¬ïŒã®éå£ã«éãªã第ïŒã®ç¶æ ã«ãåèšå¯åãã¹ã¯ãèšå®ãã
ãã¹ãã¬ãªã¢ãŒãã«ãããŠãåèšå 軞æ¹åã«èŠãå Žåã«åèšç¬¬ïŒã第ïŒã®éå£ãåèšç¬¬ïŒã第ïŒã®éå£ã«éãªããšå ±ã«åèšé®å éšãåèšç¬¬ïŒã®éå£ã«éãªã第ïŒã®ç¶æ ã«ãåèšå¯åãã¹ã¯ãèšå®ããããšãç¹åŸŽãšããæ®åè£ çœ®ã In claim 11,
A movable mask control unit for controlling the movable mask;
The movable mask controller is
In the non-stereo mode, when viewed in the optical axis direction, the movable portion is in a first state in which the light shielding portion overlaps the first and second openings and the fourth opening overlaps the third opening. Set the mask,
In the stereo mode, when viewed in the optical axis direction, the fourth and fifth openings overlap the first and second openings, and the light shielding portion overlaps the third opening. An imaging apparatus, wherein the movable mask is set.
ãåèšç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸãããšå ±ã«åèšç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸããªã第ïŒã®æ³¢é·åž¯åãåã³åèšç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸãããšå ±ã«åèšç¬¬ïŒè²ã®æ³¢é·åž¯åã«å«ãŸããªã第ïŒã®æ³¢é·åž¯åã«ãããŠæäžã®å éååžãæãããã¿ãŒã³ç §æã被åäœã«ç §å°ãã
ãåèšç¬¬ïŒè²ã®ç»åãšåèšç¬¬ïŒè²ã®ç»åãšåèšç¬¬ïŒè²ã®ç»åãšãå«ãæ®åç»åãæ®åããããšãç¹åŸŽãšããæ®åæ¹æ³ã Of the first color image, the second color image longer than the first color, and the third color image longer than the second color, the first color image and the When a third color image can be captured as a stereo image,
A first wavelength band that is included in the wavelength band of the first color and not included in the wavelength band of the second color, and that is included in the wavelength band of the third color and included in the wavelength band of the second color. Irradiating the subject with pattern illumination having a given light intensity distribution in the second wavelength band not present,
An imaging method, comprising: capturing a captured image including the first color image, the second color image, and the third color image.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2015/078871 WO2017064746A1 (en) | 2015-10-13 | 2015-10-13 | Imaging device, endoscopic device and imaging method |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2015/078871 WO2017064746A1 (en) | 2015-10-13 | 2015-10-13 | Imaging device, endoscopic device and imaging method |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017064746A1 true WO2017064746A1 (en) | 2017-04-20 |
Family
ID=58517404
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2015/078871 Ceased WO2017064746A1 (en) | 2015-10-13 | 2015-10-13 | Imaging device, endoscopic device and imaging method |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2017064746A1 (en) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003047028A (en) * | 2001-08-01 | 2003-02-14 | Olympus Optical Co Ltd | Imaging apparatus and stereogram-photographing method |
| JP2005223812A (en) * | 2004-02-09 | 2005-08-18 | Canon Inc | Imaging device |
| JP2013124985A (en) * | 2011-12-15 | 2013-06-24 | Ricoh Co Ltd | Compound-eye imaging apparatus and distance measuring device |
| JP2015502558A (en) * | 2011-10-03 | 2015-01-22 | ã€ãŒã¹ããã³ ã³ãã㯠ã«ã³ãã㌠| Stereo projector using spectrally adjacent color bands |
| JP2015513686A (en) * | 2012-01-17 | 2015-05-14 | ã€ãŒã¹ããã³ ã³ãã㯠ã«ã³ãã㌠| Stereoscopic glasses with tilt filter |
-
2015
- 2015-10-13 WO PCT/JP2015/078871 patent/WO2017064746A1/en not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003047028A (en) * | 2001-08-01 | 2003-02-14 | Olympus Optical Co Ltd | Imaging apparatus and stereogram-photographing method |
| JP2005223812A (en) * | 2004-02-09 | 2005-08-18 | Canon Inc | Imaging device |
| JP2015502558A (en) * | 2011-10-03 | 2015-01-22 | ã€ãŒã¹ããã³ ã³ãã㯠ã«ã³ãã㌠| Stereo projector using spectrally adjacent color bands |
| JP2013124985A (en) * | 2011-12-15 | 2013-06-24 | Ricoh Co Ltd | Compound-eye imaging apparatus and distance measuring device |
| JP2015513686A (en) * | 2012-01-17 | 2015-05-14 | ã€ãŒã¹ããã³ ã³ãã㯠ã«ã³ãã㌠| Stereoscopic glasses with tilt filter |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| KR101798656B1 (en) | 3-d imaging using telecentric defocus | |
| US10542875B2 (en) | Imaging device, endoscope apparatus, and imaging method | |
| JP5053468B2 (en) | Stereoscopic image capturing apparatus and endoscope | |
| CN108073016A (en) | Image forming apparatus | |
| US20220346643A1 (en) | Ophthalmic imaging apparatus and system | |
| WO2017168986A1 (en) | Control device, endoscope image pickup device, control method, program, and endoscope system | |
| WO2018051679A1 (en) | Measurement assistance device, endoscope system, processor for endoscope system, and measurement assistance method | |
| US20180098053A1 (en) | Imaging device, endoscope apparatus, and imaging method | |
| US11025812B2 (en) | Imaging apparatus, imaging method, and imaging system | |
| JP2015231498A (en) | Endoscope device | |
| JP5953443B2 (en) | Endoscope system | |
| JP6706026B2 (en) | Endoscope system and operating method of endoscope apparatus | |
| US20180092516A1 (en) | Imaging device, endoscope apparatus, and imaging method | |
| JP7609953B2 (en) | Ophthalmic device and control method thereof | |
| JP2015046019A (en) | Image processing apparatus, imaging apparatus, imaging system, image processing method, program, and storage medium | |
| JP6948407B2 (en) | Equipment and methods for determining surface topology and associated colors | |
| US10798332B1 (en) | Dual pass-through imaging system and method | |
| JP2011250976A (en) | Image processor, image processing method, and program | |
| JPS63246716A (en) | Method for taking-in of endoscope image | |
| WO2017064746A1 (en) | Imaging device, endoscopic device and imaging method | |
| KR20230106593A (en) | Imaging systems and laparoscopes for imaging objects | |
| US11648080B2 (en) | Medical observation control device and medical observation system that correct brightness differences between images acquired at different timings | |
| JP7583551B2 (en) | Imaging device | |
| WO2017212577A1 (en) | Imaging device, endoscope device, and imaging method | |
| WO2024200351A1 (en) | An extraoral scanner system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15906211 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15906211 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |