WO2013031392A1 - Dispositif imageur en 3d - Google Patents
Dispositif imageur en 3d Download PDFInfo
- Publication number
- WO2013031392A1 WO2013031392A1 PCT/JP2012/067786 JP2012067786W WO2013031392A1 WO 2013031392 A1 WO2013031392 A1 WO 2013031392A1 JP 2012067786 W JP2012067786 W JP 2012067786W WO 2013031392 A1 WO2013031392 A1 WO 2013031392A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- imaging
- stereoscopic
- image
- monocular
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/218—Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/286—Image signal generators having separate monoscopic and stereoscopic modes
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
Definitions
- the present invention relates to a stereoscopic imaging apparatus, and more particularly to a stereoscopic imaging apparatus provided with a plurality of optical systems.
- stereoscopic display devices have been developed for displaying a stereoscopic image.
- the cross point of a stereoscopic image is adjusted according to conditions such as the size of the display device and the distance between the viewer and the display device.
- stereoscopic cameras capable of acquiring stereoscopic image data have been used.
- stereoscopic image data is acquired using two (left and right) lenses and one image sensor having different sensitivity depending on the incident angle.
- the imaging device described in Document 3 includes a plurality of optical systems, and can switch between monocular imaging and compound-eye imaging so that a stereoscopic image can be acquired.
- Patent Document 1 adjusts the cross point according to the conditions at the time of image reproduction.
- the cross point is shifted and the user is informed. It will give an unnatural feeling.
- the settable shooting modes are limited, and images cannot be acquired in shooting modes arbitrarily selected from various shooting modes.
- the present invention has been made based on such circumstances, and an object thereof is to provide a stereoscopic imaging apparatus capable of displaying an image with a natural feeling when the shooting mode is switched. It is another object of the present invention to provide a stereoscopic imaging apparatus that can acquire various images according to a user's request.
- a stereoscopic imaging apparatus is a plurality of imaging units that capture an image of a subject, and a light beam that has passed through different areas of a single imaging optical system is a pixel group.
- a plurality of imaging units including at least one monocular stereoscopic imaging unit having an imaging device including a plurality of pixel groups that perform photoelectric conversion for each, an image generation unit that generates a stereoscopic image of a subject from imaging signals of the plurality of imaging units,
- a stereoscopic imaging device including a crosspoint control unit that controls a crosspoint of a stereoscopic image, wherein the image generation unit configures a stereoscopic image from a plurality of viewpoint images obtained by photographing with the monocular stereoscopic imaging unit.
- the cross-point control unit when switching from shooting with the compound eye stereoscopic imaging function to shooting with the monocular stereoscopic imaging function, and when switching from shooting with the anterior eye stereoscopic imaging function to shooting with the compound eye stereoscopic imaging function
- the image generation unit is controlled so that the cross point of the displayed stereoscopic image does not change before and after the switching.
- a focused point in the image becomes a cross point
- the in-focus position in the image is crossed. It is different from the point. Therefore, when the monocular stereoscopic imaging and the compound eye stereoscopic imaging are switched to each other, the cross point changes before and after the switching, so that the viewer may feel uncomfortable that the subject jumps forward or retracts backward. Such a sense of incongruity is particularly felt when the pop-out state of the main subject that is often in focus changes.
- a cross-point control unit is provided to switch from imaging with the compound eye stereoscopic imaging function to imaging with the monocular stereoscopic imaging function, and from imaging with the monocular stereoscopic imaging function to compound eye stereoscopic imaging.
- the image generation unit is controlled so that the cross point of the stereoscopic image displayed on the image display unit does not change before and after the switching.
- the “cross point” means a point where parallax is zero in a stereoscopic image.
- a subject that is in front of the cross point appears to protrude (front) from the screen, and a subject that is behind the cross point appears to be retracted (back) from the screen.
- the crosspoint control unit performs imaging with the compound-eye stereoscopic imaging function while imaging with the compound-eye stereoscopic imaging function continues.
- the first in-focus area in one viewpoint image among the viewpoint images constituting the stereoscopic image obtained by the above is detected, and the detected first in the other viewpoint images among the viewpoint images constituting the stereoscopic image is detected.
- a first corresponding area corresponding to the in-focus area is detected, and based on the detected first corresponding area, a first misregistration amount between one viewpoint image and another viewpoint image is calculated.
- the cross point of the stereoscopic image obtained by imaging with the compound eye stereoscopic imaging function can be captured with the monocular stereoscopic imaging function.
- loss point it may be.
- the cross point of the stereoscopic image obtained by photographing is shifted in advance so as to coincide with the first in-focus area that is the cross point of the stereoscopic image obtained by photographing with the monocular stereoscopic imaging function.
- the cross point does not change suddenly when actually switching from the compound-eye stereoscopic imaging function to the monocular stereoscopic imaging function, and an image can be displayed with a natural feeling.
- the crosspoint control unit is configured to provide a monocular stereoscopic imaging function while shooting with the monocular stereoscopic imaging function is continued.
- the second in-focus area which is a cross point of the stereoscopic image is detected, and shooting with the compound eye stereoscopic imaging function is performed.
- a region is detected, a second positional deviation amount between one viewpoint image and another viewpoint image is calculated based on the detected second corresponding region, and the already stored positional deviation amount is calculated as the second position.
- Update and store the displacement amount, monocular 3D photography When the function is switched to the compound eye stereoscopic imaging function, a compound image is formed by constructing a stereoscopic image from one viewpoint image and an image obtained by shifting the other viewpoint image by the second positional shift amount stored.
- the image generation unit is controlled so that the cross point of the stereoscopic image obtained by shooting with the stereoscopic imaging function matches the second in-focus area that is the cross point of the stereoscopic image obtained by shooting with the monocular stereoscopic imaging function. You may do it.
- the second positional shift amount with respect to the viewpoint image is calculated and updated in advance while shooting with the monocular stereoscopic imaging function is continued, and switching from the monocular stereoscopic imaging function to the compound eye stereoscopic imaging function is performed.
- a stereoscopic image is formed after shifting the viewpoint image newly obtained by switching to the compound eye stereoscopic imaging function by shifting the second positional deviation amount.
- an imaging function automatic switching unit that automatically switches between a monocular stereoscopic imaging function and a compound-eye stereoscopic imaging function.
- the imaging function automatic switching unit operates the monocular stereoscopic imaging function while the in-focus position is closer than a predetermined distance, and switches to the compound eye stereoscopic imaging function when the in-focus position becomes farther than the predetermined distance, and the cross point
- the control unit is operated to operate the compound eye stereoscopic imaging function while the in-focus position is closer than a predetermined distance, and the user is instructed to switch to the monocular stereoscopic imaging function when the in-focus position is closer than the predetermined distance. In such a case, it may be switched to the monocular stereoscopic imaging function.
- the image generation unit is an imaging element included in the monocular stereoscopic imaging unit during the operation of the compound eye stereoscopic imaging function.
- the pixel signal of each pixel group which comprises a some pixel group may be added for every pixel position of this, and the pixel signal addition process which makes the said addition result the pixel signal in each pixel position may be performed.
- a stereoscopic imaging apparatus is a plurality of imaging units that capture an image of a subject, and a light beam that has passed through different areas of a single imaging optical system is grouped into a pixel group.
- a plurality of imaging units including at least one monocular stereoscopic imaging unit having an imaging element including a plurality of pixel groups that perform photoelectric conversion, an image generation unit that generates a stereoscopic image of a subject from imaging signals of the plurality of imaging units, and a user
- An imaging mode setting unit that sets an imaging mode based on an instruction input, and the imaging mode setting unit obtains the number of imaging units to be used for imaging among a plurality of imaging units and acquires the imaging mode setting unit.
- two-dimensional imaging mode and monocular stereoscopic imaging mode using at least one monocular stereoscopic imaging unit, and at least one monocular stereoscopic imaging unit and a plurality of imaging units Without even sets an imaging mode from among the imaging modes including two-dimensional imaging mode and the compound-eye stereoscopic imaging mode, using the image pickup unit other than the one monocular stereoscopic imaging unit.
- the size of the parallax, the number of viewpoints, and the like differ by changing the number of imaging units used for shooting and the number of viewpoint images to be acquired (number of viewpoints) among the plurality of imaging units.
- Various imaging modes can be set. For example, the two-dimensional imaging mode or the stereoscopic imaging mode can be selected, and the monocular stereoscopic imaging mode or the compound eye stereoscopic imaging mode can be selected also in the stereoscopic imaging mode.
- the stereoscopic imaging device according to the second aspect can acquire various images according to the user's request.
- the image generation unit is based on a user instruction input, and the two-dimensional imaging mode using at least one monocular stereoscopic imaging unit,
- the compound-eye stereoscopic imaging mode using at least one monocular stereoscopic imaging unit and an imaging unit other than at least one monocular stereoscopic imaging unit among the plurality of imaging units each of the imaging elements included in at least one monocular stereoscopic imaging unit
- pixel signals of each pixel group constituting a plurality of pixel groups may be added, and pixel signal addition processing may be performed using the added result as a pixel signal at each pixel position. In this way, the amount of noise of an image generated by adding a plurality of pixel signals can be reduced, and more various images can be acquired according to the user's request.
- the number of the plurality of imaging units is 2, and other than at least one monocular stereoscopic imaging unit.
- imaging unit also may be a monocular stereoscopic imaging unit.
- the stereoscopic imaging apparatus may further include a stereoscopic image display unit that displays the generated stereoscopic image.
- the stereoscopic imaging device of the present invention it is possible to display an image with a natural feeling when the shooting mode is switched, and it is possible to acquire various images according to a user's request.
- FIG. 1 is a block diagram showing a configuration of a stereoscopic imaging apparatus 10 according to the first embodiment of the present invention.
- Figure 2 is an image view showing an appearance of a stereoscopic imaging device 10.
- FIG. 3 is a diagram illustrating a configuration of an imaging element used in the monocular stereoscopic imaging unit.
- 4 is a diagram showing the main / sub-pixels of the image sensor shown in FIG. 3 one by one.
- FIG. 5A is a diagram showing a configuration of a normal CCD.
- Figure 5B is a diagram showing an example of a configuration of a monocular 3D sensor.
- FIG. 5C is a diagram illustrating another example of the configuration of the monocular 3D sensor.
- FIG. 1 is a block diagram showing a configuration of a stereoscopic imaging apparatus 10 according to the first embodiment of the present invention.
- Figure 2 is an image view showing an appearance of a stereoscopic imaging device 10.
- FIG. 3 is a diagram illustrating a configuration of an
- FIG. 6 is a block diagram illustrating a main part of the stereoscopic imaging apparatus according to the first embodiment.
- FIG. 7 is a flowchart showing the cross point control at the time of switching from the compound eye stereoscopic imaging function to the monocular stereoscopic imaging function.
- FIG. 8A is a conceptual diagram illustrating a relationship between a cross point and a focal point during compound eye stereoscopic imaging.
- FIG. 8B is a conceptual diagram illustrating a relationship between a cross point and a focal point during monocular stereoscopic imaging.
- FIG. 9 is another conceptual diagram showing the cross point control at the time of switching from the compound eye stereoscopic imaging function to the monocular stereoscopic imaging function.
- FIG. 8A is a conceptual diagram illustrating a relationship between a cross point and a focal point during compound eye stereoscopic imaging.
- FIG. 8B is a conceptual diagram illustrating a relationship between a cross point and a focal point during monocular stereoscopic imaging.
- FIG. 10 is a flowchart showing the cross point control at the time of switching from the monocular stereoscopic imaging function to the compound eye stereoscopic imaging function.
- FIG. 11 is a conceptual diagram showing a relationship between a cross point and a focal point during monocular / compound eye stereoscopic imaging.
- FIG. 12 is a flowchart showing a process for automatically switching between the monocular / compound-eye stereoscopic imaging function.
- FIG. 13 is a table showing shooting modes that can be set in the stereoscopic imaging apparatus according to the first embodiment.
- FIG. 14 is a conceptual diagram illustrating a procedure for setting a shooting mode in the stereoscopic imaging apparatus according to the first embodiment.
- FIG. 15 is a conceptual diagram illustrating pixel signal addition processing in the stereoscopic imaging apparatus according to the first embodiment.
- FIG. 16 is a flowchart illustrating a procedure for selecting a shooting mode in consideration of pixel signal addition processing.
- FIG. 17 is a block diagram illustrating a main part of a stereoscopic imaging apparatus according to the second embodiment of the present invention.
- FIG. 18 is a table showing shooting modes that can be set in the stereoscopic imaging apparatus according to the second embodiment.
- FIG. 1 is a block diagram illustrating an embodiment of a stereoscopic imaging apparatus 10 according to the present invention
- FIG. 2 is an image diagram illustrating an external appearance of the stereoscopic imaging apparatus 10.
- the stereoscopic imaging apparatus 10 displays a captured image on a liquid crystal monitor (LCD) 30 or records it on a memory card 54 (hereinafter also referred to as “media”).
- the overall operation of the apparatus is a central processing unit (CPU) 40. It is controlled by
- the stereoscopic imaging device 10 is provided with operation units 38 such as a shutter button, a mode dial, a playback button, a MENU / OK key, a cross key, and a BACK key.
- operation units 38 such as a shutter button, a mode dial, a playback button, a MENU / OK key, a cross key, and a BACK key.
- a signal from the operation unit 38 is input to the CPU 40, and the CPU 40 controls each circuit of the stereoscopic imaging device 10 based on the input signal. For example, lens driving control, aperture driving control, photographing operation control, image processing control, image processing Data recording / reproduction control, display control of the liquid crystal monitor 30 for stereoscopic display, and the like are performed.
- the shutter button is an operation button for inputting an instruction to start shooting, and is configured by a two-stroke switch having an S1 switch that is turned on when half-pressed and an S2 switch that is turned on when fully pressed.
- the mode dial is a selection means for selecting a 2D shooting mode, a 3D shooting mode, an auto shooting mode, a manual shooting mode, a scene position such as a person, a landscape, a night view, a macro mode, a moving image mode, and a parallax priority shooting mode according to the present invention. is there.
- the playback button is a button for switching to a playback mode in which a still image or a moving image of a stereoscopic image (3D image) or a planar image (2D image) that has been recorded is displayed on the liquid crystal monitor 30.
- the MENU / OK key is an operation key having both a function as a menu button for instructing to display a menu on the screen of the liquid crystal monitor 30 and a function as an OK button for instructing confirmation and execution of the selection contents. It is.
- the cross key is an operation unit for inputting instructions in four directions, up, down, left, and right, and functions as a button (cursor moving operation means) for selecting an item from the menu screen or instructing selection of various setting items from each menu. To do.
- the up / down key of the cross key functions as a zoom switch for shooting or a playback zoom switch in playback mode
- the left / right key functions as a frame advance (forward / reverse feed) button in playback mode.
- the BACK key is used to delete a desired object such as a selection item, cancel an instruction content, or return to the previous operation state.
- the image light indicating the subject is a phase difference image sensor via the photographing lenses 12 (12-1, 12-2) including the focus lens and the zoom lens, and the aperture 14 (14-1, 14-2).
- a solid-state imaging device 16 (16-1, 16-2, hereinafter referred to as “monocular 3D sensor”.
- the photographing lenses 12 (12-1, 12-2) are driven by a lens driving unit 36 (36-1, 36-2) controlled by the CPU 40, and focus control, zoom control, and the like are performed.
- the diaphragm 14 (14-1, 14-2) is composed of, for example, five diaphragm blades, and is driven by a diaphragm driver 34 (34-1, 34-2) controlled by the CPU 40.
- the diaphragm value F1 Aperture control is performed in 6 steps in increments of 1AV from .4 to F11.
- the CPU 40 controls the diaphragm 14 (14-1, 14-2) via the diaphragm driving unit 34 (34-1, 34-2) and the CCD control unit 32 (32-1, 32-2).
- the charge accumulation time (shutter speed) in the monocular 3D sensor 16, the readout control of the image signal from the monocular 3D sensor 16, and the like are performed.
- FIG. 3 is a diagram illustrating a configuration example of the monocular 3D sensor 16.
- the monocular 3D sensor 16 includes odd-line pixels (main pixels) and even-line pixels (sub-pixels) arranged in a matrix.
- image signals for the two surfaces photoelectrically converted by these main and sub-pixels can be read independently.
- GRGR ... And BGBG... Pixel array lines are provided alternately.
- the pixels on the even lines (2, 4, 6,...) Are arranged in the GRGR.
- BGBG... Pixel array lines are alternately provided, and the pixels are arranged so as to be shifted in the line direction by a half pitch with respect to the even-numbered pixels.
- FIG. 4 is a diagram showing the main lens PDa and the sub-pixel PDb of the photographing lens 12 (shooting optical system), the diaphragm 14, and the monocular 3D sensor 16, and FIGS. 5A to 5C are main parts of FIG. It is an enlarged view.
- the light beam passing through the exit pupil enters the normal CCD pixel (photodiode PD) via the microlens L without being restricted.
- the monocular 3D sensor 16 shown in FIG. 5B includes a microlens L that collects the light beam that has passed through the photographing lens 12, and a photodiode PD that receives the light beam that has passed through the microlens L (the main pixel PDa and the subpixel). PDb) and a light shielding member 16A that partially shields the light receiving surface of the photodiode PD.
- the right half or the left half of the light receiving surfaces of the main pixel PDa and the subpixel PDb is shielded by the light shielding member 16A. That is, the light shielding member 16A functions as a pupil division member.
- the monocular 3D sensor 16 having the above-described configuration is configured such that the main pixel PDa and the sub-pixel PDb have different regions (right half and left half) where the light beam is limited by the light shielding member 16A.
- the microlens L and the photodiode PD (PDa, PDb) are relatively shifted in the left-right direction (pupil division direction) without providing the light shielding member 16A.
- the light beam incident on the photodiode PD may be limited by disposing the optical axis Ic of the photodiode PD and the optical axes Pc of the photodiodes PDa and PDb. Further, by providing one microlens for two pixels (main pixel and subpixel), the light flux incident on each pixel may be limited.
- the signal charge accumulated in the monocular 3D sensor 16 (16-1, 16-2) is read out as a voltage signal corresponding to the signal charge based on the readout signal applied from the CCD controller 32.
- the voltage signal read from the monocular 3D sensor 16 (16-1, 16-2) is applied to the analog signal processing unit 18 (18-1, 18-2), where R, G,
- the B signal is sampled and held, amplified by a gain designated by the CPU 40 (corresponding to ISO sensitivity), and then added to the A / D converter 20 (20-1, 20-2).
- the A / D converter 20 (20-1, 20-2) sequentially converts the input R, G, B signals into digital R, G, B signals and converts them into image input controllers 22 (22-1, 22-22). Output to 2).
- the first image input unit 22-1, the first CCD control unit 32-1, the first aperture driving unit 34-1 and the first lens driving unit 36-1 are used to form the first imaging unit 11-. 1 is configured.
- the second photographing lens 12-2, the second diaphragm 14-2, the second monocular 3D sensor 16-2, the second analog signal processing unit 18-2, and the second A / D converter 20- 2, the second image input controller 22-2, the second CCD control unit 32-2, the second diaphragm driving unit 34-2, and the second lens driving unit 36-2 11-2 is configured.
- the digital signal processing unit 24 performs gain control processing including gamma correction processing, gamma correction processing, synchronization processing, YC processing for digital image signals input via the image input controller 22, including offset processing, white balance correction, and sensitivity correction. Then, predetermined signal processing such as sharpness correction is performed.
- the EEPROM 46 stores a camera control program, defect information of the monocular 3D sensor 16, various parameters and tables used for image processing, a program diagram, a plurality of parallax priority program diagrams according to the present invention, and the like. It is a non-volatile memory.
- the main image data read from the odd-line main pixels of the monocular 3D sensor 16 is processed as the left viewpoint image data and read from the even-line sub-pixels.
- the sub image data to be processed is processed as right viewpoint image data.
- the left viewpoint image data and right viewpoint image data (3D image data) processed by the digital signal processing unit 24 are input to the VRAM 50.
- the VRAM 50 includes an A area and a B area each storing 3D image data representing a 3D image for one frame.
- 3D image data representing a 3D image for one frame is rewritten alternately in the A area and the B area.
- the written 3D image data is read from an area other than the area in which the 3D image data is rewritten in the A area and the B area of the VRAM 50.
- the 3D image data read from the VRAM 50 is encoded by the video encoder 28 and is output to the stereoscopic display liquid crystal monitor 30 provided on the back of the camera, whereby the 3D subject image is displayed on the display screen of the liquid crystal monitor 30. Is displayed.
- the liquid crystal monitor (LCD) 30 is a stereoscopic display unit that can display a stereoscopic image (a left viewpoint image and a right viewpoint image) as a directional image having a predetermined directivity by a parallax barrier, but is not limited thereto.
- a stereoscopic image (a left viewpoint image and a right viewpoint image) as a directional image having a predetermined directivity by a parallax barrier, but is not limited thereto.
- the left viewpoint image and the right viewpoint image may be viewed separately by using a lenticular lens or by wearing dedicated glasses such as polarized glasses or liquid crystal shutter glasses.
- the stereoscopic imaging apparatus 10 includes the liquid crystal monitor 30 capable of displaying a stereoscopic image has been described.
- the stereoscopic imaging apparatus 10 does not include the liquid crystal monitor 30 and is recorded on the memory card 54.
- a stereoscopic image may be viewed on another stereoscopic image display device using the image data.
- the monocular 3D sensor 16 starts the AF operation and the AE operation, and the focus in the photographing lens 12 is set via the lens driving unit 36. Control is performed so that the lens comes to the in-focus position.
- the image data output from the A / D converter 20 when the shutter button is half-pressed is taken into the AE detection unit 44.
- the AE detection unit 44 integrates the G signals of the entire screen or integrates the G signals that are weighted differently in the central portion and the peripheral portion of the screen, and outputs the integrated value to the CPU 40.
- the CPU 40 calculates the brightness of the subject (shooting EV value) from the integrated value input from the AE detection unit 44, and the aperture value of the diaphragm 14 and the electronic shutter (shutter speed) of the monocular 3D sensor 16 based on the shooting EV value. Is determined according to a predetermined program diagram, the aperture 14 is controlled via the aperture drive unit 34 based on the determined aperture value, and the monocular 3D sensor is controlled via the CCD control unit 32 based on the determined shutter speed. The charge accumulation time at 16 is controlled.
- the AF processing unit 42 is a part that performs contrast AF processing.
- a high frequency component of image data in a predetermined focus area is extracted from at least one of the left viewpoint image data and the right viewpoint image data, and the in-focus state is determined by integrating the high frequency component.
- An AF evaluation value is calculated.
- AF control is performed by controlling the focus lens in the photographic lens 12 so that the AF evaluation value is maximized.
- phase difference AF processing may be performed. In this case, the phase difference between the image data corresponding to the main pixel and the sub-pixel in the predetermined focus area of the left viewpoint image data and the right viewpoint image data is detected.
- the defocus amount is obtained based on the information indicating the phase difference.
- AF control is performed by controlling the focus lens in the taking lens 12 so that the defocus amount becomes zero.
- the two pieces of image data temporarily stored in the memory 48 are appropriately read out by the digital signal processing unit 24, where predetermined signals such as luminance data and color difference data generation processing (YC processing) are performed. Processing is performed.
- the YC processed image data (YC data) is stored in the memory 48 again. Subsequently, the two pieces of YC data are respectively output to the compression / decompression processing unit 26 and subjected to predetermined compression processing such as JPEG (joint photographic experts group), and then stored in the memory 48 again.
- a multi-picture file (MP file: a file in a format in which a plurality of images are connected) is generated from two pieces of YC data (compressed data) stored in the memory 48, and the MP file is generated by the media controller 52. It is read and recorded in the memory card 54.
- MP file a file in a format in which a plurality of images are connected
- the stereoscopic imaging device 10 having the configuration shown in FIG. 1 includes a first imaging unit 11-1 and a second imaging unit 11-2 that image a subject, and a first imaging unit 11-1 and a second imaging unit.
- a CPU 40 is provided as a control unit for controlling 11-2.
- Both the first imaging unit 11-1 and the second imaging unit 11-1 are a plurality of units that photoelectrically convert light beams that have passed through different areas of the exit pupil of the photographing lens 12 (12-1, 12-2).
- An image sensor (monocular 3D sensors 16-1 and 16-2) including a pixel group is included.
- the stereoscopic imaging device 10 uses a viewpoint image (image information) obtained by the first imaging unit 11-1 and a viewpoint image (image information) obtained by the second imaging unit 11-2 as a stereoscopic image on the liquid crystal monitor. And a plurality of viewpoint images (a plurality of viewpoint images obtained by one imaging unit having a plurality of pixel groups among the first imaging unit 11-1 and the second imaging unit 11-2).
- the image information) on the liquid crystal monitor 30 as a stereoscopic image.
- the imaging unit on the right side is the “first imaging unit”.
- the imaging unit on the left side when viewed from the right side is described as a “second imaging unit”.
- FIGS. 8A and 8B are conceptual diagrams showing the relationship between the cross point and the focus during monocular / compound eye stereoscopic imaging.
- the cross point means a point where parallax is zero in a stereoscopic image.
- the left and right image data can be electronically shifted to freely set the cross point. Therefore, as shown in the example of FIG. 8A, the cross point (object B) and the focal point (object A ) Is often different.
- the stereoscopic imaging apparatus 10 can perform the following crosspoint control. Such cross-point control is particularly effective when shooting continuously, that is, when shooting moving images and acquiring so-called through images. Note that whether or not to perform crosspoint control may be determined by a user input via the operation unit 38.
- FIG. 7 is a flowchart showing the cross point control at the time of switching from the compound eye stereoscopic imaging function to the monocular stereoscopic imaging function
- FIG. 9 is a conceptual diagram showing the state of the cross point control.
- the compound eye stereoscopic photographing mode compound eye 3D mode
- left and right viewpoint images are acquired in S104.
- the left viewpoint image (L in FIG. 9A) is acquired by the left channel of the first imaging unit 11-1
- the right viewpoint image (FIG. 9) is acquired by the right channel of the second imaging unit 11-2.
- R) of (a) shall be acquired.
- the contrast AF process is performed using the left viewpoint image L acquired in the left channel of the first imaging unit 11-1.
- the cross point is object B and the focal point is object A, as in FIG. 8A.
- S106 for detecting a focused area in the left viewpoint image L.
- region S1 the contrast is maximum
- S108 a region corresponding to the region S1 is detected in the right viewpoint image R.
- this area is the area S2 of (c) of FIG.
- This corresponding area detection can be performed by various methods such as a correlation method and template matching.
- an amount (shift amount ⁇ X) of moving the right viewpoint image R so that the cross point becomes object A is calculated.
- the right viewpoint image R is moved by ⁇ X and (the right viewpoint of FIG. 9D)
- the stereoscopic image composed of the image R ′), the left viewpoint image L and the right viewpoint image R ′ is displayed on the liquid crystal monitor 30.
- the three-dimensional image may be recorded on the memory card 54.
- the cross point of the stereoscopic image obtained by the compound eye stereoscopic imaging is object A, which matches the in-focus area.
- Such processing is performed while the compound eye stereoscopic photography is continued, and the state where the cross point and the in-focus point coincide with each other is maintained until switching from the compound eye stereoscopic photography to the monocular stereoscopic photography.
- Such processing may be performed at predetermined time intervals, for example, 100 msec intervals.
- FIG. 10 is a flowchart showing the procedure of such crosspoint control
- FIG. 11 is a conceptual diagram showing the state of the crosspoint control.
- the stereoscopic image is composed of two viewpoint images with small parallax, as shown in FIG.
- the cross point and the focal point coincide (object A in FIG. 11A).
- the viewpoint image obtained in this way is displayed on the liquid crystal monitor 30 as a stereoscopic image.
- the in-focus area is detected in the image acquired by the first imaging unit 11-1.
- the cross point coincides with the focal point at the time of monocular stereoscopic photography, as shown in (b) and (c) of FIG.
- the focus area may be detected by one of the left and right viewpoint images.
- a viewpoint image by the second imaging unit 11-2 is also acquired in S210 ((d) in FIG. 11).
- an area corresponding to the detected in-focus area is detected in the viewpoint image acquired by the second imaging unit 11-2 (S212).
- the corresponding area can be detected by an algorithm such as a correlation method as described above.
- a deviation amount between the viewpoint image acquired by the first imaging unit 11-1 and the viewpoint image acquired by the second imaging unit 11-2 is calculated ((e) in FIG. 11).
- This shift amount is an image shift amount for causing the cross point and the in-focus area to coincide with each other in the stereoscopic image obtained when switching to the compound eye stereoscopic shooting.
- Such processing is performed while monocular stereoscopic photography continues, and the calculation, update, and recording of the deviation amount are continued until switching from monocular stereoscopic photography to compound eye stereoscopic photography.
- Such processing may be performed at predetermined time intervals, for example, 100 msec intervals.
- the viewpoint image acquired by the second imaging unit 11-2 in S218 is shifted by the above deviation amount ((f) in FIG. 11),
- the stereoscopic image is displayed on the liquid crystal monitor 30 together with the viewpoint image acquired by the first imaging unit 11-1 (S220).
- the viewpoint image acquired by the second imaging unit 11-2 is shifted by the amount of shift, so that the stereoscopic image obtained when switching to compound eye stereoscopic shooting is aligned with the cross point as in monocular stereoscopic shooting.
- the focal area matches. Therefore, even when switching from single-eye stereoscopic photography to compound-eye stereoscopic photography, the viewer does not feel that the cross point has changed suddenly, and an image can be displayed with a natural feeling when the photographing mode is switched.
- the cross-point adjustment function only needs to be activated when switching to long-distance shooting for short-distance shooting, but for long-distance shooting, it is not only when switching to short-distance shooting but also for monocular / compound-eye stereoscopic shooting by the user. It is necessary to always activate it (so that the crosspoint is adjusted continuously) so that it can be handled even when there is a change request.
- the user selects whether to automatically switch between monocular / compound-eye stereoscopic photography (S302). / The user is allowed to select any one of the compound eye stereoscopic photographing (S304). If YES in step S302, the process proceeds to step S306, where the in-focus position is detected, and if close to a predetermined threshold (for example, 70 cm) (YES in step S306), monocular stereoscopic shooting is performed (S308). This determination is continuously performed at a predetermined time interval (for example, 100 msec) (S310).
- a predetermined threshold for example, 70 cm
- S316 After the state of the compound eye stereoscopic photographing / cross point automatic adjustment function is activated in S316, it is determined whether or not to switch to monocular stereoscopic photographing at a predetermined time interval (S318) (S320). If the in-focus distance is shorter than the predetermined threshold value in S320, or if a change request to monocular stereoscopic photography is made by a user instruction via the operation unit 38, the process returns to S308 and monocular stereoscopic photography is performed. If none of the conditions is satisfied in S320, the process returns to S316 to continue the state of the compound-eye stereoscopic photographing and the automatic activation of the crosspoint function.
- S318 a predetermined time interval
- both the first imaging unit 11-1 and the second imaging unit 11-2 have a monocular stereoscopic imaging function. Therefore, if left and right viewpoint images are acquired by both of the two imaging units, a total of four viewpoints can be obtained. If the number of viewpoints is large, an improvement in performance can be expected when taking the corresponding points and measuring the amount of parallax.
- the first imaging unit 11-1 acquires the left viewpoint image
- the second imaging unit 11-2 acquires the right viewpoint image.
- the left viewpoint image may be acquired by the second imaging unit 11-2
- the right viewpoint image may be acquired by the first imaging unit 11-1. In this case the latter is obtained parallax slightly smaller.
- the pixel value is obtained by performing pixel addition as shown by the dotted line in FIG. 15, two viewpoint images with reduced noise are obtained. It is possible to obtain.
- a stereoscopic image composed and displayed from a viewpoint image with a small parallax has less presence than a stereoscopic image with a large parallax, but “is less tiring” and “when displayed on a 3D (three-dimensional) TV” , It has an advantage of being viewed as a normal 2D (two-dimensional) image by a person who is not wearing glasses (not a double image).
- the number of viewpoints, the amount of parallax, the amount of noise, and the like differ depending on the combination of the number of imaging units used for shooting, the number of viewpoint images to be acquired, or the presence or absence of pixel addition.
- the image according to the user's request can be acquired.
- the table shown in FIG. 13 summarizes such shooting modes that can be set by the stereoscopic imaging apparatus 10.
- FIG. 14 is a diagram showing an example of a specific procedure for setting the shooting mode shown in FIG.
- the interface shown in FIG. 14 can be displayed on the liquid crystal monitor 30, and the shooting mode can be set by a user instruction input via the operation unit 38.
- the stereoscopic imaging apparatus 10 first displays the screen shown in FIG. 14A on the liquid crystal monitor 30 and prompts the user to input the number of viewpoints (1, 2, or 4).
- the stereoscopic imaging apparatus 10 prompts the user for further input. That is, which one of the first and second imaging units 11-1 and 11-2 is used for one viewpoint (FIG. 14B), two viewpoint images are obtained by monocular stereoscopic imaging for two viewpoints. Or whether to acquire two viewpoint images by compound eye stereoscopic photography ((c) of FIG. 14).
- the shooting mode [7] or [8] is set depending on whether the first stereoscopic imaging unit 11-1 or the second stereoscopic imaging unit 11-2 is used.
- the stereoscopic imaging device 10 prompts the input of the amount of parallax as shown in FIG. Then, one of the shooting modes [2] to [4] is set according to the input amount of parallax ((e) in FIG. 14).
- the shooting mode setting is not limited to the above example.
- the first imaging unit 11-1 may be selected unconditionally without selecting the first / second imaging unit at the time of one viewpoint. This is because it is expected that the use of the imaging unit opposite to the side where the shutter button is located is relatively less likely to have a finger (a user's finger covers the lens).
- FIG. 15 is a conceptual diagram regarding addition of pixel signals.
- the amount of noise can be reduced by adding the pixel signals of the two pixels of the image sensor 16 as indicated by the dotted line in FIG.
- the high-frequency signal may become dull and the resolution may be lowered. Therefore, in the stereoscopic imaging device 10, during stereoscopic imaging from two viewpoints, according to the brightness of the screen, the contrast of the signal, etc., “the viewpoint image is acquired by compound eye stereoscopic imaging, and the result of adding the pixel signals of two pixels is captured for each imaging.
- the image can be automatically selected from the following: “Make a viewpoint image of a part” and “Shoot with monocular three-dimensional imaging and do not add pixel signals”.
- FIG. 16 is a flowchart showing an example of such an automatic selection process of the stereoscopic shooting mode.
- S400 it is determined in S402 whether the luminance (Bv value) of the entire screen is equal to or greater than a predetermined threshold value. If it is equal to or greater than the predetermined threshold (YES in S402), the process proceeds to S406, and it is determined whether or not the response of the contrast extraction filter is equal to or greater than the predetermined threshold. If YES in S406, monocular stereoscopic shooting is performed with the shooting mode set to [5] or [6] (S408), and if NO, compound eye stereoscopic shooting is performed with the shooting mode set to [3] (S404).
- the embodiment of the present invention is limited to such a mode. is not.
- the stereoscopic imaging device of the present invention at least one of the plurality of imaging units may be a monocular stereoscopic imaging unit, and the types of other imaging units are not particularly limited.
- a normal imaging unit an imaging unit that is not a monocular stereoscopic imaging unit
- an imaging device that can separate the left and right incident light may be used.
- FIG. 17 is a block diagram illustrating a main part of the stereoscopic imaging apparatus 10 ′ according to the second embodiment.
- the first imaging unit 11-1 is a monocular stereoscopic imaging unit
- the second imaging unit 11-2 ′ is an imaging unit having a normal sensor 17. Since the configuration other than this is the same as that of the stereoscopic imaging apparatus 10 according to the first embodiment, the same reference numerals as those of the stereoscopic imaging apparatus 10 are used, and detailed description thereof is omitted.
- a stereoscopic imaging apparatus 10 ′ similarly to the stereoscopic imaging apparatus 10 according to the first embodiment, monocular stereoscopic imaging / compound eye stereoscopic imaging is possible, and the cross point at the time of switching between monocular / compound stereoscopic imaging described above. Automatic activation of control and cross point adjustment can be performed.
- the second imaging unit is a normal imaging unit, and thus the shooting modes that can be set are different from those in the stereoscopic imaging device 10 according to the first embodiment.
- the shooting mode can be set according to the number of viewpoints, presence / absence of pixel addition, and the like as in the case of the stereoscopic imaging apparatus 10.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Studio Devices (AREA)
Abstract
Dans l'un de ses modes de réalisation, la présente invention se rapporte à un dispositif imageur en 3D (10) comprenant des premier et second modules de formation d'images (11-1, 11-2) qui sont tous les deux des modules imageurs monoculaires en 3D. Le dispositif imageur en 3D (10) selon l'invention comprend d'autre part un module de contrôle de point de croisement (40), et le module de contrôle de point de croisement contrôle un module de génération d'images de telle sorte que le point de croisement des images en 3D qui sont affichées sur un module d'affichage d'image (30) ne change pas, ni avant ni après la commutation, quand une commutation est exécutée, d'un mode de formation d'images au moyen d'une fonction d'imagerie en 3D multioculaire à une fonction d'imagerie en 3D monoculaire, et quand une commutation est exécutée, du mode de formation d'images au moyen de la fonction d'imagerie en 3D monoculaire à la fonction d'imagerie en 3D multioculaire. La solution technique décrite dans la présente invention permet de supprimer des changements importants d'un état de rétraction découlant de changements des points de croisement. Elle permet d'autre part d'afficher une image qui semble naturelle quand une commutation est exécutée entre les modes de formation d'images.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2011187559 | 2011-08-30 | ||
| JP2011-187559 | 2011-08-30 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2013031392A1 true WO2013031392A1 (fr) | 2013-03-07 |
Family
ID=47755899
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2012/067786 Ceased WO2013031392A1 (fr) | 2011-08-30 | 2012-07-12 | Dispositif imageur en 3d |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2013031392A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2015001788A1 (fr) * | 2013-07-05 | 2015-01-08 | 株式会社ニコン | Dispositif imageur |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2002185844A (ja) * | 2001-11-07 | 2002-06-28 | Olympus Optical Co Ltd | カメラシステム |
| JP2005250396A (ja) * | 2004-03-08 | 2005-09-15 | Fuji Photo Film Co Ltd | カメラ付き携帯端末 |
| JP2010154310A (ja) * | 2008-12-25 | 2010-07-08 | Fujifilm Corp | 複眼カメラ及び撮影方法 |
| WO2011024423A1 (fr) * | 2009-08-28 | 2011-03-03 | パナソニック株式会社 | Dispositif de commande d'affichage d'image stéréoscopique et dispositif d'imagerie pour images stéréoscopiques |
-
2012
- 2012-07-12 WO PCT/JP2012/067786 patent/WO2013031392A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2002185844A (ja) * | 2001-11-07 | 2002-06-28 | Olympus Optical Co Ltd | カメラシステム |
| JP2005250396A (ja) * | 2004-03-08 | 2005-09-15 | Fuji Photo Film Co Ltd | カメラ付き携帯端末 |
| JP2010154310A (ja) * | 2008-12-25 | 2010-07-08 | Fujifilm Corp | 複眼カメラ及び撮影方法 |
| WO2011024423A1 (fr) * | 2009-08-28 | 2011-03-03 | パナソニック株式会社 | Dispositif de commande d'affichage d'image stéréoscopique et dispositif d'imagerie pour images stéréoscopiques |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2015001788A1 (fr) * | 2013-07-05 | 2015-01-08 | 株式会社ニコン | Dispositif imageur |
| CN105359519A (zh) * | 2013-07-05 | 2016-02-24 | 株式会社尼康 | 摄像装置 |
| JPWO2015001788A1 (ja) * | 2013-07-05 | 2017-02-23 | 株式会社ニコン | 撮像装置 |
| CN105359519B (zh) * | 2013-07-05 | 2017-07-04 | 株式会社尼康 | 摄像装置 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5595499B2 (ja) | 単眼立体撮像装置 | |
| JP5425554B2 (ja) | 立体撮像装置及び立体撮像方法 | |
| JP5722975B2 (ja) | 撮像装置、撮像装置用シェーディング補正方法及び撮像装置用プログラム | |
| JP5788518B2 (ja) | 単眼立体撮影装置、撮影方法及びプログラム | |
| JP5269252B2 (ja) | 単眼立体撮像装置 | |
| JP5469258B2 (ja) | 撮像装置および撮像方法 | |
| US20110234767A1 (en) | Stereoscopic imaging apparatus | |
| JP2011205374A (ja) | 表示装置 | |
| JP5871989B2 (ja) | 撮影装置、撮影方法及びプログラム | |
| JP2011199755A (ja) | 撮像装置 | |
| JP2011259168A (ja) | 立体パノラマ画像撮影装置 | |
| JP5160460B2 (ja) | 立体撮像装置および立体撮像方法 | |
| JP5449551B2 (ja) | 画像出力装置、方法およびプログラム | |
| US9077979B2 (en) | Stereoscopic image capture device and method | |
| JP2010237582A (ja) | 立体撮像装置および立体撮像方法 | |
| JP5580486B2 (ja) | 画像出力装置、方法およびプログラム | |
| WO2012043003A1 (fr) | Dispositif d'affichage d'image en trois dimensions et procédé d'affichage d'image en trois dimensions | |
| JP2010200024A (ja) | 立体画像表示装置および立体画像表示方法 | |
| JP2012124650A (ja) | 撮像装置および撮像方法 | |
| JP5351298B2 (ja) | 複眼撮像装置 | |
| WO2013031392A1 (fr) | Dispositif imageur en 3d | |
| JP2011077680A (ja) | 立体撮影装置および撮影制御方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12827546 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 12827546 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |