WO2013069279A1 - Dispositif de capture d'image - Google Patents
Dispositif de capture d'image Download PDFInfo
- Publication number
- WO2013069279A1 WO2013069279A1 PCT/JP2012/007149 JP2012007149W WO2013069279A1 WO 2013069279 A1 WO2013069279 A1 WO 2013069279A1 JP 2012007149 W JP2012007149 W JP 2012007149W WO 2013069279 A1 WO2013069279 A1 WO 2013069279A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- focus
- unit
- digital
- focus position
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/958—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
- H04N23/959—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
Definitions
- the present disclosure relates to an imaging apparatus that calculates depth information from a captured image.
- Patent Document 1 discloses a technique for generating a parallax image based on an input image and depth information.
- DFF From Focus
- Patent Document 2 calculates depth information in an image by mathematically analyzing the degree of blurring of the image using a technique called Depth from Defocus (hereinafter referred to as “DFD”). .
- Patent Document 2 discloses a method of changing the focus position based on conditions such as the subject being close to the camera and the captured image being a distant view.
- Patent Document 2 The method described in Patent Document 2 is effective in ensuring the accuracy of subject distance calculation in the vicinity of the subject by photographing an image focused on the vicinity of the assumed subject.
- both DFF and DFD use the correlation between image depth information and the degree of blur. Therefore, the depth information of the image cannot be accurately calculated based on the information on the degree of blur under the situation where the degree of blur does not change even though the depth information of the image is different. If the depth information is calculated under such a situation, an inappropriate calculation value is obtained, which may adversely affect subsequent processing.
- the present disclosure has been made in view of the above-described reasons, and an object of the present disclosure is to capture an image that can solve the above-described problem by performing shooting while excluding a focus position where it is difficult to calculate the subject distance. Is to provide a device.
- an imaging apparatus images a subject and acquires a digital image, and forms an optical image of the subject at a predetermined position of the imaging unit.
- An optical system a setting unit that sets an allowable range of a focus position of a captured image, and a focus adjustment that drives the optical system to adjust the focus position of the captured image within the allowable range set by the setting unit
- an information generation unit that generates information about the distance from the imaging device to the subject based on a plurality of digital images having the same scene and different in-focus positions acquired by the imaging unit, and the setting The unit sets an allowable range of a focus position of a digital image to be photographed later among the plurality of digital images based on a focus range of the digital image obtained by photographing first.
- the imaging apparatus without using additional hardware such as a distance meter, it is possible to prevent an image from being captured in a situation where it is difficult to calculate depth information, and to improve the accuracy of depth information. . In addition, post-processing based on incorrect depth information can be prevented.
- Schematic diagram showing the configuration of the digital camera according to the first embodiment of the present disclosure Schematic diagram showing the relationship between focus position (distance) and focus level Flow chart showing the flow of depth information calculation processing Schematic diagram showing the relationship between the best focus position (distance) and the degree of focus for calculating depth information
- the flowchart which shows an example of the process which judges whether depth information is calculated by a controller
- the flowchart which shows the modification of the process which judges whether depth information is calculated by a controller
- Schematic diagram showing the relationship between subject size and subject image size in real space The flowchart which shows the further modification of the process which judges whether depth information is calculated by a controller.
- the flowchart which shows the further modification of the process which judges whether depth information is calculated by a controller.
- FIG. 2A schematically shows the principle of the DFF described above.
- DFF digital image processing circuitry
- the degree of focus is compared for each pixel between the images, and the depth information is calculated based on the in-focus position assigned to the image with a higher degree of focus.
- the depth information of the pixel with the higher focus degree in the image I1 is set to d1
- the depth information of the pixel with the higher focus degree in the image I2 is set to d2.
- the camera has a depth of field.
- the depth of field is a range in which a clear image can be taken before and after the focused subject. Since such a depth of field exists, the degree of focus of the image takes a substantially maximum value within the range of the depth of field as shown in FIG. For this reason, in FIG. 2B, in the range where the focus level does not change between the two images (the range where the focus level is almost the maximum value in any image), any image with a higher focus level is selected.
- the present inventor has found that there is a problem that depth information cannot be uniquely determined for that reason. Specifically, when the in-focus positions of two images are extremely close, for example, when shooting an image in which a background wall exists behind a human image, such a problem occurs, and the calculated depth Information accuracy is reduced.
- FIG. 1 is a schematic diagram showing a digital camera 1 according to the first embodiment.
- FIG. 1 is a block diagram showing the configuration of the digital camera 1.
- the digital camera 1 includes an optical system 110, a zoom motor 120, an OIS actuator 130, a focus motor 140, a CCD image sensor 150, an image processing unit 160, a memory 200, a controller 210, a gyro sensor 220, a card slot 230, a memory card 240, and an operation.
- the configuration includes a member 250, a zoom lever 260, a liquid crystal monitor 270, an internal memory 280, and a mode setting button 290.
- the optical system 110 includes a zoom lens 111, an OIS 112, and a focus lens 113.
- the zoom lens 111 can enlarge or reduce the subject image by moving along the optical axis of the optical system.
- the zoom lens 111 is controlled by a zoom motor 120.
- the OIS 112 has a correction lens that can move in a plane perpendicular to the optical axis.
- the OIS 112 reduces the blur of the subject image by driving the correction lens in a direction that cancels the blur of the digital camera 1.
- the correction lens can move from the center by a maximum L within the OIS 112.
- the OIS 112 is controlled by the OIS actuator 130.
- the focus lens 113 adjusts the focus of the subject image by moving along the optical axis of the optical system.
- the focus lens 113 is driven by a focus motor 140.
- a focus motor 140 that drives the focus lens 113 adjusts the focus position of the captured image under the instruction of the focus adjustment unit 213 of the controller 210.
- the zoom motor 120 drives and controls the zoom lens 111.
- the zoom motor 130 may be realized by a pulse motor, a DC motor, a linear motor, a servo motor, or the like.
- the zoom motor 130 may drive the zoom lens 111 via a mechanism such as a cam mechanism or a ball screw.
- the OIS actuator 130 drives and controls the correction lens in the OIS 112 in a plane perpendicular to the optical axis.
- the OIS actuator 130 can be realized by a planar coil or an ultrasonic motor.
- the focus motor 140 drives and controls the focus lens 113.
- the focus motor 140 may be realized by a pulse motor, a DC motor, a linear motor, a servo motor, or the like.
- the focus motor 140 may drive the focus lens 113 via a mechanism such as a cam mechanism or a ball screw.
- the CCD image sensor 150 captures a subject image formed by the optical system 110 and generates an image signal.
- the CCD image sensor 150 performs various operations such as exposure, transfer, and electronic shutter.
- the image processing unit 160 performs various processes on the image signal generated by the CCD image sensor 150.
- the image processing unit 160 processes the image signal and generates image data to be displayed on the liquid crystal monitor 270 (hereinafter, an image displayed based on the image data is referred to as a “review image”). Or an image signal to be stored again in the memory card 240 is generated.
- the image processing unit 160 performs various image processing such as gamma correction, white balance correction, and flaw correction on the image signal.
- the image processing unit 160 compresses the image signal by the compression format or the like conforming to the JPEG standard for the processed image signal.
- the image processing unit 160 is a depth information calculation unit 162 that calculates depth information of a shooting scene from images shot at different in-focus positions, and a focus determination unit that determines whether or not the shot image is in focus. 163, a face detection unit 164 for detecting a human face image from the photographed image. A specific procedure for calculating the depth information will be described later.
- the image processing unit 160 performs various image processing such as parallax image generation and blurring processing for synthesizing parallax according to the depth information calculated by the depth information calculation unit 162.
- the image processing unit 160 can be realized by a DSP or a microcomputer.
- the resolution of the review image may be set to the screen resolution of the liquid crystal monitor 270, or may be set to the resolution of image data that is compressed and formed by a compression format or the like conforming to the JPEG standard.
- the memory 200 functions as a work memory for the image processing unit 160 and the controller 210.
- the memory 200 temporarily stores an image signal processed by the image processing unit 160 or image data input from the CCD image sensor 150 before being processed by the image processing unit 160.
- the memory 200 temporarily stores shooting conditions of the optical system 110 and the CCD image sensor 150 at the time of shooting.
- the shooting conditions indicate subject distance, field angle information, ISO sensitivity, shutter speed, EV value, F value, distance between lenses, shooting time, OIS shift amount, and the like.
- the memory 200 can be realized by, for example, a DRAM or a ferroelectric memory.
- the controller 210 is a control means for controlling the entire operation of the digital camera 1.
- the controller 210 can be realized by a semiconductor element or the like.
- the controller 210 may be configured only by hardware, or may be realized by combining hardware and software.
- the controller 210 can be realized by a microcomputer or the like.
- the controller 210 includes a setting unit 212 that sets an allowable range of a focus position of a captured image, and a focus adjustment unit 213 that operates the focus motor 140.
- the setting unit 212 is a digital image obtained by first photographing an allowable range of a focus position of a digital image to be photographed later among a plurality of digital images photographed by changing a focus position. Set based on the in-focus range.
- the focus adjustment unit 213 operates the focus motor 140 in accordance with the allowable range set by the setting unit 212.
- the focus adjustment unit 213 performs a focus operation using a contrast AF method that searches the in-focus position of the captured image by driving the focus lens 113 based on the contrast of the captured image.
- the focus method is a phase difference detection AF method in which light entering from the optical system is divided into two and guided to a dedicated sensor, and the direction and amount of focus are judged from the interval between the two images formed on the sensor. There may be.
- the gyro sensor 220 is composed of a vibration material such as a piezoelectric element.
- the gyro sensor 220 vibrates a vibration material such as a piezoelectric element at a constant frequency, converts a force generated by the Coriolis force into a voltage, and obtains angular velocity information.
- the gyro sensor 220 may be any device that can measure at least the angular velocity information of the pitch angle. Further, when the gyro sensor 220 can measure the angular velocity information of the roll angle, it is possible to consider the rotation when the digital camera 1 moves in a substantially horizontal direction.
- the memory card 240 can be inserted into the card slot 230.
- the card slot 230 can be mechanically and electrically connected to the memory card 240.
- the memory card 240 includes a flash memory, a ferroelectric memory, and the like, and can store data.
- the operation member 250 includes a release button.
- the release button receives a user's pressing operation. When the release button is pressed halfway, AF control and AE control are started via the controller 210. When the release button is fully pressed, the subject is photographed.
- the zoom lever 260 is a member that receives a zoom magnification change instruction from the user.
- the liquid crystal monitor 270 is a display device that can display an image signal generated by the CCD image sensor 150 or an image signal read from the memory card 240.
- the liquid crystal monitor 270 can display various setting information of the digital camera 1.
- the liquid crystal monitor 270 can display EV values, F values, shutter speeds, ISO sensitivity, and the like, which are shooting conditions at the time of shooting.
- the internal memory 280 is configured by a flash memory, a ferroelectric low memory, or the like.
- the internal memory 280 stores a control program for controlling the entire digital camera 1 and the like.
- the mode setting button 290 is a button for setting a shooting mode when shooting with the digital camera 1.
- the “shooting mode” indicates a shooting scene assumed by the user. For example, (1) portrait mode, (2) child mode, (3) pet mode, (4) macro mode, (5) landscape mode 2D shooting mode including (6) 3D shooting mode. Note that a 3D shooting mode may be provided for each of (1) to (5).
- the digital camera 1 performs shooting by setting appropriate shooting parameters based on this shooting mode. Note that a camera automatic setting mode in which the digital camera 1 automatically sets shooting parameters may be included.
- the mode setting button 290 also serves as a button for setting a reproduction mode of an image signal recorded on the memory card 240.
- the distance measuring unit 300 measures the in-focus position at that time.
- a method for measuring the in-focus position for example, a method for reading the position of the focus lens 113 or a method for obtaining the in-focus position based on the driving amount when the focus lens 113 is in focus can be used.
- FIG. 3 is a flowchart showing a procedure for calculating depth information by DFF.
- a plurality of images I1 and I2 having different in-focus positions are acquired by the optical system 110 and the CCD image sensor 150 (step S101).
- the optical system 110 and the CCD image sensor 150 In the following, a case where two images are used will be described, but three or more images may be used. Note that how to select the focus position of each image will be described later.
- the degree of focus indicating the degree of focus is calculated (step S102).
- the degree of focusing is defined for each pixel or a block of a plurality of pixels.
- a specific method for calculating the degree of focus for example, as disclosed in Non-Patent Document 1, there is a method using entropy or dispersion of pixel values of pixels within a certain range centered on each pixel.
- the degree of focus can be calculated using other commonly used indicators.
- the degree of focus is compared between the corresponding pixels or blocks of the two images, and the depth is assigned (step S103). Specifically, among the two images, the in-focus position of the image having the maximum in-focus degree is assigned as depth information of the pixel or block. Similarly, when three or more images are used, the in-focus position of the image having the maximum in-focus degree is assigned as depth information.
- the high in-focus range due to the depth of field of each image does not overlap.
- a position at which it is possible to determine which image has a high degree of focus as a result of comparing the degrees of focus in pixels or blocks in the two captured images is set as the focus position.
- the setting unit 212 sets the permissible range of the focus position of a digital image to be photographed later among the plurality of digital images, the depth of field range of the digital image obtained by photographing first, and the digital image to be photographed later. Set so that it does not overlap with the depth of field range.
- the setting unit 212 is a digital image to be taken later among a plurality of digital images in order to prevent a reduction in accuracy of depth information due to comparison between blurred images.
- the allowable range of the in-focus position is set so that the depth-of-field range of the digital image obtained by first photographing and the depth-of-field range of the digital image photographed later are adjacent to each other.
- FIG. 2C an area having a low focus degree is not formed in both of the two images.
- hyperfocal distance H is obtained by the following [Equation 1].
- f represents the focal length of the optical system 110
- F represents the F number of the optical system 110
- C represents the allowable circle of confusion.
- the front end DN and the rear end DF of the depth of field when focusing on an arbitrary position d is expressed by the following [Expression 2] using the hyperfocal distance H.
- the depth of field range with a high degree of focus does not overlap with the depth of field range of the first image.
- the in-focus position can be determined in the same manner even when there are three or more images. .
- the focus adjustment unit 213 is a contrast AF method that drives the focus lens 113 based on the contrast of the optical image and searches for the in-focus position of the captured image.
- FIG. 5 is a flowchart showing a flow of processing in which the controller 210 controls the operation of the image processing unit 160.
- the controller 210 controls the operation of the image processing unit 160.
- the second image is photographed at a focus position at infinity, and then the first image is focused.
- the digital camera 1 captures a second image at a focus position at infinity (step S200). That is, the focus adjustment unit 213 drives the focus lens 113 and captures the second image in a state where the focus lens 113 is focused on a position far from the above-described hyperfocal distance.
- the focus adjustment unit 213 confirms the contrast of the captured image while driving the focus lens 113, thereby performing focusing to search for an in-focus position (Ste S201).
- the range for searching whether the image is in focus is allowed only before d1 expressed by the equation [3].
- step S202 it is determined whether or not there is a focus within the search permission range. Note that the determination as to whether the subject is in focus is performed by the focus determination unit 163 using a known method. Here, as described above, the contrast of the captured image is derived, and it is determined that the focus lens 113 at which the contrast is the highest is in focus.
- step S202 If it is determined in step S202 that the focus has been achieved, the first image is taken with the release button on the operation member 250 being fully pressed as a trigger (step S203). On the other hand, if it is not determined in step S202 that the subject is in focus, such as when there is no subject in the proximity area, the processing ends.
- depth information is calculated based on the first image and the second image (step S204). Specifically, the first and second image pixels are derived by deriving the degree of focus indicating the degree of focus for each pixel of both images and comparing the degree of focus of both images for each pixel. Set the depth information for each.
- an image (first image) is taken at a focus position where the focus position at infinity overlaps the depth of field range, that is, a focus position that is inappropriate for the calculation of depth information. Can be deterred. Thereby, the precision of the set depth information can be improved.
- the operation of capturing the second image first and then capturing the first image has been described.
- the present embodiment is not limited to the above, and an operation of capturing the first image first and then capturing the second image may be performed.
- the range for searching for the focus in step S201 is permitted to be limited to the back of d2 expressed by the equation [4].
- step S202 determines that the subject is in focus
- the process is terminated as it is.
- the shooting conditions are changed so that a focus position where the infinite focus range and the depth of field range do not overlap is obtained, and then the processing from step S201 is repeated again. Also good.
- the shooting condition is changed so that the in-focus range and the depth of field range do not overlap with each other so that focus is achieved.
- the aperture and zoom adjustments may be performed by the user according to an appropriate guidance message or the like, or may be performed automatically by the controller 210.
- FIG. 6 is a flowchart illustrating a flow of a modification of the process in which the controller 210 controls the operation of the image processing unit 160.
- the face detection unit 164 detects a human face image from the first image, and determines whether or not to calculate depth information using the result.
- steps S300, S301, S302, S303, and S305 of FIG. 6 are the same as steps S200, S201, S202, S203, and S204 of Example 1, respectively, and thus description thereof is omitted.
- step S304 the face detection unit 164 detects a human face image from the first image.
- processing for calculating depth information is executed.
- the threshold value for the size of the face image is determined as follows. From FIG. 7, an object having a size e at a distance d from the principal point of the optical system forms an image having a size e 'at a position d' from the principal point. From FIG. 7, since the object and the image are similar, the formula [6] is established.
- the interocular distance e ′ on the image is obtained by using, for example, a human interocular distance of approximately 60 mm to 65 mm as e. Further, e ′ obtained by the equation [9] is the image size itself, and can be converted into the number of pixels on the image according to the size of one pixel of the CCD image sensor 150.
- an index that can be obtained relatively stably such as the size of a person's face image, is added to the determination based on whether or not the subject is in focus.
- an index that can be obtained relatively stably such as the size of a person's face image
- focusing by autofocus may focus on subjects other than the main subject depending on the algorithm.
- a human face image which is an important subject, is detected to exceed a predetermined size, it may be considered that focusing by autofocus focuses on an accurate subject. If the face image is larger than a predetermined size, the person who is the subject is considered to be at a close distance from the digital camera 1.
- depth information is calculated when a focus is achieved and a face of a predetermined size is detected.
- this may be changed so that depth information is calculated when a face is in focus or a face of a predetermined size is detected.
- the focus adjustment unit 213 is a contrast AF method in which the focus lens 113 is driven based on the contrast of the optical image to search for the in-focus position of the captured image.
- the range for searching whether the focus is in focus in the contrast AF method is limited to a range in front of d1 expressed by the formula [5]. It was. However, in this example, a mode will be described in which the search permission range is not provided as described above and the in-focus position is fixed at d1.
- FIG. 8 is a flowchart showing a flow of processing for controlling the operation of the image processing unit 160 by the controller 210.
- steps S400, S403, and S404 in FIG. 8 are the same as steps S200, S201, S203, and S204 in Example 1, and a description thereof will be omitted.
- step S401 focusing is performed in order to capture the first image at a predetermined focus position d1 (step S401). At this time, the in-focus position is not searched by driving the focus lens 113, and the in-focus position is fixed at d1 expressed by the formula [5].
- step S401 it is determined whether or not the focus is in focus at the focus position d1 in step S401 based on the contrast value of the photographed image (step S402), and when it is determined that the focus is in focus at the focus position d1. Advances to the first image capturing process (step S403). On the other hand, if it is determined that there is no focus at the in-focus position d1, such as when there is no subject in the proximity region of the in-focus position d1, the process ends.
- the first image is shot at the in-focus position where the in-focus range of the second image and the depth-of-field range overlap, that is, the in-focus position inappropriate for calculating depth information. Can be suppressed. Thereby, the precision of the set depth information can be improved.
- FIG. 9 is a flowchart showing the flow of processing for controlling the operation of the image processing unit 160 by the controller 210.
- the focus position of the second image is set to infinity and the first image is focused.
- the digital camera 1 captures a second image at a focus position at infinity (step S500). That is, the focus adjustment unit 213 drives the focus lens 113 and captures the second image in a state where the focus lens 113 is focused on a position far from the above-described hyperfocal distance.
- the controller 210 divides the light entering from the optical system 110 into two and guides it to a dedicated sensor, and detects the first image by phase difference detection that analyzes the interval between the two images formed on the sensor.
- the focus position (the direction of focus and the direction and amount of movement of the focus lens 113) is derived (S501).
- the controller 210 determines whether or not the in-focus position of the first image derived in step S501 is within an allowable range (S502).
- the permissible range is, for example, a range before d1 represented by [Expression 4].
- the process proceeds to step S503.
- the processing is ended as it is.
- step S502 If it is determined in step S502 that the focus position of the first image is within the allowable range, the focus adjustment unit 213 moves the focus lens 113 based on the result derived in step S501 to adjust the focus position. Then, the first image is taken (S503).
- depth information is calculated based on the first image and the second image (step S504).
- the first and second image pixels are derived by deriving the degree of focus indicating the degree of focus for each pixel of both images and comparing the degree of focus of both images for each pixel. Set the depth information for each.
- the first image is shot at the in-focus position where the in-focus range of the second image and the depth-of-field range overlap, that is, the in-focus position inappropriate for calculating depth information. Can be suppressed. Thereby, the precision of the set depth information can be improved.
- step S502 if it is determined in step S502 that the in-focus position of the first image is outside the allowable range, the process is terminated as it is. However, instead of ending the process, the shooting conditions are changed so that a focus position where the focus range at infinity and the depth of field range do not overlap is obtained, and then the processes after step S501 are repeated again. The same may be applied to Examples 1 and 2.
- the digital camera 1 according to the first embodiment is effective even when the focus position of the second image is so-called overinf, that is, when the second image is focused farther than infinity. Function.
- the focus lens 113 is driven by the focus motor 140 and moves along the optical axis of the optical system 110 to adjust the focus of the subject image. At this time, the focus lens 113 moves from a focus position corresponding to a subject at a finite distance to a focus position corresponding to a subject at infinity, and further to a position beyond the focus position corresponding to infinity. Sometimes. This lens state is overinf.
- FIG. 10 is a schematic diagram showing the relationship between the focus position (distance) and the focus degree when the image I2 is in the overinf state.
- the permissible range of the in-focus position of the image I1 is the depth of field of the image I2 obtained by shooting first.
- the range and the depth of field range of the image I2 to be captured later are set to be adjacent to each other. As a result, as shown in FIG. 2C, an area having a low focus degree is not formed in both of the two images.
- the focus position that is the basis of the depth information is always determined when the focus degree of at least one of the images is high. As a result, it is possible to prevent comparison in a region where the accuracy of the focusing degree itself is low and to obtain more accurate depth information.
- the digital image to which the generated depth information is assigned or various post-processing is performed based on the generated depth information is a digital image different from the plurality of digital images used to obtain the depth information. May be. That is, it is possible to shoot substantially the same scene separately from the plurality of digital images used to obtain the depth information, assign depth information to the captured image, or perform post-processing using the depth information. Is possible.
- the digital camera 1 captures a subject and acquires a digital image
- the CCD image sensor 150 an example of an imaging unit
- An optical system 110 that forms an image at a predetermined position of the CCD image sensor 150, a setting unit 212 that sets an allowable range of a focus position of a captured image, and an allowable range that is set by the setting unit 212 by driving the optical system 110.
- a depth information calculation unit 162 (an example of an information generation unit) that generates information on the information, and the setting unit 212 includes a plurality of digital images.
- the setting unit 212 sets the permissible range of the focus position of a digital image to be photographed later among a plurality of digital images, and the depth of field range of the digital image obtained by photographing first. Further, the setting may be made so that the depth of field range of a digital image to be photographed later does not overlap.
- the coverage of the digital image obtained by first photographing is set.
- the depth of field range is set so that the depth of field range of a digital image to be captured later does not overlap.
- the setting unit 212 sets the permissible range of the focus position of a digital image to be photographed later among a plurality of digital images, and the depth of field range of the digital image obtained by photographing first. And a depth-of-field range of a digital image to be photographed later may be set adjacent to each other.
- the coverage of the digital image obtained by first photographing is set.
- the depth-of-field range is set so that the depth-of-field range of a digital image to be captured later is adjacent.
- the setting unit 212 determines that the first image (a digital image to be captured later) is obtained when the focus range of the second image (a digital image obtained by first capturing) is infinity.
- the permissible range of the in-focus position may be set closer to the apparatus side of the digital camera 1 than the position d shown by the following [Equation 10].
- f is the focal length of the imaging unit
- F is the F number of the imaging unit
- C is the allowable circle of confusion.
- the focus range of the second image digital image obtained by shooting first
- the depth of field range of the first image digital image to be shot later
- the focus adjustment unit 213 may search the in-focus position of the captured image by driving the optical system 110 based on the imaging state of the optical image on the CCD image sensor 150.
- the optical system 110 may be driven based on the phase difference detection method.
- the present disclosure can be applied to the digital camera 1 regardless of the type of the autofocus method employed by the focus adjustment unit 213.
- the digital camera 1 further includes a focus determination unit 163 that determines whether or not the digital image acquired by the CCD image sensor 150 is in focus, and is later photographed by the focus determination unit 163.
- the depth information calculation unit 162 may generate information on the distance from the digital camera 1 to the subject based on the plurality of digital images. Good.
- depth information can be obtained with high accuracy using a high degree of focus in at least a part of the first image (digital image taken later).
- the digital camera 1 further includes a face detection unit 164 that detects a human face image from the digital image acquired by the CCD image sensor 150, and the face detection unit 164 detects the first image (later When a face image larger than a predetermined size is detected from the captured digital image), the depth information calculation unit 162 determines the distance from the digital camera 1 (an example of an imaging device) to the subject based on the plurality of digital images. Information may be generated.
- the first image (a digital image captured later) is captured at a sufficiently short distance using a relatively stable index of the size of the face image in the digital image. it can.
- the in-focus position between the second image (a digital image captured first) and the first image (a digital image captured later) is It becomes easy to secure a separated state. Therefore, the accuracy of information relating to the distance from the digital camera 1 to the obtained subject, that is, depth information is improved.
- the first embodiment has been described as an example of the technique disclosed in the present application.
- the technology in the present disclosure is not limited to this, and can also be applied to an embodiment in which changes, replacements, additions, omissions, and the like are appropriately performed.
- each block may be individually made into one chip by a semiconductor device such as an LSI, or may be made into one chip so as to include a part or the whole.
- LSI LSI
- IC system LSI
- super LSI ultra LSI depending on the degree of integration
- the method of circuit integration is not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor.
- An FPGA Field Programmable Gate Array
- a reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
- each process of the above embodiment may be realized by hardware or software. Further, it may be realized by mixed processing of software and hardware. Needless to say, when the digital camera according to the above-described embodiment is realized by hardware, it is necessary to adjust timing for performing each process. In the above embodiment, for convenience of explanation, details of timing adjustment of various signals generated in actual hardware design are omitted.
- the execution order of the processing method in the said embodiment is not necessarily restrict
- steps S302 and S303 may be interchanged and the face image detection process may be performed first.
- the second image is taken after the first image is taken instead of taking the second image after making the determination on the first image. You may make it determine about.
- the focus position of the second image is set to infinity
- the focus position of the second image may be set to an arbitrary position before infinity.
- the focusing range of the first image is determined based on the formula [3].
- examples 1 and 2 of the operation of the controller 210 have been described using two images, three or more images may be used. In that case, the focusing range of each image is determined by applying the formula [3] in order. The more images are used, the more depth information of the images can be obtained.
- the depth of field needs to be narrowed so that the depth-of-field ranges in each image do not overlap.
- a compact camera has a large depth of field, but a single-lens reflex camera can set a small depth of field.
- the imaging apparatus it is possible to prevent depth information from being calculated from an image obtained by shooting at an inappropriate focus position by ensuring shooting under conditions suitable for calculation of depth information.
- the present disclosure can be applied to all imaging devices including digital cameras (including single-lens reflex cameras and compact cameras).
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Automatic Focus Adjustment (AREA)
- Studio Devices (AREA)
- Focusing (AREA)
Abstract
La présente invention porte sur un dispositif de capture d'image qui comprend: une unité de capture d'image pour capturer une image d'un sujet et obtenir l'image numérique de celui-ci ; un système optique pour former une image optique du sujet à une position prédéterminée de l'unité de capture d'image ; une unité de réglage pour régler la tolérance de la position focalisée de l'image capturée ; une unité d'ajustement de focalisation pour commander le système optique et ajuster la position focalisée de l'image capturée dans la tolérance réglée par l'unité de réglage ; et une unité de génération d'informations pour, sur la base d'une pluralité d'images numériques ayant une scène identique obtenues par l'unité de capture d'image et ayant différentes positions focalisées, générer des informations sur la distance depuis le dispositif de capture d'image vers le sujet. Parmi la pluralité d'images numériques, l'unité de réglage règle la tolérance de la position focalisée de l'image numérique à capturer plus tard sur la base de la plage focalisée de l'image numérique obtenue en étant capturée précédemment.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2011245140A JP2015017999A (ja) | 2011-11-09 | 2011-11-09 | 撮像装置 |
| JP2011-245140 | 2011-11-09 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2013069279A1 true WO2013069279A1 (fr) | 2013-05-16 |
Family
ID=48289490
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2012/007149 Ceased WO2013069279A1 (fr) | 2011-11-09 | 2012-11-07 | Dispositif de capture d'image |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2015017999A (fr) |
| WO (1) | WO2013069279A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106791372A (zh) * | 2016-11-30 | 2017-05-31 | 努比亚技术有限公司 | 一种多点清晰成像的方法及移动终端 |
| WO2019115560A1 (fr) * | 2017-12-12 | 2019-06-20 | Bircher Reglomat Ag | Détermination de distance basée sur différentes zones de profondeur de champ à différentes mises au point d'un objectif |
| WO2020124517A1 (fr) * | 2018-12-21 | 2020-06-25 | 深圳市大疆创新科技有限公司 | Procédé de commande d'équipement de photographie, dispositif de commande d'équipement de photographie et équipement de photographie |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6569157B1 (ja) * | 2018-06-27 | 2019-09-04 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 制御装置、撮像装置、移動体、制御方法、及びプログラム |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2002333571A (ja) * | 2001-05-09 | 2002-11-22 | Ricoh Co Ltd | 自動合焦装置、自動合焦方法、およびその方法をコンピュータが実行するためのプログラム |
| JP2008070640A (ja) * | 2006-05-10 | 2008-03-27 | Canon Inc | 点調節装置、撮像装置、焦点調節装置の制御方法及びプログラム及び記憶媒体 |
| JP2011188454A (ja) * | 2010-03-11 | 2011-09-22 | Sharp Corp | 撮像モジュール |
-
2011
- 2011-11-09 JP JP2011245140A patent/JP2015017999A/ja active Pending
-
2012
- 2012-11-07 WO PCT/JP2012/007149 patent/WO2013069279A1/fr not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2002333571A (ja) * | 2001-05-09 | 2002-11-22 | Ricoh Co Ltd | 自動合焦装置、自動合焦方法、およびその方法をコンピュータが実行するためのプログラム |
| JP2008070640A (ja) * | 2006-05-10 | 2008-03-27 | Canon Inc | 点調節装置、撮像装置、焦点調節装置の制御方法及びプログラム及び記憶媒体 |
| JP2011188454A (ja) * | 2010-03-11 | 2011-09-22 | Sharp Corp | 撮像モジュール |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106791372A (zh) * | 2016-11-30 | 2017-05-31 | 努比亚技术有限公司 | 一种多点清晰成像的方法及移动终端 |
| CN106791372B (zh) * | 2016-11-30 | 2020-06-30 | 努比亚技术有限公司 | 一种多点清晰成像的方法及移动终端 |
| WO2019115560A1 (fr) * | 2017-12-12 | 2019-06-20 | Bircher Reglomat Ag | Détermination de distance basée sur différentes zones de profondeur de champ à différentes mises au point d'un objectif |
| WO2020124517A1 (fr) * | 2018-12-21 | 2020-06-25 | 深圳市大疆创新科技有限公司 | Procédé de commande d'équipement de photographie, dispositif de commande d'équipement de photographie et équipement de photographie |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2015017999A (ja) | 2015-01-29 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP3318054B1 (fr) | Systèmes et procédés pour un déclenchement de mise au point automatique | |
| US9635280B2 (en) | Image pickup apparatus, method of controlling image pickup apparatus, and non-transitory computer-readable storage medium | |
| US8760567B2 (en) | Photographing apparatus and method to reduce auto-focus time | |
| JP2009048125A (ja) | 撮影装置および撮影装置の制御方法 | |
| JP2009036986A (ja) | 撮影装置および撮影装置の制御方法 | |
| JP6432038B2 (ja) | 撮像装置 | |
| JP2008026802A (ja) | 撮像装置 | |
| JP2009065582A (ja) | 拡大表示機能付きカメラおよびカメラの制御方法 | |
| CN111868474B (zh) | 测距摄像机 | |
| JP5963552B2 (ja) | 撮像装置 | |
| WO2013069279A1 (fr) | Dispositif de capture d'image | |
| JP2009069170A (ja) | 撮影装置および撮影装置の制御方法 | |
| US20160275657A1 (en) | Imaging apparatus, image processing apparatus and method of processing image | |
| JP2016142924A (ja) | 撮像装置及びその制御方法、プログラム、記憶媒体 | |
| JP2009036985A (ja) | 撮影装置および撮影装置の制御方法 | |
| JP5023750B2 (ja) | 測距装置および撮像装置 | |
| JP2020008785A (ja) | 撮像装置 | |
| JP2013061560A (ja) | 測距装置および撮像装置 | |
| JP2014134697A (ja) | 撮像装置 | |
| JP5420034B2 (ja) | 撮像装置、その制御方法、プログラム及び記憶媒体 | |
| JP2009048123A (ja) | 撮影装置および撮影装置の制御方法 | |
| US10530985B2 (en) | Image capturing apparatus, image capturing system, method of controlling image capturing apparatus, and non-transitory computer-readable storage medium | |
| JP2005164983A (ja) | デジタルカメラ | |
| JP2009036987A (ja) | 撮影装置および撮影装置の制御方法 | |
| JP7288227B2 (ja) | 測距カメラ |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12847086 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 12847086 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |