WO2019193889A1 - Dispositif, procédé, et programme d'aide à l'alignement d'image, et dispositif d'imagerie - Google Patents
Dispositif, procédé, et programme d'aide à l'alignement d'image, et dispositif d'imagerie Download PDFInfo
- Publication number
- WO2019193889A1 WO2019193889A1 PCT/JP2019/008299 JP2019008299W WO2019193889A1 WO 2019193889 A1 WO2019193889 A1 WO 2019193889A1 JP 2019008299 W JP2019008299 W JP 2019008299W WO 2019193889 A1 WO2019193889 A1 WO 2019193889A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- images
- alignment
- condition
- imaging
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
Definitions
- the present invention relates to an image alignment auxiliary device, method and program, and an imaging apparatus, and more particularly to a technique applied when a plurality of images are aligned and synthesized.
- a technique is known that combines a plurality of images captured by an imaging apparatus such as a digital camera and having different imaging conditions to generate a composite image with improved image quality.
- HDR High-dynamic-range rendering
- depth synthesis to make the image in focus.
- the KLT Kerade-Lucas-Tomasi Feature Tracker
- Patent Documents 1 and 2 Conventionally, as a technique for aligning and synthesizing a plurality of images of this type, there are those described in Patent Documents 1 and 2.
- Patent Document 1 when a plurality of images are superimposed and synthesized on a single image, a relative displacement between the images is obtained, and the overlay synthesis is performed based on the reliability of the obtained displacement. A plurality of images to be used are selected, and the selected plurality of images are aligned and combined by relative displacement between the images.
- Patent Document 1 Another invention described in Patent Document 1 has low detection accuracy of relative displacement between images instead of selecting a plurality of images used for overlay synthesis based on the reliability of displacement (low reliability). ) The displacement between images is replaced by the displacement between images other than between the images, and based on the relative displacement between the images and the replaced displacement, the plurality of images are aligned so that the subject matches. Synthesize.
- N is an integer of 3 or more images captured at different focus positions arranged in the order of the focus position, and at least one of the adjacent two images is a feature.
- N / 2 images are set as feature point extraction images so as to become point extraction images, corresponding points corresponding to the feature points extracted from the feature point extraction images are searched from other images, and the positions of the feature points and the corresponding points are determined. From the relationship, a correction parameter for matching the corresponding positions of the N images is calculated, and other images are corrected based on the image with the narrowest angle of view among the N images.
- JP 2007-272459 A Japanese Patent Laying-Open No. 2015-099975
- Patent Document 1 Since one invention described in Patent Document 1 selects a plurality of images to be used for overlay synthesis based on the reliability of relative displacement between images, an unselected image (relative displacement between images). Image with low reliability of the image cannot be used for overlay synthesis.
- Patent Document 1 discloses a conventional technique for acquiring a plurality of images with different exposure amounts and synthesizing those signals, thereby avoiding so-called overexposure and underexposure and substantially expanding the dynamic range.
- the detailed description of the invention of Patent Document 1 does not describe how a plurality of images are captured.
- one invention described in Patent Document 1 has a problem in that the intended dynamic range cannot be expanded due to an unselected image. .
- displacement between images having low detection accuracy of relative displacement between images is replaced by displacement between images other than between the images.
- the reliability of a motion vector between images is low (if the accuracy of identifying a motion vector due to a common feature point detection error or error is low)
- it is difficult to increase the reliability of the movement vector This is because, when calculating a movement vector between certain images, if there is an error in detection of a feature point common to each image, the same feature point also has an error between feature points of other images. This is because the movement vector obtained via another image also has a low detection accuracy.
- the invention described in Patent Document 2 is a feature point extraction image for extracting feature points from N images captured at different focus positions arranged in the order of focus positions, and feature points extracted from the feature point extraction images.
- One of the features is a setting method with other images that search for corresponding points corresponding to the feature point, but at least between adjacent images arranged in the order of the focus position, feature points and corresponding corresponding points can be extracted. It is assumed that it can be done.
- the present invention has been made in view of such circumstances.
- a plurality of images are aligned and synthesized, a plurality of images that can be accurately aligned and a desired synthesized image can be generated. It is an object of the present invention to provide an image alignment assisting device, method and program, and an imaging device capable of acquiring the image.
- an image alignment assisting apparatus acquires a plurality of first images imaged under a stepwise first imaging condition composed of a plurality of stepwise different conditions.
- An imaging condition calculation unit that calculates a second imaging condition that internally divides the two first imaging conditions of the two first images based on the first imaging condition. The alignment condition is satisfied between the first images of the sheets If it is determined that no, obtaining a second image captured by a plurality of first image and the second imaging condition.
- the alignment condition it is determined whether or not the plurality of first images satisfy the alignment condition necessary for alignment of the composite image, and it is determined that the alignment condition is not satisfied.
- a second imaging condition that internally divides the two first imaging conditions of the two first images is calculated between the two first images based on the first imaging conditions of the two first images.
- the second image captured under the second imaging condition is acquired.
- the imaging condition calculation unit determines whether the alignment condition is not satisfied between the two first images by the determination unit. It is preferable to determine a step size that internally divides the two first imaging conditions based on the one imaging condition and the determined alignment condition, and to calculate the second imaging condition based on the determined step size. As a result, the second imaging condition of the second image necessary for increasing the correction accuracy is known.
- the alignment condition is a feature point between the two first images, and an evaluation value based on the feature points corresponding to each other. preferable.
- the evaluation value is preferably a value corresponding to the number of feature point pairs corresponding to each other. This is because the larger the number of pairs of feature points corresponding to each other, the higher the alignment accuracy between these images.
- the image acquisition unit is an imaging unit that continuously captures the same subject under different imaging conditions, and the determination unit determines between the two first images.
- the determination unit determines between the two first images.
- the image acquisition unit includes a storage unit that stores an image, the image acquisition unit acquires a plurality of first images captured by the first imaging condition from the storage unit, and the alignment unit determines whether or not the alignment unit between the two first images is determined by the determination unit. When it is determined that is not satisfied, it is preferable to further obtain a second image having the second imaging condition calculated by the imaging condition calculation unit from the storage unit. According to this, it is not necessary to capture a plurality of first images and a second image again.
- the first imaging condition and the second imaging condition are an in-focus condition corresponding to an exposure condition or an in-focus distance.
- the first imaging condition and the second imaging condition are exposure conditions
- a plurality of first images and second images can be used for HDR synthesis
- the first imaging condition and the second imaging condition are in-focus distances.
- the plurality of first images and second images can be used for depth synthesis.
- the determination unit when the image acquisition unit acquires a plurality of first images and second images, the determination unit includes a plurality of first images and second images. Whether or not the alignment condition is satisfied is determined by adding the second imaging condition to the stepwise first imaging condition and the stepwise imaging condition between two adjacent first and second images, or 2 The determination is made between the two second images, and the imaging condition calculation unit satisfies the alignment condition between the two first images and the second image or between the two second images by the determination unit.
- the first imaging condition of the two first images and the second imaging condition of the second image, or the second imaging conditions of the two second images Internally divide the first imaging condition and the second imaging condition of the second image, or the second imaging condition of the two second images.
- the image acquisition unit determines that the alignment condition is not satisfied between the two first images and the second image or between the second images, It is preferable to obtain the first image, the second image, and the third image captured by the third imaging condition.
- the alignment condition may not be satisfied between the images.
- a third image between the images that do not satisfy the alignment condition (a third image captured under the third imaging condition) is further acquired, and each of the plurality of first images, second images, and third images is acquired. Ensure that alignment conditions are met between images.
- the determination unit determines that all of the plurality of first images satisfy the alignment condition, of the plurality of first images It is preferable to include an alignment unit that uses one first image as a reference image and aligns a first image other than the reference image among the plurality of first images with respect to the reference image. Accordingly, the first image other than the reference image can be aligned with one reference image.
- the alignment unit determines whether the determination unit determines that all of the plurality of first images do not satisfy the alignment condition, with respect to the reference image.
- the first image other than the reference image and the second image of the plurality of first images are preferably aligned. As a result, even if the first image other than the reference image alone cannot be aligned with a single reference image with high accuracy, the second image intervenes as an auxiliary image for alignment. High alignment can be achieved.
- the image alignment assisting apparatus preferably includes a composite image generation unit that generates a composite image based on a plurality of first images after alignment by the alignment unit. According to this, it is possible to generate a composite image using only a plurality of first images.
- the image alignment assisting device includes a composite image generation unit that generates a composite image based on a plurality of first images and second images after alignment by the alignment unit. preferable. According to this, it is possible to generate a composite image using a plurality of first images and second images. In particular, this is effective when a desired composite image cannot be combined with only a plurality of first images.
- An imaging device preferably includes the above-described image alignment assisting device.
- the image acquisition unit acquires a plurality of first images picked up according to stepwise first image pickup conditions including a plurality of stepwise different conditions.
- the determination unit determines whether or not the acquired first images of the plurality of images satisfy the alignment condition necessary for alignment of the composite image.
- the imaging condition calculating unit Calculating a second imaging condition that internally divides the two first imaging conditions of the two first images based on the first imaging condition of the first image and positioning between the two first images If it is determined that the conditions are not met, Acquisition unit includes obtaining a second image captured by a plurality of first image and the second imaging condition, the.
- the step of calculating the second imaging condition is 2 when it is determined that the alignment condition is not satisfied between the two first images. It is preferable to determine a step size that internally divides the two first imaging conditions based on the two first imaging conditions and the determined alignment condition, and to calculate the second imaging condition based on the determined step sizes.
- the imaging condition calculation unit Based on the first imaging condition for the first image and the second imaging condition for the second image, or the second imaging condition for the two second images, the first imaging condition for the two first images and the second imaging condition for the second image. 2 imaging conditions or 3rd imaging conditions that internally divide the second imaging conditions of the two second images are calculated by the determination unit.
- the image acquisition unit When it is determined that the alignment condition is not satisfied between the two first images and the second images, or between the two second images, the image acquisition unit performs a plurality of first images, first images It is preferable to acquire the second image and the third image captured by the third imaging condition.
- the alignment unit detects the plurality of first images.
- one of the first images is set as a reference image, and a first image other than the reference image among the plurality of first images is aligned with the reference image.
- the alignment unit when it is determined that all of the plurality of first images do not satisfy the alignment condition, the alignment unit includes a plurality of images with respect to the reference image.
- the first image other than the reference image and the second image of the first images are preferably aligned.
- the composite image generation unit includes a step of generating a composite image based on the plurality of first images after alignment.
- the step of generating a composite image generates a composite image based on a plurality of first images and second images after alignment.
- the image acquisition unit acquires a plurality of first images picked up according to a stepwise first image pickup condition including a plurality of stepwise different conditions.
- the stepwise first imaging condition of the plurality of first images is adjacent to the function and whether or not the acquired first images satisfy the alignment condition necessary for the alignment of the composite image. If it is determined that the alignment condition between the two first images and the alignment condition between the two first images is not satisfied, the first imaging condition of the two first images is satisfied. Based on the first imaging condition for dividing the two first imaging conditions of the two first images, it is determined that the alignment condition is not satisfied between the two first images.
- the image acquisition unit includes a plurality of first images and first images. Realizing the function of acquiring a second image captured by the imaging conditions, to the computer.
- FIG. 1 is a perspective view of an imaging apparatus according to the present invention as viewed obliquely from the front.
- FIG. 2 is a rear view of the imaging apparatus.
- FIG. 3 is a block diagram illustrating an embodiment of the internal configuration of the imaging apparatus.
- FIG. 4 is a diagram illustrating a configuration example of the image sensor.
- FIG. 5 is a schematic view showing a cross section of a part of the phase difference pixel row of the image sensor.
- FIG. 6 is a functional block diagram showing a first embodiment of the main body side CPU 220 that mainly performs imaging in the composite image capturing mode.
- FIG. 7 is a diagram used to describe an embodiment of the operation of the main body side CPU 220 that executes imaging in the composite image capturing mode.
- FIG. 1 is a perspective view of an imaging apparatus according to the present invention as viewed obliquely from the front.
- FIG. 2 is a rear view of the imaging apparatus.
- FIG. 3 is a block diagram illustrating an embodiment of
- FIG. 8 is a diagram used to describe another embodiment of the operation of the main body side CPU 220 that executes imaging in the composite image capturing mode.
- FIG. 9 is a functional block diagram showing a second embodiment of the main body side CPU 220 that mainly performs imaging in the composite image capturing mode.
- FIG. 10 is a functional block diagram showing a third embodiment of the main body side CPU 220 that mainly executes imaging in the composite image capturing mode.
- FIG. 11 is a flowchart showing an embodiment of the image alignment assisting method according to the present invention.
- FIG. 12 is an external view of a smartphone which is an embodiment of an imaging apparatus according to the present invention.
- FIG. 13 is a block diagram showing a configuration of the smartphone.
- FIG. 1 is a perspective view of an image pickup apparatus provided with an image alignment assisting device according to the present invention as viewed obliquely from the front
- FIG. 2 is a rear view of the image pickup apparatus.
- the imaging apparatus 10 is a mirrorless digital single-lens camera including an interchangeable lens 100 and a camera body 200 to which the interchangeable lens 100 can be attached and detached.
- a main body mount 260 to which the interchangeable lens 100 is attached, a finder window 20 of an optical finder, and the like are provided on the front surface of the camera main body 200, and a shutter release switch 22 and a shutter are mainly provided on the upper surface of the camera main body 200.
- a speed dial 23, an exposure correction dial 24, a power lever 25, and a built-in flash 30 are provided.
- a liquid crystal monitor 216 As shown in FIG. 2, a liquid crystal monitor 216, an optical viewfinder eyepiece 26, a MENU / OK key 27, a cross key 28, a playback button 29, and the like are mainly provided on the back of the camera body 200.
- the liquid crystal monitor 216 functions as a display unit for displaying various menu screens in addition to displaying a live view image in the shooting mode, reproducing and displaying an image captured in the playback mode, and displaying various information to the user.
- the MENU / OK key 27 is an operation having both a function as a menu button for instructing to display a menu on the screen of the liquid crystal monitor 216 and a function as an OK button for instructing confirmation and execution of selection contents.
- the cross key 28 is an operation unit for inputting instructions in four directions, up, down, left, and right, and functions as a multi-function key for selecting an item from the menu screen and instructing selection of various setting items from each menu.
- the up and down keys of the cross key 28 function as a zoom switch at the time of imaging or a playback zoom switch in the playback mode, and the left key and the right key are frame advance (forward and reverse) buttons in the playback mode. Function as. In addition, it also functions as an operation unit that designates an arbitrary subject for focus adjustment from a plurality of subjects displayed on the liquid crystal monitor 216.
- Still images are continuously captured.
- Various shooting modes including a continuous shooting mode, a continuous shooting mode, and a composite image capturing mode for capturing a plurality of images with different exposures used for HDR combining to expand a dynamic range, and a moving image capturing mode for capturing a moving image Can be set.
- the playback button 29 is a button for switching to a playback mode in which the recorded still image or moving image is displayed on the liquid crystal monitor 216.
- FIG. 3 is a block diagram illustrating an embodiment of the internal configuration of the imaging apparatus 10.
- the interchangeable lens 100 that functions as an imaging optical system constituting the imaging apparatus 10 is manufactured in accordance with the communication standard of the camera body 200, and can communicate with the camera body 200 as described later. It is an interchangeable lens.
- the interchangeable lens 100 includes an imaging optical system 102, a focus lens control unit 116, an aperture control unit 118, a lens side CPU (Central Processing Unit) 120, a flash ROM (Read Only Memory) 126, a lens side communication unit 150, and a lens mount. 160.
- a lens side CPU Central Processing Unit
- flash ROM Read Only Memory
- the imaging optical system 102 of the interchangeable lens 100 includes a lens group 104 including a focus lens and a diaphragm 108.
- the focus lens control unit 116 moves the focus lens in accordance with a command from the lens side CPU 120 and controls the position (focus position) of the focus lens.
- the diaphragm control unit 118 controls the diaphragm 108 in accordance with a command from the lens side CPU 120.
- the lens-side CPU 120 controls the interchangeable lens 100 as a whole, and includes a ROM 124 and a RAM (Random Access Memory) 122.
- the flash ROM 126 is a nonvolatile memory that stores programs downloaded from the camera body 200.
- the lens-side CPU 120 performs overall control of each part of the interchangeable lens 100 using the RAM 122 as a work area according to a control program stored in the ROM 124 or the flash ROM 126.
- the lens side communication unit 150 is connected to the camera body 200 via a plurality of signal terminals (lens side signal terminals) provided on the lens mount 160 in a state where the lens mount 160 is attached to the body mount 260 of the camera body 200. To communicate. That is, the lens side communication unit 150 sends a request signal and a response signal to and from the main body side communication unit 250 of the camera body 200 connected via the lens mount 160 and the main body mount 260 in accordance with a command from the lens side CPU 120. Transmission / reception (bidirectional communication) is performed to notify the camera body 200 of lens information (focus lens position information, focal length information, aperture information, etc.) of each optical member of the imaging optical system 102.
- lens information focus lens position information, focal length information, aperture information, etc.
- the interchangeable lens 100 also includes a detection unit (not shown) that detects focus lens position information and aperture information.
- the aperture information is information indicating the aperture value (F value) of the aperture 108, the aperture diameter of the aperture 108, and the like.
- the lens side CPU 120 preferably stores various lens information including the detected focus lens position information and aperture information in the RAM 122. Further, the lens information is detected when there is a request for lens information from the camera body 200, or is detected when the optical member is driven, or at a fixed period (a period sufficiently shorter than the frame period of the moving image). It is detected and the detection result can be held.
- [Camera body] 3 includes an image sensor 201, an image sensor control unit 202, an analog signal processing unit 203, an A / D (Analog / Digital) converter 204, an image input controller 205, a digital signal.
- the mount 260 and the built-in flash 30 (FIG. 1) are configured. It comprises: (focal-plane shutter FPS) 280 and the FPS control unit 296, Rush emitting unit 270, the flash control unit 272,
- the image sensor 201 is composed of a complementary metal-oxide semiconductor (CMOS) type color image sensor.
- CMOS complementary metal-oxide semiconductor
- the image sensor 201 is not limited to the CMOS type, but may be an XY address type or a CCD (Charge Coupled Device) type image sensor.
- CCD Charge Coupled Device
- the image sensor 201 has red on a plurality of pixels configured by photoelectric conversion elements (photodiodes) arranged two-dimensionally in the x direction (horizontal direction) and the y direction (vertical direction).
- photoelectric conversion elements photodiodes
- x direction horizontal direction
- y direction vertical direction
- RGB red
- G green
- B blue
- the image sensor 201 includes a phase difference pixel (first phase difference pixel PA, second phase difference pixel PB) and a normal pixel for capturing a still image or moving image (a pixel other than the phase difference pixel). ing.
- a normal pixel row in which only normal pixels are arranged in the horizontal direction (row direction) includes a pixel having an R filter (R pixel) and a pixel having a G filter (G pixel).
- R pixel an R filter
- G pixel a pixel having a G filter
- the RG rows and GB rows are alternately arranged in the vertical direction (column direction).
- the image sensor 201 has a phase difference pixel row in which the first phase difference pixel PA and the second phase difference pixel PB are provided, and a normal pixel row in which only the normal pixels are provided.
- the phase difference pixel rows of the image sensor 201 are periodically set in a specific GB row of the Bayer array, with a pair of the first phase difference pixel PA and the second phase difference pixel PB and one normal pixel as one cycle. Arranged in the row direction. Therefore, in the phase difference pixel row, the G pixel and the B pixel are alternately arranged every two pixels (a pair of the first phase difference pixel PA and the second phase difference pixel PB) in the row direction.
- the phase difference pixel row of this example is provided in GB row
- the periodic color array is not limited to the Bayer array, and may be another color filter array such as an X-Trans (registered trademark) array.
- FIG. 5 is an enlarged view of a main part showing the configuration of the first phase difference pixel PA and the second phase difference pixel PB which are arranged adjacent to each other in the phase difference pixel row shown in FIG.
- a light shielding film MA is disposed on the front side (microlens ML side) of the photodiode PD of the first retardation pixel PA, while the photodiode PD of the second retardation pixel PB.
- a light-shielding film MB is disposed on the front side.
- the microlens ML and the light shielding films MA and MB have a pupil division function.
- the light shielding film MA shields the left half of the light receiving surface of the photodiode PD. Therefore, only the light beam passing through the left region of the optical axis among the light beams passing through the exit pupil of the imaging optical system 102 is selectively received by the first phase difference pixel PA.
- a G filter is disposed below the microlens ML.
- the light shielding film MB shields the right half of the light receiving surface of the photodiode PD of the second phase difference pixel PB. Therefore, only the light beam passing through the right region of the optical axis among the light beams passing through the exit pupil of the imaging optical system 102 is selectively received by the second phase difference pixel PB.
- the light beam passing through the exit pupil is divided into the left and right by the microlens ML having the pupil division function and the light shielding films MA and MB, and is incident on the first phase difference pixel PA and the second phase difference pixel PB, respectively. To do.
- the pupil division direction of the first phase difference pixel PA and the second phase difference pixel PB of the image sensor 201 of this example is the horizontal direction (x direction) (the left and right direction in FIG. 4).
- the phase difference pixel row may be arranged over the entire imaging region of the image sensor 201, or may be arranged only in the central region of the imaging region.
- the optical image of the subject formed on the light receiving surface of the image sensor 201 by the imaging optical system 102 of the interchangeable lens 100 is converted into an electrical signal by the image sensor 201.
- Charges corresponding to the amount of incident light are accumulated in each pixel of the image sensor 201, and an electric signal corresponding to the amount of charge (signal charge) accumulated in each pixel is read from the image sensor 201 as an image signal.
- the image sensor control unit 202 performs reading control of an image signal from the image sensor 201 in accordance with a command from the main body side CPU 220. In addition, when a still image is captured, the image sensor control unit 202 reads all lines of the image sensor 201 with the FPS 280 closed after the exposure time is controlled by opening and closing the FPS 280. In addition, the image sensor 201 and the image sensor control unit 202 of this example sequentially perform an exposure operation for each of at least one line or each pixel (that is, sequentially reset for each line or each pixel to accumulate charges.
- the analog signal processing unit 203 performs various types of analog signal processing on an analog image signal obtained by imaging the subject with the image sensor 201.
- the analog signal processing unit 203 includes a sampling hold circuit, a color separation circuit, an AGC (Automatic Gain Control) circuit, and the like.
- the AGC circuit functions as a sensitivity adjustment unit that adjusts the sensitivity at the time of imaging (ISO: International Organization for Standardization), adjusts the gain of an amplifier that amplifies the input image signal, and the signal level of the image signal is Try to be in the proper range.
- the A / D converter 204 converts the analog image signal output from the analog signal processing unit 203 into a digital image signal.
- Image data (mosaic image data) for each pixel of RGB output via the image sensor 201, the analog signal processing unit 203, and the A / D converter 204 when capturing a still image or moving image is transferred from the image input controller 205 to the RAM 207. Is temporarily stored.
- the image sensor 201 is a CMOS type image sensor, the analog signal processing unit 203 and the A / D converter 204 are often built in the image sensor 201.
- the digital signal processing unit 206 performs various types of digital signal processing on the image data stored in the RAM 207.
- the digital signal processing unit 206 appropriately reads out image data stored in the RAM 207, and performs offset processing, gain control processing including sensitivity correction, gamma correction processing, demosaicing processing (demosaicing processing, simultaneous processing) on the read image data.
- Digital signal processing such as RGB / YCrCb conversion processing or the like, and image data after the digital signal processing is stored in the RAM 207 again.
- the demosaic process is a process of calculating color information of all RGB for each pixel from a mosaic image composed of RGB, for example, in the case of an image sensor composed of RGB color filters, and mosaic data (dot sequential RGB data). ) To generate the RGB 3 plane image data.
- the RGB / YCrCb conversion process is a process of converting the synchronized RGB data into luminance data (Y) and color difference data (Cr, Cb).
- the compression / decompression processing unit 208 performs compression processing on the uncompressed luminance data Y and color difference data Cb, Cr once stored in the RAM 207 when recording a still image or a moving image.
- the image is compressed in, for example, JPEG (Joint Photographic coding Experts Group) format. Compress in H.264 format.
- the image data compressed by the compression / decompression processing unit 208 is recorded on the memory card 212 via the media control unit 210.
- the compression / decompression processing unit 208 performs decompression processing on the compressed image data obtained from the memory card 212 via the media control unit 210 in the playback mode, and generates uncompressed image data.
- the media control unit 210 performs control to record the image data compressed by the compression / decompression processing unit 208 in the memory card 212. In addition, the media control unit 210 performs control for reading compressed image data from the memory card 212.
- the display control unit 214 performs control to display uncompressed image data stored in the RAM 207 on the liquid crystal monitor 216.
- the liquid crystal monitor 216 is configured by a liquid crystal display device, but may be configured by a display device such as organic electroluminescence instead of the liquid crystal monitor 216.
- the liquid crystal monitor 216 When a live view image is displayed on the liquid crystal monitor 216, digital image signals continuously generated by the digital signal processing unit 206 are temporarily stored in the RAM 207.
- the display control unit 214 converts the digital image signal temporarily stored in the RAM 207 into a display signal format and sequentially outputs it to the liquid crystal monitor 216. Thereby, the captured image is displayed on the liquid crystal monitor 216 in real time, and the liquid crystal monitor 216 can be used as an electronic viewfinder.
- the shutter release switch 22 is an imaging instruction unit for inputting an imaging instruction of a still image or a moving image, and is configured by a two-stage stroke type switch composed of so-called “half press” and “full press”.
- an S1 ON signal is output when the shutter release switch 22 is half-pressed, an S2 ON signal is output when the shutter release switch 22 is further pressed halfway down, and an S1 ON signal is output.
- the main body side CPU 220 executes shooting preparation processing such as AF control (automatic focus adjustment) and AE control (automatic exposure control), and when the S2 ON signal is output, executes still image shooting processing and recording processing. To do.
- AF control and AE control are automatically performed when the auto mode is set by the operation unit 222, respectively, and AF control and AE control are not performed when the manual mode is set. Needless to say.
- the camera body 200 In the moving image capturing mode, when the S2 ON signal is output when the shutter release switch 22 is fully pressed, the camera body 200 enters a moving image recording mode in which moving image recording is started, and image processing of the moving image is performed. When the shutter release switch 22 is fully pressed again and an S2 ON signal is output, the camera body 200 enters a standby state and temporarily stops the moving image recording process.
- the shutter release switch 22 is not limited to a two-stroke type switch consisting of half-pressing and full-pressing, and may output an S1 ON signal and an S2 ON signal in a single operation. May be provided to output an S1 on signal and an S2 on signal.
- the operation instruction may be output by touching an area corresponding to the operation instruction displayed on the screen of the touch panel as these operation means.
- the form of the operation means is not limited to these as long as the preparation process or the imaging process is instructed.
- the still image or moving image acquired by the imaging is compressed by the compression / decompression processing unit 208, and the compressed image data includes the required attachment of the imaging date and time, GPS information, and imaging conditions (F value, shutter speed, ISO sensitivity, etc.).
- the information is converted into an image file added to the header, and then stored in the memory card 212 via the media control unit 210.
- the main body side CPU 220 controls the overall operation of the camera main body 200 and the driving of the optical members of the interchangeable lens 100, etc. Based on the input from the operation unit 222 including the shutter release switch 22, the main body side CPU 220 The interchangeable lens 100 is controlled.
- the clock unit 224 measures time based on a command from the main body CPU 220 as a timer.
- the clock unit 224 measures the current date and time as a calendar.
- the flash ROM 226 is a non-volatile memory that can be read and written, and stores setting information.
- the ROM 228 stores a camera control program executed by the main body side CPU 220, an image alignment auxiliary program for executing imaging in the image capturing mode for composition according to the present invention, defect information of the image sensor 201, various types used for image processing, and the like. Parameters and tables are stored.
- the main body side CPU 220 controls each part of the camera main body 200 and the interchangeable lens 100 using the RAM 207 as a work area in accordance with a camera control program stored in the ROM 228 or an image alignment auxiliary program.
- the AF control unit 230 that functions as an automatic focus adjustment unit calculates a defocus amount necessary for controlling the phase difference AF, and based on the calculated defocus amount, a position (focus position) command to which the focus lens should move is calculated. Is notified to the interchangeable lens 100 via the main body side CPU 220 and the main body side communication unit 250.
- the AF control unit 230 includes a phase difference detection unit and a defocus amount calculation unit.
- the phase difference detection unit is in an AF area of the image sensor 201 (an area where the main subject specified by the user exists, an area of the main subject automatically detected by face detection, or an area set by default).
- Pixel data (first pixel value and second pixel value) from the first phase difference pixel group consisting of the first phase difference pixel PA and the second phase difference pixel group consisting of the second phase difference pixel PB, respectively.
- a phase difference is detected based on the first pixel value and the second pixel value.
- This phase difference is obtained when the correlation between the plurality of first pixel values of the first phase difference pixel PA and the plurality of second pixel values of the second phase difference pixel PB is maximized (the plurality of first pixels And a shift amount in the pupil division direction between the first pixel value and the second pixel value (when the integrated value of the absolute difference between the value and the plurality of second pixel values is minimized).
- the defocus amount calculation unit calculates the defocus amount by multiplying the phase difference detected by the phase difference detection unit by a coefficient corresponding to the current F value (ray angle) of the interchangeable lens 100.
- the focus lens position command corresponding to the defocus amount calculated by the AF control unit 230 is notified to the interchangeable lens 100, and the lens side CPU 120 of the interchangeable lens 100 that receives the focus lens position command receives the focus lens control unit 116.
- the focus lens is moved via, and the position (focus position) of the focus lens is controlled.
- the AE control unit 232 is a part that detects the brightness of the subject (subject brightness), and is a numerical value (exposure value (EV value)) necessary for AE control and AWB (Auto White Balance) control corresponding to the subject brightness. )) Is calculated.
- the AE control unit 232 calculates an EV value based on the brightness of the image acquired via the image sensor 201, the shutter speed when the brightness of the image is acquired, and the F value.
- the main body side CPU 220 can determine the F value, shutter speed, and ISO sensitivity from a predetermined program diagram based on the EV value obtained from the AE control unit 232, and perform AE control.
- the white balance correction unit 234 calculates white balance gains (WB (White Balance) gains) Gr, Gg, Gb for each color data of RGB data (R data, G data, and B data). White balance correction is performed by multiplying the B data by the calculated WB gains Gr, Gg, and Gb, respectively.
- WB gains Gr, Gg, and Gb the subject is illuminated based on scene recognition (outdoor / indoor determination, etc.) based on the brightness (EV value) of the subject, the color temperature of ambient light, and the like.
- a method of reading out a WB gain corresponding to a specified light source type from a storage unit in which an appropriate WB gain is stored in advance for each light source type is conceivable, but at least using an EV value
- Other known methods for determining Gr, Gg, Gb are conceivable.
- the wireless communication unit 236 is a part that performs short-range wireless communication of a standard such as Wi-Fi (Wireless Fidelity) (registered trademark), Bluetooth (registered trademark), etc., and with a peripheral digital device (a mobile terminal such as a smartphone). Necessary information is sent and received between the two.
- Wi-Fi Wireless Fidelity
- Bluetooth registered trademark
- peripheral digital device a mobile terminal such as a smartphone
- the GPS receiving unit 238 receives GPS signals transmitted from a plurality of GPS satellites according to instructions from the main body side CPU 220, executes positioning calculation processing based on the received plurality of GPS signals, and the latitude and longitude of the camera main body 200 And GPS information consisting of altitude.
- the acquired GPS information can be recorded in the header of the image file as attached information indicating the imaging position of the captured image.
- the power supply control unit 240 applies power supply voltage supplied from the battery 242 to each unit of the camera main body 200 in accordance with a command from the main body side CPU 220.
- the power supply control unit 240 supplies the power supply voltage supplied from the battery 242 to each unit of the interchangeable lens 100 via the main body mount 260 and the lens mount 160 in accordance with a command from the main body side CPU 220.
- the lens power switch 244 switches on and off the power supply voltage applied to the interchangeable lens 100 via the main body mount 260 and the lens mount 160 and switches the level in accordance with a command from the main body side CPU 220.
- the main body side communication unit 250 transmits / receives a request signal and a response signal to / from the lens side communication unit 150 of the interchangeable lens 100 connected via the main body mount 260 and the lens mount 160 in accordance with a command from the main body side CPU 220 ( (Bidirectional communication).
- the main body mount 260 is provided with a plurality of terminals 260A as shown in FIG. 1, and when the interchangeable lens 100 is attached to the camera main body 200 (the lens mount 160 and the main body mount 260 are connected), the main body A plurality of terminals 260A (FIG. 1) provided on the mount 260 and a plurality of terminals (not shown) provided on the lens mount 160 are electrically connected, and the main body side communication unit 250 and the lens side communication unit 150 are connected. Bi-directional communication is possible.
- the built-in flash 30 (FIG. 1) is, for example, a TTL (Through The Lens) automatic dimming flash, and includes a flash light emitting unit 270 and a flash control unit 272.
- the flash control unit 272 has a function of adjusting the light emission amount (guide number) of flash light emitted from the flash light emitting unit 270. That is, the flash control unit 272 causes the flash light emitting unit 270 to emit light in synchronization with the flash imaging instruction from the main body side CPU 220, and the reflected light (including ambient light) incident through the imaging optical system 102 of the interchangeable lens 100. Photometry is started, and when the photometric value reaches the standard exposure value, the flash light emission from the flash light emitting unit 270 is stopped.
- the FPS 280 constitutes a mechanical shutter of the imaging apparatus 10 and is disposed immediately before the image sensor 201.
- the FPS control unit 296 controls the opening and closing of the front and rear curtains of the FPS 280 based on input information (S2 ON signal, shutter speed, etc.) from the main body CPU 220, and controls the exposure time (shutter speed) in the image sensor 201. To do.
- FIG. 6 is a functional block diagram showing a first embodiment of the main body side CPU 220 that mainly performs imaging in the composite image capturing mode, and particularly shows a portion that functions during imaging in the composite image capturing mode. .
- the image capturing in the image capturing mode for compositing according to the present invention captures a plurality of images with different exposures used for HDR composition that expands the dynamic range, and in particular, a plurality of images are necessary for alignment of the composite images.
- a plurality of images (a plurality of images increased by the auxiliary image) to which the auxiliary image is added so as to satisfy the alignment condition are imaged again.
- the main body CPU 220 When imaging in the composition image capturing mode, the main body CPU 220 mainly controls the imaging control in the composition image capturing mode and functions as the determination unit 220A and the imaging condition calculation unit 220B.
- the user When capturing an image in the compositing image capturing mode, the user operates the operation unit 222 to set the compositing image capturing mode. For example, by operating the operation unit 222 to display a menu screen on the liquid crystal monitor 216 and selecting “compositing image capturing mode” on the menu screen, the compositing image capturing mode can be set.
- a menu screen for receiving the number of images (first image) used for HDR composition and the image capturing condition (first image capturing condition) of each first image is displayed on the liquid crystal monitor 216. Is displayed.
- the user can set an arbitrary “number of sheets” and “first imaging condition” by an on-screen interactive method.
- the plurality of first images used for the HDR composition are a first image (reference image) S 0 with proper exposure, a first image S ⁇ 1 with under exposure, and a first image S +1 with over exposure. These three images.
- the first imaging conditions of the first images S ⁇ 1 , S 0 , S +1 are exposure conditions
- the exposure conditions of the first image S 0 are appropriate exposure values
- the exposure condition of the overexposed first image S + 1 is an exposure value of +3 EV with respect to the proper exposure value.
- the exposure conditions are set in stages.
- the ⁇ 3 EV exposure correction value with respect to the appropriate exposure value
- the “number of images” and “first imaging condition” of the plurality of first images are not limited to being set by the user, and may be “number of images” and / or “first imaging condition” set by default.
- the main body CPU 220 When the main body CPU 220 receives an imaging instruction by operating the shutter release switch 22, it performs the following processing.
- the main body CPU 220 sets the interchangeable lens 100, the image sensor 201, and the FPS 280 functioning as the imaging unit (image acquisition unit 221) based on the first imaging condition (under-exposure, proper exposure, and over-exposure exposure conditions).
- the exposure is controlled, and the three first images S ⁇ 1 , S 0 , S +1 are continuously shot, and the three first images S ⁇ 1 , S 0 , S +1 are acquired and temporarily stored in the RAM 207. .
- the determination unit 220A of the main body side CPU 220 determines the alignment conditions necessary for the alignment of the composite image (HDR composite) for the three first images S ⁇ 1 , S 0 , S +1 temporarily stored in the RAM 207. Whether the first imaging condition (first exposure condition) of the three first images S ⁇ 1 , S 0 , S +1 is between the two adjacent first images (in this example, whether or not the two are satisfied) , Between the first images S ⁇ 1 and S 0 and between the first images S 0 and S +1 ). Note that “adjacent imaging conditions” represent two imaging conditions with different levels, and “adjacent first imaging conditions” represent two first imaging conditions with different levels. To express.
- the alignment condition determined by the determination unit 220A is a feature point between the two first images and is an evaluation value based on the feature points corresponding to each other.
- the alignment conditions correspond to each other.
- the evaluation value based on the feature points is a value corresponding to the number of feature point pairs corresponding to each other.
- the determination unit 220A determines whether or not the alignment condition is satisfied between the two first images S ⁇ 1 and S 0 whose exposure conditions are adjacent, the two first images S ⁇ . 1 , feature points are extracted from S 0 .
- a feature point algorithm such as SIFT (Scale-invariant feature transform), SURF (Speed-Upped Robust Feature), or ORB (Oriented FAST and Rotated BRIEF) can be used. It is assumed that a plurality of feature points are detected for each of the first images S ⁇ 1 and S 0 by this process.
- the feature amount is preferably information acquired based on information on the feature point peripheral region. Whether or not a certain feature point of the two first images S ⁇ 1 and S 0 matches is evaluated by the similarity of the feature amount corresponding to the feature point. For example, if the feature quantity is SIFT, it is assumed that a match is made when the Euclidean distance of each real vector is smaller than a certain threshold value.
- the determination unit 220A determines the evaluation value (the number of pairs). ) Is equal to or greater than the reference value, it is determined that the alignment condition is satisfied. If it is less than the reference value, it is determined that the alignment condition is not satisfied.
- the determination method for determining whether or not the alignment condition is satisfied based on the number of feature point pairs corresponding to each other is merely an example, and the present invention is not limited to this.
- the imaging condition calculation unit 220B of the main body CPU 220 determines the first imaging condition ( Based on the first exposure condition), a second imaging condition (second exposure condition) that internally divides the two first exposure conditions of the two first images is calculated.
- the appropriate exposure value that is the first exposure condition of the first image S 0 is calculated.
- second exposure conditions second exposure conditions
- the two first imaging conditions (first exposure) are based on the two first imaging conditions (first exposure condition) and the determined alignment condition. It is preferable to determine a step width that internally divides the condition) and calculate the second imaging condition (second exposure condition) based on the determined step width. Thereby, the second imaging condition of the second image necessary for improving the correction accuracy (2nd exposure conditions) can be calculated
- the image acquisition unit 221 detects the first imaging condition (first A plurality of first images captured under the exposure condition) and a second image captured under the second imaging condition (second exposure condition) are acquired.
- the interchangeable lens 100, the image sensor 201, and the FPS 280 that function as the image acquisition unit 221 under the control of the main body side CPU 220 include the first imaging condition (first exposure condition) and the second imaging condition (first The exposure control is performed based on (2 exposure conditions), and the three first images S ⁇ 1 , S 0 , S +1 and the second image as one or a plurality of auxiliary images are acquired by continuous shooting.
- the image acquisition unit 221 may additionally acquire only the second image as the auxiliary image captured under the second imaging condition (second exposure condition).
- second imaging condition second exposure condition
- the misalignment tends to increase between the first image and the second image, and when the misalignment increases, the second image becomes an auxiliary image for alignment. It is because it may stop functioning as.
- a plurality of images (the first image, or the first image and the second image) acquired as described above are HDR-combined after alignment as described later, but as a plurality of images for HDR composition
- the data may be recorded in the memory card 212 in association with each other.
- FIG. 7 is a diagram used to describe an embodiment of the operation of the main body side CPU 220 that executes imaging in the composite image imaging mode.
- FIGS. 7A, 7B, and 7C are diagrams.
- FIG. 4 schematically shows operations at the time of the first imaging, the second imaging, and the third imaging in the image capturing mode for synthesis, respectively.
- the main body side CPU 220 uses the first images S ⁇ 1 and S 0 having different first imaging conditions (first exposure conditions) as a plurality of images for HDR composition.
- S + 1 are acquired from the image acquisition unit 221 functioning as an imaging unit.
- FIG. 7A shows three first images S ⁇ 1 , S 0 , S +1 and first exposure conditions for the first images S ⁇ 1 , S 0 , S +1 (exposure of the first image S 0 ).
- the exposure correction values of the first images S ⁇ 1 and S +1 with respect to the value (appropriate exposure value) are shown.
- the determination unit 220A of the main body side CPU 220 determines whether the first exposure condition of the three first images S ⁇ 1 , S 0 , S +1 is between two adjacent first images (first image S ⁇ 1 , In S 0 and in the first images S 0 and S +1 ), it is determined whether or not the alignment condition necessary for the HDR synthesis alignment is satisfied.
- the determination unit 220A determines the alignment condition based on whether or not the number of feature point pairs corresponding to each other between the two first images is equal to or greater than a reference value. It is determined whether it is satisfied.
- the reference value is 100
- the determination unit 220A determines that the alignment condition is satisfied when the number of feature point pairs is 100 or more, and when the number of feature point pairs is less than 100, It is determined that the alignment condition is not satisfied.
- the reference value 100 is an example, and the present invention is not limited to this value.
- the determination unit 220A determines the two first images S 0. , S + 1 , it is determined that the alignment condition is satisfied (OK).
- the determination unit 220A determines that between the two first images S ⁇ 1 and S 0 , It is determined that the alignment condition is not satisfied (NG).
- the imaging condition calculation unit 220B of the main body side CPU 220 sets the first exposure condition for the two first images. Based on this, a second exposure condition that internally divides the two first exposure conditions of the two first images is calculated.
- the imaging condition calculation unit 220B uses an exposure value 1 EV smaller than the exposure value of the first image S 0 and an exposure value 1 EV larger than the exposure value of the first image S ⁇ 1 as the second exposure condition. calculate.
- the image acquisition unit 221 includes three first images S ⁇ 1 , S 0 , S +1 and two second images A 1 and A 2 according to the second exposure condition are acquired by continuous shooting.
- the determination unit 220A determines the exposure condition based on the three first images S ⁇ 1 , S 0 , S +1 newly acquired and the two second images A 1 , A 2 (a total of five images). It is determined again whether or not the alignment condition is satisfied between two adjacent images.
- the determination unit 220A determines between the first image S 0 and the second image A 1 and between the first image S ⁇ 1 and the second image A 2. between image a 2, it is determined to satisfy the respective alignment condition.
- the determination unit 220A performs alignment between the two second images A 1 and A 2. It is determined that the condition is not satisfied.
- the imaging condition calculation unit 220B determines that the alignment condition is not satisfied between the two images (in this example, the two second images A 1 and A 2 ) by the determination unit 220A, calculate a third exposure condition for internally dividing the two second images a 1, the second image a 1 based on the second exposure conditions a 2, 2 one second exposure conditions a 2 (third imaging condition) To do.
- the imaging condition calculation unit 220B includes a 1 / 3EV smaller exposure value than the second exposure value of the image A 1, a 1 / 3EV larger exposure value than the second exposure value of the image A 2, respectively 3 Calculated as exposure conditions.
- the image acquisition unit 221 includes three first images S ⁇ 1 , S 0 , S +1 , two second images A 1 and A 2 according to the second exposure condition, and two third images B 1 and B 2 according to the third exposure condition are acquired by continuous shooting.
- the determination unit 220A includes three first images S ⁇ 1 , S 0 , S +1 newly acquired, two second images A 1 , A 2, and two third images B 1 , B 2 ( Based on the total of seven images), it is determined again whether or not the alignment condition is satisfied between the two images whose exposure conditions are adjacent.
- the determination unit 220A determines the position between these images. It is determined that the alignment condition is satisfied. Needless to say, the alignment condition is satisfied between the other images.
- FIG. 8 is a diagram used for explaining another embodiment of the operation of the main body side CPU 220 for performing imaging in the composite image capturing mode, and (A), (B), and (C) in FIG. FIG. 4 schematically shows operations at the time of the first imaging, the second imaging, and the third imaging in the image capturing mode for synthesis, respectively.
- the operation of the main body side CPU 220 shown in FIG. 8 is different from the embodiment of the main body side CPU 220 shown in FIG. 7 in that the alignment condition is satisfied between two images whose exposure conditions are adjacent to each other. If it is determined that the auxiliary image is not captured, the imaging condition for capturing the auxiliary image (the imaging condition for the auxiliary image that internally divides the two imaging conditions) is different. Since the operation at the first imaging shown in FIG. 8A is the same as the operation at the first imaging shown in FIG. 7A, the description thereof is omitted.
- the imaging condition calculation unit 220B An average (center) exposure condition of the first exposure conditions of the two first images S ⁇ 1 and S 0 is calculated as the second exposure condition.
- the imaging condition calculation unit 220B the exposure value is also exposed (3/2) EV greater than than the exposure value of the first image S 0 (3/2) EV smaller exposure value (first image S -1 Value) is calculated as the second exposure condition.
- the image acquisition unit 221 includes three first images S ⁇ 1 , S 0 , S + 1 and one second image A according to the second exposure condition are acquired by continuous shooting.
- the determination unit 220A uses two newly acquired first images S ⁇ 1 , S 0 , S +1 and one second image A (a total of four images) for which two exposure conditions are adjacent. It is determined again whether or not the alignment condition is satisfied among the images.
- the determination unit 220A uses the first image S 0 , It is determined that the alignment condition is satisfied between the second images A.
- the determination unit 220A uses the two first images S ⁇ 1 and the second image A During the period, it is determined that the alignment condition is not satisfied.
- the imaging condition calculation unit 220B is determined by the determination unit 220A that the alignment condition is not satisfied between the two images (in this example, the two first images S ⁇ 1 and the second image A). that the first exposure condition and the first exposure condition and the second image a 2 second exposure condition of the first image S -1 based on the second exposure condition of the second image a 2 of the first image S -1 An average (center) third exposure condition is calculated.
- the imaging condition calculation unit 220B rather than the exposure value of the first image Oh S 0 to (9/4) EV smaller exposure value is calculated as the third exposure condition.
- the image acquisition unit 221 has three first images S ⁇ 1 , S 0 , S + 1 , one second image A under the second exposure condition, and one third image B under the third exposure condition are acquired by continuous shooting.
- the determination unit 220A is based on three newly acquired first images S ⁇ 1 , S 0 , S +1 , one second image A, and one third image B (a total of five images). Then, it is determined again whether or not the alignment condition is satisfied between the two images whose exposure conditions are adjacent.
- the determination unit 220A determines that the alignment condition is satisfied between these images. Needless to say, the alignment condition is satisfied between the other images.
- FIG. 9 is a functional block diagram showing a second embodiment of the main body side CPU 220 that mainly performs imaging in the composite image capturing mode.
- parts that are the same as those in the first embodiment shown in FIG. 6 are given the same reference numerals, and detailed descriptions thereof are omitted.
- a plurality of composite images having different imaging conditions Is imaged under imaging conditions that satisfy the alignment condition.
- a plurality of first images used for HDR composition are set to three first images S ⁇ 1 , S 0 , S +1, and the imaging condition (exposure condition) of the first image S 0 is an appropriate exposure value.
- the underexposed first image S ⁇ 1 has an exposure condition of ⁇ 3 EV with respect to the proper exposure value
- the overexposed first image S +1 has an exposure condition of +3 EV with respect to the proper exposure value. Exposure value.
- the subject at the same position is continuously shot with a step width narrower than the first imaging condition in a range of ⁇ 3 EV and a step width satisfying the alignment condition (for example, (1/3) EV). If the range of ⁇ 3 EV is continuously shot under an exposure condition with a step size of (1/3) EV, 19 images including images with appropriate exposure are captured.
- the alignment condition for example, (1/3) EV
- the 19 images thus captured are temporarily stored in the RAM 207 in the camera body 200.
- the image acquisition unit 223 in the main body side CPU 220 shown in FIG. 9 acquires three first images S ⁇ 1 , S 0 , S +1 from the RAM 207.
- the three first images S ⁇ 1 , S 0 , and S +1 are images captured at ⁇ 3 EV, ⁇ 0 EV, and +3 EV with respect to proper exposure among 19 images stored in the RAM 207. is there.
- the determination unit 220A determines whether the first exposure condition among the three first images S ⁇ 1 , S 0 , S +1 is between two adjacent first images (between the first images S ⁇ 1 , S 0 ). , And between the first images S 0 and S +1 ), it is determined whether or not an alignment condition necessary for the HDR composition alignment is satisfied.
- the determination unit 220A determines the two first images S ⁇ 1. , S 0 , it is determined that the alignment condition is not satisfied (NG).
- the imaging condition calculation unit 220B determines that the two first images S ⁇ 1 , first image S -1 of the two on the basis of the two first exposure condition S 0, to calculate the second exposure condition which internally divides the two first exposure conditions S 0.
- the imaging condition calculation unit 220B uses an exposure value 1 EV smaller than the exposure value of the first image S 0 and an exposure value 1 EV larger than the exposure value of the first image S ⁇ 1 as the second exposure condition. calculate.
- the image acquisition unit 223 reads the two second images A 1 and A 2 corresponding to the second exposure condition from the RAM 207.
- the determination unit 220A determines the exposure condition based on the three first images S ⁇ 1 , S 0 , S +1 newly acquired and the two second images A 1 , A 2 (a total of five images). It is determined again whether or not the alignment condition is satisfied between two adjacent images.
- the determination unit 220A uses the two second images A 1 and A 2. Between 2 , it is determined that the alignment condition is not satisfied.
- the imaging condition calculation unit 220B When the determination unit 220A determines that the alignment condition is not satisfied between the two second images A 1 and A 2 , the imaging condition calculation unit 220B has two second images A 1 and A 2. Based on these two second exposure conditions, a third exposure condition that internally divides the two second exposure conditions of the two second images A 1 and A 2 is calculated.
- the image acquisition unit 223 reads the two third images B 1 and B 2 corresponding to the third exposure condition from the RAM 207.
- the determination unit 220A includes three first images S ⁇ 1 , S 0 , S +1 and two second images A 1 and A 2 and two third images B 1 and B 2 (a total of seven images). ), It is determined again whether or not the alignment condition is satisfied between the two images whose exposure conditions are adjacent to each other.
- the determination unit 220A determines the position between these images. It is determined that the alignment condition is satisfied. Needless to say, the alignment condition is satisfied between the other images.
- FIG. 10 is a functional block diagram illustrating a third embodiment of the main body side CPU 220 that mainly performs imaging in the composite image capturing mode.
- FIG. 10 parts that are the same as those in the first embodiment shown in FIG. 6 are given the same reference numerals, and detailed descriptions thereof are omitted.
- the third embodiment shown in FIG. 10 is different from the first embodiment shown in FIG. 6 in that an alignment unit 220C and a composite image generation unit 220D are added.
- the alignment unit 220 ⁇ / b> C performs alignment of a plurality of images that are finally acquired on the assumption that alignment conditions between all images are satisfied.
- the alignment unit 220C uses, for example, the first image S 0 captured with appropriate exposure as the reference image for alignment, and the feature points of the first images S ⁇ 1 and S +1 are the feature points of the reference image.
- the first images S ⁇ 1 and S +1 are geometrically transformed so as to match.
- This geometrical deformation can be performed by projective transformation using projective transformation parameters, affine transformation using affine transformation parameters, Helmart transformation using Helmart transformation parameters, or the like.
- the first images S ⁇ 1 and S +1 are aligned by projective transformation with respect to the first image S 0 which is the reference image
- the first images S ⁇ 1 and S +1 are represented by the coordinates of the first image S 0 .
- a matrix H to be projected (projective transformation) on a plane is obtained.
- the relationship shown in is established.
- the matrix H is obtained by applying the equation [1] to the matched pair of corresponding points and solving the simultaneous equations for the elements of the matrix H.
- the coefficient h 33 in the matrix H is 1, eight coefficients are obtained.
- a matrix H having eight coefficients (projection transformation parameters) can be obtained by solving eight simultaneous equations from the coordinates of four feature point pairs (total of eight).
- an optimum matrix H by estimating a plurality of highly reliable pairs of feature points by an NSAC (Random Sampling Consensus) method or an M-estimation method.
- NSAC Random Sampling Consensus
- M-estimation method M-estimation method
- the alignment unit 220C geometrically transforms the first images S ⁇ 1 and S +1 using the conversion parameters as described above, and performs alignment on the first image S 0 that is the reference image.
- the alignment unit 220C obtains an alignment auxiliary image when the alignment condition is not satisfied between the first images S ⁇ 1 and S 0 as in the example illustrated in FIG. Alignment is performed using the two second images A 1 and A 2 and the two third images B 1 and B 2 .
- the second image A 1 is aligned with the first image S 0 as the reference image in the same manner as described above.
- the third image B 1 is aligned with the second image A 1 after alignment.
- perform the extraction of common feature points between the second image A 1 after the positioning and the third image B 1 performs alignment based on the extracted feature point.
- the third image B 2 is aligned with the third image B 1 after alignment
- the second image A 2 is aligned with the third image B 2 after alignment.
- the alignment of the first image S -1 performs the second image a 2 after the last alignment.
- First image S -1 aligned in this way, becomes in alignment with respect to initial first image S 0.
- the composite image generation unit 220D performs HDR composition based on the three first images S ⁇ 1 , S 0 , S +1 after alignment by the alignment unit 220C. For example, pixels of the first image S- 1 that is underexposed (or increased weighting) are assigned to pixels in the highlight (bright part) of the image, and pixels in the overexposure are assigned to pixels in the shadow (dark part). 1 assigns the pixel of the image S +1 (or increasing the weight), the pixel at the middle part of the highlight and shadow synthesizes upon allocation of the first pixel of image S 0 of the proper exposure (or increasing the weight) .
- FIG. 11 is a flowchart showing an embodiment of the image alignment assisting method according to the present invention.
- the main body side CPU 220 uses the interchangeable lens 100 functioning as the image capturing unit and the image sensor 201 to expose exposure conditions for HDR compositing.
- a plurality of different images are taken, and a plurality of first images (composition images) are acquired (step S10).
- the determination unit 220A determines whether an auxiliary image (second image, third image) other than the composition image is necessary (step S12). As described above, this determination can be made based on whether or not the number of corresponding feature points between two first images with adjacent exposure conditions is equal to or greater than a reference value.
- step S18 If it is determined that the auxiliary image is unnecessary, the process proceeds to step S18.
- the imaging condition calculation unit 220B calculates a necessary imaging condition (second imaging condition) of the auxiliary image (step S14).
- an auxiliary image is acquired based on the image for synthesis and the second imaging condition (step S16).
- (M + N + 1) images are prepared (step S18).
- M is the number of images picked up under an exposure condition with an exposure value smaller than that of the first image S 0 with appropriate exposure as the reference image
- N is an exposure value larger than that of the first image S 0. This is the number of images taken under exposure conditions.
- the (M + N + 1) images are arranged in the order of the exposure value to obtain an image S n (0 ⁇ n ⁇ M, N), and a parameter n for specifying the image is set to 0 (step S20).
- Step S22 the image Sn and the image Sn + 1 are compared, a correction amount (conversion parameter for alignment) is calculated (step S22), and the image Sn + 1 is corrected (geometrically converted) based on the calculated correction amount. (Step S24).
- step S26 the image S-n and the image S- (n + 1) are compared to calculate a correction amount (step S26), and the image S- (n + 1) is corrected based on the calculated correction amount (step S28).
- step S30 it is determined whether or not the condition of n ⁇ N or n ⁇ M is satisfied. If “Yes”, the parameter n is incremented by 1 and the process proceeds to steps S22 and S26.
- the imaging device 10 is a mirrorless digital single-lens camera, but is not limited thereto, and may be a single-lens reflex camera, a lens-integrated imaging device, a digital video camera, or the like.
- the present invention is also applicable to mobile devices having other functions (call function, communication function, and other computer functions).
- Other modes to which the present invention can be applied include, for example, mobile phones and smartphones having a camera function, PDAs (Personal Digital Assistants), and portable game machines.
- PDAs Personal Digital Assistants
- FIG. 12 shows the appearance of a smartphone 500 that is an embodiment of the imaging apparatus of the present invention.
- a smartphone 500 illustrated in FIG. 12 includes a flat housing 502, and a display input in which a display panel 521 as a display unit and an operation panel 522 as an input unit are integrated on one surface of the housing 502. Part 520.
- the casing 502 includes a speaker 531, a microphone 532, an operation unit 540, and a camera unit 541.
- the configuration of the housing 502 is not limited to this, and, for example, a configuration in which the display unit and the input unit are independent, or a configuration having a folding structure or a slide mechanism may be employed.
- FIG. 13 is a block diagram showing the configuration of the smartphone 500 shown in FIG.
- a wireless communication unit 510 that performs mobile wireless communication via a base station and a mobile communication network
- a display input unit 520 that performs mobile wireless communication via a base station and a mobile communication network
- a call unit 530 that performs mobile wireless communication via a base station and a mobile communication network
- a camera unit 541 a recording unit 550
- an external input / output unit 560 an external input / output unit 560
- a GPS (Global Positioning System) receiving unit 570 GPS (Global Positioning System) receiving unit 570
- a motion sensor unit 580 a power supply unit 590
- main control unit 501 main control unit 501.
- the wireless communication unit 510 performs wireless communication with a base station accommodated in the mobile communication network in accordance with an instruction from the main control unit 501. Using this wireless communication, transmission / reception of various file data such as audio data and image data, e-mail data, and reception of Web data and streaming data are performed.
- the display input unit 520 displays images (still images and moving images), character information, and the like, and visually transmits information to the user under the control of the main control unit 501, and detects a user operation on the displayed information.
- This is a so-called touch panel, and includes a display panel 521 and an operation panel 522.
- the display panel 521 uses an LCD (Liquid Crystal Display), an OELD (Organic Electro-Luminescence Display), or the like as a display device.
- Operation panel 52 Reference numeral 2 denotes a device that is placed so that an image displayed on the display surface of the display panel 521 is visible and detects one or more coordinates operated by a user's finger or stylus. When such a device is operated by a user's finger or stylus, a detection signal generated due to the operation is output to the main control unit 501. Next, the main control unit 501 detects an operation position (coordinates) on the display panel 521 based on the received detection signal.
- the display panel 521 and the operation panel 522 of the smartphone 500 illustrated as an embodiment of the imaging apparatus of the present invention integrally constitute a display input unit 520.
- the arrangement is such that 522 completely covers the display panel 521.
- the operation panel 522 may have a function of detecting a user operation even in an area outside the display panel 521.
- the operation panel 522 includes a detection area (hereinafter referred to as a display area) for an overlapping portion that overlaps the display panel 521 and a detection area (hereinafter, a non-display area) for an outer edge portion that does not overlap the other display panel 521. May be included).
- the operation panel 522 may include two sensitive regions of the outer edge portion and the other inner portion. Furthermore, the width of the outer edge portion is appropriately designed according to the size of the housing 502 and the like. Further, examples of the position detection method employed in the operation panel 522 include a matrix switch method, a resistance film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, and a capacitance method, and any method is adopted. You can also
- the call unit 530 includes a speaker 531 and a microphone 532, and converts a user's voice input through the microphone 532 into voice data that can be processed by the main control unit 501, and outputs the voice data to the main control unit 501, or a wireless communication unit 510 or the audio data received by the external input / output unit 560 is decoded and output from the speaker 531.
- the speaker 531 and the microphone 532 can be mounted on the same surface as the surface on which the display input unit 520 is provided.
- the operation unit 540 is a hardware key using a key switch or the like, and receives an instruction from the user.
- the operation unit 540 is mounted on the side surface of the housing 502 of the smartphone 500 and is turned on when pressed with a finger or the like, and is turned off by a restoring force such as a spring when the finger is released. It is a push button type switch.
- the recording unit 550 includes a control program, control data, application software (including an image alignment assistance program according to the present invention), address data in which a communication partner name and telephone number are associated, and transmitted / received electronic data. It stores mail data, web data downloaded by web browsing, downloaded content data, and temporarily stores streaming data and the like.
- the recording unit 550 includes an internal storage unit 551 built in the smartphone and an external storage unit 562 having a removable external memory slot.
- Each of the internal storage unit 551 and the external storage unit 552 constituting the recording unit 550 includes a flash memory type, a hard disk type, a multimedia card micro type, This is realized by using a recording medium such as a card type memory (for example, Micro SD (registered trademark) memory), a RAM (Random Access Memory), a ROM (Read Only Memory) or the like.
- a card type memory for example, Micro SD (registered trademark) memory
- RAM Random Access Memory
- ROM Read Only Memory
- the external input / output unit 560 serves as an interface with all external devices connected to the smartphone 500, and communicates with other external devices (for example, universal serial bus (USB), IEEE 1394, etc.) or Network (for example, Internet, wireless LAN (Local Area Network), Bluetooth (registered trademark), RFID (Radio Frequency Identification), infrared communication (Infrared Data Association: IrDA) (registered trademark), UWB (Ultra Wideband) ( (Registered trademark), ZigBee (registered trademark), etc.) for direct or indirect connection.
- USB universal serial bus
- IEEE 1394 etc.
- Network for example, Internet, wireless LAN (Local Area Network), Bluetooth (registered trademark), RFID (Radio Frequency Identification), infrared communication (Infrared Data Association: IrDA) (registered trademark), UWB (Ultra Wideband) ( (Registered trademark), ZigBee (registered trademark), etc.) for direct or indirect connection.
- Examples of external devices connected to the smartphone 500 include a wired / wireless headset, wired / wireless external charger, wired / wireless data port, a memory card connected via a card socket, and a SIM (Subscriber).
- Identity Module Card (UIM) / User Identity Module Card (UIM) card or external audio / video equipment connected via audio / video I / O (Input / Output) terminal, external audio / video equipment connected wirelessly, wired / wireless
- smartphones to be connected, personal computers to be connected to / from wireless, PDAs to be connected to wireless and wireless, and earphones.
- the external input / output unit can transmit the data transmitted from such an external device to each component inside the smartphone 500, or can transmit the data inside the smartphone 500 to the external device.
- the GPS receiving unit 570 receives GPS signals transmitted from the GPS satellites ST1 to STn in accordance with instructions from the main control unit 501, performs positioning calculation processing based on the received plurality of GPS signals, A position consisting of longitude and altitude is detected.
- the GPS reception unit 570 can acquire position information from the wireless communication unit 510 or the external input / output unit 560 (for example, a wireless LAN), the GPS reception unit 570 can also detect the position using the position information.
- the motion sensor unit 580 includes, for example, a triaxial acceleration sensor and a gyro sensor, and detects the physical movement of the smartphone 500 in accordance with an instruction from the main control unit 501. By detecting the physical movement of the smartphone 500, the moving direction and acceleration of the smartphone 500 are detected. This detection result is output to the main control unit 501.
- the power supply unit 590 supplies power stored in a battery (not shown) to each unit of the smartphone 500 in accordance with an instruction from the main control unit 501.
- the main control unit 501 includes a microprocessor, operates according to a control program and control data stored in the recording unit 550, and controls each unit of the smartphone 500 in an integrated manner. Further, the main control unit 501 includes a mobile communication control function and an application processing function for controlling each unit of the communication system in order to perform voice communication and data communication through the wireless communication unit 510.
- the application processing function is realized by the main control unit 501 operating in accordance with application software stored in the recording unit 550.
- Examples of the application processing function include an infrared communication function for controlling the external input / output unit 560 to perform data communication with the opposite device, an e-mail function for sending and receiving e-mails, a web browsing function for browsing web pages, and the present invention.
- the main control unit 501 has an image processing function such as displaying video on the display input unit 520 based on image data (still image or moving image data) such as received data or downloaded streaming data.
- the image processing function refers to a function in which the main control unit 501 decodes the image data, performs image processing on the decoding result, and displays an image on the display input unit 520.
- the main control unit 501 executes display control for the display panel 521 and operation detection control for detecting a user operation through the operation unit 540 and the operation panel 522.
- the main control unit 501 By executing the display control, the main control unit 501 displays an icon for starting application software and a software key such as a scroll bar, or displays a window for creating an e-mail.
- a software key such as a scroll bar
- the scroll bar refers to a software key for accepting an instruction to move the display portion of a large image that does not fit in the display area of the display panel 521.
- the main control unit 501 detects a user operation through the operation unit 540, receives an operation on the icon and an input of a character string in the input field of the window through the operation panel 522, or scrolls. A request to scroll the display image through the bar is accepted.
- the main control unit 501 causes the operation position with respect to the operation panel 522 to overlap with the display panel 521 (display area) or other outer edge part (non-display area) that does not overlap with the display panel 521.
- a touch panel control function for controlling the sensitive area of the operation panel 522 and the display position of the software key.
- the main control unit 501 can also detect a gesture operation on the operation panel 522 and execute a preset function in accordance with the detected gesture operation.
- Gesture operation is not a conventional simple touch operation, but an operation of drawing a trajectory with at least one position from a plurality of positions by drawing a trajectory with a finger or the like, or simultaneously specifying a plurality of positions. means.
- the camera unit 541 is a digital camera that performs electronic photography using an imaging device such as a complementary metal oxide semiconductor (CMOS) or a charge-coupled device (CCD), and corresponds to the imaging device 10 illustrated in FIG.
- CMOS complementary metal oxide semiconductor
- CCD charge-coupled device
- the camera unit 541 converts image data obtained by imaging into compressed image data such as JPEG (Joint Photographic coding Experts Group) under the control of the main control unit 501, and records the data in the recording unit 550.
- the data can be output through the input / output unit 560 and the wireless communication unit 510. As shown in FIG.
- the camera unit 541 is mounted on the same surface as the display input unit 520, but the mounting position of the camera unit 541 is not limited to this, and is mounted on the back surface of the display input unit 520.
- a plurality of camera units 541 may be mounted. Note that when a plurality of camera units 541 are mounted, the camera unit 541 used for shooting can be switched for shooting alone, or a plurality of camera units 541 can be used for shooting simultaneously.
- the camera unit 541 can be used for various functions of the smartphone 500.
- an image acquired by the camera unit 541 can be displayed on the display panel 521, or the image of the camera unit 541 can be used as one of operation inputs of the operation panel 522.
- the GPS receiving unit 570 detects the position, the position can also be detected with reference to an image from the camera unit 541.
- the optical axis direction of the camera unit 541 of the smartphone 500 without using the triaxial acceleration sensor or in combination with the triaxial acceleration sensor (gyro sensor). It is also possible to determine the current usage environment.
- the image from the camera unit 541 can be used in the application software.
- the position information acquired by the GPS receiver 570 to the image data of the still image or the moving image, the voice information acquired by the microphone 532 (the text information may be converted into voice information by the main control unit or the like), Posture information and the like acquired by the motion sensor unit 580 can be added and recorded in the recording unit 550 or output through the external input / output unit 560 and the wireless communication unit 510.
- the present invention is not limited to this.
- the present invention can be applied to a case where a plurality of images having different in-focus distances are acquired, and a case where alignment and depth composition of the acquired plurality of images are performed.
- the imaging condition for depth synthesis is a focusing condition corresponding to a plurality of focusing distances (focus lens positions).
- the hardware structure of a processing unit (processing unit) that executes various processes is various processors as shown below.
- the circuit configuration can be changed after manufacturing a CPU (Central Processing Unit), FPGA (Field Programmable Gate Array), which is a general-purpose processor that functions as various processing units by executing software (programs).
- a dedicated electric circuit that is a processor having a circuit configuration specifically designed to execute a specific process such as a programmable logic device (PLD) or an application specific integrated circuit (ASIC). It is.
- PLD programmable logic device
- ASIC application specific integrated circuit
- One processing unit may be configured by one of these various processors, or may be configured by two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of CPU and FPGA). May be. Further, the plurality of processing units may be configured by one processor. As an example of configuring a plurality of processing units with one processor, first, as represented by a computer such as a client or server, one processor is configured with a combination of one or more CPUs and software. There is a form in which the processor functions as a plurality of processing units.
- SoC System On Chip
- various processing units are configured using one or more of the various processors as a hardware structure.
- the hardware structure of these various processors is more specifically an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined.
- the present invention includes a program that functions as an imaging apparatus according to the present invention by being installed in a computer in the imaging apparatus, and a recording medium on which the program is recorded.
- Imaging device 10
- Finder window 22
- Shutter release switch 23
- Shutter speed dial 24
- Exposure compensation dial 25
- Power lever 26
- Eyepiece 27
- MENU / OK key 28
- Play button 30
- Built-in flash 100 interchangeable lenses
- Imaging optical system 104
- Lens group 108
- Focus lens control unit 118
- Aperture control unit 120
- Lens side CPU 122
- RAM 124
- ROM 126 Flash ROM
- Lens side communication unit 160
- Lens mount 200
- Camera body 201
- Image sensor 202
- Image sensor controller 203
- Analog signal processor 204
- Image input controller 206
- Digital signal processor 207
- Compression / decompression processor 210
- Memory control unit 212 memory card
- Display control unit 216
- Main CPU 220A determination unit 220B
- Imaging condition calculation unit 220C
- Positioning section 220D composite image generator 221,223 Image acquisition unit
- Operation unit 224 clock 226 flash ROM 228
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
Abstract
L'invention concerne un dispositif, un procédé et un programme d'aide à l'alignement d'image, et un dispositif d'imagerie qui sont prévus de sorte que, lors de l'alignement et de la combinaison de multiples images, les images puissent être alignées avec une précision élevée, et de multiples images à partir desquelles générer une image composite souhaitée puissent être acquises. Si trois premières images S-1, S0, S+1 satisfont une condition d'alignement nécessaire à l'alignement pour l'image composite sont déterminées ; lorsque deux premières images S-1, S0 sont déterminées comme ne satisfaisant pas la condition d'alignement, une seconde condition d'imagerie (valeur EV) pour diviser de manière interne deux premières conditions d'imagerie (valeurs EV) des deux premières images S-1, S0, est calculée sur la base des premières conditions d'imagerie (valeurs EV) des deux premières images S-1, S0 ; et des secondes images A1, A2 capturées à l'aide de la seconde condition d'imagerie sont acquises en plus des trois premières images S-1, S0, S+1. Par conséquent, un alignement très précis peut être obtenu, et de multiples images à partir desquelles générer une image composite souhaitée peuvent également être acquises.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2020511650A JP6810298B2 (ja) | 2018-04-05 | 2019-03-04 | 画像位置合わせ補助装置、方法及びプログラム並びに撮像装置 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018-072844 | 2018-04-05 | ||
| JP2018072844 | 2018-04-05 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019193889A1 true WO2019193889A1 (fr) | 2019-10-10 |
Family
ID=68100424
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2019/008299 Ceased WO2019193889A1 (fr) | 2018-04-05 | 2019-03-04 | Dispositif, procédé, et programme d'aide à l'alignement d'image, et dispositif d'imagerie |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP6810298B2 (fr) |
| WO (1) | WO2019193889A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2025070010A1 (fr) * | 2023-09-28 | 2025-04-03 | 富士フイルム株式会社 | Système de traitement d'informations, dispositif de génération de modèle tridimensionnel, dispositif optique, procédé de traitement d'informations et programme associé |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2010273038A (ja) * | 2009-05-20 | 2010-12-02 | Hoya Corp | 撮像装置 |
| JP2015232620A (ja) * | 2014-06-09 | 2015-12-24 | キヤノン株式会社 | 撮像装置、制御方法およびプログラム |
-
2019
- 2019-03-04 WO PCT/JP2019/008299 patent/WO2019193889A1/fr not_active Ceased
- 2019-03-04 JP JP2020511650A patent/JP6810298B2/ja active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2010273038A (ja) * | 2009-05-20 | 2010-12-02 | Hoya Corp | 撮像装置 |
| JP2015232620A (ja) * | 2014-06-09 | 2015-12-24 | キヤノン株式会社 | 撮像装置、制御方法およびプログラム |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2025070010A1 (fr) * | 2023-09-28 | 2025-04-03 | 富士フイルム株式会社 | Système de traitement d'informations, dispositif de génération de modèle tridimensionnel, dispositif optique, procédé de traitement d'informations et programme associé |
Also Published As
| Publication number | Publication date |
|---|---|
| JP6810298B2 (ja) | 2021-01-06 |
| JPWO2019193889A1 (ja) | 2021-02-25 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN105100594B (zh) | 摄像装置和摄像方法 | |
| US11184523B2 (en) | Imaging apparatus with phase difference detecting element | |
| US9179059B2 (en) | Image capture device and image display method | |
| JP5567235B2 (ja) | 画像処理装置、撮影装置、プログラム及び画像処理方法 | |
| JP4872797B2 (ja) | 撮像装置、撮像方法および撮像プログラム | |
| US9258478B2 (en) | Imaging apparatus, imaging method thereof, and computer readable recording medium | |
| JP5923670B2 (ja) | 撮像装置及び撮像方法 | |
| CN109417592B (zh) | 拍摄装置、拍摄方法及拍摄程序 | |
| WO2015045829A1 (fr) | Dispositif et procédé d'imagerie | |
| US11032483B2 (en) | Imaging apparatus, imaging method, and program | |
| JP6998454B2 (ja) | 撮像装置、撮像方法、プログラム及び記録媒体 | |
| JP6810299B2 (ja) | 画像処理装置、方法、及びプログラム並びに撮像装置 | |
| JP2021192544A (ja) | 画像処理装置、撮影装置、画像処理方法及び画像処理プログラム | |
| WO2020158069A1 (fr) | Dispositif d'imagerie, procédé d'imagerie et programme | |
| JP7112529B2 (ja) | 撮像装置、撮像方法、及びプログラム | |
| CN109845241B (zh) | 摄像装置、摄像方法及记录介质 | |
| JP6622945B2 (ja) | 画像処理装置、画像処理方法、およびプログラム | |
| JP6810298B2 (ja) | 画像位置合わせ補助装置、方法及びプログラム並びに撮像装置 | |
| US10674092B2 (en) | Image processing apparatus and method, and image capturing apparatus | |
| WO2015129479A1 (fr) | Dispositif et procédé d'imagerie et programme | |
| JP5182395B2 (ja) | 撮像装置、撮像方法および撮像プログラム | |
| WO2019202983A1 (fr) | Dispositif de capture d'image, procédé de mesure de distance, programme de mesure de distance et support d'enregistrement | |
| WO2020003944A1 (fr) | Dispositif d'imagerie, procédé d'imagerie, et programme |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19780983 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2020511650 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19780983 Country of ref document: EP Kind code of ref document: A1 |