[go: up one dir, main page]

WO2020066341A1 - Dispositif de détection de degré de mise au point, dispositif de génération de carte de profondeur et dispositif électronique - Google Patents

Dispositif de détection de degré de mise au point, dispositif de génération de carte de profondeur et dispositif électronique Download PDF

Info

Publication number
WO2020066341A1
WO2020066341A1 PCT/JP2019/031620 JP2019031620W WO2020066341A1 WO 2020066341 A1 WO2020066341 A1 WO 2020066341A1 JP 2019031620 W JP2019031620 W JP 2019031620W WO 2020066341 A1 WO2020066341 A1 WO 2020066341A1
Authority
WO
WIPO (PCT)
Prior art keywords
focus
image
imaging
phase difference
difference detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2019/031620
Other languages
English (en)
Japanese (ja)
Inventor
幸也 田中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Semiconductor Solutions Corp
Original Assignee
Sony Semiconductor Solutions Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Semiconductor Solutions Corp filed Critical Sony Semiconductor Solutions Corp
Priority to KR1020217007718A priority Critical patent/KR20210065939A/ko
Priority to DE112019004845.7T priority patent/DE112019004845T5/de
Publication of WO2020066341A1 publication Critical patent/WO2020066341A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/32Measuring distances in line of sight; Optical rangefinders by focusing the object, e.g. on a ground glass screen
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/957Light-field or plenoptic cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/958Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
    • H04N23/959Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B30/00Camera modules comprising integrated lens units and imaging units, specially adapted for being embedded in other devices, e.g. mobile phones or vehicles

Definitions

  • the present disclosure relates to a focus degree detection device, a depth map generation device, and an electronic device.
  • a depth value up to each point on an object surface is determined by using a group of images of a measurement target object (ie, a group of multifocal images) photographed while changing the focus.
  • a method of estimating generally called SFF / SFD (Shape @ From @ Focus / Defocus) method.
  • SFF / SFD Shape @ From @ Focus / Defocus
  • first phase difference detection pixel data and second phase difference detection pixel data are extracted from image data using the position information of the phase difference detection pixels, and the first window and the second window are extracted. Based on the movement of the window, a calculation process of calculating a first phase graph and a second phase graph from the first phase difference detection pixel data and the second phase difference detection pixel data is performed. Then, arithmetic processing for calculating a phase difference shift is performed using the first phase graph and the second phase graph, and a depth map is generated based on the phase difference shift obtained by this calculation. .
  • depth information (depth value) on which a depth map is generated is acquired from image data by arithmetic processing using position information of a phase difference detection pixel. Therefore, the responsiveness is poor, and the circuit scale for arithmetic processing is increased.
  • the present disclosure provides a depth-of-focus detection device that can achieve acquisition of depth information in a short processing time with a small circuit scale, a depth map generation device that can realize generation of a depth map in a short processing time, and the depth map.
  • An object of the present invention is to provide an electronic device equipped with a generation device.
  • An imaging optical system that captures image light from the subject, Phase difference detection pixels for obtaining a first image and a second image based on a light beam respectively passing through the first pupil region and the second pupil region included in the exit pupil of the imaging optical system are arranged in a two-dimensional array.
  • An imaging unit comprising: A detection unit that detects a phase difference detection pixel in which the luminance difference between the first image and the second image is equal to or less than a predetermined value for each of a plurality of images captured at different focal distances; It is characterized by the following.
  • a depth map generation device of the present disclosure for achieving the above object, An imaging optical system that captures image light from the subject, Phase difference detection pixels for obtaining a first image and a second image based on a light beam respectively passing through the first pupil region and the second pupil region included in the exit pupil of the imaging optical system are arranged in a two-dimensional array.
  • Imaging unit A detection unit that detects a phase difference detection pixel in which a luminance difference between the first image and the second image is equal to or less than a predetermined value for each of a plurality of images captured at different focal distances; A first method of mapping the position information of the phase difference detection pixels detected by the detection unit into a two-dimensional array for each of a plurality of images captured at different in-focus distances to generate a two-dimensional focus degree map of the subject.
  • an electronic apparatus for achieving the above object is equipped with the depth map generation device having the above configuration.
  • FIG. 1 is a system configuration diagram schematically illustrating a configuration of an imaging system according to the first embodiment of the present disclosure.
  • FIG. 2 is a schematic configuration diagram illustrating an example of a configuration of an imaging unit and a phase difference detection pixel.
  • FIG. 3 is a diagram conceptually showing the exit pupil division direction and the exit pupil area of the imaging optical system.
  • FIG. 4 is a principle diagram of the focus degree detection for detecting the focus degree using the phase difference detection pixels.
  • FIG. 5 illustrates a detection unit that detects a phase difference detection pixel in a focused state, a first map generation unit that generates a two-dimensional focus degree map of a subject, and a second map generation unit that generates a depth map.
  • FIG. 3 is a functional block diagram of a calculation unit having each function.
  • FIG. 6 is a diagram illustrating a specific example of generating a two-dimensional focus degree map and a depth map in the imaging system according to the first embodiment.
  • FIG. 7 is a flowchart illustrating an example of a process of acquiring focusing information of a phase difference detection pixel in the imaging system according to the first embodiment.
  • FIG. 8 is a system configuration diagram illustrating an outline of a configuration of an imaging system according to the second embodiment of the present disclosure.
  • FIG. 9 is a diagram illustrating a specific example of generating a two-dimensional focus degree map and a depth map in the imaging system according to the second embodiment.
  • FIG. 10 is a flowchart illustrating an example of a process of acquiring focusing information of a phase difference detection pixel in the imaging system according to the second embodiment.
  • FIG. 11 is a flowchart illustrating another example of the process of acquiring the focus information of the phase difference detection pixels in the imaging system according to the second embodiment.
  • FIG. 12A is an external view as viewed from the front side of a smartphone according to a specific example of the electronic device of the present disclosure
  • FIG. 12B is an external view as viewed from the back side.
  • a two-dimensional array is used to store position information of phase difference detection pixels detected for each of a plurality of images captured at different focal distances.
  • a map generation unit (first map generation unit) for generating a two-dimensional in-focus degree map of the subject.
  • the phase difference detection pixel may be configured to include a pair of light receiving elements. Then, the pair of light receiving elements are arranged side by side in a single pixel in a first direction, a second direction orthogonal to the first direction, or both directions of the first direction and the second direction. Configuration.
  • a single imaging capable of adjusting the position of the imaging optical system in the optical axis direction is provided. It is possible to have a configuration in which a lens is provided and a plurality of images taken at different in-focus distances are obtained by performing a plurality of shootings at different positions of a single imaging lens.
  • the focus detection device, the depth map generation device, and the electronic device of the present disclosure including the above-described preferred configuration include, in the electronic apparatus, an imaging optical system, including a plurality of imaging lenses having different focal positions, It is possible to obtain a plurality of images taken at different in-focus distances by a single photographing with a plurality of imaging lenses.
  • FIG. 1 is a system configuration diagram schematically illustrating a configuration of an imaging system according to the first embodiment of the present disclosure.
  • the imaging system 10A according to the first embodiment has a function of focus degree detection and a function of generating a depth map based on the focus degree, in addition to a normal photographing (standard photographing) function.
  • the imaging system 10A includes an imaging optical system 11, a lens / aperture drive unit 12, an imaging device 13, an operation unit 14, an operation unit 15, a storage unit 16, a display unit 17, and a system control unit 18.
  • System configuration In this system configuration, the imaging device 13, the operation unit 15, the storage unit 16, the display unit 17, and the system control unit 18 are connected to each other via a control bus line 19A.
  • the imaging device 13, the operation unit 15, the storage unit 16, and the display unit 17 are connected to each other via a data bus line 19B.
  • the imaging optical system 11 has an imaging lens 111 and an aperture 112, takes in image light from a subject (light of an image incident from the subject), and forms an image on an imaging surface of the imaging device 13.
  • the position of the imaging lens 111 in the direction of the optical axis O can be adjusted by driving the lens / aperture drive unit 12. That is, the imaging system 10A according to the first embodiment has a configuration of a so-called monocular system having the imaging lens 111 capable of adjusting the position in the optical axis direction.
  • the imaging system 10 ⁇ / b> A detects the degree of focus, and in the mode of the accompanying depth map generation, by the drive of the lens / aperture drive unit 12 under the control of the system control unit 18.
  • a plurality of photographing positions can be set as positions of the imaging lens 111 in the optical axis direction. Then, a plurality of images taken at different in-focus distances can be obtained by a plurality of shootings at different positions of the imaging lens 111.
  • the imaging device 13 includes an imaging unit 131, an analog signal processing unit 132, an A (analog) / D (digital) conversion unit 133, a digital image signal processing unit 134, and a control unit 135.
  • the imaging unit 131 has a configuration in which phase difference detection pixels for detecting a phase difference between a pair of images based on image light from a subject are arranged in a two-dimensional array. More specifically, as shown in FIG. 2, the imaging unit 131 includes a pixel array unit 1311 in which the phase difference detection pixels 20 are arranged in a two-dimensional array, and each pixel of the pixel array unit 1311 (the phase difference detection pixel 20) comprises a row selection unit 1312 for selectively scanning each pixel row and a CMOS sensor having a column selection unit 1313 for selectively scanning each pixel in each pixel column.
  • the imaging unit 131 is not limited to the CMOS sensor, and may be configured by a CCD sensor.
  • the phase difference detection pixel 20 includes a first image based on the first light beam that has passed through the first pupil region included in the exit pupil of the imaging optical system 11 and a second image included in the exit pupil of the imaging optical system 11. And a second image based on the second light flux passing through the pupil region. Then, the degree of focus of the imaging optical system 11 can be detected from the luminance difference between the pair of images, that is, the luminance difference between the first image and the second image. Details of the focus degree detection (focus detection) will be described later.
  • the analog signal processing unit 132 performs, for example, signal processing such as sample and hold on a pixel signal output from each pixel (the phase difference detection pixel 20) of the pixel array unit 1311 for each pixel column.
  • the A / D converter 133 includes a set of a plurality of analog-to-digital converters provided corresponding to the pixel columns of the pixel array unit 1311, and converts an analog pixel signal output from the analog signal processing unit 132 into a digital signal. Convert to As the analog-to-digital converter of the A / D converter 133, a well-known analog-to-digital converter can be used. Examples of the well-known analog-digital converter include a single-slope analog-digital converter, a successive approximation analog-digital converter, and a delta-sigma ( ⁇ ) analog-digital converter. However, the analog-digital converter is not limited to these.
  • the digital image signal processing unit 134 reads the digital pixel signal A / D converted by the A / D conversion unit 133 for each pixel column, performs various signal processing such as amplification processing and arithmetic processing, for example, and performs the present imaging apparatus. 13 to the data bus line 19B.
  • the control unit 135 generates various timing signals, clock signals, control signals, and the like under the control of the system control unit 18, and based on the generated signals, the row selection unit 1312 of the imaging unit 131 and the analog signal processing.
  • the drive control of the section 132, the A / D conversion section 133, the digital image signal processing section 134, and the like is performed.
  • the operation unit 14 issues operation commands to the system control unit 18 for various functions of the imaging device 13 under the operation of the user.
  • the calculation unit 15 performs calculation processing for general camera processing, for example, white balance processing, demosaic processing, gamma correction processing, and the like.
  • the storage unit 16 is used for storing data as needed in the course of the arithmetic processing by the arithmetic unit 15.
  • the display unit 17 includes a panel-type display device such as a liquid crystal display device or an organic EL display device, and displays a moving image or a still image captured by the imaging device 13.
  • the system control unit 18 is configured by, for example, a microcomputer, and controls the lens / aperture drive unit 12 and the imaging device 13 based on an operation command from the operation unit 14 by a user, and further, via a control bus line 19A.
  • the control unit 15 controls the calculation unit 15, the storage unit 16, and the display unit 17.
  • an image plane phase difference AF is known as one of AF (autofocus) methods for automatically adjusting the focus of a camera.
  • the phase difference detection pixel 20 is used to realize the image plane phase difference AF.
  • the degree of focus (focus) of the imaging optical system 11 can be detected as described later.
  • the phase difference detection pixel 20 has a pair of light receiving elements (for example, photodiodes) 20A and 20B provided in, for example, a single pixel.
  • FIG. 3 conceptually shows the dividing direction of the ranging pupil and the area of the ranging pupil of the phase difference detection pixel 20.
  • FIG. 3 is a diagram of the imaging system (camera) viewed obliquely from the front.
  • the ranging pupil of the phase difference detection pixel 20 is a division in the first direction (the row direction / left / right direction), and each ranging pupil is one of the left and right divided regions of the exit pupil of the imaging optical system 11. It is large enough to cover.
  • the phase difference detection pixel 20 includes an on-chip micro lens 21 on the light receiving surface side (the imaging optical system 11 side) of the pair of light receiving elements 20A and 20B.
  • the exit pupil of the imaging optical system 11 and the pair of light receiving elements 20A and 20B are placed in a conjugate relationship by the on-chip microlens 21 in order to efficiently use the imaging light flux from the imaging optical system 11.
  • the phase difference detection pixel 20 is provided with a pupil division function.
  • phase difference detection pixel 20 a light beam passing through the right half of the imaging optical system 11 is guided to one light receiving element 20A, and a light beam passing through the left half of the imaging optical system 11 is guided to the other light receiving element 20B. Be guided. Then, an image captured by the plurality of light receiving elements 20A is referred to as a first image A, an image captured by the plurality of light receiving elements 20B is referred to as a second image B, and an overlap between the first image A and the second image B is determined. By detecting the degree, the degree of focus of the imaging optical system 11 can be detected. A part of the focus degree detection function may be used also as the auto focus function.
  • the pair of light receiving elements 20A and 20B of the phase difference detection pixel 20 are arranged side by side in the first direction (row direction / left / right direction).
  • the pair of light receiving elements 20A and 20B may be arranged side by side in a second direction (column direction / up / down direction) orthogonal to the first direction, or in the first direction and the second direction. May be arranged side by side in both directions.
  • the arrangement of the pair of light receiving elements 20A and 20B is not limited to the arrangement within a single pixel, and may be configured such that half of the pixels are arranged over a plurality of masked pixels.
  • FIG. 4 shows a principle diagram of the focus degree detection for detecting the focus degree using the phase difference detection pixel 20.
  • FIG. 4 shows a focusing state, a front focus state, and a rear focus state in the imaging optical system 11.
  • the “focused state” refers to a state in which the focus is on the imaging surface of the imaging unit 131
  • the “front focus state” refers to a state in which the focus position is on the front side of the imaging surface of the imaging unit 131. (The imaging optical system 11 side)
  • the “back focus state” refers to a state in which the focus position is shifted rearward from the imaging surface of the imaging unit 131.
  • the first light flux that has passed through the first pupil region included in the exit pupil of the imaging optical system 11 and the second light beam that has passed through the second pupil region included in the exit pupil of the imaging optical system 11 The light beam overlaps most, and the luminance of the image based on the first light beam and the second light beam acquired by the pair of light receiving elements 20A and 20B of the phase difference detection pixel 20 becomes maximum.
  • the center of gravity of the first pupil region matches the center of gravity of the second pupil region, and the brightness of the first image A acquired by one light receiving element 20A and the other light receiving element 20B And the brightness of the second image B acquired by the above becomes equal.
  • “equal” means not only the case where the luminance of the first image A and the luminance of the second image B are exactly equal but also the case where they are substantially equal. The existence of various manufacturing variations is acceptable.
  • the state where the luminance difference between the first image A and the second image B is ideally zero is the in-focus state, but here, the luminance of the first image A and the second image B is A state where the difference is equal to or less than a predetermined value close to zero is defined as an in-focus state.
  • the center of gravity of the first pupil region and the center of gravity of the second pupil region do not match, and are located at different positions. Then, in the front focus state, with respect to the optical center O, the first image A acquired by the plurality of light receiving elements 20A moves to the left, and the second image B acquired by the plurality of light receiving elements 20B moves to the right. . In the back focus state, the second image B acquired by the plurality of light receiving elements 20B moves away from the optical center O to the left, and the first image A acquired by the plurality of light receiving elements 20A moves away to the right. . In such a state, a difference occurs between the output of the light receiving element 20A and the output of the light receiving element 20B.
  • the detection process of the degree of focus (focus) of the imaging optical system 11 based on the output signals of the pair of light receiving elements 20A and 20B of the phase difference detection pixel 20 is executed by, for example, the calculation unit 15 (see FIG. 1). That is, the calculating unit 15 detects, for each of a plurality of images captured at different in-focus distances, a phase difference detection pixel in which the luminance difference between the first image A and the second image B is equal to or smaller than a predetermined value.
  • the calculating unit 15 detects, for each of a plurality of images captured at different in-focus distances, a phase difference detection pixel in which the luminance difference between the first image A and the second image B is equal to or smaller than a predetermined value.
  • the position of the imaging lens 111 is adjusted in the optical axis direction, and the position of the imaging lens 111 is changed at a different position, that is, the focus is adjusted. Shooting at different distances to obtain a plurality of images.
  • the arithmetic unit 15 controls the luminance difference
  • the calculation unit 15 provided outside the imaging device 13 has a function of a detection unit that detects a phase difference detection pixel in a focused state, but the detection unit is provided by a CMOS sensor or a CCD sensor.
  • the imaging device 13 may be configured to be incorporated as a part of a functional unit.
  • the calculation unit 15 has a function of detecting a phase difference detection pixel in a focused state, and also detects a phase difference in a focused state, which is detected for each of a plurality of images taken at different in-focus distances. It has a function of a first map generation unit that maps pixel position information into a two-dimensional array and generates a two-dimensional focus degree map of a subject. In the calculation unit 15, the function of the first map generation unit is executed under the control of the system control unit 18.
  • the depth information (depth value) of the three-dimensional shape of the subject can be estimated from the focusing distance of each of a plurality of images.
  • the calculation unit 15 further includes, in addition to the function of the detection unit and the function of the first map generation unit, depth information of the subject estimated from the in-focus distance of each of the plurality of images, and a two-dimensional focus degree It has a function of a second map generation unit that generates a depth map based on the map.
  • the function of the second map generation unit is executed under the control of the system control unit 18.
  • FIG. 5 shows a functional block diagram of the operation unit 15.
  • the calculation unit 15 includes a detection unit 151 that detects a phase difference detection pixel in a focused state, a first map generation unit 152 that generates a two-dimensional focus degree map of a subject, and a depth map.
  • the position P 1 farthest from the imaging surface of the imaging unit 131, and the position P 2 , the position P 3 , the position closest to the imaging surface as approaching the imaging surface. shall be set are four in close P 4.
  • the distance between the focus when taken at the position P 1 matches and f 1
  • the distance the focus when taken at the position P 2 matches the f 2
  • the distance the focus when taken at the position P 3 fit f 3 and then, the in focus distance when taken with the position P 4 to f 4.
  • the position of the phase difference detection pixel 20 that is in focus is: It differs according to the three-dimensional shape of the subject.
  • the pixel array of the pixel array unit 1311 is 4 pixels in the horizontal direction ⁇ 4 pixels in the vertical direction is taken as an example, and generation of a two-dimensional focus degree map and a depth map will be described with reference to FIG. This will be described with reference to FIG.
  • the first row and first column and four rows of two points of the position of the fourth column phase difference detection pixel 20 is in focus It shall be.
  • a distance f 2 that in focus corresponding to the three-dimensional shape of the object, second row and first column, first row and second column, fourth row third column and the fourth position of the third row fourth column It is assumed that the phase difference detection pixels 20 at the location are in focus.
  • phase difference detection pixels 20 at the position of the second row and the fourth column are in focus.
  • a distance f 4 which in focus corresponding to the three-dimensional shape of the object, 4 row and first column, third row second column, second row third column and the fourth position of the first row 4 column It is assumed that the phase difference detection pixels 20 at the location are in focus.
  • the position information of the phase difference detection pixel 20 detected for each of the four images captured at different in-focus distances f 1 to f 4 is obtained by the function of the first map generation unit of the calculation unit 15.
  • a two-dimensional focus degree map of the subject can be generated for each image at the focused distances f 1 to f 4 .
  • the depth information D 1 to D 4 of the three-dimensional shape of the subject estimated from the in-focus distances f 1 to f 4 and the two-dimensional A depth map can be generated based on the focus degree map.
  • the above-described specific example is an example of an algorithm for generating a depth map, and is not limited thereto.
  • An existing generation algorithm may be used.
  • the calculation unit 15 in the imaging system 10A detects the degree of focus using the phase difference detection pixel 20, and obtains the depth map of the three-dimensional shape of the subject (for example, the face to be recognized). Is generated, it is only necessary to detect a phase difference detection pixel in which the luminance difference between the first image A and the second image B is equal to or less than a predetermined value.
  • the arithmetic unit 15 simply performs focusing on each of a plurality of images captured at different in-focus distances without acquiring depth information based on which a depth map is generated from the image data by arithmetic processing.
  • the depth map can be generated only by detecting the phase difference detection pixels in the state. Therefore, acquisition of depth information and generation of a depth map can be realized in a short processing time with a small circuit scale.
  • FIG. 7 is a flowchart illustrating an example of a process of acquiring focusing information of the phase difference detection pixel 20 in the imaging system 10A according to the first embodiment.
  • the position of the imaging lens 111 in the optical axis direction in the above case, four types of positions P 1 to P 4 can be set.
  • the position numbers N of the positions P 1 to P 4 are set to 1 to 4.
  • the coordinates of the phase difference detection pixel 20 in the focused state be PD (m, n).
  • the process of acquiring the focus information of the phase difference detection pixel 20 in the focused state is a part of the process for generating the two-dimensional focus degree map. This is a process executed by the unit 15.
  • the system controller 18 when taken at the position P 1, reads out the first image A and a second image B captured by the phase difference detection pixel 20 coordinate PD (1, 1) (Step S14 Then, it is determined whether or not the luminance difference
  • the system control unit 18 determines that the phase difference detection pixel 20 is in the focused state, and For example, a focus degree based on the luminance difference
  • the focus degree output as the focus information is a signal indicating the closeness of the luminance between the first image and the second image. If the luminance difference
  • the system control unit 18 increments the position number N and sets the initial value 1 as the coordinate position m, n in the horizontal and vertical directions. Is set (step S22), and then the imaging lens 11 is driven by a predetermined amount to the next position (step S23). Thereafter, the process returns to step S13. Then, the system control unit 18 repeatedly executes the processing from step S13 to step S21 until the position number N exceeds the maximum value Nmax .
  • the focus information of the phase difference detection pixel 20 in focus which is used for generating the two-dimensional focus degree map, is focused.
  • the degree can be obtained, and the position of the phase difference detection pixel 20 in the image (pixel array unit) can be specified.
  • a degree map can be generated, and a depth map can be generated based on the two-dimensional focus degree map.
  • the focus information of the phase difference detection pixel 20 in the focused state the focus degree based on the luminance difference
  • position information may be output instead of the degree of focus.
  • the focus information of the phase difference detection pixel 20 in a focused state may be output as it is, or may be compressed and output.
  • FIG. 8 is a system configuration diagram illustrating an outline of a configuration of an imaging system according to the second embodiment of the present disclosure. Similar to the imaging system 10A according to the first embodiment, the imaging system 10B according to the second embodiment has a function of normal imaging (standard imaging), a degree of focus detection, and generation of a depth map based thereon. Has the function of
  • An imaging system 10B has a system configuration including an imaging optical system 11, an imaging device 13, an operation unit 14, a calculation unit 15, a storage unit 16, a display unit 17, and a system control unit 18. .
  • the imaging device 13, the operation unit 15, the storage unit 16, the display unit 17, and the system control unit 18 are connected to each other via a control bus line 19A.
  • the imaging device 13, the operation unit 15, the storage unit 16, and the display unit 17 are connected to each other via a data bus line 19B.
  • the imaging optical system 11 has a configuration of a so-called compound eye system having a plurality of imaging lenses 111 (multi-lens lenses) arranged around the optical axis O and having different focal positions (focus depths).
  • the compound eye system can also be called a multi-eye system.
  • the imaging optical system 11 of the compound-eye system configuration multi-view system configuration
  • the configurations and functions of the imaging device 13, the operation unit 14, the calculation unit 15, the storage unit 16, the display unit 17, and the system control unit 18 are basically the same as the corresponding components in the imaging system 10A according to the first embodiment. Is the same as The principle of focus degree detection based on the first image A and the second image B acquired by the phase difference detection pixels 20 arranged in the two-dimensional array in the imaging unit 131 of the imaging device 13 is also described. This is as described in the first embodiment.
  • the imaging system 10B according to the second embodiment since it is a compound eye system configuration having a plurality of imaging lenses 111 having different focal positions, a plurality of imaging lenses at different focal lengths can be obtained by one shooting. One image can be obtained. Accordingly, the imaging system 10B according to the second embodiment is more advantageous than the imaging system 10A according to the first embodiment with respect to the time required to acquire a plurality of images captured at different in-focus distances. In the case of the imaging system 10B according to the second embodiment, since there is no mechanism for moving the imaging lens 111 in the direction of the optical axis O, the direction of the optical axis O is smaller than that of the imaging system 10A according to the first embodiment. Can be reduced in size.
  • FIG. 9 shows a specific example of generating a two-dimensional focus degree map and a depth map in the imaging system 10B according to the second embodiment.
  • the signals from the two-dimensional array-shaped phase difference detection pixels 20 are sequentially AD-converted, the brightness of the paired first image A and the brightness of the second image B are compared, and output as a focus degree indicating the closeness of brightness. I do.
  • mapping the position information of the in-focus phase difference detection pixels 20 in a two-dimensional array a two-dimensional focus degree map of the subject is generated for each image at the focused distances f 1 to f 4. be able to.
  • a depth map can be generated based on the depth information D 1 to D 4 of the three-dimensional shape of the subject estimated from the focusing distances f 1 to f 4 and the two-dimensional focus degree map. .
  • FIG. 10 is a flowchart illustrating an example of a process of acquiring focusing information of the phase difference detection pixel 20 in the imaging system 10B according to the second embodiment.
  • the number of images captured and acquired at different in-focus distances is set to four, and corresponding to the movement positions P 1 to P 4 of the imaging lens 111, The distances at which the respective focuses are set are f 1 to f 4 .
  • the imaging system 10B according to the second embodiment in the imaging optical system 11, four imaging lenses 111 are arranged around the optical axis O, and the focus of each of the four imaging lenses 111 is The matching distance is defined as f 1 to f 4 .
  • the process of acquiring the focus information of the phase difference detection pixel 20 is a part of the process for generating the two-dimensional focus degree map. Therefore, similarly to the case of the imaging system 10A according to the first embodiment, the acquisition process of the focusing information of the phase difference detection pixels 20 is executed in the calculation unit 15 under the control of the system control unit 18.
  • a two-dimensional map is read and synthesized for each of different areas at different distances where the compound eye is in focus.
  • the system control unit 18 sets the coordinate position m in the horizontal direction (row direction) and the coordinate position n in the vertical direction (column direction) to 1 (step S31), and then acquires four captured images (step S31). (Step S32) Then, the first image A and the second image B captured by the phase difference detection pixels 20 of the obtained image are read out (Step S33).
  • the system control unit 18 determines whether the luminance difference
  • focusing is performed as focus information of the phase difference detection pixel 20 in a focused state used for generating a two-dimensional focus degree map.
  • the degree can be obtained, and the position of the phase difference detection pixel 20 in the image (pixel array unit) can be specified.
  • a depth map can be generated based on the two-dimensional focus degree map.
  • FIG. 11 shows another example of the process of acquiring the focusing information of the phase difference detection pixel 20 in the imaging system 10B according to the second embodiment.
  • the image numbers M of the images taken at the in-focus distances f 1 to f 4 are set to 1 to 4.
  • the two-dimensional map is read and synthesized for each of different areas at different distances where the compound eye is in focus.
  • Step S41 the system control unit 18 increments the image number M (Step S41), and then returns to Step S33, and returns from Step S34 to Step S34.
  • the processing up to S41 is repeatedly executed until the image number M exceeds the maximum value Mmax . If the image number M exceeds the maximum value Mmax (Yes in S40), the system control unit 18 ends a series of processing for acquiring the focusing information of the phase difference detection pixel 20.
  • the focus degree is acquired as the focus information of the phase difference detection pixel 20 in the focused state, and the position information of the phase difference detection pixel 20 is obtained.
  • the focus information of the phase difference detection pixel 20 in a focused state the focus degree based on the luminance difference
  • ⁇ Modification> As described above, the technology of the present disclosure has been described based on the preferred embodiments, but the technology of the present disclosure is not limited to the embodiments.
  • the configuration and structure of the imaging system described in each of the above embodiments are examples, and can be changed as appropriate.
  • a depth sensing system using a multi-lens lens having a different focusing depth and a normal monocular system for imaging are hybridized.
  • a hybrid system can also be used.
  • the resolution of pixel readout and the resolution of focus determination may be changed, and the control of the resolution in the depth (depth) direction may be performed by controlling the position of the imaging lens in the optical axis direction.
  • imaging, detection, and reading may be performed in consideration of the response speed of the position control of the imaging lens.
  • the imaging system 10A according to the first embodiment or the imaging system 10B according to the second embodiment including the focus detection device and the depth map generation device according to the present disclosure described above includes various electronic devices having a camera function (imaging function). It can be used as an imaging unit (imaging device) mounted on a device. Examples of the electronic device having the camera function include a mobile device such as a smartphone, a digital camera, a tablet, and a personal computer. However, electronic devices that can use the imaging system 10A according to the first embodiment or the imaging system 10B according to the second embodiment are not limited to mobile devices.
  • FIG. 12A shows an external view of a smartphone according to a specific example of the electronic device of the present disclosure, as viewed from the front side
  • FIG. 12B shows an external view as viewed from the back side.
  • the smartphone 100 includes the display unit 120 on the front side of the housing 110.
  • the smartphone 100 includes imaging units (cameras) 130 and 140 in an upper part on the front side and an upper part on the back side of the housing 110.
  • An imaging system 10B according to the second embodiment can be used. That is, the smartphone 100 according to this specific example is manufactured by using the imaging system 10A according to the first embodiment or the imaging system 10B according to the second embodiment as the imaging units 130 and 140.
  • the smartphone 100 includes the imaging system 10A according to the first embodiment or the imaging system 10B according to the second embodiment as the imaging units 130 and 140, and detects the degree of focus of the present disclosure described above.
  • the functions of the apparatus and the depth map generation apparatus it is possible to have a function of recognizing a three-dimensional shape of a subject, for example, a face recognition function.
  • Focus detection device [A-1] an imaging optical system that captures image light from a subject, Phase difference detection pixels for obtaining a first image and a second image based on a light beam respectively passing through the first pupil region and the second pupil region included in the exit pupil of the imaging optical system are arranged in a two-dimensional array.
  • An imaging unit comprising: A detection unit that detects a phase difference detection pixel in which the luminance difference between the first image and the second image is equal to or less than a predetermined value for each of a plurality of images captured at different focal distances; Focus detection device.
  • [A-2] The position information of the phase difference detection pixels detected by the detection unit is mapped in a two-dimensional array for each of a plurality of images photographed at different in-focus distances, and a two-dimensional focus degree map of the subject is obtained. Having a map generation unit for generating, The focus degree detection device according to the above [A-1].
  • the phase difference detection pixel includes a pair of light receiving elements.
  • [A-4] The pair of light receiving elements are arranged in a single pixel in the first direction, the second direction orthogonal to the first direction, or both the first direction and the second direction. Are located in the The focus degree detection device according to the above [A-3].
  • the imaging optical system has a single imaging lens whose position can be adjusted in the direction of the optical axis. Obtain multiple images taken with The focus detection device according to any one of [A-1] to [A-4].
  • the imaging optical system has a plurality of imaging lenses with different focal positions, and obtains a plurality of images taken at different in-focus distances by one shooting with the plurality of imaging lenses.
  • the focus detection device according to any one of [A-1] to [A-4].
  • Imaging unit A detection unit that detects a phase difference detection pixel in which a luminance difference between the first image and the second image is equal to or less than a predetermined value for each of a plurality of images captured at different focal distances; A first method of mapping the position information of the phase difference detection pixels detected by the detection unit into a two-dimensional array for each of a plurality of images captured at different in-focus distances to generate a two-dimensional focus degree map of the subject.
  • the phase difference detection pixel includes a pair of light receiving elements.
  • a pair of light receiving elements are arranged in a single pixel in a first direction, a second direction orthogonal to the first direction, or both directions of the first direction and the second direction. Are located in the The depth map generation device according to the above [B-2].
  • the imaging optical system has a single imaging lens whose position can be adjusted in the optical axis direction, and different distances at which the single imaging lens focuses on a plurality of times at different positions. Obtain multiple images taken with The depth map generation device according to any of [B-1] to [B-3].
  • the imaging optical system has a plurality of imaging lenses having different focal positions, and obtains a plurality of images taken at different in-focus distances by a single shooting with the plurality of imaging lenses.
  • the depth map generation device according to any of [B-1] to [B-3].
  • C-1 an imaging optical system that captures image light from a subject, Phase difference detection pixels for obtaining a first image and a second image based on a light beam respectively passing through the first pupil region and the second pupil region included in the exit pupil of the imaging optical system are arranged in a two-dimensional array.
  • Imaging unit A detection unit that detects a phase difference detection pixel in which a luminance difference between the first image and the second image is equal to or less than a predetermined value for each of a plurality of images captured at different focal distances; A first method of mapping the position information of the phase difference detection pixels detected by the detection unit into a two-dimensional array for each of a plurality of images captured at different in-focus distances to generate a two-dimensional focus degree map of the subject.
  • An electronic device equipped with a depth map generator.
  • the phase difference detection pixel includes a pair of light receiving elements.
  • the pair of light receiving elements are arranged in a single pixel in the first direction, the second direction orthogonal to the first direction, or both directions of the first direction and the second direction. Are located in the The electronic device according to the above [C-2].
  • the imaging optical system has a single imaging lens whose position in the direction of the optical axis can be adjusted. Obtain multiple images taken with The electronic device according to any one of the above [C-1] to [C-3].
  • the imaging optical system has a plurality of imaging lenses having different focal positions, and obtains a plurality of images photographed at different in-focus distances by one photographing with the plurality of imaging lenses.
  • the electronic device according to any one of the above [C-1] to [C-3].
  • 10A imaging system according to the first embodiment
  • 10B imaging system according to the second embodiment
  • 11 imaging optical system
  • 12 lens / aperture drive unit
  • 14 operation unit, 15 ... calculation unit
  • 16 ... storage unit
  • 17 ... display unit
  • 18 ... system control unit
  • 19A ... control bus line
  • 19B data Bus line
  • 20 phase difference detection pixel
  • 20A, 20B pair of light receiving elements
  • 21 on-chip micro lens

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Computing Systems (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Focusing (AREA)
  • Automatic Focus Adjustment (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention concerne un dispositif de détection de degré de mise au point qui comprend un système optique d'imagerie permettant de recueillir la lumière de l'image d'un sujet, une unité de réseau de pixels dans laquelle des pixels de détection de différence de phases servant à obtenir une première image et une seconde image sur la base de faisceaux lumineux qui ont traversé une première zone de pupille et une seconde zone de pupille du système optique d'imagerie sont agencés selon un réseau bidimensionnel, et une unité de détection permettant, pour chaque image d'une pluralité d'images nettes photographiées à différentes distances, de détecter des pixels de détection de différence de phases pour lesquels la différence de luminosité entre la première image et la seconde image est inférieure ou égale à une valeur prédéfinie.
PCT/JP2019/031620 2018-09-28 2019-08-09 Dispositif de détection de degré de mise au point, dispositif de génération de carte de profondeur et dispositif électronique Ceased WO2020066341A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020217007718A KR20210065939A (ko) 2018-09-28 2019-08-09 합초도 검출 장치, 심도 맵 생성 장치, 및, 전자 기기
DE112019004845.7T DE112019004845T5 (de) 2018-09-28 2019-08-09 Fokusgraddetektionseinrichtung, tiefenkartenerzeugungseinrichtung und elektronische vorrichtung

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018184005 2018-09-28
JP2018-184005 2018-09-28

Publications (1)

Publication Number Publication Date
WO2020066341A1 true WO2020066341A1 (fr) 2020-04-02

Family

ID=69950435

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/031620 Ceased WO2020066341A1 (fr) 2018-09-28 2019-08-09 Dispositif de détection de degré de mise au point, dispositif de génération de carte de profondeur et dispositif électronique

Country Status (3)

Country Link
KR (1) KR20210065939A (fr)
DE (1) DE112019004845T5 (fr)
WO (1) WO2020066341A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023026372A (ja) * 2021-08-11 2023-02-24 ウープティックス ソシエダ リミターダ 波面の空間分布に関する情報を抽出するためのシステムおよび方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014016670A (ja) * 2012-07-05 2014-01-30 Nikon Corp 画像処理装置及び画像処理プログラム
JP2015164284A (ja) * 2014-01-28 2015-09-10 キヤノン株式会社 固体撮像素子、動き情報取得装置、および撮像装置
JP2016192621A (ja) * 2015-03-31 2016-11-10 キヤノン株式会社 撮像装置、画像処理演算装置、及び画像処理方法

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102299575B1 (ko) 2015-03-09 2021-09-07 삼성전자주식회사 위상 검출 픽셀들로부터 깊이 맵을 생성할 수 있는 이미지 신호 프로세서와 이를 포함하는 장치

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014016670A (ja) * 2012-07-05 2014-01-30 Nikon Corp 画像処理装置及び画像処理プログラム
JP2015164284A (ja) * 2014-01-28 2015-09-10 キヤノン株式会社 固体撮像素子、動き情報取得装置、および撮像装置
JP2016192621A (ja) * 2015-03-31 2016-11-10 キヤノン株式会社 撮像装置、画像処理演算装置、及び画像処理方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023026372A (ja) * 2021-08-11 2023-02-24 ウープティックス ソシエダ リミターダ 波面の空間分布に関する情報を抽出するためのシステムおよび方法
JP7549629B2 (ja) 2021-08-11 2024-09-11 ウープティックス ソシエダ リミターダ 波面の空間分布に関する情報を抽出するためのシステムおよび方法
US12108150B2 (en) 2021-08-11 2024-10-01 Wooptix S.L. System and method for extracting information on the spatial distribution of wavefronts

Also Published As

Publication number Publication date
DE112019004845T5 (de) 2021-07-15
KR20210065939A (ko) 2021-06-04

Similar Documents

Publication Publication Date Title
US11856291B2 (en) Thin multi-aperture imaging system with auto-focus and methods for using same
US10999544B2 (en) Image sensor including phase detection pixels and image pickup device
JP6555264B2 (ja) 複眼撮像装置
JP5657184B2 (ja) 撮像装置及び信号処理方法
JP2017158018A (ja) 画像処理装置およびその制御方法、撮像装置
JP6594048B2 (ja) 撮像装置及びその制御方法
WO2020066341A1 (fr) Dispositif de détection de degré de mise au point, dispositif de génération de carte de profondeur et dispositif électronique
JP7378935B2 (ja) 画像処理装置
JP2016133595A (ja) 制御装置、撮像装置、制御方法、プログラム、および、記憶媒体
US20240163581A1 (en) Imaging element and imaging device
JP6891470B2 (ja) 撮像装置
JP2016009024A (ja) 焦点検出装置およびその制御方法、並びに撮像装置
JP2019041300A (ja) 撮像装置
JP2016024402A (ja) 焦点調節装置、焦点調節方法およびプログラム、並びに撮像装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19866473

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19866473

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP