[go: up one dir, main page]

WO2021093528A1 - Focusing method and apparatus, and electronic device and computer readable storage medium - Google Patents

Focusing method and apparatus, and electronic device and computer readable storage medium Download PDF

Info

Publication number
WO2021093528A1
WO2021093528A1 PCT/CN2020/122301 CN2020122301W WO2021093528A1 WO 2021093528 A1 WO2021093528 A1 WO 2021093528A1 CN 2020122301 W CN2020122301 W CN 2020122301W WO 2021093528 A1 WO2021093528 A1 WO 2021093528A1
Authority
WO
WIPO (PCT)
Prior art keywords
phase difference
difference value
target
pixel
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2020/122301
Other languages
French (fr)
Chinese (zh)
Inventor
贾玉虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Publication of WO2021093528A1 publication Critical patent/WO2021093528A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals

Definitions

  • This application relates to the field of image processing technology, and in particular to a focusing method and device, electronic equipment, and computer-readable storage medium.
  • Traditional focusing methods include Phase Detection Auto Focus (PDAF), which acquires the phase difference value for focusing.
  • PDAF Phase Detection Auto Focus
  • the traditional focusing method has the problem of inaccurate focusing.
  • a focusing method, device, electronic device, and computer-readable storage medium are provided.
  • a focusing method, applied to an electronic device including a gyroscope includes:
  • phase difference value includes a phase difference value in a first direction and a phase difference value in a second direction; the first direction and the second direction form a preset angle;
  • Focusing is performed based on the target phase difference value.
  • a focusing device applied to electronic equipment including a gyroscope including:
  • the phase difference value acquisition module is used to acquire the phase difference value during shooting.
  • the phase difference value includes a phase difference value in a first direction and a phase difference value in a second direction; Set angle
  • a gyroscope data acquisition module configured to acquire gyroscope data through the gyroscope
  • a target phase difference value determining module configured to determine a target phase difference value from the phase difference value in the first direction and the phase difference value in the second direction according to the gyroscope data;
  • the focusing module is used for focusing based on the target phase difference value.
  • An electronic device includes a memory and a processor, and a computer program is stored in the memory.
  • the processor causes the processor to perform the operation of the above-mentioned focusing method.
  • a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the operation of the above-mentioned method is realized.
  • the above-mentioned focusing method and device, electronic equipment, and computer-readable storage medium acquire a phase difference value during shooting.
  • the phase difference value includes a phase difference value in a first direction and a phase difference value in a second direction; the first direction and the second direction are Preset angle; obtain gyroscope data through gyroscope; according to gyroscope data, the moving direction of the electronic device can be determined, so that a more accurate target phase can be determined from the phase difference value in the first direction and the phase difference value in the second direction Difference; based on the target phase difference value, it can focus more accurately.
  • Figure 1 is a schematic diagram of the principle of phase detection autofocus.
  • FIG. 2 is a schematic diagram of a part of the structure of an image sensor in an embodiment.
  • FIG. 3 is a schematic diagram of the structure of pixels in an embodiment.
  • Fig. 4 is a schematic structural diagram of an imaging device in an embodiment.
  • Fig. 5 is a schematic diagram of a filter set on a pixel point group in an embodiment.
  • Fig. 6 is a flowchart of a focusing method in an embodiment.
  • FIG. 7 is a flowchart of determining the target phase difference value in an embodiment.
  • Fig. 8 is a flowchart of focusing in an embodiment.
  • Fig. 9 is a flowchart of determining the phase difference value in an embodiment.
  • Fig. 10 is a schematic diagram of a pixel point group in an embodiment.
  • Fig. 11 is a flowchart of a focusing method in another embodiment.
  • Fig. 12 is a structural block diagram of a focusing device in an embodiment.
  • Fig. 13 is a schematic diagram of the internal structure of an electronic device in an embodiment.
  • first, second, etc. used in this application can be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish the first element from another element.
  • first direction may be referred to as the second direction
  • second direction may be referred to as the first direction. Both the first direction and the second direction are directions, but they are not the same direction.
  • FIG. 1 is a schematic diagram of the principle of phase detection auto focus (PDAF).
  • M1 is the position of the image sensor when the imaging device included in the electronic device is in the in-focus state, where the in-focus state refers to the state of successful focusing.
  • the imaging light g reflected by the object W toward the lens Lens in different directions converges on the image sensor, that is, the imaging light g reflected by the object W toward the lens Lens in different directions is in the image
  • the image is imaged at the same position on the sensor. At this time, the image of the image sensor is clear.
  • M2 and M3 are the possible positions of the image sensor when the imaging device is not in focus.
  • the image sensor when the image sensor is at the M2 position or the M3 position, the object W is reflected in different directions of the lens Lens.
  • the imaging light g will be imaged at different positions. Please refer to Figure 1.
  • the imaging light g reflected by the object W in different directions to the lens Lens is imaged at the position A and the position B respectively.
  • the image sensor is at the M3 position
  • the object W is reflected toward the lens.
  • the imaging light g in different directions of the lens Lens is imaged at the position C and the position D respectively. At this time, the image of the image sensor is not clear.
  • the difference in position of the image formed by the imaging light entering the lens from different directions in the image sensor can be obtained.
  • the difference between position A and position B can be obtained, or, Obtain the difference between position C and position D; after obtaining the difference in position of the image formed by the imaging light entering the lens from different directions in the image sensor, the difference and the difference between the lens and the image sensor in the camera
  • the geometric relationship is used to obtain the defocus distance.
  • the so-called defocus distance refers to the distance between the current position of the image sensor and the position where the image sensor should be in the in-focus state; the imaging device can focus according to the obtained defocus distance.
  • the calculated PD value phase difference value
  • the larger the calculated value the farther the position of the clutch focus is, and the smaller the value, the closer the clutch focus.
  • the present application provides an imaging assembly.
  • the imaging component includes an image sensor.
  • the image sensor may be a metal oxide semiconductor device (English: Complementary Metal Oxide Semiconductor; abbreviation: CMOS) image sensor, a charge-coupled device (English: Charge-coupled Device; abbreviation: CCD), a quantum thin film sensor, or an organic sensor.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge-coupled Device
  • quantum thin film sensor or an organic sensor.
  • Fig. 2 is a schematic diagram of a part of the image sensor in an embodiment.
  • the image sensor includes a plurality of pixel point groups Z arranged in an array, each pixel point group Z includes a plurality of pixel points D arranged in an array, and each pixel point D corresponds to a photosensitive unit.
  • the multiple pixels include M*N pixels, where both M and N are natural numbers greater than or equal to 2.
  • Each pixel point D includes a plurality of sub-pixel points d arranged in an array. That is, each photosensitive unit can be composed of a plurality of photosensitive elements arranged in an array. Among them, the photosensitive element is an element that can convert light signals into electrical signals. Referring to FIG.
  • each pixel point group Z includes 4 pixel points D arranged in a 2*2 array, and each pixel point D may include 4 sub-pixel points d arranged in a 2*2 array.
  • the four sub-pixel points d jointly cover a microlens W.
  • each pixel point D includes 2*2 photodiodes, and the 2*2 photodiodes are arranged correspondingly to the 4 sub-pixel points d arranged in a 2*2 array.
  • Each photodiode is used to receive optical signals and perform photoelectric conversion, thereby converting the optical signals into electrical signals for output.
  • the 4 sub-pixels d included in each pixel D are set corresponding to the same color filter, so each pixel D corresponds to a color channel, such as the red R channel, or the green channel G, or the blue channel B .
  • each sub-pixel point in pixel point D can be determined The PD value (phase difference value) in the second direction.
  • the signals of sub-pixel point 1 and sub-pixel point 3 are combined and output, and the signals of sub-pixel point 2 and sub-pixel point 4 are combined and output, thereby constructing two PD pixel pairs along the first direction (ie, horizontal direction).
  • the phase value of can determine the PD value (phase difference value) of each sub-pixel point in the pixel point D along the first direction.
  • Fig. 4 is a schematic structural diagram of an imaging device in an embodiment.
  • the imaging device includes a lens 40, a filter 42 and an imaging component 44.
  • the lens 40, the filter 42 and the imaging component 44 are sequentially located on the incident light path, that is, the lens 40 is disposed on the filter 42, and the filter 42 is disposed on the imaging component 44.
  • the imaging component 44 includes the image sensor in FIG. 2.
  • the image sensor includes a plurality of pixel point groups Z arranged in an array.
  • Each pixel point group Z includes a plurality of pixel points D arranged in an array.
  • Each pixel point D corresponds to a photosensitive unit, and each photosensitive unit can consist of multiple arrays. It is composed of arranged photosensitive elements.
  • each pixel point D includes 4 sub-pixel points d arranged in a 2*2 array, and each sub-pixel point d corresponds to a light spot diode 442, that is, 2*2 photodiodes 442 and a 2*2 array array.
  • the 4 sub-pixel points d of the cloth are correspondingly arranged.
  • the 4 sub-pixel points d share one lens.
  • the filter 42 may include three types of red, green, and blue, and can only transmit light of corresponding wavelengths of red, green, and blue, respectively.
  • the four sub-pixel points d included in one pixel point D are arranged corresponding to the filters of the same color.
  • the filter may also be white, which facilitates the passage of light in a larger spectrum (wavelength) range and increases the luminous flux passing through the white filter.
  • the lens 40 is used to receive incident light and transmit the incident light to the filter 42. After the filter 42 performs filtering processing on the incident light, the filtered light signal is projected onto the imaging component 44.
  • the photosensitive unit in the image sensor included in the imaging component 44 converts the light incident from the filter 42 into a charge signal through the photoelectric effect, and generates a pixel signal consistent with the charge signal, and finally outputs an image after a series of processing.
  • the pixels included in the image sensor and the pixels included in the image are two different concepts.
  • the pixels included in the image refer to the smallest component unit of the image, which is generally represented by a sequence of numbers.
  • the sequence of numbers can be referred to as the pixel value of a pixel.
  • the embodiments of the present application involve both concepts of "pixels included in an image sensor" and "pixels included in an image”. To facilitate readers' understanding, a brief explanation is provided here.
  • Fig. 5 is a schematic diagram of a filter set on a pixel point group in an embodiment.
  • the pixel group Z includes 4 pixels D arranged in an array arrangement of two rows and two columns, wherein the color channel of the pixels in the first row and the first column is green, that is, the first row and the first row
  • the filter set on the pixels in one column is a green filter; the color channel of the pixels in the first row and second column is red, that is, the filter set on the pixels in the first row and second column
  • the filter is a red filter; the color channel of the pixel in the second row and the first column is blue, that is, the filter set on the pixel in the second row and the first column is a blue filter;
  • the color channel of the pixel points in the second row and second column is green, that is, the filter set on the pixel points in the second row and second column is a green filter.
  • Fig. 6 is a flowchart of a focusing method in an embodiment. As shown in FIG. 6, the focusing method includes operations 602 to 608.
  • Operation 602 Obtain a phase difference value during shooting, where the phase difference value includes the phase difference value in the first direction and the phase difference value in the second direction; the first direction and the second direction form a preset angle.
  • phase difference value when an image is taken by an imaging device of an electronic device, a phase difference value is acquired, and the phase difference value includes a phase difference value in the first direction and a phase difference value in the second direction.
  • the first direction and the second direction may form a preset included angle, and the preset included angle may be any angle other than 0 degrees, 180 degrees, and 360 degrees.
  • the gyroscope data is obtained through the gyroscope.
  • the gyroscope is an angular motion detection device that uses the moment of momentum sensitive shell of a high-speed rotating body to rotate one or two axes orthogonal to the rotation axis relative to the inertial space.
  • Gyroscopes include fiber optic gyroscopes, laser gyroscopes, and MEMS (Micro Electro Mechanical systems) gyroscopes.
  • the gyroscope data may include angular velocity data, movement direction, and so on.
  • the gyroscope includes X-axis, Y-axis, and Z-axis, and can detect the gyroscope data of the X-axis, the gyroscope data of the Y-axis, and the gyroscope data of the Z-axis respectively.
  • the gyroscope data of the X-axis of the gyroscope may represent data moving left and right in the horizontal direction
  • the gyroscope data of the Y-axis of the gyroscope may represent data moving back and forth in the horizontal direction. Therefore, the horizontal direction data of the electronic device can be determined based on the gyroscope data of the X axis and the gyroscope data of the Y axis of the gyroscope.
  • the horizontal direction of the electronic device can be determined according to the gyroscope data of the X axis of the gyroscope and the gyroscope data of the Y axis. Move left.
  • the electronic device can be determined to move forward horizontally according to the gyroscope data of the X axis of the gyroscope and the gyroscope data of the Y axis .
  • the horizontal direction of the electronic device can be determined according to the gyroscope data of the X axis of the gyroscope and the gyroscope data of the Y axis Move forward right.
  • the target phase difference value is determined from the phase difference value in the first direction and the phase difference value in the second direction according to the gyroscope data.
  • the moving direction in the gyroscope data is the horizontal direction
  • the image blur of the pixel pairs arranged horizontally (left and right) is used to obtain the horizontal direction.
  • the phase difference value is not accurate, which affects the accuracy of focusing.
  • the moving direction in the gyroscope data is the vertical direction, especially when the object to be photographed contains a horizontal texture, the vertical alignment (arranged up and down) of the pixel pairs will be blurred by the program to obtain the vertical direction.
  • the phase difference value is not accurate, which affects the accuracy of focusing.
  • an accurate target phase difference value can be determined from the phase difference value in the first direction and the phase difference value in the second direction.
  • the first direction is the horizontal direction
  • the second direction is the vertical direction.
  • the phase difference value in the second direction can be determined as the target phase difference value;
  • the phase difference value in the first direction can be determined as the target phase difference value.
  • focusing is performed based on the target phase difference value.
  • Focusing refers to the process of driving the lens through a motor to change the position of the animal distance and distance, so that the image of the object is clear.
  • the phase difference value is acquired during shooting, and the phase difference value includes the phase difference value in the first direction and the phase difference value in the second direction; the first direction and the second direction form a preset angle; the gyroscope is obtained through a gyroscope Data; According to the gyroscope data, the moving direction of the electronic device can be determined, so that a more accurate target phase difference value can be determined from the phase difference value in the first direction and the phase difference value in the second direction; it can be more accurate based on the target phase difference value Focus.
  • determining the target phase difference value from the phase difference value in the first direction and the phase difference value in the second direction according to the gyroscope data includes: determining the moving direction of the electronic device according to the gyroscope data; In the moving direction, the target phase difference value is determined from the phase difference value in the first direction and the phase difference value in the second direction.
  • the direction in the gyroscope data may be used as the moving direction of the electronic device. For example, if the direction in the gyroscope data is moving horizontally to the left, the moving direction of the electronic device is moving horizontally to the left.
  • the direction in the gyroscope data can also be weighted to obtain the moving direction of the electronic device.
  • the X-axis weighting factor, the Y-axis weighting factor, and the Z-axis weighting factor in the gyroscope data are obtained; according to the gyroscope data of the X-axis, the gyroscope data of the Y-axis, the gyroscope data of the Z-axis, and the corresponding Each weighting factor of determines the direction of movement of the electronic device.
  • the gyroscope data of the X-axis is 50cm, which means it moves 50cm horizontally to the right;
  • the gyroscope data of the Y-axis is 40cm, which means it moves forward 40cm horizontally;
  • the gyroscope data of the Z-axis is 0;
  • the weighting factor of the X-axis is 1.
  • the weighting factor of the Y axis is 1.25, and the weighting factor of the Z axis is 1.5.
  • the moving direction of the electronic device may also be determined according to other methods, and the specific method may be set according to the needs of the user, and is not limited to this.
  • the electronic device contains an image sensor, and the image sensor contains each pixel pair.
  • the movement direction of the electronic device is determined more accurately, the movement direction of each pixel pair can be determined more accurately, so that the pixel pair that is blurred in the image can be determined , Remove the phase difference value corresponding to the imaged blurred pixel pair, and determine a more accurate target phase difference value.
  • determining the target phase difference value from the phase difference value in the first direction and the phase difference value in the second direction according to the moving direction of the electronic device includes:
  • the pixels arranged in the first direction in the image sensor included in the electronic device blur the image, and the image is blurred in the first direction.
  • the phase difference value obtained by the aligned pixels is not accurate.
  • the image of the pixel pair arranged in the second direction is clear, and the phase difference value obtained by the pixel pair arranged in the second direction is accurate, and the phase difference value in the second direction is determined as the target phase difference value.
  • the first direction is the horizontal direction and the second direction is the vertical direction.
  • the image sensors included in the electronic device are arranged in the horizontal direction. Pixels (arranged left and right) are blurred for imaging, while pixels arranged in a vertical direction (arranged up and down) are sharp for imaging. That is to say, the phase difference value obtained by the pixel pair arranged in the horizontal direction is not accurate, while the phase difference value obtained by the pixel pair arranged in the vertical direction is accurate. Therefore, the phase difference value in the vertical direction is determined as the target phase difference value.
  • the pixels arranged in the second direction in the image sensor included in the electronic device blur the image, and the image is blurred in the second direction.
  • the phase difference value obtained by the aligned pixels is not accurate.
  • the image of the pixel pair arranged in the first direction is clear, and the phase difference value obtained by the pixel pair arranged in the first direction is accurate, and the phase difference value in the first direction is determined as the target phase difference value.
  • the first direction is the vertical direction
  • the second direction is the horizontal direction.
  • the image sensor contained in the electronic device uses the vertical direction. Pixels arranged (arranged up and down) blur the image, while pixels arranged in the horizontal direction (arranged left and right) have a clear image. That is to say, the phase difference value obtained by the pixel pair arranged in the vertical direction is not accurate, while the phase difference value obtained by the pixel pair arranged in the horizontal direction is accurate. Therefore, the phase difference value in the horizontal direction is determined as the target phase difference value.
  • an accurate phase difference value is determined from the phase difference value in the first direction and the phase difference value in the second direction as the target phase difference value, which avoids performing based on inaccurate phase difference values. Focusing causes the problem of inaccurate focus, which improves the accuracy of focus.
  • performing focusing based on the target phase difference value includes:
  • a defocus distance value is determined according to the target phase difference value.
  • the corresponding relationship between the target phase difference value and the defocus distance value can be obtained through calibration.
  • defocus PD*slope(DCC), where DCC (Defocus Conversion Coefficient) is obtained by calibration, and PD is the target phase difference value.
  • the calibration process of the corresponding relationship between the target phase difference value and the defocus distance value includes: dividing the effective focus stroke of the camera module into 10 equal parts, namely (near focus DAC-far focus DAC)/10, so as to cover the motor Focus range; focus at each focus DAC (DAC can be 0 to 1023) position, and record the phase difference of the current focus DAC position; after completing the motor focus stroke, take a group of 10 focus DACs and the obtained PD value to do Ratio; Generate 10 similar ratios K, and fit the two-dimensional data composed of DAC and PD to get a straight line with slope K.
  • the lens is controlled to move to focus according to the defocus distance value.
  • the defocus distance value and the moving direction are determined according to the target phase difference value; the lens movement can be controlled according to the defocus distance value and the moving direction to more accurately focus.
  • determining the defocus distance value according to the target phase difference value includes: obtaining the confidence level of the target phase difference value; when the confidence level is greater than the confidence threshold, according to the target phase difference value from the phase difference value and the defocus distance The corresponding defocus distance value is determined in the value correspondence relationship.
  • the confidence level of the target phase difference value refers to the credibility of the target phase difference value. The higher the confidence of the target phase difference value, the more reliable the target phase difference value, that is, the more accurate the target phase difference value; the lower the confidence of the target phase difference value, the less reliable the target phase difference value, that is, the target phase difference. The more inaccurate the value.
  • the target phase difference value is determined from the corresponding relationship between the phase difference value and the defocus distance value. The corresponding defocus distance value.
  • the confidence is less than or equal to the confidence threshold, it means that the confidence of the target phase difference value is low, and it can be considered that the target phase difference value is inaccurate. You can return to the operation of obtaining the phase difference value during shooting and reacquire the phase in the first direction. The difference and the phase difference in the second direction.
  • the phase difference value of a line coordinate x in the image is calculated, and the left figure is x-2, x-1, x, x+1, x+2, and a total of 5
  • the brightness value of each pixel is moved on the right picture, and the moving range can be -10 to +10.
  • the degree of similarity matching can be
  • the brightness value of a column of pixels in the upper image can be compared with the brightness value of the same number of pixels in the lower image.
  • the process of obtaining the credibility of the upper and lower images is similar to that of the left and right images, so I won't repeat them here.
  • the target phase difference value when the confidence of the target phase difference value is greater than the confidence threshold, the target phase difference value is more accurate, and the corresponding relationship can be determined from the phase difference value and the defocus distance value according to the target phase difference value. Defocus distance value, the determined defocus distance value is also more accurate, so that the accuracy of focusing can be improved.
  • the above method further includes: acquiring a first image; performing subject detection on the first image to obtain a region of interest.
  • Obtaining the phase difference value during shooting includes: obtaining the phase difference value in the region of interest during shooting.
  • Focusing based on the target phase difference value includes: focusing in the region of interest based on the target phase difference value to obtain the second image.
  • subject detection refers to automatically processing regions of interest when facing a scene and selectively ignoring regions of interest.
  • the area of interest is called the subject area.
  • the visible light image refers to an RGB (Red, Green, Blue) image.
  • RGB Red, Green, Blue
  • a color image can be obtained by shooting any scene with a color camera, that is, an RGB image.
  • the visible light image may be stored locally by the electronic device, may also be stored by other devices, may also be stored on the network, or may be captured by the electronic device in real time, but is not limited to this.
  • the phase difference value of the region of interest in the first image is acquired.
  • the phase difference value of the region of interest also includes the phase difference value in the first direction and the phase difference value in the second direction, the first direction and the second direction Set the preset angle; obtain gyroscope data through the gyroscope; determine the target phase difference value from the phase difference value in the first direction and the phase difference value in the second direction according to the gyroscope data; based on the target phase difference value in the region of interest Focus in the middle to get the second image.
  • the above focusing method combined with subject detection, acquires the region of interest, and performs focusing in the region of interest, which can avoid focusing in the background area and improve the accuracy of focusing.
  • the electronic device includes an image sensor, the image sensor includes a plurality of pixel point groups arranged in an array, and each pixel point group includes M*N pixel points arranged in an array; each pixel point corresponds to a photosensitive unit , Where both M and N are natural numbers greater than or equal to 2.
  • obtaining the phase difference value includes:
  • a target brightness map is obtained according to the brightness values of the pixel points included in each pixel point group.
  • the brightness value of the pixel of the image sensor can be characterized by the brightness value of the sub-pixel included in the pixel.
  • the imaging device may obtain the target brightness map according to the brightness values of the sub-pixel points in the pixel points included in each pixel point group.
  • the brightness value of a sub-pixel point refers to the brightness value of the light signal received by the photosensitive element corresponding to the sub-pixel point.
  • the sub-pixel included in the image sensor is a photosensitive element that can convert light signals into electrical signals. Therefore, the light signal received by the sub-pixel can be obtained according to the electrical signal output by the sub-pixel. Intensity, the brightness value of the sub-pixel can be obtained according to the intensity of the light signal received by the sub-pixel.
  • the target brightness map in the embodiment of the present application is used to reflect the brightness value of the sub-pixels in the image sensor.
  • the target brightness map may include multiple pixels, wherein the pixel value of each pixel in the target brightness map is based on the image sensor Obtained from the brightness value of the neutron pixel.
  • Operation 904 perform segmentation processing on the target brightness map to obtain a first segmented brightness map and a second segmented brightness map, and based on the position difference of pixels matching each other in the first segmented brightness map and the second segmented brightness map , Determine the phase difference value of the pixels that match each other.
  • the imaging device may perform segmentation processing on the target luminance map along the column direction (the y-axis direction in the image coordinate system). In the process of segmenting the target luminance map along the column direction, Each dividing line of the segmentation process is perpendicular to the direction of the column.
  • the imaging device may perform segmentation processing on the target brightness map along the row direction (the x-axis direction in the image coordinate system), and in the process of segmenting the target brightness map along the row direction , Each dividing line of the segmentation process is perpendicular to the direction of the row.
  • the first segmented brightness map and the second segmented brightness map obtained after the target brightness map is segmented along the column direction can be referred to as the upper image and the lower image, respectively.
  • the first segmented brightness map and the second segmented brightness map obtained after the target brightness map is segmented along the row direction can be called the left image and the right image, respectively.
  • mutant pixels means that the pixel matrix composed of the pixel itself and the surrounding pixels are similar to each other.
  • the pixel a and its surrounding pixels in the first segmented brightness map form a pixel matrix with 3 rows and 3 columns, and the pixel value of the pixel matrix is:
  • the pixel b and the surrounding pixels in the second segmented brightness map also form a pixel matrix with 3 rows and 3 columns, and the pixel value of the pixel matrix is:
  • the two matrices are similar, and it can be considered that the pixel a and the pixel b match each other.
  • the pixel value of each corresponding pixel in the two pixel matrices can be calculated, and then the absolute value of the difference obtained is added, and the result of the addition is used to determine Whether the pixel matrix is similar, that is, if the result of the addition is less than a preset threshold, the pixel matrix is considered to be similar; otherwise, the pixel matrix is considered to be dissimilar.
  • the difference of 1 and 2 the difference of 15 and 15, the difference of 70 and 70, ..., and then the absolute difference Values are added, and the result of the addition is 3. If the result of the addition of 3 is less than the preset threshold, it is considered that the two pixel matrices with 3 rows and 3 columns are similar.
  • Another way to judge whether the pixel matrix is similar is to use the Sobel convolution kernel calculation method or the high Laplacian calculation method to extract the edge characteristics, and judge whether the pixel matrix is similar by the edge characteristics.
  • the positional difference of pixels that match each other refers to the positions of the pixels in the first split brightness map and the positions of the pixels in the second split brightness map among the matched pixels.
  • the difference As in the above example, the position difference between the pixel a and the pixel b that are matched with each other refers to the difference between the position of the pixel a in the first split brightness map and the position of the pixel b in the second split brightness map.
  • the pixels that match each other correspond to different images in the image sensor formed by the imaging light entering the lens from different directions.
  • the pixel a in the first split brightness map and the pixel b in the second split brightness map match each other, where the pixel a may correspond to the image formed at position A in FIG. 1, and the pixel b may correspond to The image formed at position B in Figure 1.
  • the phase difference of the matched pixels can be determined according to the position difference of the matched pixels. .
  • phase difference value in the first direction and the phase difference value in the second direction are determined according to the phase difference values of the pixels that match each other.
  • the phase difference value in the first direction can be determined according to the phase difference of the pixel a and the pixel b that are matched with each other.
  • the second split brightness map includes odd-numbered columns
  • pixel a in the first split brightness map and pixel b in the second split brightness map Mutual matching, based on the phase difference between the matched pixel a and the pixel b, the phase difference value in the second direction can be determined.
  • the brightness value of the pixel points in the above pixel point group obtains the target brightness map.
  • the phase difference value of the matching pixels can be quickly determined, and the phase difference value of the matching pixels can be quickly determined.
  • the rich phase difference value can improve the accuracy of the phase difference value and improve the accuracy and stability of the focus.
  • each pixel point includes a plurality of sub-pixel points arranged in an array
  • obtaining the target brightness map according to the brightness value of the pixel points included in each pixel point group includes: for each pixel point group, according to the pixel point The brightness value of the sub-pixel at the same position of each pixel in the group is obtained, and the sub-brightness map corresponding to the pixel group is obtained; the target brightness map is generated according to the sub-brightness map corresponding to each pixel group.
  • the sub-pixel points at the same position of each pixel point refer to the sub-pixel points that are arranged in the same position in each pixel point.
  • FIG. 10 is a schematic diagram of a pixel point group in an embodiment.
  • the pixel point group includes 4 pixels arranged in an array arrangement of two rows and two columns, and the 4 pixels are respectively D1 pixel, D2 pixel, D3 pixel and D4 pixel, where each pixel includes 4 sub-pixels arranged in an array arrangement of two rows and two columns, where the sub-pixels are respectively d11, d12, d13, d14, d21, d22, d23, d24, d31, d32, d33, d34, d41, d42, d43, and d44.
  • the sub-pixels d11, d21, d31, and d41 are arranged in the same position in each pixel, and they are all in the first row and first column.
  • the sub-pixels d12, d22, d32, and d42 are in each pixel.
  • the arrangement positions in are the same in the first row and second column, and the sub-pixels d13, d23, d33, and d43 are arranged in the same position in each pixel, and they are all in the second row and first column, and the sub-pixel d14 , D24, d34 and d44 are arranged in the same position in each pixel, and they are all in the second row and second column.
  • operation 1102 uses a 2*2 PDAF image sensor to capture light information, specifically, obtaining a target brightness map according to the brightness value of each pixel point included in each pixel point group on the image sensor; Perform segmentation processing on the target brightness map to obtain the first split brightness map and the second split brightness map; perform operation 1104, according to the position difference of the pixels matching each other in the first split brightness map and the second split brightness map, Determine the phase difference value of the pixels that match each other; determine the phase difference value in the first direction and the phase difference value in the second direction according to the phase difference values of the pixels that match each other.
  • FIGS. 6-9 may include multiple sub-operations or multiple stages. These sub-operations or stages are not necessarily executed at the same time, but can be executed at different times. These sub-operations or The execution order of the stages is not necessarily carried out sequentially, but may be executed alternately or alternately with at least a part of other operations or sub-operations or stages of other operations.
  • Fig. 12 is a structural block diagram of a focusing device according to an embodiment.
  • a focusing device 1200 is provided, which is applied to an electronic device including a gyroscope, and includes: a phase difference value acquisition module 1202, a gyroscope data acquisition module 1204, a target phase difference value determination module 1206, and a focus module 1208, of which:
  • the phase difference value obtaining module 1202 is used to obtain the phase difference value during shooting, the phase difference value includes the phase difference value in the first direction and the phase difference value in the second direction; the first direction and the second direction form a preset angle.
  • the gyroscope data acquisition module 1204 is used for acquiring gyroscope data through the gyroscope.
  • the target phase difference value determination module 1206 is configured to determine the target phase difference value from the phase difference value in the first direction and the phase difference value in the second direction according to the gyroscope data.
  • the focusing module 1208 is used for focusing based on the target phase difference value.
  • the above focusing device obtains the phase difference value during shooting, the phase difference value includes the phase difference value in the first direction and the phase difference value in the second direction; the first direction and the second direction form a preset angle; the gyroscope is obtained through the gyroscope Data; According to the gyroscope data, the moving direction of the electronic device can be determined, so that a more accurate target phase difference value can be determined from the phase difference value in the first direction and the phase difference value in the second direction; it can be more accurate based on the target phase difference value Focus.
  • the above-mentioned target phase difference value determining module 1206 is further used to determine the moving direction of the electronic device according to the gyroscope data; according to the moving direction of the electronic device, the phase difference value from the first direction and the phase difference in the second direction Determine the target phase difference value in the value.
  • the above-mentioned target phase difference value determining module 1206 is further configured to determine that the phase difference value in the second direction is the target phase difference value when the moving direction of the electronic device is the first direction; when the moving direction of the electronic device is In the second direction, it is determined that the phase difference in the first direction is the target phase difference value.
  • the above-mentioned focusing module 1208 is further configured to determine the defocus distance value and the moving direction according to the target phase difference value; and control the movement of the lens to focus according to the defocus distance value and the moving direction.
  • the above-mentioned focusing module 1208 is also used to obtain the confidence of the target phase difference value; when the confidence is greater than the confidence threshold, it is determined from the corresponding relationship between the phase difference value and the defocus distance value according to the target phase difference value. The corresponding defocus distance value.
  • the above-mentioned focusing device 1200 further includes a subject detection module for acquiring a first image; subject detection is performed on the first image to obtain a region of interest.
  • Obtaining the phase difference value during shooting includes: obtaining the phase difference value in the region of interest during shooting.
  • Focusing based on the target phase difference value includes: focusing in the region of interest based on the target phase difference value to obtain the second image.
  • the electronic device includes an image sensor, and the image sensor includes a plurality of pixel point groups arranged in an array, and each pixel point group includes a plurality of pixel points arranged in an array; each pixel point corresponds to a photosensitive unit.
  • the above-mentioned phase difference value acquisition module 1202 is also used to acquire the target brightness map according to the brightness values of the pixels included in each pixel point group; perform segmentation processing on the target brightness map to obtain the first segmented brightness map and the second segmented brightness According to the position difference of the matching pixels in the first split brightness map and the second split brightness map, the phase difference value of the matching pixels is determined; the phase difference value of the matching pixels is determined according to the phase difference value of the matching pixels in the first direction. The phase difference value or the phase difference value in the second direction.
  • the above-mentioned phase difference value acquisition module 1202 is further configured to, for each pixel point group, obtain the corresponding pixel point group according to the brightness value of the sub-pixel point at the same position of each pixel point in the pixel point group.
  • Sub-brightness map Generate the target brightness map according to the sub-brightness map corresponding to each pixel group.
  • the division of the modules in the above-mentioned focusing device is only used for illustration. In other embodiments, the focusing device can be divided into different modules as required to complete all or part of the functions of the above-mentioned focusing device.
  • Fig. 13 is a schematic diagram of the internal structure of an electronic device in an embodiment.
  • the electronic device includes a processor and a memory connected through a system bus.
  • the processor is used to provide computing and control capabilities to support the operation of the entire electronic device.
  • the memory may include a non-volatile storage medium and internal memory.
  • the non-volatile storage medium stores an operating system and a computer program.
  • the computer program can be executed by a processor to implement a focusing method provided in the following embodiments.
  • the internal memory provides a cached operating environment for the operating system computer program in the non-volatile storage medium.
  • the electronic device can be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device.
  • each module in the focusing device provided in the embodiment of the present application may be in the form of a computer program.
  • the computer program can be run on a terminal or a server.
  • the program module composed of the computer program can be stored in the memory of the terminal or the server.
  • the embodiment of the present application also provides a computer-readable storage medium.
  • One or more non-volatile computer-readable storage media containing computer-executable instructions when the computer-executable instructions are executed by one or more processors, cause the processors to perform the operations of the focusing method.
  • a computer program product containing instructions that, when run on a computer, causes the computer to execute the focusing method.
  • Non-volatile memory may include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory may include random access memory (RAM), which acts as external cache memory.
  • RAM is available in many forms, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
  • SRAM static RAM
  • DRAM dynamic RAM
  • SDRAM synchronous DRAM
  • DDR SDRAM double data rate SDRAM
  • ESDRAM enhanced SDRAM
  • SLDRAM synchronous Link (Synchlink) DRAM
  • Rambus direct RAM
  • DRAM direct memory bus dynamic RAM
  • RDRAM memory bus dynamic RAM

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Focusing (AREA)
  • Studio Devices (AREA)

Abstract

A focusing method and apparatus, and an electronic device and a computer readable storage medium. The focusing method comprises: obtaining a phase difference during photographing, the phase difference comprising a phase difference in a first direction and a phase difference in a second direction, and the first direction and the second direction forming a preset included angle; obtaining gyroscope data by means of a gyroscope; determining a target phase difference from the phase difference in the first direction and the phase difference in the second direction according to the gyroscope data; and focusing based on the target phase difference.

Description

对焦方法和装置、电子设备、计算机可读存储介质Focusing method and device, electronic equipment, and computer readable storage medium

相关申请的交叉引用Cross-references to related applications

本申请要求于2019年11月12日提交中国专利局、申请号为201911101409.1、发明名称为“对焦方法和装置、电子设备、计算机可读存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。This application claims the priority of a Chinese patent application filed with the Chinese Patent Office, the application number is 201911101409.1, and the invention title is "focusing method and device, electronic equipment, computer-readable storage medium" on November 12, 2019, and the entire content of it is approved The reference is incorporated in this application.

技术领域Technical field

本申请涉及图像处理技术领域,特别是涉及一种对焦方法和装置、电子设备、计算机可读存储介质。This application relates to the field of image processing technology, and in particular to a focusing method and device, electronic equipment, and computer-readable storage medium.

背景技术Background technique

随着影像技术的发展,人们越来越习惯通过电子设备上的摄像头等图像采集设备拍摄图像或视频,记录各种信息。摄像头在采集图像过程中一般需要对焦至被拍摄的对象,从而获取被拍摄的对象的清晰的图像。With the development of imaging technology, people are becoming more and more accustomed to taking images or videos and recording various information through image acquisition devices such as cameras on electronic devices. The camera generally needs to focus on the object being photographed in the process of collecting images, so as to obtain a clear image of the object being photographed.

传统的对焦方式包括相位对焦(Phase Detection Auto Focus,PDAF),获取相位差值进行对焦。然而,传统的对焦方法存在对焦不准确的问题。Traditional focusing methods include Phase Detection Auto Focus (PDAF), which acquires the phase difference value for focusing. However, the traditional focusing method has the problem of inaccurate focusing.

发明内容Summary of the invention

根据本申请的各种实施例提供一种对焦方法、装置、电子设备、计算机可读存储介质。According to various embodiments of the present application, a focusing method, device, electronic device, and computer-readable storage medium are provided.

一种对焦的方法,应用于包括陀螺仪的电子设备中,包括:A focusing method, applied to an electronic device including a gyroscope, includes:

拍摄时获取相位差值,所述相位差值包括第一方向的相位差值和第二方向的相位差值;所述第一方向与所述第二方向成预设夹角;Acquiring a phase difference value during shooting, where the phase difference value includes a phase difference value in a first direction and a phase difference value in a second direction; the first direction and the second direction form a preset angle;

通过所述陀螺仪获取陀螺仪数据;Obtain gyroscope data through the gyroscope;

根据所述陀螺仪数据从所述第一方向的相位差值和第二方向的相位差值中确定目标相位差值;Determining a target phase difference value from the phase difference value in the first direction and the phase difference value in the second direction according to the gyroscope data;

基于所述目标相位差值进行对焦。Focusing is performed based on the target phase difference value.

一种对焦装置,应用于包括陀螺仪的电子设备中,包括:A focusing device applied to electronic equipment including a gyroscope, including:

相位差值获取模块,用于拍摄时获取相位差值,所述相位差值包括第一方向的相位差值和第二方向的相位差值;所述第一方向与所述第二方向成预设夹角;The phase difference value acquisition module is used to acquire the phase difference value during shooting. The phase difference value includes a phase difference value in a first direction and a phase difference value in a second direction; Set angle

陀螺仪数据获取模块,用于通过所述陀螺仪获取陀螺仪数据;A gyroscope data acquisition module, configured to acquire gyroscope data through the gyroscope;

目标相位差值确定模块,用于根据所述陀螺仪数据从所述第一方向的相位差值和第二方向的相位差值中确定目标相位差值;A target phase difference value determining module, configured to determine a target phase difference value from the phase difference value in the first direction and the phase difference value in the second direction according to the gyroscope data;

对焦模块,用于基于所述目标相位差值进行对焦。The focusing module is used for focusing based on the target phase difference value.

一种电子设备,包括存储器及处理器,所述存储器中储存有计算机程序,所述计算机程序被所述处理器执行时,使得所述处理器执行如上述的对焦方法的操作。An electronic device includes a memory and a processor, and a computer program is stored in the memory. When the computer program is executed by the processor, the processor causes the processor to perform the operation of the above-mentioned focusing method.

一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现如上述的方法的操作。A computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the operation of the above-mentioned method is realized.

上述对焦方法和装置、电子设备、计算机可读存储介质,拍摄时获取相位差值,相位差值包括第一方向的相位差值和第二方向的相位差值;第一方向与第二方向成预设夹角;通过陀螺仪获取陀螺仪数据;根据陀螺仪数据可以确定电子设备的移动方向,从而可以从第一方向的相位差值和第二方向的相位差值中确定更准确的目标相位差值;基于目标相位差值可以更准确进行对焦。The above-mentioned focusing method and device, electronic equipment, and computer-readable storage medium acquire a phase difference value during shooting. The phase difference value includes a phase difference value in a first direction and a phase difference value in a second direction; the first direction and the second direction are Preset angle; obtain gyroscope data through gyroscope; according to gyroscope data, the moving direction of the electronic device can be determined, so that a more accurate target phase can be determined from the phase difference value in the first direction and the phase difference value in the second direction Difference; based on the target phase difference value, it can focus more accurately.

附图说明Description of the drawings

为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术 描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。In order to more clearly describe the technical solutions in the embodiments of the present application or the prior art, the following will briefly introduce the drawings that need to be used in the description of the embodiments or the prior art. Obviously, the drawings in the following description are only These are some embodiments of the present application. For those of ordinary skill in the art, other drawings can be obtained based on these drawings without creative work.

图1为相位检测自动对焦的原理示意图。Figure 1 is a schematic diagram of the principle of phase detection autofocus.

图2为一个实施例中图像传感器的部分结构示意图。FIG. 2 is a schematic diagram of a part of the structure of an image sensor in an embodiment.

图3为一个实施例中像素点的结构示意图。FIG. 3 is a schematic diagram of the structure of pixels in an embodiment.

图4为一个实施例中成像设备的结构示意图。Fig. 4 is a schematic structural diagram of an imaging device in an embodiment.

图5为一个实施例中像素点组上设置滤光片的示意图。Fig. 5 is a schematic diagram of a filter set on a pixel point group in an embodiment.

图6为一个实施例中对焦方法的流程图。Fig. 6 is a flowchart of a focusing method in an embodiment.

图7为一个实施例中确定目标相位差值的流程图。FIG. 7 is a flowchart of determining the target phase difference value in an embodiment.

图8为一个实施例中对焦的流程图。Fig. 8 is a flowchart of focusing in an embodiment.

图9为一个实施例中确定相位差值的流程图。Fig. 9 is a flowchart of determining the phase difference value in an embodiment.

图10为一个实施例中像素点组的示意图。Fig. 10 is a schematic diagram of a pixel point group in an embodiment.

图11为另一个实施例中对焦方法的流程图。Fig. 11 is a flowchart of a focusing method in another embodiment.

图12为一个实施例中对焦装置的结构框图。Fig. 12 is a structural block diagram of a focusing device in an embodiment.

图13为一个实施例中电子设备的内部结构示意图。Fig. 13 is a schematic diagram of the internal structure of an electronic device in an embodiment.

具体实施方式Detailed ways

为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。In order to make the purpose, technical solutions, and advantages of this application clearer, the following further describes this application in detail with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the application, and not used to limit the application.

可以理解,本申请所使用的术语“第一”、“第二”等可在本文中用于描述各种元件,但这些元件不受这些术语限制。这些术语仅用于将第一个元件与另一个元件区分。举例来说,在不脱离本申请的范围的情况下,可以将第一方向称为第二方向,且类似地,可将第二方向称为第一方向。第一方向和第二方向两者都是方向,但其不是同一方向。It can be understood that the terms "first", "second", etc. used in this application can be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish the first element from another element. For example, without departing from the scope of the present application, the first direction may be referred to as the second direction, and similarly, the second direction may be referred to as the first direction. Both the first direction and the second direction are directions, but they are not the same direction.

图1为相位检测自动对焦(phase detection auto focus,PDAF)的原理示意图。如图1所示,M1为电子设备中所包含的成像设备处于合焦状态时,图像传感器所处的位置,其中,合焦状态指的是成功对焦的状态。当图像传感器位于M1位置时,由物体W反射向镜头Lens的不同方向上的成像光线g在图像传感器上会聚,也即是,由物体W反射向镜头Lens的不同方向上的成像光线g在图像传感器上的同一位置处成像,此时,图像传感器成像清晰。Figure 1 is a schematic diagram of the principle of phase detection auto focus (PDAF). As shown in FIG. 1, M1 is the position of the image sensor when the imaging device included in the electronic device is in the in-focus state, where the in-focus state refers to the state of successful focusing. When the image sensor is at the M1 position, the imaging light g reflected by the object W toward the lens Lens in different directions converges on the image sensor, that is, the imaging light g reflected by the object W toward the lens Lens in different directions is in the image The image is imaged at the same position on the sensor. At this time, the image of the image sensor is clear.

M2和M3为成像设备不处于合焦状态时,图像传感器所可能处于的位置,如图1所示,当图像传感器位于M2位置或M3位置时,由物体W反射向镜头Lens的不同方向上的成像光线g会在不同的位置成像。请参考图1,当图像传感器位于M2位置时,由物体W反射向镜头Lens的不同方向上的成像光线g在位置A和位置B分别成像,当图像传感器位于M3位置时,由物体W反射向镜头Lens的不同方向上的成像光线g在位置C和位置D分别成像,此时,图像传感器成像不清晰。M2 and M3 are the possible positions of the image sensor when the imaging device is not in focus. As shown in Figure 1, when the image sensor is at the M2 position or the M3 position, the object W is reflected in different directions of the lens Lens. The imaging light g will be imaged at different positions. Please refer to Figure 1. When the image sensor is at the M2 position, the imaging light g reflected by the object W in different directions to the lens Lens is imaged at the position A and the position B respectively. When the image sensor is at the M3 position, the object W is reflected toward the lens. The imaging light g in different directions of the lens Lens is imaged at the position C and the position D respectively. At this time, the image of the image sensor is not clear.

在PDAF技术中,可以获取从不同方向射入镜头的成像光线在图像传感器中所成的像在位置上的差异,例如,如图1所示,可以获取位置A和位置B的差异,或者,获取位置C和位置D的差异;在获取到从不同方向射入镜头的成像光线在图像传感器中所成的像在位置上的差异之后,可以根据该差异以及摄像机中镜头与图像传感器之间的几何关系,得到离焦距离,所谓离焦距离指的是图像传感器当前所处的位置与合焦状态时图像传感器所应该处于的位置的距离;成像设备可以根据得到的离焦距离进行对焦。In PDAF technology, the difference in position of the image formed by the imaging light entering the lens from different directions in the image sensor can be obtained. For example, as shown in Figure 1, the difference between position A and position B can be obtained, or, Obtain the difference between position C and position D; after obtaining the difference in position of the image formed by the imaging light entering the lens from different directions in the image sensor, the difference and the difference between the lens and the image sensor in the camera The geometric relationship is used to obtain the defocus distance. The so-called defocus distance refers to the distance between the current position of the image sensor and the position where the image sensor should be in the in-focus state; the imaging device can focus according to the obtained defocus distance.

由此可知,合焦时,计算得到的PD值(相位差值)为0,反之算出的值越大,表示离合焦点的位置越远,值越小,表示离合焦点越近。采用PDAF对焦时,通过计算出PD值,再根据标定得到PD值与离焦距离之间的对应关系,可以求得离焦距离,然后根据离焦距离控制镜头移动达到合焦点,以实现对焦。It can be seen from this that when focusing, the calculated PD value (phase difference value) is 0, and conversely, the larger the calculated value, the farther the position of the clutch focus is, and the smaller the value, the closer the clutch focus. When using PDAF to focus, by calculating the PD value, and then obtaining the corresponding relationship between the PD value and the defocusing distance according to the calibration, the defocusing distance can be obtained, and then controlling the lens to move to the focal point according to the defocusing distance to achieve focusing.

在一个实施例中,本申请提供了一种成像组件。成像组件包括图像传感器。图像传感器可以为金属氧化物半导体元件(英文:Complementary Metal Oxide Semiconductor;简称:CMOS)图像传感器、电荷耦合元件(英文:Charge-coupled Device;简称:CCD)、量子薄膜传感器或者有机传感器等。In one embodiment, the present application provides an imaging assembly. The imaging component includes an image sensor. The image sensor may be a metal oxide semiconductor device (English: Complementary Metal Oxide Semiconductor; abbreviation: CMOS) image sensor, a charge-coupled device (English: Charge-coupled Device; abbreviation: CCD), a quantum thin film sensor, or an organic sensor.

图2为一个实施例中图像传感器的一部分的结构示意图。图像传感器包括阵列排布的多个像素点组Z,每个像素点组Z包括阵列排布的多个像素点D,每个像素点D对应一个感光单元。多个像素点包括M*N个像素点,其中,M和N均为大于或等于2的自然数。每个像素点D包括阵列排布的多个子像素点d。也就是每个感光单元可以由多个阵列排布的感光元件组成。其中,感光元件是一种能够将光信号转化为电信号的元件。参图3,每个像素点D中阵列排布的多个子像素点d上共同覆盖一个微透镜W。在一个实施例中,感光元件可为光电二极管。本实施例中,每个像素点组Z包括2*2阵列排布的4个像素点D,每个像素点D可包括2*2阵列排布的4个子像素点d。4个子像素点d上共同覆盖一个微透镜W。其中,每个像素点D包括2*2个光电二极管,2*2个光电二极管与2*2阵列排布的4个子像素点d对应设置。每个光电二极管用于接收光信号并进行光电转换,从而将光信号转换为电信号输出。每个像素点D所包括的4个子像素点d与同一颜色的滤光片对应设置,因此每个像素点D对应于一个颜色通道,比如红色R通道,或者绿色通道G,或者蓝色通道B。Fig. 2 is a schematic diagram of a part of the image sensor in an embodiment. The image sensor includes a plurality of pixel point groups Z arranged in an array, each pixel point group Z includes a plurality of pixel points D arranged in an array, and each pixel point D corresponds to a photosensitive unit. The multiple pixels include M*N pixels, where both M and N are natural numbers greater than or equal to 2. Each pixel point D includes a plurality of sub-pixel points d arranged in an array. That is, each photosensitive unit can be composed of a plurality of photosensitive elements arranged in an array. Among them, the photosensitive element is an element that can convert light signals into electrical signals. Referring to FIG. 3, a plurality of sub-pixel points d arranged in an array in each pixel point D collectively cover a microlens W. In one embodiment, the photosensitive element may be a photodiode. In this embodiment, each pixel point group Z includes 4 pixel points D arranged in a 2*2 array, and each pixel point D may include 4 sub-pixel points d arranged in a 2*2 array. The four sub-pixel points d jointly cover a microlens W. Among them, each pixel point D includes 2*2 photodiodes, and the 2*2 photodiodes are arranged correspondingly to the 4 sub-pixel points d arranged in a 2*2 array. Each photodiode is used to receive optical signals and perform photoelectric conversion, thereby converting the optical signals into electrical signals for output. The 4 sub-pixels d included in each pixel D are set corresponding to the same color filter, so each pixel D corresponds to a color channel, such as the red R channel, or the green channel G, or the blue channel B .

如图3所示,以每个像素点D包括子像素点1、子像素点2、子像素点3和子像素点4为例,可将子像素点1和子像素点2信号合并输出,子像素点3和子像素点4信号合并输出,从而构造成沿着第二方向(即竖直方向)的两个PD像素对,根据两个PD像素对的相位值可以确定像素点D内各子像素点沿第二方向的PD值(相位差值)。将子像素点1和子像素点3信号合并输出,子像素点2和子像素点4信号合并输出,从而构造沿着第一方向(即水平方向)的两个PD像素对,根据两个PD像素对的相位值可以确定像素点D内各子像素点沿第一方向的PD值(相位差值)。As shown in Figure 3, taking each pixel D including sub-pixel 1, sub-pixel 2, sub-pixel 3, and sub-pixel 4 as an example, the signals of sub-pixel 1 and sub-pixel 2 can be combined and output, and sub-pixel The signals of point 3 and sub-pixel point 4 are combined and output, thereby constructing two PD pixel pairs along the second direction (ie the vertical direction). According to the phase value of the two PD pixel pairs, each sub-pixel point in pixel point D can be determined The PD value (phase difference value) in the second direction. The signals of sub-pixel point 1 and sub-pixel point 3 are combined and output, and the signals of sub-pixel point 2 and sub-pixel point 4 are combined and output, thereby constructing two PD pixel pairs along the first direction (ie, horizontal direction). According to the two PD pixel pairs The phase value of can determine the PD value (phase difference value) of each sub-pixel point in the pixel point D along the first direction.

图4为一个实施例中成像设备的结构示意图。如图4所示,该成像设备包括透镜40、滤光片42和成像组件44。透镜40、滤光片42和成像组件44依次位于入射光路上,即透镜40设置在滤光片42之上,滤光片42设置在成像组件44上。Fig. 4 is a schematic structural diagram of an imaging device in an embodiment. As shown in FIG. 4, the imaging device includes a lens 40, a filter 42 and an imaging component 44. The lens 40, the filter 42 and the imaging component 44 are sequentially located on the incident light path, that is, the lens 40 is disposed on the filter 42, and the filter 42 is disposed on the imaging component 44.

成像组件44包括图2中的图像传感器。图像传感器包括阵列排布的多个像素点组Z,每个像素点组Z包括阵列排布的多个像素点D,每个像素点D对应一个感光单元,每个感光单元可以由多个阵列排布的感光元件组成。本实施例中,每个像素点D包括2*2阵列排布的4个子像素点d,每个子像素点d对应一个光点二极管442,即2*2个光电二极管442与2*2阵列排布的4个子像素点d对应设置。4个子像素点d共用一个透镜。The imaging component 44 includes the image sensor in FIG. 2. The image sensor includes a plurality of pixel point groups Z arranged in an array. Each pixel point group Z includes a plurality of pixel points D arranged in an array. Each pixel point D corresponds to a photosensitive unit, and each photosensitive unit can consist of multiple arrays. It is composed of arranged photosensitive elements. In this embodiment, each pixel point D includes 4 sub-pixel points d arranged in a 2*2 array, and each sub-pixel point d corresponds to a light spot diode 442, that is, 2*2 photodiodes 442 and a 2*2 array array. The 4 sub-pixel points d of the cloth are correspondingly arranged. The 4 sub-pixel points d share one lens.

滤光片42可包括红、绿、蓝三种,分别只能透过红色、绿色、蓝色对应波长的光线。一个像素点D所包括的4个子像素点d与同一颜色的滤光片对应设置。在其他实施例中,滤光片也可以是白色,方便较大光谱(波长)范围的光线通过,增加透过白色滤光片的光通量。The filter 42 may include three types of red, green, and blue, and can only transmit light of corresponding wavelengths of red, green, and blue, respectively. The four sub-pixel points d included in one pixel point D are arranged corresponding to the filters of the same color. In other embodiments, the filter may also be white, which facilitates the passage of light in a larger spectrum (wavelength) range and increases the luminous flux passing through the white filter.

透镜40用于接收入射光,并将入射光传输给滤光片42。滤光片42对入射光进行滤波处理后,将滤波处理后的光信号投射到成像组件44上。The lens 40 is used to receive incident light and transmit the incident light to the filter 42. After the filter 42 performs filtering processing on the incident light, the filtered light signal is projected onto the imaging component 44.

成像组件44所包括的图像传感器中的感光单元通过光电效应将从滤光片42入射的光转换成电荷信号,并生成与电荷信号一致的像素信号,经过一系列处理后最终输出图像。The photosensitive unit in the image sensor included in the imaging component 44 converts the light incident from the filter 42 into a charge signal through the photoelectric effect, and generates a pixel signal consistent with the charge signal, and finally outputs an image after a series of processing.

由上文说明可知,图像传感器包括的像素点与图像包括的像素是两个不同的概念,其中,图像包括的像素指的是图像的最小组成单元,其一般由一个数字序列进行表示,通常情况下,可以将该数字序列称为像素的像素值。本申请实施例对“图像传感器包括的像素点”以及“图像包括的像素”两个概念均有所涉及,为了方便读者理解,在此进行简要的解释。It can be seen from the above description that the pixels included in the image sensor and the pixels included in the image are two different concepts. The pixels included in the image refer to the smallest component unit of the image, which is generally represented by a sequence of numbers. In the following, the sequence of numbers can be referred to as the pixel value of a pixel. The embodiments of the present application involve both concepts of "pixels included in an image sensor" and "pixels included in an image". To facilitate readers' understanding, a brief explanation is provided here.

图5为一个实施例中像素点组上设置滤光片的示意图。像素点组Z包括按照两行两列的阵列排布方式进行排布的4个像素点D,其中,第一行第一列的像素点的颜色通道为绿色,也即是,第一行第一列的像素点上设置的滤光片为绿色滤光片;第一行第二列的像素点的颜色通道为红色,也即是,第一行第二列的像素点上设置的滤光片为红色滤光片;第二行第一 列的像素点的颜色通道为蓝色,也即是,第二行第一列的像素点上设置的滤光片为蓝色滤光片;第二行第二列的像素点的颜色通道为绿色,也即是,第二行第二列的像素点上设置的滤光片为绿色滤光片。Fig. 5 is a schematic diagram of a filter set on a pixel point group in an embodiment. The pixel group Z includes 4 pixels D arranged in an array arrangement of two rows and two columns, wherein the color channel of the pixels in the first row and the first column is green, that is, the first row and the first row The filter set on the pixels in one column is a green filter; the color channel of the pixels in the first row and second column is red, that is, the filter set on the pixels in the first row and second column The filter is a red filter; the color channel of the pixel in the second row and the first column is blue, that is, the filter set on the pixel in the second row and the first column is a blue filter; The color channel of the pixel points in the second row and second column is green, that is, the filter set on the pixel points in the second row and second column is a green filter.

图6为一个实施例中对焦方法的流程图。如图6所示,对焦方法包括操作602至操作608。Fig. 6 is a flowchart of a focusing method in an embodiment. As shown in FIG. 6, the focusing method includes operations 602 to 608.

操作602,拍摄时获取相位差值,相位差值包括第一方向的相位差值和第二方向的相位差值;第一方向与第二方向成预设夹角。Operation 602: Obtain a phase difference value during shooting, where the phase difference value includes the phase difference value in the first direction and the phase difference value in the second direction; the first direction and the second direction form a preset angle.

具体地,通过电子设备的成像设备拍摄图像时,获取相位差值,该相位差值包括第一方向的相位差值和第二方向的相位差值。第一方向和第二方向可成预设夹角,该预设夹角可为除0度、180度和360度外的任意角度。Specifically, when an image is taken by an imaging device of an electronic device, a phase difference value is acquired, and the phase difference value includes a phase difference value in the first direction and a phase difference value in the second direction. The first direction and the second direction may form a preset included angle, and the preset included angle may be any angle other than 0 degrees, 180 degrees, and 360 degrees.

操作604,通过陀螺仪获取陀螺仪数据。In operation 604, the gyroscope data is obtained through the gyroscope.

陀螺仪是用高速回转体的动量矩敏感壳体相对惯性空间绕正交于自转轴的一个或二个轴的角运动检测装置。陀螺仪包括光纤陀螺仪、激光陀螺仪、MEMS(Micro Electro Mechanical systems,微电子机械系统)陀螺仪等。The gyroscope is an angular motion detection device that uses the moment of momentum sensitive shell of a high-speed rotating body to rotate one or two axes orthogonal to the rotation axis relative to the inertial space. Gyroscopes include fiber optic gyroscopes, laser gyroscopes, and MEMS (Micro Electro Mechanical systems) gyroscopes.

陀螺仪数据可以包括角速度数据,移动方向等。在陀螺仪中包括了X轴、Y轴和Z轴,可以分别检测X轴的陀螺仪数据、Y轴的陀螺仪数据和Z轴的陀螺仪数据。The gyroscope data may include angular velocity data, movement direction, and so on. The gyroscope includes X-axis, Y-axis, and Z-axis, and can detect the gyroscope data of the X-axis, the gyroscope data of the Y-axis, and the gyroscope data of the Z-axis respectively.

陀螺仪的X轴的陀螺仪数据可以表示在水平方向上左右移动的数据,陀螺仪的Y轴的陀螺仪数据可以表示水平方向上前后移动的数据。因此,可以根据陀螺仪的X轴的陀螺仪数据和Y轴的陀螺仪数据确定电子设备的水平方向的数据。The gyroscope data of the X-axis of the gyroscope may represent data moving left and right in the horizontal direction, and the gyroscope data of the Y-axis of the gyroscope may represent data moving back and forth in the horizontal direction. Therefore, the horizontal direction data of the electronic device can be determined based on the gyroscope data of the X axis and the gyroscope data of the Y axis of the gyroscope.

例如,当陀螺仪的X轴的陀螺仪数据为向左移动,Y轴的陀螺仪数据为零,则根据陀螺仪的X轴的陀螺仪数据和Y轴的陀螺仪数据可以确定电子设备水平向左移动。当陀螺仪的X轴的陀螺仪数据为零,Y轴的陀螺仪数据为向前移动,则根据陀螺仪的X轴的陀螺仪数据和Y轴的陀螺仪数据可以确定电子设备水平向前移动。当陀螺仪的X轴的陀螺仪数据为向右移动,Y轴的陀螺仪数据为向前移动,则根据陀螺仪的X轴的陀螺仪数据和Y轴的陀螺仪数据可以确定电子设备水平向右前方移动。For example, when the gyroscope data of the X axis of the gyroscope moves to the left and the gyroscope data of the Y axis is zero, then the horizontal direction of the electronic device can be determined according to the gyroscope data of the X axis of the gyroscope and the gyroscope data of the Y axis. Move left. When the gyroscope data of the X axis of the gyroscope is zero and the gyroscope data of the Y axis is moving forward, the electronic device can be determined to move forward horizontally according to the gyroscope data of the X axis of the gyroscope and the gyroscope data of the Y axis . When the gyroscope data of the X axis of the gyroscope moves to the right and the gyroscope data of the Y axis moves forward, the horizontal direction of the electronic device can be determined according to the gyroscope data of the X axis of the gyroscope and the gyroscope data of the Y axis Move forward right.

操作606,根据陀螺仪数据从第一方向的相位差值和第二方向的相位差值中确定目标相位差值。In operation 606, the target phase difference value is determined from the phase difference value in the first direction and the phase difference value in the second direction according to the gyroscope data.

可以理解的是,当陀螺仪数据中的移动方向为水平方向时,特别是在被拍摄的物体包含垂直纹理时,则通过水平排列(左右排列)的像素对的成像模糊,得到的水平方向的相位差值不准确,从而影响对焦的准确性。同样地,当陀螺仪数据中的移动方向为竖直方向时,特别是在被拍摄物体包含水平纹理时,则通过竖直排列(上下排列)的像素对的程序模糊,得到的竖直方向的相位差值不准确,从而影响对焦的准确性。It is understandable that when the moving direction in the gyroscope data is the horizontal direction, especially when the object to be photographed contains vertical texture, the image blur of the pixel pairs arranged horizontally (left and right) is used to obtain the horizontal direction. The phase difference value is not accurate, which affects the accuracy of focusing. Similarly, when the moving direction in the gyroscope data is the vertical direction, especially when the object to be photographed contains a horizontal texture, the vertical alignment (arranged up and down) of the pixel pairs will be blurred by the program to obtain the vertical direction. The phase difference value is not accurate, which affects the accuracy of focusing.

而在本申请的实施例中,根据陀螺仪数据,可以从第一方向的相位差值和第二方向的相位差值中确定准确的目标相位差值。In the embodiment of the present application, according to the gyroscope data, an accurate target phase difference value can be determined from the phase difference value in the first direction and the phase difference value in the second direction.

例如,第一方向为水平方向,第二方向为竖直方向,当陀螺仪数据为水平方向进行移动,则可以将第二方向的相位差值确定为目标相位差值;当陀螺仪数据为竖直方向进行移动,则可以将第一方向的相位差值确定为目标相位差值。For example, the first direction is the horizontal direction, and the second direction is the vertical direction. When the gyroscope data is moving in the horizontal direction, the phase difference value in the second direction can be determined as the target phase difference value; when the gyroscope data is vertical Moving in the straight direction, the phase difference value in the first direction can be determined as the target phase difference value.

操作608,基于目标相位差值进行对焦。In operation 608, focusing is performed based on the target phase difference value.

对焦指的是通过马达驱动透镜以改变动物距和相距的位置,使被拍物成像清晰的过程。Focusing refers to the process of driving the lens through a motor to change the position of the animal distance and distance, so that the image of the object is clear.

上述对焦方法,拍摄时获取相位差值,相位差值包括第一方向的相位差值和第二方向的相位差值;第一方向与第二方向成预设夹角;通过陀螺仪获取陀螺仪数据;根据陀螺仪数据可以确定电子设备的移动方向,从而可以从第一方向的相位差值和第二方向的相位差值中确定更准确的目标相位差值;基于目标相位差值可以更准确进行对焦。In the above focusing method, the phase difference value is acquired during shooting, and the phase difference value includes the phase difference value in the first direction and the phase difference value in the second direction; the first direction and the second direction form a preset angle; the gyroscope is obtained through a gyroscope Data; According to the gyroscope data, the moving direction of the electronic device can be determined, so that a more accurate target phase difference value can be determined from the phase difference value in the first direction and the phase difference value in the second direction; it can be more accurate based on the target phase difference value Focus.

在一个实施例中,根据陀螺仪数据从第一方向的相位差值和第二方向的相位差值中确定目标相位差值,包括:根据陀螺仪数据确定电子设备的移动方向;根据电子设备的移动方向,从第一方向的相位差值和第二方向的相位差值中确定目标相位差值。In one embodiment, determining the target phase difference value from the phase difference value in the first direction and the phase difference value in the second direction according to the gyroscope data includes: determining the moving direction of the electronic device according to the gyroscope data; In the moving direction, the target phase difference value is determined from the phase difference value in the first direction and the phase difference value in the second direction.

在一个实施例中,可以将陀螺仪数据中的方向作为电子设备的移动方向。例如,陀螺仪 数据中的方向为水平向左移动,则电子设备的移动方向为水平向左移动。In one embodiment, the direction in the gyroscope data may be used as the moving direction of the electronic device. For example, if the direction in the gyroscope data is moving horizontally to the left, the moving direction of the electronic device is moving horizontally to the left.

在另一个实施例中,也可以将陀螺仪数据中的方向进行加权处理,从而得到电子设备的移动方向。具体地,获取陀螺仪数据中的X轴的权重因子、Y轴的权重因子以及Z轴的权重因子;根据X轴的陀螺仪数据、Y轴的陀螺仪数据、Z轴的陀螺仪数据以及对应的各个权重因子确定电子设备的移动方向。In another embodiment, the direction in the gyroscope data can also be weighted to obtain the moving direction of the electronic device. Specifically, the X-axis weighting factor, the Y-axis weighting factor, and the Z-axis weighting factor in the gyroscope data are obtained; according to the gyroscope data of the X-axis, the gyroscope data of the Y-axis, the gyroscope data of the Z-axis, and the corresponding Each weighting factor of determines the direction of movement of the electronic device.

例如,X轴的陀螺仪数据为50cm,即水平向右移动50cm;Y轴的陀螺仪数据为40cm,即水平向前移动40cm;Z轴的陀螺仪数据为0;X轴的权重因子为1,Y轴的权重因子为1.25,Z轴的权重因子为1.5,则加权处理之后,X轴的陀螺仪数据为50*1=50cm,Y轴的陀螺仪数据为40*1.25=50cm,则可以确定电子设备的移动方向为向右前方45度进行移动。For example, the gyroscope data of the X-axis is 50cm, which means it moves 50cm horizontally to the right; the gyroscope data of the Y-axis is 40cm, which means it moves forward 40cm horizontally; the gyroscope data of the Z-axis is 0; the weighting factor of the X-axis is 1. , The weighting factor of the Y axis is 1.25, and the weighting factor of the Z axis is 1.5. After the weighting process, the gyroscope data of the X axis is 50*1=50cm, and the gyroscope data of the Y axis is 40*1.25=50cm. It is determined that the moving direction of the electronic device is 45 degrees to the right and forward.

在其他实施例中,还可以根据其他方式确定电子设备的移动方向,具体的方式可以根据用户需要进行设定,不限于此。In other embodiments, the moving direction of the electronic device may also be determined according to other methods, and the specific method may be set according to the needs of the user, and is not limited to this.

电子设备中包含了图像传感器,图像传感器中包含了各个像素对,当确定了更准确的电子设备的移动方向时,可以更准确地确定各个像素对的移动方向,从而可以确定成像模糊的像素对,去除该成像模糊的像素对所对应的相位差值,确定更准确的目标相位差值。The electronic device contains an image sensor, and the image sensor contains each pixel pair. When the movement direction of the electronic device is determined more accurately, the movement direction of each pixel pair can be determined more accurately, so that the pixel pair that is blurred in the image can be determined , Remove the phase difference value corresponding to the imaged blurred pixel pair, and determine a more accurate target phase difference value.

在一个实施例中,如图7所示,根据电子设备的移动方向,从第一方向的相位差值和第二方向的相位差值中确定目标相位差值,包括:In one embodiment, as shown in FIG. 7, determining the target phase difference value from the phase difference value in the first direction and the phase difference value in the second direction according to the moving direction of the electronic device includes:

操作702,当电子设备的移动方向为第一方向时,确定第二方向的相位差值为目标相位差值。In operation 702, when the moving direction of the electronic device is the first direction, it is determined that the phase difference in the second direction is the target phase difference value.

当电子设备的移动方向为第一方向时,特别是被拍摄物体存在第二方向的纹理时,则电子设备所包含的图像传感器中以第一方向排列的像素对成像模糊,则以第一方向排列的像素对所得到的相位差值不准确。而以第二方向排列的像素对成像清晰,以第二方向排列的像素对所得到的相位差值是准确的,则将第二方向的相位差值确定为目标相位差值。When the moving direction of the electronic device is the first direction, especially when the subject has a texture in the second direction, the pixels arranged in the first direction in the image sensor included in the electronic device blur the image, and the image is blurred in the first direction. The phase difference value obtained by the aligned pixels is not accurate. However, the image of the pixel pair arranged in the second direction is clear, and the phase difference value obtained by the pixel pair arranged in the second direction is accurate, and the phase difference value in the second direction is determined as the target phase difference value.

例如,第一方向为水平方向,第二方向为竖直方向,电子设备水平进行移动时,特别是被拍摄物体存在竖直方向的纹理时,则电子设备所包含的图像传感器中以水平方向排列(左右排列)的像素对成像模糊,而以竖直方向排列(上下排列)的像素对成像清晰。也就是说,以水平方向排列的像素对得到的相位差值不准确,而以竖直方向排列的像素对得到的相位差值是准确的。因此,将竖直方向的相位差值确定为目标相位差值。For example, the first direction is the horizontal direction and the second direction is the vertical direction. When the electronic device moves horizontally, especially when the subject has a vertical texture, the image sensors included in the electronic device are arranged in the horizontal direction. Pixels (arranged left and right) are blurred for imaging, while pixels arranged in a vertical direction (arranged up and down) are sharp for imaging. That is to say, the phase difference value obtained by the pixel pair arranged in the horizontal direction is not accurate, while the phase difference value obtained by the pixel pair arranged in the vertical direction is accurate. Therefore, the phase difference value in the vertical direction is determined as the target phase difference value.

操作704,当电子设备的移动方向为第二方向时,确定第一方向的相位差值为目标相位差值。In operation 704, when the moving direction of the electronic device is the second direction, it is determined that the phase difference in the first direction is the target phase difference value.

当电子设备的移动方向为第二方向时,特别是被拍摄物体存在第一方向的纹理时,则电子设备所包含的图像传感器中以第二方向排列的像素对成像模糊,则以第二方向排列的像素对所得到的相位差值不准确。而以第一方向排列的像素对成像清晰,以第一方向排列的像素对所得到的相位差值是准确的,则将第一方向的相位差值确定为目标相位差值。When the moving direction of the electronic device is the second direction, especially when the subject has a texture in the first direction, the pixels arranged in the second direction in the image sensor included in the electronic device blur the image, and the image is blurred in the second direction. The phase difference value obtained by the aligned pixels is not accurate. However, the image of the pixel pair arranged in the first direction is clear, and the phase difference value obtained by the pixel pair arranged in the first direction is accurate, and the phase difference value in the first direction is determined as the target phase difference value.

例如,第一方向为竖直方向,第二方向为水平方向,电子设备竖直进行移动时,特别是被拍摄物体存在水平方向的纹理时,则电子设备所包含的图像传感器中以竖直方向排列(上下排列)的像素对成像模糊,而以水平方向排列(左右排列)的像素对成像清晰。也就是说,以竖直方向排列的像素对得到的相位差值不准确,而以水平方向排列的像素对得到的相位差值是准确的。因此,将水平方向的相位差值确定为目标相位差值。For example, the first direction is the vertical direction, and the second direction is the horizontal direction. When the electronic device moves vertically, especially when the subject has a horizontal texture, the image sensor contained in the electronic device uses the vertical direction. Pixels arranged (arranged up and down) blur the image, while pixels arranged in the horizontal direction (arranged left and right) have a clear image. That is to say, the phase difference value obtained by the pixel pair arranged in the vertical direction is not accurate, while the phase difference value obtained by the pixel pair arranged in the horizontal direction is accurate. Therefore, the phase difference value in the horizontal direction is determined as the target phase difference value.

上述对焦方法,根据电子设备的移动方向,从第一方向的相位差值和第二方向的相位差值中确定准确的相位差值作为目标相位差值,避免了基于不准确的相位差值进行对焦造成对焦不准确的问题,提高了对焦的准确性。In the above focusing method, according to the moving direction of the electronic device, an accurate phase difference value is determined from the phase difference value in the first direction and the phase difference value in the second direction as the target phase difference value, which avoids performing based on inaccurate phase difference values. Focusing causes the problem of inaccurate focus, which improves the accuracy of focus.

在一个实施例中,如图8所示,基于目标相位差值进行对焦,包括:In one embodiment, as shown in FIG. 8, performing focusing based on the target phase difference value includes:

操作802,根据目标相位差值确定离焦距离值。In operation 802, a defocus distance value is determined according to the target phase difference value.

目标相位差值与离焦距离值之间的对应关系可通过标定得到。The corresponding relationship between the target phase difference value and the defocus distance value can be obtained through calibration.

离焦距离值与目标相位差值之间的对应关系如下:The corresponding relationship between the defocus distance value and the target phase difference value is as follows:

defocus=PD*slope(DCC),其中,DCC(Defocus Conversion Coefficient,离焦系数) 由标定得到,PD为目标相位差值。defocus=PD*slope(DCC), where DCC (Defocus Conversion Coefficient) is obtained by calibration, and PD is the target phase difference value.

目标相位差值与离焦距离值的对应关系的标定过程包括:将摄像模组的有效对焦行程切分为10等分,即(近焦DAC-远焦DAC)/10,以此覆盖马达的对焦范围;在每个对焦DAC(DAC可为0至1023)位置进行对焦,并记录当前对焦DAC位置的相位差;完成马达对焦行程后取一组10个的对焦DAC与获得的PD值进行做比;生成10个相近的比值K,将DAC与PD组成的二维数据进行拟合得到斜率为K的直线。The calibration process of the corresponding relationship between the target phase difference value and the defocus distance value includes: dividing the effective focus stroke of the camera module into 10 equal parts, namely (near focus DAC-far focus DAC)/10, so as to cover the motor Focus range; focus at each focus DAC (DAC can be 0 to 1023) position, and record the phase difference of the current focus DAC position; after completing the motor focus stroke, take a group of 10 focus DACs and the obtained PD value to do Ratio; Generate 10 similar ratios K, and fit the two-dimensional data composed of DAC and PD to get a straight line with slope K.

操作804,根据离焦距离值控制镜头移动以对焦。In operation 804, the lens is controlled to move to focus according to the defocus distance value.

上述对焦方法,根据目标相位差值确定离焦距离值及移动方向;根据离焦距离值和移动方向控制镜头移动可以更准确进行对焦。In the above focusing method, the defocus distance value and the moving direction are determined according to the target phase difference value; the lens movement can be controlled according to the defocus distance value and the moving direction to more accurately focus.

在一个实施例中,根据目标相位差值确定离焦距离值,包括:获取目标相位差值的置信度;当置信度大于置信度阈值时,根据目标相位差值从相位差值与离焦距离值的对应关系中确定对应的离焦距离值。In one embodiment, determining the defocus distance value according to the target phase difference value includes: obtaining the confidence level of the target phase difference value; when the confidence level is greater than the confidence threshold, according to the target phase difference value from the phase difference value and the defocus distance The corresponding defocus distance value is determined in the value correspondence relationship.

目标相位差值的置信度指的是目标相位差值的可信程度。目标相位差值的置信度越高,表示目标相位差值越可信,即目标相位差值越准确;目标相位差值的置信度越低,表示目标相位差值越不可信,即目标相位差值越不准确。The confidence level of the target phase difference value refers to the credibility of the target phase difference value. The higher the confidence of the target phase difference value, the more reliable the target phase difference value, that is, the more accurate the target phase difference value; the lower the confidence of the target phase difference value, the less reliable the target phase difference value, that is, the target phase difference. The more inaccurate the value.

当置信度大于置信度阈值时,表示目标相位差值的置信度较高,可以认为该目标相位差值较准确,则根据目标相位差值从相位差值与离焦距离值的对应关系中确定对应的离焦距离值。When the confidence is greater than the confidence threshold, it means that the confidence of the target phase difference value is higher. It can be considered that the target phase difference value is more accurate. Then the target phase difference value is determined from the corresponding relationship between the phase difference value and the defocus distance value. The corresponding defocus distance value.

当置信度小于或等于置信度阈值时,表示目标相位差值的置信度较低,可以认为该目标相位差值不准确,可以返回执行拍摄时获取相位差值操作,重新获取第一方向的相位差值和第二方向的相位差值。When the confidence is less than or equal to the confidence threshold, it means that the confidence of the target phase difference value is low, and it can be considered that the target phase difference value is inaccurate. You can return to the operation of obtaining the phase difference value during shooting and reacquire the phase in the first direction. The difference and the phase difference in the second direction.

本实施例中,以计算水平方向的相位差值为例,计算图像中某一行坐标x的相位差值,取左图x-2,x-1,x,x+1,x+2共5个像素点的亮度值,右图上做移动,移动范围可为-10到+10。即:In this embodiment, taking the phase difference value in the horizontal direction as an example, the phase difference value of a line coordinate x in the image is calculated, and the left figure is x-2, x-1, x, x+1, x+2, and a total of 5 The brightness value of each pixel is moved on the right picture, and the moving range can be -10 to +10. which is:

对右图亮度值Rx-12,Rx-11,Rx-10,Rx-9,Rx-8和x-2,x-1,x,x+1,x+2做相似比较;Compare the brightness values Rx-12, Rx-11, Rx-10, Rx-9, Rx-8 and x-2, x-1, x, x+1, x+2 in the picture on the right;

对右图亮度值Rx-11,Rx-10,Rx-9,Rx-8,Rx-7和x-2,x-1,x,x+1,x+2做相似比较;Compare the brightness values Rx-11, Rx-10, Rx-9, Rx-8, Rx-7 and x-2, x-1, x, x+1, x+2 in the picture on the right;

……...

对右图亮度值Rx-2,Rx-1,Rx,Rx+1,Rx+2和x-2,x-1,x,x+1,x+2做相似比较;Compare the brightness values Rx-2, Rx-1, Rx, Rx+1, Rx+2 and x-2, x-1, x, x+1, x+2 in the picture on the right;

对右图亮度值Rx-1,Rx,Rx+1,Rx+2,Rx+3和x-2,x-1,x,x+1,x+2做相似比较;Compare the brightness values Rx-1, Rx, Rx+1, Rx+2, Rx+3 and x-2, x-1, x, x+1, x+2 in the picture on the right;

……...

对右图亮度值Rx+7,Rx+8,Rx+9,Rx+10,Rx+11和x-2,x-1,x,x+1,x+2做相似比较Compare the brightness values Rx+7, Rx+8, Rx+9, Rx+10, Rx+11 and x-2, x-1, x, x+1, x+2 in the right picture.

对右图亮度值Rx+8,Rx+9,Rx+10,Rx+11,Rx+12和x-2,x-1,x,x+1,x+2做相似比较。Compare the brightness values Rx+8, Rx+9, Rx+10, Rx+11, Rx+12 and x-2, x-1, x, x+1, x+2 in the picture on the right.

以右图五个像素点值为Rx-2,Rx-1,Rx,Rx+1,Rx+2,左图五个像素点值为x-2,x-1,x,x+1,x+2为例,相似度匹配程度可以为|Rx-2-x-2|+|Rx-1-x-1|+|Rx-x|+|Rx+1--x+1|+|Rx+2-x+2|。相似度匹配程度的值越小,相似度越高。相似度越高,可信度越高。相似的像素点值可作为相匹配的像素点得到相位差。而对于上图和下图,可取上图中的一列像素点的亮度值和下图中一列相同数量的像素点的亮度值作相似比较。上图和下图的可信度获取过程与左图和右图的过程类似,在此不再赘述。Take the five pixels on the right as Rx-2, Rx-1, Rx, Rx+1, Rx+2, and the five pixels on the left as x-2, x-1, x, x+1, x +2 as an example, the degree of similarity matching can be |Rx-2-x-2|+|Rx-1-x-1|+|Rx-x|+|Rx+1--x+1|+|Rx +2-x+2|. The smaller the value of the similarity matching degree, the higher the similarity. The higher the similarity, the higher the credibility. Similar pixel values can be used as matched pixels to get the phase difference. As for the upper and lower images, the brightness value of a column of pixels in the upper image can be compared with the brightness value of the same number of pixels in the lower image. The process of obtaining the credibility of the upper and lower images is similar to that of the left and right images, so I won't repeat them here.

上述对焦方法,当目标相位差值的置信度大于置信度阈值时,表示该目标相位差值较准确,可以根据该目标相位差值从相位差值与离焦距离值的对应关系中确定对应的离焦距离值,则确定的离焦距离值也较准确,从而可以提高对焦的准确性。In the above focusing method, when the confidence of the target phase difference value is greater than the confidence threshold, the target phase difference value is more accurate, and the corresponding relationship can be determined from the phase difference value and the defocus distance value according to the target phase difference value. Defocus distance value, the determined defocus distance value is also more accurate, so that the accuracy of focusing can be improved.

在一个实施例中,上述方法还包括:获取第一图像;对第一图像进行主体检测,得到感兴趣区域。拍摄时获取相位差值,包括:拍摄时获取感兴趣区域中的相位差值。基于目标相位差值进行对焦,包括:基于目标相位差值在感兴趣区域中进行对焦,得到第二图像。In an embodiment, the above method further includes: acquiring a first image; performing subject detection on the first image to obtain a region of interest. Obtaining the phase difference value during shooting includes: obtaining the phase difference value in the region of interest during shooting. Focusing based on the target phase difference value includes: focusing in the region of interest based on the target phase difference value to obtain the second image.

其中,主体检测(salient object detection)是指面对一个场景时,自动地对感兴趣区域进行处理而选择性的忽略不感兴趣区域。感兴趣区域称为主体区域。可见光图是指RGB(Red、Green、Blue)图像。可通过彩色摄像头拍摄任意场景得到彩色图像,即RGB图像。该可见光图可为电子设备本地存储的,也可为其他设备存储的,也可以为从网络上存储的,还可为电子设备实时拍摄的,不限于此。Among them, subject detection (salient object detection) refers to automatically processing regions of interest when facing a scene and selectively ignoring regions of interest. The area of interest is called the subject area. The visible light image refers to an RGB (Red, Green, Blue) image. A color image can be obtained by shooting any scene with a color camera, that is, an RGB image. The visible light image may be stored locally by the electronic device, may also be stored by other devices, may also be stored on the network, or may be captured by the electronic device in real time, but is not limited to this.

具体地,获取第一图像中的感兴趣区域的相位差值,感兴趣区域的相位差值也包括了第一方向的相位差值和第二方向的相位差值,第一方向和第二方向成预设夹角;通过陀螺仪获取陀螺仪数据;根据陀螺仪数据从第一方向的相位差值和第二方向的相位差值中确定目标相位差值;基于目标相位差值在感兴趣区域中进行对焦,得到第二图像。Specifically, the phase difference value of the region of interest in the first image is acquired. The phase difference value of the region of interest also includes the phase difference value in the first direction and the phase difference value in the second direction, the first direction and the second direction Set the preset angle; obtain gyroscope data through the gyroscope; determine the target phase difference value from the phase difference value in the first direction and the phase difference value in the second direction according to the gyroscope data; based on the target phase difference value in the region of interest Focus in the middle to get the second image.

上述对焦方法,结合主体检测,获取到感兴趣区域,在感兴趣区域中进行对焦,可以避免对焦至背景区域中,提高了对焦的准确性。The above focusing method, combined with subject detection, acquires the region of interest, and performs focusing in the region of interest, which can avoid focusing in the background area and improve the accuracy of focusing.

在一个实施例中,电子设备包括图像传感器,图像传感器包括阵列排布的多个像素点组,每个像素点组包括阵列排布的M*N个像素点;每个像素点对应一个感光单元,其中,M和N均为大于或等于2的自然数。In one embodiment, the electronic device includes an image sensor, the image sensor includes a plurality of pixel point groups arranged in an array, and each pixel point group includes M*N pixel points arranged in an array; each pixel point corresponds to a photosensitive unit , Where both M and N are natural numbers greater than or equal to 2.

如图9所示,获取相位差值,包括:As shown in Figure 9, obtaining the phase difference value includes:

操作902,根据每个像素点组包括的像素点的亮度值获取目标亮度图。In operation 902, a target brightness map is obtained according to the brightness values of the pixel points included in each pixel point group.

通常情况下,图像传感器的像素点的亮度值可以由该像素点包括的子像素点的亮度值来进行表征。成像设备可以根据每个像素点组包括的像素点中子像素点的亮度值来获取该目标亮度图。其中,子像素点的亮度值是指该子像素点对应的感光元件接收到的光信号的亮度值。Generally, the brightness value of the pixel of the image sensor can be characterized by the brightness value of the sub-pixel included in the pixel. The imaging device may obtain the target brightness map according to the brightness values of the sub-pixel points in the pixel points included in each pixel point group. Wherein, the brightness value of a sub-pixel point refers to the brightness value of the light signal received by the photosensitive element corresponding to the sub-pixel point.

如上文所述,图像传感器包括的子像素点是一种能够将光信号转化为电信号的感光元件,因此,可以根据子像素点输出的电信号来获取该子像素点接收到的光信号的强度,根据子像素点接收到的光信号的强度即可得到该子像素点的亮度值。As mentioned above, the sub-pixel included in the image sensor is a photosensitive element that can convert light signals into electrical signals. Therefore, the light signal received by the sub-pixel can be obtained according to the electrical signal output by the sub-pixel. Intensity, the brightness value of the sub-pixel can be obtained according to the intensity of the light signal received by the sub-pixel.

本申请实施例中的目标亮度图用于反映图像传感器中子像素点的亮度值,该目标亮度图可以包括多个像素,其中,目标亮度图中的每个像素的像素值均是根据图像传感器中子像素点的亮度值得到的。The target brightness map in the embodiment of the present application is used to reflect the brightness value of the sub-pixels in the image sensor. The target brightness map may include multiple pixels, wherein the pixel value of each pixel in the target brightness map is based on the image sensor Obtained from the brightness value of the neutron pixel.

操作904,对目标亮度图进行切分处理,得到第一切分亮度图和第二切分亮度图,并根据第一切分亮度图和第二切分亮度图中相互匹配的像素的位置差异,确定相互匹配的像素的相位差值。Operation 904: perform segmentation processing on the target brightness map to obtain a first segmented brightness map and a second segmented brightness map, and based on the position difference of pixels matching each other in the first segmented brightness map and the second segmented brightness map , Determine the phase difference value of the pixels that match each other.

在一个实施例中,成像设备可以沿列的方向(图像坐标系中的y轴方向)对该目标亮度图进行切分处理,在沿列的方向对目标亮度图进行切分处理的过程中,切分处理的每一分割线都与列的方向垂直。In one embodiment, the imaging device may perform segmentation processing on the target luminance map along the column direction (the y-axis direction in the image coordinate system). In the process of segmenting the target luminance map along the column direction, Each dividing line of the segmentation process is perpendicular to the direction of the column.

在另一个实施例中,成像设备可以沿行的方向(图像坐标系中的x轴方向)对该目标亮度图进行切分处理,在沿行的方向对目标亮度图进行切分处理的过程中,切分处理的每一分割线都与行的方向垂直。In another embodiment, the imaging device may perform segmentation processing on the target brightness map along the row direction (the x-axis direction in the image coordinate system), and in the process of segmenting the target brightness map along the row direction , Each dividing line of the segmentation process is perpendicular to the direction of the row.

沿列的方向对目标亮度图进行切分处理后得到的第一切分亮度图和第二切分亮度图可以分别称为上图和下图。沿行的方向对目标亮度图进行切分处理后得到的第一切分亮度图和第二切分亮度图可以分别称为左图和右图。The first segmented brightness map and the second segmented brightness map obtained after the target brightness map is segmented along the column direction can be referred to as the upper image and the lower image, respectively. The first segmented brightness map and the second segmented brightness map obtained after the target brightness map is segmented along the row direction can be called the left image and the right image, respectively.

其中,“相互匹配的像素”指的是由像素本身及其周围像素组成的像素矩阵相互相似。例如,第一切分亮度图中像素a和其周围的像素组成一个3行3列的像素矩阵,该像素矩阵的像素值为:Among them, "mutually matched pixels" means that the pixel matrix composed of the pixel itself and the surrounding pixels are similar to each other. For example, the pixel a and its surrounding pixels in the first segmented brightness map form a pixel matrix with 3 rows and 3 columns, and the pixel value of the pixel matrix is:

Figure PCTCN2020122301-appb-000001
Figure PCTCN2020122301-appb-000001

第二切分亮度图中像素b和其周围的像素也组成一个3行3列的像素矩阵,该像素矩阵的像素值为:The pixel b and the surrounding pixels in the second segmented brightness map also form a pixel matrix with 3 rows and 3 columns, and the pixel value of the pixel matrix is:

Figure PCTCN2020122301-appb-000002
Figure PCTCN2020122301-appb-000002

由上文可以看出,这两个矩阵是相似的,则可以认为像素a和像素b相互匹配。判断像素矩阵是否相似的方式很多,通常可对两个像素矩阵中的每个对应像素的像素值求差,再将求得的差值的绝对值进行相加,利用该相加的结果来判断像素矩阵是否相似,也即是,若该相加的结果小于预设的某一阈值,则认为像素矩阵相似,否则,则认为像素矩阵不相似。It can be seen from the above that the two matrices are similar, and it can be considered that the pixel a and the pixel b match each other. There are many ways to judge whether the pixel matrix is similar. Usually, the pixel value of each corresponding pixel in the two pixel matrices can be calculated, and then the absolute value of the difference obtained is added, and the result of the addition is used to determine Whether the pixel matrix is similar, that is, if the result of the addition is less than a preset threshold, the pixel matrix is considered to be similar; otherwise, the pixel matrix is considered to be dissimilar.

例如,对于上述两个3行3列的像素矩阵而言,可以分别将1和2求差,将15和15求差,将70和70求差,……,再将求得的差的绝对值相加,得到相加结果为3,该相加结果3小于预设的阈值,则认为上述两个3行3列的像素矩阵相似。For example, for the above two pixel matrices with 3 rows and 3 columns, the difference of 1 and 2, the difference of 15 and 15, the difference of 70 and 70, ..., and then the absolute difference Values are added, and the result of the addition is 3. If the result of the addition of 3 is less than the preset threshold, it is considered that the two pixel matrices with 3 rows and 3 columns are similar.

另一种判断像素矩阵是否相似的方式是利用sobel卷积核计算方式或者高拉普拉斯计算方式等方式提取其边缘特征,通过边缘特征来判断像素矩阵是否相似。Another way to judge whether the pixel matrix is similar is to use the Sobel convolution kernel calculation method or the high Laplacian calculation method to extract the edge characteristics, and judge whether the pixel matrix is similar by the edge characteristics.

在本申请实施例中,“相互匹配的像素的位置差异”指的是,相互匹配的像素中位于第一切分亮度图中的像素的位置和位于第二切分亮度图中的像素的位置的差异。如上述举例,相互匹配的像素a和像素b的位置差异指的是像素a在第一切分亮度图中的位置和像素b在第二切分亮度图中的位置的差异。In the embodiments of the present application, "the positional difference of pixels that match each other" refers to the positions of the pixels in the first split brightness map and the positions of the pixels in the second split brightness map among the matched pixels. The difference. As in the above example, the position difference between the pixel a and the pixel b that are matched with each other refers to the difference between the position of the pixel a in the first split brightness map and the position of the pixel b in the second split brightness map.

相互匹配的像素分别对应于从不同方向射入镜头的成像光线在图像传感器中所成的不同的像。例如,第一切分亮度图中的像素a与第二切分亮度图中的像素b相互匹配,其中,该像素a可以对应于图1中在A位置处所成的像,像素b可以对应于图1中在B位置处所成的像。The pixels that match each other correspond to different images in the image sensor formed by the imaging light entering the lens from different directions. For example, the pixel a in the first split brightness map and the pixel b in the second split brightness map match each other, where the pixel a may correspond to the image formed at position A in FIG. 1, and the pixel b may correspond to The image formed at position B in Figure 1.

由于相互匹配的像素分别对应于从不同方向射入镜头的成像光线在图像传感器中所成的不同的像,因此,根据相互匹配的像素的位置差异,即可确定该相互匹配的像素的相位差。Since the matched pixels correspond to the different images in the image sensor formed by the imaging light entering the lens from different directions, the phase difference of the matched pixels can be determined according to the position difference of the matched pixels. .

操作906,根据相互匹配的像素的相位差值确定第一方向的相位差值和第二方向的相位差值。In operation 906, the phase difference value in the first direction and the phase difference value in the second direction are determined according to the phase difference values of the pixels that match each other.

当第一切分亮度图包括的是偶数行的像素,第二切分亮度图包括的是奇数行的像素,第一切分亮度图中的像素a与第二切分亮度图中的像素b相互匹配,则根据相互匹配的像素a和像素b的相位差,可以确定第一方向的相位差值。When the first split brightness map includes even-numbered rows of pixels, the second split brightness map includes odd-numbered rows, pixel a in the first split brightness map and pixel b in the second split brightness map Mutual matching, the phase difference value in the first direction can be determined according to the phase difference of the pixel a and the pixel b that are matched with each other.

当第一切分亮度图包括的是偶数列的像素,第二切分亮度图包括的是奇数列的像素,第一切分亮度图中的像素a与第二切分亮度图中的像素b相互匹配,则根据相互匹配的像素a和像素b的相位差,可以确定第二方向的相位差值。When the first split brightness map includes even-numbered columns, the second split brightness map includes odd-numbered columns, pixel a in the first split brightness map and pixel b in the second split brightness map Mutual matching, based on the phase difference between the matched pixel a and the pixel b, the phase difference value in the second direction can be determined.

上述像素点组中的像素点的亮度值得到目标亮度图,将目标亮度图划分为两个切分亮度图后,通过像素匹配,可以快速的确定相互匹配的像素的相位差值,同时包含了丰富的相位差值,可以提高相位差值得精确度,提高对焦的准确度和稳定度。The brightness value of the pixel points in the above pixel point group obtains the target brightness map. After the target brightness map is divided into two segmented brightness maps, through pixel matching, the phase difference value of the matching pixels can be quickly determined, and the phase difference value of the matching pixels can be quickly determined. The rich phase difference value can improve the accuracy of the phase difference value and improve the accuracy and stability of the focus.

在一个实施例中,每个像素点包括阵列排布的多个子像素点,根据每个像素点组包括的像素点的亮度值获取目标亮度图,包括:对于每个像素点组,根据像素点组中每个像素点的相同位置处的子像素点的亮度值,获取像素点组对应的子亮度图;根据每个像素点组对应的子亮度图生成目标亮度图。In one embodiment, each pixel point includes a plurality of sub-pixel points arranged in an array, and obtaining the target brightness map according to the brightness value of the pixel points included in each pixel point group includes: for each pixel point group, according to the pixel point The brightness value of the sub-pixel at the same position of each pixel in the group is obtained, and the sub-brightness map corresponding to the pixel group is obtained; the target brightness map is generated according to the sub-brightness map corresponding to each pixel group.

其中,每个像素点的相同位置处的子像素点指的是在各像素点中排布位置相同的子像素点。Wherein, the sub-pixel points at the same position of each pixel point refer to the sub-pixel points that are arranged in the same position in each pixel point.

图10为一个实施例中的像素点组的示意图,如图10所示,该像素点组包括按照两行两列的阵列排布方式进行排布的4个像素点,该4个像素点分别为D1像素点、D2像素点、D3像素点和D4像素点,其中,每个像素点包括按照两行两列的阵列排布方式进行排布的4个子像素点,其中,子像素点分别为d11、d12、d13、d14、d21、d22、d23、d24、d31、d32、d33、d34、d41、d42、d43和d44。FIG. 10 is a schematic diagram of a pixel point group in an embodiment. As shown in FIG. 10, the pixel point group includes 4 pixels arranged in an array arrangement of two rows and two columns, and the 4 pixels are respectively D1 pixel, D2 pixel, D3 pixel and D4 pixel, where each pixel includes 4 sub-pixels arranged in an array arrangement of two rows and two columns, where the sub-pixels are respectively d11, d12, d13, d14, d21, d22, d23, d24, d31, d32, d33, d34, d41, d42, d43, and d44.

如图10所示,子像素点d11、d21、d31和d41在各像素点中的排布位置相同,均为第一行第一列,子像素点d12、d22、d32和d42在各像素点中的排布位置相同,均为第一行第二列,子像素点d13、d23、d33和d43在各像素点中的排布位置相同,均为第二行第一列,子像素点d14、d24、d34和d44在各像素点中的排布位置相同,均为第二行第二列。As shown in Figure 10, the sub-pixels d11, d21, d31, and d41 are arranged in the same position in each pixel, and they are all in the first row and first column. The sub-pixels d12, d22, d32, and d42 are in each pixel. The arrangement positions in are the same in the first row and second column, and the sub-pixels d13, d23, d33, and d43 are arranged in the same position in each pixel, and they are all in the second row and first column, and the sub-pixel d14 , D24, d34 and d44 are arranged in the same position in each pixel, and they are all in the second row and second column.

在一个实施例中,如图11所示,操作1102,采用2*2PDAF图像传感器捕获光信息,具体地,根据图像传感器上每个像素点组包括的像素点的亮度值获取目标亮度图;对目标亮度图进行切分处理,得到第一切分亮度图和第二切分亮度图;执行操作1104,根据第一切分亮度图和第二切分亮度图中相互匹配的像素的位置差异,确定相互匹配的像素的相位差值;根据相互匹配的像素的相位差值确定第一方向的相位差值和第二方向的相位差值。In one embodiment, as shown in FIG. 11, operation 1102 uses a 2*2 PDAF image sensor to capture light information, specifically, obtaining a target brightness map according to the brightness value of each pixel point included in each pixel point group on the image sensor; Perform segmentation processing on the target brightness map to obtain the first split brightness map and the second split brightness map; perform operation 1104, according to the position difference of the pixels matching each other in the first split brightness map and the second split brightness map, Determine the phase difference value of the pixels that match each other; determine the phase difference value in the first direction and the phase difference value in the second direction according to the phase difference values of the pixels that match each other.

执行操作1106,获取陀螺仪数据;执行操作1108,根据陀螺仪数据判断电子设备的移动方向;当确定好电子设备的移动方向之后,执行操作1110,从第一方向的相位差值和第二方向的相位差值中确定目标相位差值。具体地,当电子设备的移动方向为第一方向时,确定第二方向的相位差值为目标相位差值;当电子设备的移动方向为第二方向时,确定第一方向的相位差值为目标相位差值。执行操作1112,基于目标相位差值进行对焦。Perform operation 1106 to obtain gyroscope data; perform operation 1108 to determine the moving direction of the electronic device according to the gyroscope data; after determining the moving direction of the electronic device, perform operation 1110 to determine the phase difference between the first direction and the second direction Determine the target phase difference value in the phase difference value of. Specifically, when the moving direction of the electronic device is the first direction, the phase difference in the second direction is determined to be the target phase difference value; when the moving direction of the electronic device is the second direction, the phase difference in the first direction is determined to be Target phase difference value. Operation 1112 is performed to perform focusing based on the target phase difference value.

应该理解的是,虽然图6至图9的流程图中的各个操作按照箭头的指示依次显示,但是这些操作并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些操作的执行并没有严格的顺序限制,这些操作可以以其它的顺序执行。而且,图6至图9中的至少一部分操作可以包括多个子操作或者多个阶段,这些子操作或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,这些子操作或者阶段的执行顺序也不必然是依次进行,而是可以与其它操作或者其它操作的子操作或者阶段的至少一部分轮流或者交替地执行。It should be understood that although the various operations in the flowcharts of FIGS. 6 to 9 are displayed in sequence as indicated by the arrows, these operations are not necessarily performed in sequence in the order indicated by the arrows. Unless there is a clear description in this article, there is no strict order restriction on the execution of these operations, and these operations can be executed in other orders. Moreover, at least part of the operations in FIGS. 6-9 may include multiple sub-operations or multiple stages. These sub-operations or stages are not necessarily executed at the same time, but can be executed at different times. These sub-operations or The execution order of the stages is not necessarily carried out sequentially, but may be executed alternately or alternately with at least a part of other operations or sub-operations or stages of other operations.

图12为一个实施例的对焦装置的结构框图。如图12所示,提供了一种对焦装置1200,应用于包括陀螺仪的电子设备中,包括:相位差值获取模块1202、陀螺仪数据获取模块1204、目标相位差值确定模块1206和对焦模块1208,其中:Fig. 12 is a structural block diagram of a focusing device according to an embodiment. As shown in FIG. 12, a focusing device 1200 is provided, which is applied to an electronic device including a gyroscope, and includes: a phase difference value acquisition module 1202, a gyroscope data acquisition module 1204, a target phase difference value determination module 1206, and a focus module 1208, of which:

相位差值获取模块1202,用于拍摄时获取相位差值,相位差值包括第一方向的相位差值和第二方向的相位差值;第一方向与第二方向成预设夹角。The phase difference value obtaining module 1202 is used to obtain the phase difference value during shooting, the phase difference value includes the phase difference value in the first direction and the phase difference value in the second direction; the first direction and the second direction form a preset angle.

陀螺仪数据获取模块1204,用于通过陀螺仪获取陀螺仪数据。The gyroscope data acquisition module 1204 is used for acquiring gyroscope data through the gyroscope.

目标相位差值确定模块1206,用于根据陀螺仪数据从第一方向的相位差值和第二方向的相位差值中确定目标相位差值。The target phase difference value determination module 1206 is configured to determine the target phase difference value from the phase difference value in the first direction and the phase difference value in the second direction according to the gyroscope data.

对焦模块1208,用于基于目标相位差值进行对焦。The focusing module 1208 is used for focusing based on the target phase difference value.

上述对焦装置,拍摄时获取相位差值,相位差值包括第一方向的相位差值和第二方向的相位差值;第一方向与第二方向成预设夹角;通过陀螺仪获取陀螺仪数据;根据陀螺仪数据可以确定电子设备的移动方向,从而可以从第一方向的相位差值和第二方向的相位差值中确定更准确的目标相位差值;基于目标相位差值可以更准确进行对焦。The above focusing device obtains the phase difference value during shooting, the phase difference value includes the phase difference value in the first direction and the phase difference value in the second direction; the first direction and the second direction form a preset angle; the gyroscope is obtained through the gyroscope Data; According to the gyroscope data, the moving direction of the electronic device can be determined, so that a more accurate target phase difference value can be determined from the phase difference value in the first direction and the phase difference value in the second direction; it can be more accurate based on the target phase difference value Focus.

在一个实施例中,上述目标相位差值确定模块1206还用于根据陀螺仪数据确定电子设备的移动方向;根据电子设备的移动方向,从第一方向的相位差值和第二方向的相位差值中确定目标相位差值。In one embodiment, the above-mentioned target phase difference value determining module 1206 is further used to determine the moving direction of the electronic device according to the gyroscope data; according to the moving direction of the electronic device, the phase difference value from the first direction and the phase difference in the second direction Determine the target phase difference value in the value.

在一个实施例中,上述目标相位差值确定模块1206还用于当电子设备的移动方向为第一方向时,确定第二方向的相位差值为目标相位差值;当电子设备的移动方向为第二方向时,确定第一方向的相位差值为目标相位差值。In one embodiment, the above-mentioned target phase difference value determining module 1206 is further configured to determine that the phase difference value in the second direction is the target phase difference value when the moving direction of the electronic device is the first direction; when the moving direction of the electronic device is In the second direction, it is determined that the phase difference in the first direction is the target phase difference value.

在一个实施例中,上述对焦模块1208还用于根据目标相位差值确定离焦距离值及移动方向;根据离焦距离值和移动方向控制镜头移动以对焦。In an embodiment, the above-mentioned focusing module 1208 is further configured to determine the defocus distance value and the moving direction according to the target phase difference value; and control the movement of the lens to focus according to the defocus distance value and the moving direction.

在一个实施例中,上述对焦模块1208还用于获取目标相位差值的置信度;当置信度大于置信度阈值时,根据目标相位差值从相位差值与离焦距离值的对应关系中确定对应的离焦距离值。In an embodiment, the above-mentioned focusing module 1208 is also used to obtain the confidence of the target phase difference value; when the confidence is greater than the confidence threshold, it is determined from the corresponding relationship between the phase difference value and the defocus distance value according to the target phase difference value. The corresponding defocus distance value.

在一个实施例中,上述对焦装置1200还包括主体检测模块,用于获取第一图像;对第一图像进行主体检测,得到感兴趣区域。拍摄时获取相位差值,包括:拍摄时获取感兴趣区域中的相位差值。基于目标相位差值进行对焦,包括:基于目标相位差值在感兴趣区域中进行对焦,得到第二图像。In one embodiment, the above-mentioned focusing device 1200 further includes a subject detection module for acquiring a first image; subject detection is performed on the first image to obtain a region of interest. Obtaining the phase difference value during shooting includes: obtaining the phase difference value in the region of interest during shooting. Focusing based on the target phase difference value includes: focusing in the region of interest based on the target phase difference value to obtain the second image.

在一个实施例中,电子设备包括图像传感器,图像传感器包括阵列排布的多个像素点组,每个像素点组包括阵列排布的多个像素点;每个像素点对应一个感光单元。上述相位差值获取模块1202还用于根据每个像素点组包括的像素点的亮度值获取目标亮度图;对目标亮度图进行切分处理,得到第一切分亮度图和第二切分亮度图,并根据第一切分亮度图和第二切分亮度图中相互匹配的像素的位置差异,确定相互匹配的像素的相位差值;根据相互匹配的像素的相位差值确定第一方向的相位差值或第二方向的相位差值。In one embodiment, the electronic device includes an image sensor, and the image sensor includes a plurality of pixel point groups arranged in an array, and each pixel point group includes a plurality of pixel points arranged in an array; each pixel point corresponds to a photosensitive unit. The above-mentioned phase difference value acquisition module 1202 is also used to acquire the target brightness map according to the brightness values of the pixels included in each pixel point group; perform segmentation processing on the target brightness map to obtain the first segmented brightness map and the second segmented brightness According to the position difference of the matching pixels in the first split brightness map and the second split brightness map, the phase difference value of the matching pixels is determined; the phase difference value of the matching pixels is determined according to the phase difference value of the matching pixels in the first direction. The phase difference value or the phase difference value in the second direction.

在一个实施例中,上述相位差值获取模块1202还用于对于每个像素点组,根据像素点组中每个像素点的相同位置处的子像素点的亮度值,获取像素点组对应的子亮度图;根据每个像素点组对应的子亮度图生成目标亮度图。In one embodiment, the above-mentioned phase difference value acquisition module 1202 is further configured to, for each pixel point group, obtain the corresponding pixel point group according to the brightness value of the sub-pixel point at the same position of each pixel point in the pixel point group. Sub-brightness map: Generate the target brightness map according to the sub-brightness map corresponding to each pixel group.

上述对焦装置中各个模块的划分仅用于举例说明,在其他实施例中,可将对焦装置按照需要划分为不同的模块,以完成上述对焦装置的全部或部分功能。The division of the modules in the above-mentioned focusing device is only used for illustration. In other embodiments, the focusing device can be divided into different modules as required to complete all or part of the functions of the above-mentioned focusing device.

图13为一个实施例中电子设备的内部结构示意图。如图13所示,该电子设备包括通过系统总线连接的处理器和存储器。其中,该处理器用于提供计算和控制能力,支撑整个电子设备的运行。存储器可包括非易失性存储介质及内存储器。非易失性存储介质存储有操作系统和计算机程序。该计算机程序可被处理器所执行,以用于实现以下各个实施例所提供的一种对焦方法。内存储器为非易失性存储介质中的操作系统计算机程序提供高速缓存的运行环境。该电子设备可以是手机、平板电脑或者个人数字助理或穿戴式设备等。Fig. 13 is a schematic diagram of the internal structure of an electronic device in an embodiment. As shown in FIG. 13, the electronic device includes a processor and a memory connected through a system bus. Among them, the processor is used to provide computing and control capabilities to support the operation of the entire electronic device. The memory may include a non-volatile storage medium and internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program can be executed by a processor to implement a focusing method provided in the following embodiments. The internal memory provides a cached operating environment for the operating system computer program in the non-volatile storage medium. The electronic device can be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device.

本申请实施例中提供的对焦装置中的各个模块的实现可为计算机程序的形式。该计算机程序可在终端或服务器上运行。该计算机程序构成的程序模块可存储在终端或服务器的存储器上。该计算机程序被处理器执行时,实现本申请实施例中所描述方法的操作。The implementation of each module in the focusing device provided in the embodiment of the present application may be in the form of a computer program. The computer program can be run on a terminal or a server. The program module composed of the computer program can be stored in the memory of the terminal or the server. When the computer program is executed by the processor, the operation of the method described in the embodiment of the present application is realized.

本申请实施例还提供了一种计算机可读存储介质。一个或多个包含计算机可执行指令的非易失性计算机可读存储介质,当所述计算机可执行指令被一个或多个处理器执行时,使得所述处理器执行对焦方法的操作。The embodiment of the present application also provides a computer-readable storage medium. One or more non-volatile computer-readable storage media containing computer-executable instructions, when the computer-executable instructions are executed by one or more processors, cause the processors to perform the operations of the focusing method.

一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行对焦方法。A computer program product containing instructions that, when run on a computer, causes the computer to execute the focusing method.

本申请实施例所使用的对存储器、存储、数据库或其它介质的任何引用可包括非易失性和/或易失性存储器。合适的非易失性存储器可包括只读存储器(ROM)、可编程ROM(PROM)、电可编程ROM(EPROM)、电可擦除可编程ROM(EEPROM)或闪存。易失性存储器可包括随机存取存储器(RAM),它用作外部高速缓冲存储器。作为说明而非局限,RAM以多种形式可得,诸如静态RAM(SRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双数据率SDRAM(DDR SDRAM)、增强型SDRAM(ESDRAM)、同步链路(Synchlink)DRAM(SLDRAM)、存储器总线(Rambus)直接RAM(RDRAM)、直接存储器总线动态RAM(DRDRAM)、以及存储器总线动态RAM(RDRAM)。Any reference to memory, storage, database, or other media used in the embodiments of the present application may include non-volatile and/or volatile memory. Suitable non-volatile memory may include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory. Volatile memory may include random access memory (RAM), which acts as external cache memory. As an illustration and not a limitation, RAM is available in many forms, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).

以上所述实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对本申请专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请专利的保护范围应以所附权利要求为准。The above-mentioned embodiments only express several implementation manners of the present application, and the description is relatively specific and detailed, but it should not be understood as a limitation to the patent scope of the present application. It should be pointed out that for those of ordinary skill in the art, without departing from the concept of this application, several modifications and improvements can be made, and these all fall within the protection scope of this application. Therefore, the scope of protection of the patent of this application shall be subject to the appended claims.

Claims (18)

一种对焦方法,其特征在于,应用于包括陀螺仪的电子设备中,包括:A focusing method, characterized in that it is applied to an electronic device including a gyroscope, and includes: 拍摄时获取相位差值,所述相位差值包括第一方向的相位差值和第二方向的相位差值;所述第一方向与所述第二方向成预设夹角;Acquiring a phase difference value during shooting, where the phase difference value includes a phase difference value in a first direction and a phase difference value in a second direction; the first direction and the second direction form a preset angle; 通过所述陀螺仪获取陀螺仪数据;Obtain gyroscope data through the gyroscope; 根据所述陀螺仪数据从所述第一方向的相位差值和第二方向的相位差值中确定目标相位差值;及Determining a target phase difference value from the phase difference value in the first direction and the phase difference value in the second direction according to the gyroscope data; and 基于所述目标相位差值进行对焦。Focusing is performed based on the target phase difference value. 根据权利要求1所述的方法,其特征在于,所述根据所述陀螺仪数据从所述第一方向的相位差值和第二方向的相位差值中确定目标相位差值,包括:The method according to claim 1, wherein the determining the target phase difference value from the phase difference value in the first direction and the phase difference value in the second direction according to the gyroscope data comprises: 根据所述陀螺仪数据确定所述电子设备的移动方向;及Determining the moving direction of the electronic device according to the gyroscope data; and 根据所述电子设备的移动方向,从所述第一方向的相位差值和第二方向的相位差值中确定目标相位差值。According to the moving direction of the electronic device, a target phase difference value is determined from the phase difference value in the first direction and the phase difference value in the second direction. 根据权利要求2所述的方法,其特征在于,所述根据所述电子设备的移动方向,从所述第一方向的相位差值和第二方向的相位差值中确定目标相位差值,包括:The method according to claim 2, wherein the determining the target phase difference value from the phase difference value in the first direction and the phase difference value in the second direction according to the moving direction of the electronic device comprises : 当所述电子设备的移动方向为第一方向时,确定第二方向的相位差值为目标相位差值;及When the moving direction of the electronic device is the first direction, determining that the phase difference in the second direction is the target phase difference value; and 当所述电子设备的移动方向为第二方向时,确定第一方向的相位差值为目标相位差值。When the moving direction of the electronic device is the second direction, it is determined that the phase difference in the first direction is the target phase difference value. 根据权利要求1所述的方法,其特征在于,所述基于所述目标相位差值进行对焦,包括:The method according to claim 1, wherein the focusing based on the target phase difference value comprises: 根据所述目标相位差值确定离焦距离值;及Determining the defocus distance value according to the target phase difference value; and 根据所述离焦距离值控制镜头移动以对焦。The lens is controlled to move to focus according to the defocus distance value. 根据权利要求4所述的方法,其特征在于,所述根据所述目标相位差值确定离焦距离值,包括:The method according to claim 4, wherein the determining a defocus distance value according to the target phase difference value comprises: 获取所述目标相位差值的置信度;及Obtaining the confidence level of the target phase difference value; and 当所述置信度大于置信度阈值时,根据所述目标相位差值从相位差值与离焦距离值的对应关系中确定对应的离焦距离值。When the confidence is greater than the confidence threshold, the corresponding defocus distance value is determined from the corresponding relationship between the phase difference value and the defocus distance value according to the target phase difference value. 根据权利要求1所述的方法,其特征在于,所述方法还包括:The method according to claim 1, wherein the method further comprises: 获取第一图像;Get the first image; 对所述第一图像进行主体检测,得到感兴趣区域;Subject detection on the first image to obtain a region of interest; 所述拍摄时获取相位差值,包括:The obtaining of the phase difference value during shooting includes: 拍摄时获取所述感兴趣区域中的相位差值;及Acquiring the phase difference value in the region of interest during shooting; and 所述基于所述目标相位差值进行对焦,包括:The focusing based on the target phase difference value includes: 基于所述目标相位差值在所述感兴趣区域中进行对焦,得到第二图像。Focusing in the region of interest based on the target phase difference value to obtain a second image. 根据权利要求1所述的方法,其特征在于,所述电子设备包括图像传感器,所述图像传感器包括阵列排布的多个像素点组,每个所述像素点组包括阵列排布的M*N个像素点;每个像素点对应一个感光单元,其中,M和N均为大于或等于2的自然数;The method according to claim 1, wherein the electronic device comprises an image sensor, the image sensor comprises a plurality of pixel point groups arranged in an array, and each of the pixel point groups comprises M* arranged in an array. N pixels; each pixel corresponds to a photosensitive unit, where M and N are both natural numbers greater than or equal to 2; 所述获取相位差值,包括:The obtaining the phase difference value includes: 根据每个所述像素点组包括的像素点的亮度值获取目标亮度图;Acquiring a target brightness map according to the brightness value of the pixel points included in each pixel point group; 对所述目标亮度图进行切分处理,得到第一切分亮度图和第二切分亮度图,并根据所述第一切分亮度图和所述第二切分亮度图中相互匹配的像素的位置差异,确定所述相互匹配的像素的相位差值;及Perform segmentation processing on the target brightness map to obtain a first segmented brightness map and a second segmented brightness map, and pixels matching each other according to the first segmented brightness map and the second segmented brightness map Determine the phase difference value of the mutually matched pixels; and 根据所述相互匹配的像素的相位差值确定第一方向的相位差值和第二方向的相位差值。The phase difference value in the first direction and the phase difference value in the second direction are determined according to the phase difference values of the mutually matched pixels. 根据权利要求7所述的方法,其特征在于,每个所述像素点包括阵列排布的多个子像素点,所述根据每个所述像素点组包括的像素点的亮度值获取目标亮度图,包括:The method according to claim 7, wherein each of the pixel points includes a plurality of sub-pixel points arranged in an array, and the target brightness map is obtained according to the brightness value of the pixel points included in each of the pixel point groups ,include: 对于每个所述像素点组,根据所述像素点组中每个像素点的相同位置处的子像素点的亮度值,获取所述像素点组对应的子亮度图;及For each pixel point group, obtain the sub-luminance map corresponding to the pixel point group according to the brightness value of the sub-pixel point at the same position of each pixel point in the pixel point group; and 根据每个所述像素点组对应的子亮度图生成所述目标亮度图。The target brightness map is generated according to the sub-brightness map corresponding to each pixel point group. 一种对焦装置,其特征在于,应用于包括陀螺仪的电子设备中,包括:A focusing device, characterized in that it is applied to an electronic device including a gyroscope, and includes: 相位差值获取模块,用于拍摄时获取相位差值,所述相位差值包括第一方向的相位差值和第二方向的相位差值;所述第一方向与所述第二方向成预设夹角;The phase difference value acquisition module is used to acquire the phase difference value during shooting. The phase difference value includes a phase difference value in a first direction and a phase difference value in a second direction; Set angle 陀螺仪数据获取模块,用于通过所述陀螺仪获取陀螺仪数据;A gyroscope data acquisition module, configured to acquire gyroscope data through the gyroscope; 目标相位差值确定模块,用于根据所述陀螺仪数据从所述第一方向的相位差值和第二方向的相位差值中确定目标相位差值;及A target phase difference value determination module, configured to determine a target phase difference value from the phase difference value in the first direction and the phase difference value in the second direction according to the gyroscope data; and 对焦模块,用于基于所述目标相位差值进行对焦。The focusing module is used for focusing based on the target phase difference value. 根据权利要求9所述的装置,其特征在于,所述目标相位差值确定模块还用于根据所述陀螺仪数据确定所述电子设备的移动方向;及The apparatus according to claim 9, wherein the target phase difference value determining module is further configured to determine the moving direction of the electronic device according to the gyroscope data; and 根据所述电子设备的移动方向,从所述第一方向的相位差值和第二方向的相位差值中确定目标相位差值。According to the moving direction of the electronic device, a target phase difference value is determined from the phase difference value in the first direction and the phase difference value in the second direction. 根据权利要求10所述的装置,其特征在于,所述目标相位差值确定模块还用于当所述电子设备的移动方向为第一方向时,确定第二方向的相位差值为目标相位差值;及The apparatus according to claim 10, wherein the target phase difference value determining module is further configured to determine that the phase difference value in the second direction is the target phase difference when the moving direction of the electronic device is the first direction. Value; and 当所述电子设备的移动方向为第二方向时,确定第一方向的相位差值为目标相位差值。When the moving direction of the electronic device is the second direction, it is determined that the phase difference in the first direction is the target phase difference value. 根据权利要求9所述的装置,其特征在于,所述对焦模块还用于根据所述目标相位差值确定离焦距离值;及The device according to claim 9, wherein the focusing module is further configured to determine a defocus distance value according to the target phase difference value; and 根据所述离焦距离值控制镜头移动以对焦。The lens is controlled to move to focus according to the defocus distance value. 根据权利要求12所述的装置,其特征在于,所述对焦模块还用于获取所述目标相位差值的置信度;及The apparatus according to claim 12, wherein the focusing module is further configured to obtain the confidence level of the target phase difference value; and 当所述置信度大于置信度阈值时,根据所述目标相位差值从相位差值与离焦距离值的对应关系中确定对应的离焦距离值。When the confidence is greater than the confidence threshold, the corresponding defocus distance value is determined from the corresponding relationship between the phase difference value and the defocus distance value according to the target phase difference value. 根据权利要求9所述的装置,其特征在于,所述装置还包括主体检测模块,用于获取第一图像;The device according to claim 9, characterized in that the device further comprises a subject detection module for acquiring the first image; 对所述第一图像进行主体检测,得到感兴趣区域;Subject detection on the first image to obtain a region of interest; 拍摄时获取所述感兴趣区域中的相位差值;及Acquiring the phase difference value in the region of interest during shooting; and 基于所述目标相位差值在所述感兴趣区域中进行对焦,得到第二图像。Focusing in the region of interest based on the target phase difference value to obtain a second image. 根据权利要求9所述的装置,其特征在于,所述电子设备包括图像传感器,所述图像传感器包括阵列排布的多个像素点组,每个所述像素点组包括阵列排布的M*N个像素点;每个像素点对应一个感光单元,其中,M和N均为大于或等于2的自然数;The device according to claim 9, wherein the electronic device comprises an image sensor, the image sensor comprises a plurality of pixel point groups arranged in an array, and each of the pixel point groups comprises M* arranged in an array. N pixels; each pixel corresponds to a photosensitive unit, where M and N are both natural numbers greater than or equal to 2; 所述相位差值获取模块还用于根据每个所述像素点组包括的像素点的亮度值获取目标亮度图;The phase difference value obtaining module is further configured to obtain a target brightness map according to the brightness value of the pixel points included in each pixel point group; 对所述目标亮度图进行切分处理,得到第一切分亮度图和第二切分亮度图,并根据所述第一切分亮度图和所述第二切分亮度图中相互匹配的像素的位置差异,确定所述相互匹配的像素的相位差值;及Perform segmentation processing on the target brightness map to obtain a first segmented brightness map and a second segmented brightness map, and pixels matching each other according to the first segmented brightness map and the second segmented brightness map Determine the phase difference value of the mutually matched pixels; and 根据所述相互匹配的像素的相位差值确定第一方向的相位差值和第二方向的相位差值。The phase difference value in the first direction and the phase difference value in the second direction are determined according to the phase difference values of the mutually matched pixels. 根据权利要求15所述的装置,其特征在于,每个所述像素点包括阵列排布的多个子像素点,所述相位差值获取模块还用于对于每个所述像素点组,根据所述像素点组中每个像素点的相同位置处的子像素点的亮度值,获取所述像素点组对应的子亮度图;及The device according to claim 15, wherein each pixel point comprises a plurality of sub-pixel points arranged in an array, and the phase difference value acquisition module is further configured to, for each pixel point group, according to the The brightness value of the sub-pixel at the same position of each pixel in the pixel group, and obtaining the sub-brightness map corresponding to the pixel group; and 根据每个所述像素点组对应的子亮度图生成所述目标亮度图。The target brightness map is generated according to the sub-brightness map corresponding to each pixel point group. 一种电子设备,包括存储器及处理器,所述存储器中储存有计算机程序,所述计算机程序被所述处理器执行时,使得所述处理器执行如权利要求1至8中任一项所述的对焦方法的操作。An electronic device, comprising a memory and a processor, and a computer program is stored in the memory. When the computer program is executed by the processor, the processor executes any one of claims 1 to 8. The operation of the focus method. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序 被处理器执行时实现如权利要求1至8中任一项所述的方法的操作。A computer-readable storage medium having a computer program stored thereon, wherein the computer program implements the operation of the method according to any one of claims 1 to 8 when the computer program is executed by a processor.
PCT/CN2020/122301 2019-11-12 2020-10-21 Focusing method and apparatus, and electronic device and computer readable storage medium Ceased WO2021093528A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911101409.1A CN112866546B (en) 2019-11-12 2019-11-12 Focusing method and device, electronic equipment and computer readable storage medium
CN201911101409.1 2019-11-12

Publications (1)

Publication Number Publication Date
WO2021093528A1 true WO2021093528A1 (en) 2021-05-20

Family

ID=75911851

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/122301 Ceased WO2021093528A1 (en) 2019-11-12 2020-10-21 Focusing method and apparatus, and electronic device and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN112866546B (en)
WO (1) WO2021093528A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114489078A (en) * 2022-01-27 2022-05-13 珠海一微半导体股份有限公司 Mobile robot obstacle avoidance method based on phase detection, chip and robot

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101750847A (en) * 2008-12-15 2010-06-23 索尼株式会社 Image pickup apparatus and focus control method
US20140048853A1 (en) * 2009-10-22 2014-02-20 Samsung Electronics Co., Ltd. Image Sensors
CN103852954A (en) * 2012-12-03 2014-06-11 北京大学 Method for achieving phase focusing
CN104919352A (en) * 2013-01-10 2015-09-16 奥林巴斯株式会社 Image pickup device, image correction method, image processing device and image processing method
CN108141549A (en) * 2015-11-16 2018-06-08 三星电子株式会社 Imaging sensor and the electronic equipment with imaging sensor
CN108632596A (en) * 2017-03-22 2018-10-09 宏达国际电子股份有限公司 Photographing apparatus and method of operating the same
CN110248097A (en) * 2019-06-28 2019-09-17 Oppo广东移动通信有限公司 Focus tracking method and device, terminal equipment and computer readable storage medium

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5387856B2 (en) * 2010-02-16 2014-01-15 ソニー株式会社 Image processing apparatus, image processing method, image processing program, and imaging apparatus
CN103430073B (en) * 2011-03-31 2014-12-31 富士胶片株式会社 Imaging device, and method for controlling imaging device
CN104380166B (en) * 2012-07-12 2016-06-08 奥林巴斯株式会社 camera device
US10288431B2 (en) * 2012-07-13 2019-05-14 National Institute Of Advance Industrial Device for estimating moving object travel direction and method for estimating travel direction
US9769371B1 (en) * 2014-09-09 2017-09-19 Amazon Technologies, Inc. Phase detect auto-focus
JP6508954B2 (en) * 2015-01-28 2019-05-08 キヤノン株式会社 Imaging device, lens unit, control method of imaging device, and program
JP6530610B2 (en) * 2015-02-04 2019-06-12 キヤノン株式会社 Focusing device, imaging device, control method of focusing device, and program
JP6553881B2 (en) * 2015-02-05 2019-07-31 キヤノン株式会社 Image processing device
CN106027905B (en) * 2016-06-29 2019-05-21 努比亚技术有限公司 A kind of method and mobile terminal for sky focusing
CN106454100B (en) * 2016-10-24 2019-07-19 Oppo广东移动通信有限公司 Focusing method and device and mobile terminal
US10044926B2 (en) * 2016-11-04 2018-08-07 Qualcomm Incorporated Optimized phase detection autofocus (PDAF) processing
US10070042B2 (en) * 2016-12-19 2018-09-04 Intel Corporation Method and system of self-calibration for phase detection autofocus
CN106973206B (en) * 2017-04-28 2020-06-05 Oppo广东移动通信有限公司 Camera shooting module group camera shooting processing method and device and terminal equipment
KR102375989B1 (en) * 2017-08-10 2022-03-18 삼성전자주식회사 Image sensor for compensating signal difference between pixels
KR102545173B1 (en) * 2018-03-09 2023-06-19 삼성전자주식회사 A image sensor phase detection pixels and a image pickup device
CN110233962B (en) * 2019-04-26 2021-04-16 努比亚技术有限公司 Confidence optimization method and device and computer readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101750847A (en) * 2008-12-15 2010-06-23 索尼株式会社 Image pickup apparatus and focus control method
US20140048853A1 (en) * 2009-10-22 2014-02-20 Samsung Electronics Co., Ltd. Image Sensors
CN103852954A (en) * 2012-12-03 2014-06-11 北京大学 Method for achieving phase focusing
CN104919352A (en) * 2013-01-10 2015-09-16 奥林巴斯株式会社 Image pickup device, image correction method, image processing device and image processing method
CN108141549A (en) * 2015-11-16 2018-06-08 三星电子株式会社 Imaging sensor and the electronic equipment with imaging sensor
CN108632596A (en) * 2017-03-22 2018-10-09 宏达国际电子股份有限公司 Photographing apparatus and method of operating the same
CN110248097A (en) * 2019-06-28 2019-09-17 Oppo广东移动通信有限公司 Focus tracking method and device, terminal equipment and computer readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114489078A (en) * 2022-01-27 2022-05-13 珠海一微半导体股份有限公司 Mobile robot obstacle avoidance method based on phase detection, chip and robot

Also Published As

Publication number Publication date
CN112866546A (en) 2021-05-28
CN112866546B (en) 2022-09-27

Similar Documents

Publication Publication Date Title
CN112866549B (en) Image processing method and apparatus, electronic device, computer-readable storage medium
CN105827922B (en) A camera device and its shooting method
CN112866552B (en) Focusing method and device, electronic device, computer-readable storage medium
US20130128068A1 (en) Methods and Apparatus for Rendering Focused Plenoptic Camera Data using Super-Resolved Demosaicing
WO2020259474A1 (en) Focus tracking method and apparatus, terminal device, and computer-readable storage medium
CN110536057A (en) Image processing method and device, electronic equipment and computer readable storage medium
WO2017120771A1 (en) Depth information acquisition method and apparatus, and image collection device
CN112866553B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866511B (en) Imaging assembly, focusing method and apparatus, electronic device
WO2023016144A1 (en) Focusing control method and apparatus, imaging device, electronic device, and computer readable storage medium
WO2021093502A1 (en) Phase difference obtaining method and apparatus, and electronic device
CN112866675B (en) Depth map generation method and apparatus, electronic device, and computer-readable storage medium
WO2021093637A1 (en) Focusing method and apparatus, electronic device, and computer readable storage medium
US20130169837A1 (en) Device having image reconstructing function, method, and recording medium
CN112866655B (en) Image processing method and device, electronic device, computer-readable storage medium
CN112866545B (en) Focus control method and device, electronic device, computer-readable storage medium
CN112866547B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866546B (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866550B (en) Phase difference acquisition method and apparatus, electronic device, and computer-readable storage medium
CN112866510A (en) Focusing method and device, electronic equipment and computer readable storage medium
CN112866543A (en) Focusing control method and device, electronic equipment and computer readable storage medium
CN112866674B (en) Depth map acquisition method and device, electronic equipment and computer readable storage medium
CN112862880B (en) Depth information acquisition method, device, electronic equipment and storage medium
CN112866554B (en) Focusing method and apparatus, electronic device, computer-readable storage medium
KR20240045876A (en) Imaging method and device for auto focusing

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20886746

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20886746

Country of ref document: EP

Kind code of ref document: A1