[go: up one dir, main page]

WO2018076529A1 - Procédé, dispositif et terminal de calcul de profondeur de scène - Google Patents

Procédé, dispositif et terminal de calcul de profondeur de scène Download PDF

Info

Publication number
WO2018076529A1
WO2018076529A1 PCT/CN2016/112696 CN2016112696W WO2018076529A1 WO 2018076529 A1 WO2018076529 A1 WO 2018076529A1 CN 2016112696 W CN2016112696 W CN 2016112696W WO 2018076529 A1 WO2018076529 A1 WO 2018076529A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
image
offset
lens
ois
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2016/112696
Other languages
English (en)
Chinese (zh)
Inventor
唐忠伟
徐荣跃
王运
李邢
李远友
敖欢欢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201680054264.2A priority Critical patent/CN108260360B/zh
Publication of WO2018076529A1 publication Critical patent/WO2018076529A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the embodiments of the present invention relate to the field of communications, and in particular, to a method, a device, and a terminal for calculating a scene depth of a target scene having a dual camera terminal device.
  • Optical Image Stabilization (OIS, also commonly referred to as optical image stabilization) is an important means of improving the quality of photographs in low light, and is also used on more and more mobile phones. OIS works by compensating for hand shake by moving the lens to achieve image stabilization.
  • the image is generally corrected by the calibration parameters of the dual camera, so that the left and right images provided by the dual camera are aligned in one direction, then the parallax is calculated, and the parallax is converted into the depth of the scene.
  • the OIS causes a lens shift, which causes the dual camera calibration parameters to change, resulting in parallax problems (positive parallax and negative parallax exist simultaneously, or the image cannot be aligned in one direction).
  • the calculated scene depth value is not accurate.
  • the embodiment of the invention provides a scene depth calculation method.
  • the scene of the target scene caused by the parallax problem (the positive or negative parallax exists simultaneously or the image cannot be aligned in one direction) is solved.
  • the problem of inaccurate depth values is solved.
  • a scene depth calculation method comprising: acquiring a lens offset of a camera with an OIS system; wherein the first camera and/or the second camera has an OIS system, the first camera And the second camera is arranged side by side on the body of the same terminal device; according to the preset OIS motor sensitivity calibration parameter, the lens offset is converted into an image offset; according to the compensated first camera calibration parameter and / Or the compensated second camera calibration parameter, and the first image and the second image obtained by the first camera and the second camera respectively acquiring the target scene at the same time, and calculating the scene depth of the target scene;
  • the calibration parameter of the first camera is compensated according to the lens offset of the first camera
  • the calibration parameter of the second camera is compensated according to the lens offset of the second camera.
  • the method may further include: acquiring angular velocity information of the terminal device jitter detected by the gyro sensor; converting the angular velocity information into jitter of the terminal device
  • the amplitude drives the OIS motor to push the lens of the first camera and/or the lens of the second camera according to the amplitude of the shake, and acquires a lens offset of the first camera and/or the second camera.
  • the terminal device jitter time and the exposure time are inconsistent, and the jitter time is greater than the exposure time, multiple lens offsets are acquired, and one lens offset is determined from the plurality of lens offsets according to a preset rule. And when the scene depth of the target scene is subsequently calculated, the determined lens offset is used for calculation.
  • the OIS motor sensitivity calibration parameter is determined according to the steps of: pushing the OIS motor through the OIS controller, moving the lens to a designated position; waiting for the OIS motor Photographing after stabilization; when the captured image reaches a preset number of sheets, detecting feature point coordinates of each image, and according to the specified position of the lens and the respective images
  • the feature point coordinates of the image determine the OIS motor sensitivity calibration parameters.
  • the OIS motor sensitivity calibration parameter is stored therein before the terminal device leaves the factory, so that when the terminal device is shipped, the scene depth of the target scene is calculated, and the stored OIS motor sensitivity calibration parameter is used to convert the lens offset into an image. Offset.
  • the lens offset is converted into an image offset according to the following formula
  • ⁇ x is the image offset
  • is the OIS motor sensitivity calibration parameter
  • ⁇ C is the lens offset
  • the lens offset is converted to the image offset by the OIS motor sensitivity calibration parameter, and the unit of the camera calibration parameter is maintained. Consistent.
  • the first camera calibration parameter after compensation and/or the second camera calibration after compensation And the first image and the second image obtained by the first camera and the second camera at the same time, and the scene depth of the target scene is specifically calculated by:
  • Z is the scene depth of the target scene
  • f is the focal length
  • a 1 is the OIS motor sensitivity calibration parameter of the first camera
  • ⁇ 1 is the lens offset
  • pixel pitch is the size of a pixel
  • a 1 ⁇ 1 ⁇ x 1 is the first image offset
  • a ⁇ ⁇ 1 ⁇ pixel pitch is the unit of the first image offset is converted from pixel to mm
  • the main point of the first camera is changed from u 1 ' to u 1 after compensation.
  • B 'becomes B 1 the principal point of the second camera is u 2
  • x 1 is the first image forming point
  • x 2 is an imaging point of the second image.
  • the first camera and the second camera both have an OIS system
  • the first camera is calibrated according to the compensated parameter and/or compensated Second camera calibration parameter
  • the first camera and the second camera are The first image and the second image obtained by the target scene are collected at the same time, and the calculation of the scene depth of the target scene includes:
  • a 1 is the OIS motor sensitivity calibration parameter of the first camera
  • ⁇ 1 is the mirror of the first camera
  • the head offset, a 1 ⁇ ⁇ 1 ⁇ ⁇ x 1 is the first image shift amount, and a 1 ⁇ ⁇ 1 ⁇ pixel pitch is a unit for converting the first image shift amount from pixel to mm, and a 2 is second.
  • the OIS motor sensitivity calibration parameter of the camera ⁇ 2 is the lens offset of the second camera, a 2 ⁇ ⁇ 2 ⁇ ⁇ x 2 is the second image offset, and a 2 ⁇ ⁇ 2 ⁇ pixel pitch is the second image bias
  • the unit pixel of the shift amount is converted to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, and the main point of the second camera is changed from u 2 ' to u 2 after compensation, and x 1 is the first
  • the image is imaged and x 2 is the imaged point of the second image.
  • an embodiment of the present invention provides a scenario depth calculation device, where the device includes: a first acquisition unit, a processing unit, and a calculation unit; and the first acquisition unit is configured to acquire a camera with an OIS system. a lens shift amount; wherein the first camera and/or the second camera are provided with an OIS system, the first camera and the second camera are juxtaposed on the body of the same terminal device; and the processing unit is configured to be preset according to The OIS motor sensitivity calibration parameter converts the lens offset into an image offset; the calculating unit is configured to perform calibration parameters according to the compensated first camera and/or compensated second camera calibration parameters, and The first camera and the second camera collect the first image and the second image respectively obtained by the target scene at the same time, and calculate the scene depth of the target scene; wherein, the first offset is compensated according to the lens offset of the first camera The calibration parameter of the camera compensates the calibration parameter of the second camera according to the lens offset of the second camera.
  • the device further includes: a second acquiring unit, where the second acquiring unit is configured to: acquire the jitter of the terminal device detected by the gyro sensor Angular velocity information; converting the angular velocity information into a jitter amplitude of the terminal device, Driving the OIS motor to drive lens movement of the lens of the first camera and/or the second camera according to the amplitude of the shake, and acquiring a lens shift amount of the first camera and/or the second camera.
  • the second acquiring unit is configured to: acquire the jitter of the terminal device detected by the gyro sensor Angular velocity information; converting the angular velocity information into a jitter amplitude of the terminal device, Driving the OIS motor to drive lens movement of the lens of the first camera and/or the second camera according to the amplitude of the shake, and acquiring a lens shift amount of the first camera and/or the second camera.
  • the terminal device jitter time and the exposure time are inconsistent, and the jitter time is greater than the exposure time, multiple lens offsets are acquired, and one lens offset is determined from the plurality of lens offsets according to a preset rule. And when the scene depth of the target scene is subsequently calculated, the determined lens offset is used for calculation.
  • the device further includes: a determining unit, where the determining unit is specifically configured to: push the OIS motor through the OIS controller, and move the lens to the designated Positioning; waiting for the OIS motor to stabilize after taking a picture; when the captured image reaches a preset number of sheets, detecting feature point coordinates of each image, and determining an OIS motor sensitivity calibration parameter according to the specified position of the lens.
  • the OIS motor sensitivity calibration parameter is stored therein before the terminal device leaves the factory, so that when the terminal device is shipped, the scene depth of the target scene is calculated, and the stored OIS motor sensitivity calibration parameter is used to convert the lens offset into an image. Offset.
  • the processing unit is specifically configured to:
  • ⁇ x is the image offset
  • is the OIS motor sensitivity calibration parameter
  • ⁇ C is the lens offset
  • the lens offset is converted to the image offset by the OIS motor sensitivity calibration parameter, and the unit of the camera calibration parameter is maintained. Consistent.
  • the calculating unit is specifically configured to:
  • Z is the scene depth of the target scene
  • f is the focal length
  • a 1 is the OIS motor sensitivity calibration parameter of the first camera
  • ⁇ 1 is the lens offset
  • pixel pitch is the size of a pixel
  • a 1 ⁇ 1 ⁇ x 1 is the first image offset
  • a ⁇ ⁇ 1 ⁇ pixel pitch is the unit of the first image offset is converted from pixel to mm
  • the main point of the first camera is changed from u 1 ' to u 1 after compensation.
  • B 'becomes B 1 the principal point of the second camera is u 2
  • x 1 is the first image forming point
  • x 2 is an imaging point of the second image.
  • the calculating unit is specifically configured to:
  • a 1 is the OIS motor sensitivity calibration parameter of the first camera
  • ⁇ 1 is the mirror of the first camera
  • the head offset, a 1 ⁇ ⁇ 1 ⁇ ⁇ x 1 is the first image shift amount, and a 1 ⁇ ⁇ 1 ⁇ pixel pitch is a unit for converting the first image shift amount from pixel to mm, and a 2 is second.
  • the OIS motor sensitivity calibration parameter of the camera ⁇ 2 is the lens offset of the second camera, a 2 ⁇ ⁇ 2 ⁇ ⁇ x 2 is the second image offset, and a 2 ⁇ ⁇ 2 ⁇ pixel pitch is the second image bias
  • the unit pixel of the shift amount is converted to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, and the main point of the second camera is changed from u 2 ' to u 2 after compensation, and x 1 is the first
  • the image is imaged and x 2 is the imaged point of the second image.
  • an embodiment of the present invention provides a terminal, where the terminal includes a first camera and a second camera, where the first camera and the second camera are used to collect at least one target scene at a same time, respectively obtaining a first image. And a second image; wherein the first camera and/or the second camera are provided with an OIS system, the first camera and the second camera are juxtaposed on the body of the same terminal device; and the memory is configured to store the first An image and a second image; the processor is configured to acquire a lens offset of the camera with the OIS motor, and convert the lens offset into an image offset according to a preset OIS motor sensitivity calibration parameter Calculating a scene depth of the target scene according to the compensated first camera calibration parameter and/or the compensated second camera calibration parameter, and the first image and the second image acquired from the memory; wherein, according to First camera The lens offset compensates the calibration parameter of the first camera, and the calibration parameter of the second camera is compensated according to the lens offset of the second camera.
  • the lens offset is used to compensate for the change of the camera calibration parameters caused by the jitter of the terminal device, and the parallax problem is solved.
  • the compensated camera calibration parameters are used to calculate the scene depth of the target scene, and the calculated scene depth value is more accurate.
  • the OIS system is specifically configured to: acquire angular velocity information of terminal device jitter detected by a gyro sensor; and convert the angular velocity information into a terminal device The amplitude of the jitter, driving the OIS motor to drive the lens movement of the lens of the first camera and/or the second camera according to the amplitude of the shake, and acquiring the lens offset of the first camera and/or the second camera.
  • the terminal device jitter time and the exposure time are inconsistent, and the jitter time is greater than the exposure time, multiple lens offsets are acquired, and one lens offset is determined from the plurality of lens offsets according to a preset rule. And when the scene depth of the target scene is subsequently calculated, the determined lens offset is used for calculation.
  • the processor is further configured to: by the OIS controller, push the OIS motor to move the lens to the designated position; wait for the OIS motor to stabilize Taking a picture; detecting a feature point coordinate of each image when the captured image reaches a preset number of sheets, and determining an OIS motor sensitivity calibration parameter according to the specified position of the lens and the feature point coordinates of each image;
  • the memory is further configured to store the OIS motor sensitivity calibration parameters.
  • the OIS motor sensitivity calibration parameter is stored therein before the terminal leaves the factory, so that when the terminal is calculated and the scene depth of the target scene is calculated, the stored OIS motor sensitivity calibration parameter is used to convert the lens offset into an image offset. the amount.
  • the processor is specifically configured to convert the lens offset into an image offset according to the following formula,
  • ⁇ x is the image offset
  • is the OIS motor sensitivity calibration parameter
  • ⁇ C is the lens offset
  • the lens offset is converted to the image offset by the OIS motor sensitivity calibration parameter, and the unit of the camera calibration parameter is maintained. Consistent.
  • the processor is specifically configured to determine a scene depth of the target scene by using:
  • Z is the scene depth of the target scene
  • f is the focal length
  • a 1 is the OIS motor sensitivity calibration parameter of the first camera
  • ⁇ 1 is the lens offset
  • pixel pitch is the size of a pixel
  • a 1 ⁇ 1 ⁇ x 1 is the first image shift amount
  • a ⁇ 1 ⁇ pixel pitch is a unit for converting the first image shift amount from pixel to mm
  • the main point of the first camera is changed from u 1 ' to u 1 after compensation.
  • the baseline B 'becomes B 1 the principal point of the second camera is u 2
  • x 1 is the first image forming point
  • the processor is specifically configured to determine the target scenario by using the following formula: Scene depth,
  • a 1 is the OIS motor sensitivity calibration parameter of the first camera
  • ⁇ 1 is the mirror of the first camera
  • the head offset, a 1 ⁇ ⁇ 1 ⁇ ⁇ x 1 is the first image shift amount, and a 1 ⁇ ⁇ 1 ⁇ pixel pitch is a unit for converting the first image shift amount from pixel to mm, and a 2 is second.
  • the OIS motor sensitivity calibration parameter of the camera ⁇ 2 is the lens offset of the second camera, a 2 ⁇ ⁇ 2 ⁇ ⁇ x 2 is the second image offset, and a 2 ⁇ ⁇ 2 ⁇ pixel pitch is the second image bias
  • the unit pixel of the shift amount is converted to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, and the main point of the second camera is changed from u 2 ' to u 2 after compensation, and x 1 is the first
  • the image is imaged and x 2 is the imaged point of the second image.
  • FIG. 1 is a block diagram showing the working principle of the OIS system
  • Figure 2 is a block diagram of a depth calculation system
  • FIG. 3 is a flowchart of a method for calculating a depth of a scene according to Embodiment 1 of the present invention
  • Figure 4a is a schematic diagram of a lens offset scene
  • Figure 4b is a schematic diagram of imaging changes before and after lens shift
  • Figure 4c is a flow chart for determining the OIS motor sensitivity calibration parameters
  • FIG. 5a is a schematic diagram of scene depth calculation according to an embodiment of the present invention.
  • FIG. 5b is still another schematic diagram of scene depth calculation according to an embodiment of the present invention.
  • Figure 6a is a schematic diagram of an image taken when compensating the calibration parameters of the dual camera
  • Figure 6b is a schematic diagram of an image taken after the calibration parameters of the dual camera are not compensated
  • Figure 6c is a partial enlarged view of Figure 6a
  • Figure 6d is a partial enlarged view of Figure 6b;
  • Figure 7a is a schematic diagram of the depth of the scene when the calibration parameters of the dual camera are not compensated
  • Figure 7b is a schematic diagram of the depth of the scene after compensating the calibration parameters of the dual camera
  • FIG. 8 is a schematic structural diagram of a scene depth calculation apparatus according to Embodiment 2 of the present invention.
  • FIG. 9 is a schematic structural diagram of still another scene depth computing apparatus according to Embodiment 2 of the present invention.
  • FIG. 10 is a schematic structural diagram of a terminal according to Embodiment 3 of the present invention.
  • the terminal device may be a device having a dual camera, including but not limited to a camera (such as a digital camera), a video camera, a mobile phone (such as a smart phone), a tablet (Pad), and a personal digital assistant (Personal).
  • a camera such as a digital camera
  • a video camera such as a digital camera
  • a mobile phone such as a smart phone
  • a tablet such as a tablet
  • a personal digital assistant Personal digital assistant
  • the digital assistant (PDA), the portable device for example, a portable computer
  • the wearable device and the like are not specifically limited in the embodiment of the present invention.
  • the terminal device may be a mobile phone, and the following uses a mobile phone as an example to perform an embodiment of the present invention. set forth.
  • the dual camera simulates the human binocular vision principle to perceive the distance, that is, observing an object from two points, and acquiring images at different viewing angles, according to the pixels between the images.
  • the matching relationship is obtained by calculating the offset between pixels by the principle of triangulation to obtain the depth of the scene of the object.
  • the OIS causes the lens to shift, which causes the dual camera calibration parameters to change, resulting in parallax problems, which in turn results in inaccurate scene depth calculation. Therefore, it is necessary to compensate the dual camera calibration parameters. , so that the scene depth of the target scene is calculated accurately.
  • FIG. 1 is a block diagram of the working principle of the OIS system.
  • the terminal device includes an OIS system 100 and an Image Signal Processor (ISP) 110.
  • the OIS system 100 includes an OIS controller 120, a gyro sensor 130, a Hall sensor 140, a motor 150, and a camera 160.
  • ISP Image Signal Processor
  • the camera 160 includes a first camera and a second camera.
  • the first camera and the second camera may be juxtaposed in front of the terminal device, or may be juxtaposed on the back of the terminal device, and may be arranged in a horizontal arrangement or a vertical arrangement.
  • the first camera and/or the second camera are provided with an OIS system, and the first camera and the second camera respectively have lenses (not shown in FIG. 1).
  • the Hall sensor 140 is a magnetic field sensor that performs displacement measurement based on the Hall effect for acquiring the lens shift amount of the camera with the OIS system, that is, the lens shift amount of the first camera and/or the second camera.
  • the gyro sensor 130 is a positioning system based on the movement of the terminal device in a free space orientation for acquiring angular velocity information when the terminal device is shaken.
  • the OIS controller 120 acquires angular velocity information from the gyro sensor 130, converts the angular velocity information into a jitter amplitude of the terminal device, and transmits the jitter amplitude as a reference signal to the motor 150.
  • the motor 150 may be an OIS motor for driving the lens movement of the camera with the OIS system according to the amplitude of the shake to ensure the sharpness of the image; wherein the movement refers to moving in the X and/or Y direction, and the Y direction refers to the lens.
  • the X direction refers to the direction of the light passing through the lens and perpendicular to the Y direction.
  • the OIS controller 120 also acquires the first image and the second image obtained by acquiring the target scene at the same time from the first camera and the second camera.
  • the ISP 110 stores the lens shift amount, the first image, and the second image acquired from the OIS controller 120.
  • the terminal device performs initialization, and usually, when ready, the OIS controller 120 controls the shutter to acquire an image.
  • the terminal device may shake, and the OIS controller 120 reads the angular velocity information detected by the gyro sensor 130, converts the angular velocity information into the jitter amplitude of the terminal device, and transmits it as a reference signal to the OIS motor, and the OIS motor according to the jitter amplitude Move the lens of the camera with OIS system to avoid blurring of the captured image caused by the jitter of the terminal device and ensure the sharpness of the image.
  • the movement may be that the lens of the first camera moves in the X and/or Y direction and/or the lens of the second camera moves in the X and/or Y direction.
  • the OIS controller 120 reads the lens offset of the camera with the OIS system detected by the Hall sensor 140, that is, the lens offset of the first camera and/or the second camera, and acquires the captured image from the camera. That is, the first image and the second image obtained by the target scene are acquired at the same time corresponding to the first camera and the second camera, respectively, and the lens offset and the captured image are sent to the ISP 110.
  • the ISP 110 stores the lens shift amount and the first image and the second image captured by the camera.
  • the terminal device jitter time is generally greater than its exposure time, for example, the terminal device jitter duration is 30ms and the exposure time is 2ms.
  • the Hall sensor 140 acquires 15 lens offsets, OIS.
  • the controller 120 reads the 15 lens offsets from the Hall sensor 140 according to a preset rule, and determines a lens offset from the 15 lens offsets, and uses in the subsequent process.
  • the determined lens offset is used as the lens offset described in the context to perform scene depth calculation of the target scene.
  • FIG. 2 is a block diagram of the depth calculation system.
  • the depth calculation system includes an ISP 110 and a depth calculation module 210.
  • the depth calculation module 210 acquires preset calibration information from the ISP 110 and
  • the ISP 110 reads the stored OIS information, and the first camera and the second camera acquire the first image and the second image respectively obtained by the target scene at the same time, calculate the scene depth of the target scene, and output the disparity map/depth map.
  • the calibration information is the camera calibration parameters such as focal length, baseline, optical center and principal point when initializing;
  • the OIS information is the lens offset.
  • the depth calculation module acquires the lens offset. Since the unit of the lens offset is code and the unit of the scene depth is millimeter (mm), the two are inconsistent. Therefore, the lens needs to be offset according to the OIS motor sensitivity. The quantity is converted into image offset, the unit is pixel (pixel), and then the camera offset parameter is compensated by the lens offset, and the scene depth value of the target scene is calculated according to the compensated camera calibration parameter, thereby calculating the scene depth value. More precise.
  • FIG. 3 is a flowchart of a method for calculating a scene depth according to Embodiment 1 of the present invention. As shown in FIG. 3, the method includes:
  • the lens shift amount can be obtained by Hall sensor detection.
  • the lens offset is abnormal; if the lens offset is not greater than the preset threshold, the lens offset is not abnormal.
  • S340 converting the lens offset into an image offset (see ⁇ x of FIG. 4b).
  • S330 needs to be executed in advance, that is, the OIS motor sensitivity calibration parameter, that is, the image offset caused by the unit lens offset, is input, wherein each camera with the OIS system has its corresponding OIS motor sensitivity calibration. Parameters, the OIS motor sensitivity calibration parameters have been pre-stored before the terminal device leaves the factory. Using the OIS motor sensitivity calibration parameters, the lens offset can be converted to an image offset.
  • some calibration parameters such as the optical center, the main point, the baseline, etc.
  • the image offset is calculated according to the lens offset
  • the lens offset is used to compensate for the change.
  • the camera calibration parameters determine the value of the camera calibration parameters after the change. Referring to Fig. 4, the optical center changes from C' to C, and the principal point changes from u' to u. Referring to Fig. 5a, the optical center of the first camera lens changes from C 1 ' to C 1 , the main point changes from u 1 ' to u 1 , and the baseline changes from B' to B 1 . Referring to Fig.
  • the optical center of the first camera lens changes from C 1 ' to C 1
  • the main point changes from u 1 ' to u 1
  • the optical center of the second camera lens changes from C 2 ' to C 2
  • the main point From u 2 ' to u 2
  • the baseline changes from B' to B 2 .
  • S380 Calculate a scene depth of the target scene. See Equation 2 and Equation 4 for the calculation formula. It is necessary to perform S360 in advance before executing S380, that is, input the first image, and S370, that is, input the second image.
  • the scene depth of the target scene is determined according to the compensated camera calibration parameters, the first image, and the second image.
  • the first camera and the second camera acquire the first image and the second image respectively obtained by the target scene at the same time.
  • S390 Output a scene depth of the target scene.
  • Figure 4a is a schematic diagram of a lens offset scene.
  • the OIS motor pushes the lens from the position of the elliptical dotted line to the specified position (x i , y i ), respectively, to take an image of the fixed chart, and the image sensor will
  • the optical image acquired by the camera is converted into an electronic signal, and the lens offset can be determined according to the position before and after the lens is moved, and the image offset is determined according to the images of the two fixed charts.
  • Figure 4b is a schematic diagram of imaging changes before and after lens shift.
  • the OIS motor pushes the lens of a camera in the X direction as an example.
  • the camera calibration parameters the focal length is f
  • the optical center is C'
  • the main point is u'.
  • part of the calibration parameters of the camera changes the optical center changes from C' to C
  • the main point changes from u' to u.
  • the imaging points before and after the lens movement are x' and x, respectively, ⁇ C is the distance between the optical center C' and the optical center C, that is, the lens shift amount, the unit is code, ⁇ x is the distance between the imaging point x' and the imaging point x, that is, the image Offset in pixels.
  • the image offset caused by the unit lens offset can be measured, that is, the OIS motor sensitivity calibration parameter.
  • the actual image offset can be calculated based on the lens offset during subsequent shooting to compensate for the camera calibration parameters when the terminal is shaken.
  • the OIS motor sensitivity calibration parameters When determining the OIS motor sensitivity calibration parameters, it is assumed that between the lens shift amount ⁇ C and the image shift amount ⁇ x The relationship is linear, and the OIS motor sensitivity calibration parameters can be obtained as: ⁇ ⁇ ⁇ x / ⁇ C, and the ⁇ unit is pixels/code.
  • ⁇ C and ⁇ x are not strictly linear.
  • Higher-order models such as second-order, third-order or higher, can be used to capture more images for more accurate OIS motor sensitivity calibration parameters.
  • Figure 4c is a flow chart for determining OIS motor sensitivity calibration parameters. As shown in Figure 4c, it includes:
  • the determined OIS motor sensitivity calibration parameter may have a relatively large error, and multiple images may be taken to improve the accuracy of the OIS motor sensitivity calibration parameter.
  • the feature point coordinates in the captured image before and after the lens movement are respectively detected, and the image offset amount is acquired.
  • the lens shift amount is determined according to the moving distance of the lens, and the OIS motor sensitivity calibration parameter is determined according to the image shift amount and the lens shift amount.
  • S450 Store the OIS motor sensitivity calibration parameter into the terminal device.
  • the OIS motor sensitivity calibration parameter is stored therein, so that after the terminal device is shipped from the factory, when the target device is used to collect the target scene, the lens is adjusted according to the OIS motor sensitivity calibration parameter stored in advance.
  • the offset is converted to the image offset, and the camera offset parameter is compensated by the lens offset to make the scene depth of the target scene more accurate.
  • FIG. 5a is a schematic diagram of scene depth calculation according to an embodiment of the present invention.
  • the first camera has an OIS system
  • the second camera does not have an OIS system
  • the lens of the first camera moves in the X direction.
  • the lens of the camera is a convex lens, and the incoming rays and the corresponding and parallel rays are formed.
  • the conjugate ray, the intersection of the line connecting the incident point and the exit point with the main optical axis, is called the focal point of the convex lens.
  • the distance from the focus plane or the imaging plane such as CCD is called the focal length.
  • the point at the center of the lens is called the optical center.
  • the intersection of the line of sight with the imaging plane such as the film or CCD is called the main point, and the distance between the first camera lens and the second camera lens is called the baseline.
  • the focal length is f
  • the optical center of the lens of the first camera is C 1 '
  • the main point is u 1 '
  • the optical center of the lens of the second camera is C 2
  • the main The point is u 2
  • the baseline is B'.
  • the scene depth calculation is based on the uncompensated camera calibration parameters, the main point u 1 ' and the baseline B', To complete.
  • the calculated scene depth Z' error is large.
  • the terminal apparatus When pressed the shutter, the terminal apparatus can cause jitter calibration parameters of the camera portion is changed, pushing the first camera lens according to the jitter amplitude shift occurs, the OIS motor, a first optical center of the camera lens from C 1 'becomes C 1 ( That is, the lens offset of the first camera is in code), the main point changes from u 1 ' to u 1 , and the baseline changes from B' to B 1 , and u 1 and B 1 need to be calculated.
  • the camera calibration parameters of the lens offset compensation are used to determine the value of the compensated camera calibration parameters.
  • the calculated scene depth Z is:
  • a 1 is the OIS motor sensitivity calibration parameter of the first camera
  • ⁇ 1 is the lens offset of the first camera
  • pixel pitch is the size of one pixel
  • a 1 ⁇ ⁇ 1 ⁇ ⁇ x 1 is the first image offset
  • a 1 ⁇ ⁇ 1 ⁇ pixel pitch is a unit for converting the first image shift amount from pixel to mm.
  • FIG. 5b is still another schematic diagram of scene depth calculation according to an embodiment of the present invention.
  • the first camera and the second camera are simultaneously provided with an OIS system, and the first camera and the second camera are simultaneously moved in the X direction.
  • the focal length is f
  • the first camera The optical center of the lens is C 1 '
  • the main point is u 1 '
  • the optical center of the lens of the second camera is C 2 '
  • the main point is u 2 '
  • the baseline is B'.
  • the imaging point of the first image acquired by the first camera is x 1
  • the imaging point of the second image acquired by the second camera is x 2 .
  • the depth calculation is based on the uncompensated camera calibration parameters, the main points u 1 ', u 2 ' and the baseline. B' to complete.
  • the calculated scene depth Z' error is large.
  • the terminal device shake will cause some calibration parameters of the camera to change, that is, the optical center of the first camera lens changes from C 1 ' to C 1 , and the main point changes from u 1 ' to u 1 , the second camera
  • the optical center of the lens changes from C 2 ' to C 2 (corresponding to the lens offset of the second camera, the unit is code), the main point changes from u 2 ' to u 2 , and the baseline changes from B' to B 2 , which needs to be calculated.
  • u 1 , u 2 and B 2 are produced.
  • a 1 is the OIS motor sensitivity calibration parameter of the first camera
  • ⁇ 1 is the lens offset of the first camera
  • a 1 ⁇ ⁇ 1 ⁇ ⁇ x 1 is the first image offset
  • the pitch is to convert the unit of the first image offset from pixel to mm
  • a 2 is the OIS motor sensitivity calibration parameter of the second camera
  • ⁇ 2 is the lens offset of the second camera
  • a 2 ⁇ ⁇ 2 ⁇ pixel pitch is a unit pixel of the second image shift amount converted into mm.
  • FIG. 5a and FIG. 5b illustrate how the camera calibration parameter compensation is performed by taking the movement of the camera in one direction as an example, it should be realized that the same can realize the compensation of the camera calibration parameters of the two cameras moving in two directions. I won't go into details here.
  • Figure 6a is a schematic diagram of an image taken when compensating the calibration parameters of the dual camera
  • Figure 6b is a schematic diagram of the image taken after the calibration parameters of the dual camera are not compensated
  • Figure 6c is a partial enlarged view of Figure 6a
  • Figure 6d is a partial enlarged view of Figure 6b It can be seen from Fig. 6a-6d that the image alignment effect is poor when the dual camera calibration parameters are not compensated, and the alignment effect of the image is good after compensating the dual camera calibration parameters.
  • Figure 7a is a schematic diagram of the depth of the scene when the calibration parameters of the dual camera are not compensated
  • Figure 7b is a schematic diagram of the depth of the scene after the calibration parameters of the dual camera are compensated.
  • different depth values are represented by different colors, and black indicates that the scene depth cannot be calculated.
  • the depth of the scene measured at 1000 mm is 1915.8 mm and the depth of the scene measured at 300 mm is 344.6 mm.
  • Figure 7b the depth of the scene measured at 1000 mm is 909.6 mm and the depth of the scene measured at 300 mm is 287.4 mm. It follows that the calculated scene depth value is more accurate after compensating the dual camera calibration parameters.
  • the scene depth calculation method with the dual camera terminal provided by the embodiment of the invention solves the problem that the scene depth value is inaccurate due to the parallax problem.
  • FIG. 8 is a schematic structural diagram of a scene depth calculation apparatus according to Embodiment 2 of the present invention.
  • the scene depth calculation apparatus 800 includes a first acquisition unit 810, a processing unit 820, and a calculation unit 830.
  • the first acquiring unit 810 is configured to acquire a lens offset of the camera with the OIS system; wherein the first camera and/or the second camera have an OIS system, and the first camera and the second camera are arranged side by side in the same On the body of the terminal device.
  • the processing unit 820 is configured to convert the lens offset into an image offset according to a preset OIS motor sensitivity calibration parameter.
  • the calculating unit 830 is configured to obtain, according to the compensated first camera calibration parameter and/or the compensated second camera calibration parameter, and the acquired target scenes obtained by the first camera and the second camera at the same time
  • the first image and the second image are used to calculate a scene depth; wherein the calibration parameter of the first camera is compensated according to the lens offset of the first camera, and the calibration parameter of the second camera is compensated according to the lens offset of the second camera.
  • the processing unit 820 is specifically configured to: convert the lens offset into an image offset according to the following formula,
  • ⁇ x is the image shift amount
  • is the OIS motor sensitivity calibration parameter
  • ⁇ C is the lens shift amount
  • the calculating unit 830 is specifically configured to:
  • Z is the scene depth of the target scene
  • f is the focal length
  • a 1 is the OIS motor sensitivity calibration parameter of the first camera
  • ⁇ 1 is the lens offset
  • pixel pitch is the size of a pixel
  • a 1 ⁇ 1 ⁇ x 1 is the first image offset
  • a ⁇ ⁇ 1 ⁇ pixel pitch is the unit of the first image offset is converted from pixel to mm
  • the main point of the first camera is changed from u 1 ' to u 1 after compensation.
  • B 'becomes B 1 the principal point of the second camera is u 2
  • x 1 is the first image forming point
  • x 2 is an imaging point of the second image.
  • the calculating unit 830 is specifically configured to:
  • a 1 is the OIS motor sensitivity calibration parameter of the first camera
  • ⁇ 1 is the mirror of the first camera
  • the head offset, a 1 ⁇ ⁇ 1 ⁇ ⁇ x 1 is the first image shift amount, and a 1 ⁇ ⁇ 1 ⁇ pixel pitch is a unit for converting the first image shift amount from pixel to mm, and a 2 is second.
  • the OIS motor sensitivity calibration parameter of the camera ⁇ 2 is the lens offset of the second camera, a 2 ⁇ ⁇ 2 ⁇ ⁇ x 2 is the second image offset, and a 2 ⁇ ⁇ 2 ⁇ pixel pitch is the second image bias
  • the unit pixel of the shift amount is converted to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, and the main point of the second camera is changed from u 2 ' to u 2 after compensation, and x 1 is the first
  • the image is imaged and x 2 is the imaged point of the second image.
  • FIG. 9 is a schematic structural diagram of still another scene depth computing apparatus according to Embodiment 2 of the present invention.
  • the device may also be a scene depth computing device 900.
  • the device 900 may further include: a second acquiring unit 910 and a determining unit 920.
  • a second obtaining unit 910 configured to acquire angular velocity information of the terminal device jitter detected by the gyro sensor; convert the angular velocity information into a jitter amplitude of the terminal device, and drive the OIS motor to push the lens of the first camera according to the jitter amplitude And/or the lens of the second camera moves and acquires the lens offset of the first camera and/or the second camera.
  • the determining unit 920 is configured to: push the OIS motor through the OIS controller, move the lens to the designated position; wait for the OIS motor to stabilize and take a picture; and when the captured image reaches a preset number of sheets, detect the feature point coordinates of each image And determining the OIS motor sensitivity calibration parameter according to the specified position of the lens.
  • FIG. 10 is a schematic structural diagram of a terminal according to Embodiment 3 of the present invention.
  • the terminal 1000 includes a camera 1010 (the camera 1010 includes a first camera and a second camera) processor 1020, a memory 1030, and a system bus; the camera 1010, the processor 1020, and the memory 1030 establish a connection through a system bus.
  • the first camera and the second camera are configured to acquire at least the target scene at the same time to obtain the first image and the second image respectively; wherein the first camera and/or the second camera have an OIS system, the first camera and the second camera Parallel to the fuselage of the same terminal device.
  • the memory 1020 is configured to store the first image and the second image.
  • the processor 1030 is configured to acquire a lens offset of the camera with the OIS motor, and convert the lens offset into an image offset according to a preset OIS motor sensitivity calibration parameter; a first camera calibration parameter and/or a compensated second camera calibration parameter, and a first image and a second image acquired from the memory, calculating a scene depth of the target scene; wherein, according to the lens offset of the first camera The calibration parameter of the first camera is compensated, and the calibration parameter of the second camera is compensated according to the lens offset of the second camera.
  • the OIS system is specifically configured to acquire angular velocity information of the terminal device jitter detected by the gyro sensor, convert the angular velocity information into a jitter amplitude of the terminal device, and drive the OIS motor to push the first camera according to the jitter amplitude.
  • the lens of the lens and/or the second camera moves and acquires the lens offset of the first camera and/or the second camera.
  • the processor 1030 is further configured to: push the OIS motor through the OIS controller, move the lens to a designated position; wait for the OIS motor to stabilize and take a picture; and when the captured image reaches a preset number of sheets, detect feature points of each image Coordinates, and determining OIS motor sensitivity calibration parameters according to the specified position of the lens and the feature point coordinates of the respective images.
  • the memory 1020 is further configured to store the OIS motor sensitivity calibration parameter.
  • processor 1030 is specifically configured to convert the lens offset into an image offset according to the following formula.
  • ⁇ x is the image shift amount
  • is the OIS motor sensitivity calibration parameter
  • ⁇ C is the lens shift amount
  • the processor is specifically configured to determine a scene depth of the target scene by using the following formula:
  • Z is the scene depth of the target scene
  • f is the focal length
  • a 1 is the OIS motor sensitivity calibration parameter of the first camera
  • ⁇ 1 is the lens offset
  • pixel pitch is the size of a pixel
  • a 1 ⁇ 1 ⁇ x 1 is the first image shift amount
  • a ⁇ 1 ⁇ pixel pitch is a unit for converting the first image shift amount from pixel to mm
  • the main point of the first camera is changed from u 1 ' to u 1 after compensation.
  • the baseline B 'becomes B 1 the principal point of the second camera is u 2
  • x 1 is the first image forming point
  • the processor is specifically configured to determine a scene depth of the target scene by using the following formula:
  • a 1 is the OIS motor sensitivity calibration parameter of the first camera
  • ⁇ 1 is the mirror of the first camera
  • the head offset, a 1 ⁇ ⁇ 1 ⁇ ⁇ x 1 is the first image shift amount, and a 1 ⁇ ⁇ 1 ⁇ pixel pitch is a unit for converting the first image shift amount from pixel to mm, and a 2 is second.
  • the OIS motor sensitivity calibration parameter of the camera ⁇ 2 is the lens offset of the second camera, a 2 ⁇ ⁇ 2 ⁇ ⁇ x 2 is the second image offset, and a 2 ⁇ ⁇ 2 ⁇ pixel pitch is the second image bias
  • the unit pixel of the shift amount is converted to mm, and the main point of the first camera is changed from u 1 ' to u 1 after compensation, and the main point of the second camera is changed from u 2 ' to u 2 after compensation, and x 1 is the first
  • the image is imaged and x 2 is the imaged point of the second image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

Selon des modes de réalisation, la présente invention concerne un procédé, un dispositif et un terminal de calcul de profondeur de scène. Le procédé comprend les étapes consistant : à obtenir un décalage d'appareil photo d'un appareil photo ayant un système OIS, un premier et/ou un second appareil photo étant dotés de ce système OIS ; à convertir le décalage d'appareil photo en décalage d'image selon un paramètre d'étalonnage de sensibilité de moteur OIS prédéfini ; à calculer la profondeur de scène d'une scène cible en fonction d'un paramètre d'étalonnage compensé du premier appareil photo et/ou d'un paramètre d'étalonnage compensé du second appareil photo, ainsi que d'une première et une seconde image obtenues par le premier et le second appareil photo photographiant la scène cible en même temps, le paramètre d'étalonnage du premier appareil photo étant compensé selon le décalage d'appareil photo du premier appareil photo, et le paramètre d'étalonnage du second appareil photo étant compensé selon le décalage d'appareil photo du second appareil photo. Par conséquent, le problème de parallaxe est résolu, et la profondeur de scène d'une scène cible est plus précise car elle est calculée à l'aide des paramètres d'étalonnage compensés d'appareils photo.
PCT/CN2016/112696 2016-10-25 2016-12-28 Procédé, dispositif et terminal de calcul de profondeur de scène Ceased WO2018076529A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201680054264.2A CN108260360B (zh) 2016-10-25 2016-12-28 场景深度计算方法、装置及终端

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610941102.2 2016-10-25
CN201610941102 2016-10-25

Publications (1)

Publication Number Publication Date
WO2018076529A1 true WO2018076529A1 (fr) 2018-05-03

Family

ID=62024299

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/112696 Ceased WO2018076529A1 (fr) 2016-10-25 2016-12-28 Procédé, dispositif et terminal de calcul de profondeur de scène

Country Status (2)

Country Link
CN (1) CN108260360B (fr)
WO (1) WO2018076529A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3582487A1 (fr) * 2018-06-15 2019-12-18 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Compensation d'image
CN112581538A (zh) * 2020-12-11 2021-03-30 昆山丘钛光电科技有限公司 一种获取马达感度的方法及装置
CN113873157A (zh) * 2021-09-28 2021-12-31 维沃移动通信有限公司 拍摄方法、装置、电子设备和可读存储介质
CN115908527A (zh) * 2021-08-03 2023-04-04 北京小米移动软件有限公司 图像处理方法及装置
EP4392737A4 (fr) * 2021-08-24 2025-07-09 Moleculight Inc Systèmes, dispositifs et procédés d'imagerie et de mesure

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111833394A (zh) * 2020-07-27 2020-10-27 深圳惠牛科技有限公司 摄像头校准方法、基于双目测量装置的测量方法
KR20230039351A (ko) 2021-09-14 2023-03-21 삼성전자주식회사 이미지에 보케 효과를 적용하는 전자 장치 및 그 동작 방법

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102867304A (zh) * 2012-09-04 2013-01-09 南京航空航天大学 双目立体视觉系统中场景立体深度与视差的关系建立方法
CN104954689A (zh) * 2015-06-30 2015-09-30 努比亚技术有限公司 一种利用双摄像头获得照片的方法及拍摄装置
CN105629427A (zh) * 2016-04-08 2016-06-01 东莞佩斯讯光电技术有限公司 基于双可控制镜头倾斜式音圈马达的立体数码摄像装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5493942B2 (ja) * 2009-12-15 2014-05-14 ソニー株式会社 撮像装置と撮像方法
JP6435265B2 (ja) * 2013-08-21 2018-12-05 オリンパス株式会社 撮像装置、撮像方法およびプログラム
CN103685950A (zh) * 2013-12-06 2014-03-26 华为技术有限公司 一种视频图像防抖方法及装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102867304A (zh) * 2012-09-04 2013-01-09 南京航空航天大学 双目立体视觉系统中场景立体深度与视差的关系建立方法
CN104954689A (zh) * 2015-06-30 2015-09-30 努比亚技术有限公司 一种利用双摄像头获得照片的方法及拍摄装置
CN105629427A (zh) * 2016-04-08 2016-06-01 东莞佩斯讯光电技术有限公司 基于双可控制镜头倾斜式音圈马达的立体数码摄像装置

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3582487A1 (fr) * 2018-06-15 2019-12-18 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Compensation d'image
US10567659B2 (en) 2018-06-15 2020-02-18 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Image compensation method, electronic device and computer-readable storage medium
CN112581538A (zh) * 2020-12-11 2021-03-30 昆山丘钛光电科技有限公司 一种获取马达感度的方法及装置
CN112581538B (zh) * 2020-12-11 2025-04-01 昆山丘钛光电科技有限公司 一种获取马达感度的方法及装置
CN115908527A (zh) * 2021-08-03 2023-04-04 北京小米移动软件有限公司 图像处理方法及装置
EP4392737A4 (fr) * 2021-08-24 2025-07-09 Moleculight Inc Systèmes, dispositifs et procédés d'imagerie et de mesure
CN113873157A (zh) * 2021-09-28 2021-12-31 维沃移动通信有限公司 拍摄方法、装置、电子设备和可读存储介质
CN113873157B (zh) * 2021-09-28 2024-04-16 维沃移动通信有限公司 拍摄方法、装置、电子设备和可读存储介质

Also Published As

Publication number Publication date
CN108260360B (zh) 2021-01-05
CN108260360A (zh) 2018-07-06

Similar Documents

Publication Publication Date Title
CN111147741B (zh) 基于对焦处理的防抖方法和装置、电子设备、存储介质
JP6663040B2 (ja) 奥行き情報取得方法および装置、ならびに画像取得デバイス
WO2018076529A1 (fr) Procédé, dispositif et terminal de calcul de profondeur de scène
US8264553B2 (en) Hardware assisted image deblurring
CN100587538C (zh) 成像设备、成像设备的控制方法
US9092875B2 (en) Motion estimation apparatus, depth estimation apparatus, and motion estimation method
CN106412426B (zh) 全聚焦摄影装置及方法
CN109712192B (zh) 摄像模组标定方法、装置、电子设备及计算机可读存储介质
JP6585006B2 (ja) 撮影装置および車両
WO2020088133A1 (fr) Procédé et appareil de traitement d'image, dispositif électronique et support de stockage lisible par ordinateur
CN109598764B (zh) 摄像头标定方法和装置、电子设备、计算机可读存储介质
WO2018228467A1 (fr) Procédé et dispositif d'exposition d'image, dispositif photographique, et support de stockage
WO2019105214A1 (fr) Procédé et appareil de floutage d'image, terminal mobile et support de stockage
CN109963080B (zh) 图像采集方法、装置、电子设备和计算机存储介质
WO2020259474A1 (fr) Procédé et appareil de suivi de mise au point, équipement terminal, et support d'enregistrement lisible par ordinateur
US20220286611A1 (en) Electrical image stabilization (eis)-assisted digital image stabilization (dis)
WO2018228466A1 (fr) Procédé et appareil d'affichage de région de mise au point, et dispositif terminal
CN110493522A (zh) 防抖方法和装置、电子设备、计算机可读存储介质
CN109660718B (zh) 图像处理方法和装置、电子设备、计算机可读存储介质
JP5857712B2 (ja) ステレオ画像生成装置、ステレオ画像生成方法及びステレオ画像生成用コンピュータプログラム
CN108090935B (zh) 混合相机系统及其时间标定方法及装置
US8179431B2 (en) Compound eye photographing apparatus, control method therefor, and program
JP5023750B2 (ja) 測距装置および撮像装置
WO2022147703A1 (fr) Procédé et appareil de suivi de mise au point, et dispositif photographique et support d'enregistrement lisible par ordinateur
WO2018161322A1 (fr) Procédé de traitement d'image basé sur la profondeur, dispositif de traitement et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16920386

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16920386

Country of ref document: EP

Kind code of ref document: A1