[go: up one dir, main page]

WO2020181494A1 - Procédé de synchronisation de paramètres, appareil de capture d'images et plateforme mobile - Google Patents

Procédé de synchronisation de paramètres, appareil de capture d'images et plateforme mobile Download PDF

Info

Publication number
WO2020181494A1
WO2020181494A1 PCT/CN2019/077859 CN2019077859W WO2020181494A1 WO 2020181494 A1 WO2020181494 A1 WO 2020181494A1 CN 2019077859 W CN2019077859 W CN 2019077859W WO 2020181494 A1 WO2020181494 A1 WO 2020181494A1
Authority
WO
WIPO (PCT)
Prior art keywords
exposure parameter
image frame
photographing device
frame number
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2019/077859
Other languages
English (en)
Chinese (zh)
Inventor
俞利富
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to PCT/CN2019/077859 priority Critical patent/WO2020181494A1/fr
Priority to CN201980005567.9A priority patent/CN111345033A/zh
Publication of WO2020181494A1 publication Critical patent/WO2020181494A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising

Definitions

  • the embodiment of the present invention relates to the field of electronic technology, and in particular to a parameter synchronization method, a photographing device and a movable platform.
  • Exposure parameters include, for example, electronic shutter values, analog gain values, digital gain values, and aperture values.
  • Exposure parameters include, for example, electronic shutter values, analog gain values, digital gain values, and aperture values.
  • the exposure parameters need to be sent to the image sensor, lens, and image signal processor (Image Signal Processor). Processor, ISP), and Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA) chip at least one device.
  • ISP Image Signal Processor
  • FPGA Field-Programmable Gate Array
  • the embodiment of the present invention provides a parameter synchronization method, a photographing device and a movable platform, which are used to solve the exposure problem of the image obtained by shooting and improve the image quality.
  • an embodiment of the present invention provides a parameter synchronization method applied to a photographing device, and the method includes:
  • the type of exposure parameter is sent to the N elements of the plurality of elements that require the exposure parameter, so that the N elements use the at least one at the same image frame number.
  • An exposure parameter where N is an integer greater than or equal to 2.
  • an embodiment of the present invention provides a photographing device, including: a body and a plurality of components other than the body, the body including a processor;
  • the processor is configured to send a frame synchronization instruction to the multiple elements, so that the multiple elements set the image frame numbers corresponding to the moments when the frame synchronization instructions are received to the same initial frame number, and when the frame is received
  • start counting the image frame number from the same initial frame number obtain at least one exposure parameter required for shooting; determine the transmission time of each exposure parameter according to the effective delay time of each exposure parameter;
  • the exposure parameter is sent to the N elements of the plurality of elements that require the exposure parameter, so that the N elements use the at least one exposure at the same image frame number Parameter, the N is an integer greater than or equal to 2.
  • an embodiment of the present invention provides a movable platform that includes a body of the movable platform and the shooting device according to the embodiment of the present invention in the first aspect, and the shooting device is mounted on the machine of the movable platform. Body.
  • an embodiment of the present invention provides a chip, including: a memory and a processor;
  • the memory is used to store program instructions; the processor is used to call the program instructions in the memory to execute the parameter synchronization method according to the embodiment of the present invention in the first aspect.
  • an embodiment of the present invention provides a readable storage medium on which a computer program is stored; when the computer program is executed, it realizes the parameters described in the embodiment of the present invention in the first aspect Synchronization method.
  • an embodiment of the present invention provides a computer program, when the computer program is executed by a computer, it is used to implement the parameter synchronization method described in the embodiment of the present invention in the first aspect.
  • the frame synchronization instruction is sent to multiple elements in the photographing device, so that the multiple elements will receive the frame synchronization instruction.
  • the corresponding image frame number is set to the same initial frame number and the image frame number is counted from the same initial frame number when the frame synchronization signal is received; at least one exposure parameter required for shooting is acquired, and each exposure parameter takes effect Delay time, determine the transmission time of each exposure parameter, and when the transmission time of each exposure parameter arrives, send the exposure parameter to the N elements of the plurality of elements that require the exposure parameter, so that all The N elements use the at least one exposure parameter in the same image frame number. Therefore, the exposure parameters generated at the same time will take effect in the same image frame in the N elements, thereby avoiding the exposure problem of the captured images and improving the quality of the captured images.
  • Fig. 1 is a schematic architecture diagram of an unmanned aerial system according to an embodiment of the present invention
  • FIG. 2 is a schematic structural diagram of a handheld pan/tilt head provided by an embodiment of the present invention
  • FIG. 3 is a flowchart of a parameter synchronization method provided by an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of components in a photographing device provided by an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of synchronization of exposure parameters provided by an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of synchronization of exposure parameters provided by another embodiment of the present invention.
  • FIG. 7 is a schematic diagram of parameter synchronization when switching between a preview mode and a photographing mode according to an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of a photographing device provided by an embodiment of the present invention.
  • Figure 9 is a schematic structural diagram of a movable platform provided by an embodiment of the present invention.
  • a component when a component is said to be “fixed to” another component, it can be directly on the other component or a central component may also exist. When a component is considered to be “connected” to another component, it can be directly connected to another component or there may be a centered component at the same time.
  • the embodiment of the present invention provides a parameter synchronization method, a camera, and a movable platform, and the movable platform may include the camera.
  • the movable platform can be, for example, a drone, an unmanned ship, an unmanned car, a robot, a handheld electronic device, and the like.
  • Hand-held electronic devices are, for example, terminal devices such as handheld PTZ, mobile phones, tablet computers, notebook phones, and wearable devices.
  • the drone may be, for example, a rotorcraft, for example, a multi-rotor aircraft propelled by a plurality of propulsion devices through the air, and the embodiments of the present invention are not limited thereto.
  • Fig. 1 is a schematic architecture diagram of an unmanned aerial system according to an embodiment of the present invention.
  • a rotary wing drone is taken as an example for description.
  • the unmanned flying system 100 may include a drone 110, a display device 130, and a control terminal 140.
  • the UAV 110 may include a power system 150, a flight control system 160, a frame, and a pan/tilt 120 carried on the frame.
  • the drone 110 can wirelessly communicate with the control terminal 140 and the display device 130.
  • the frame may include a fuselage and a tripod (also called a landing gear).
  • the fuselage may include a center frame and one or more arms connected to the center frame, and the one or more arms extend radially from the center frame.
  • the tripod is connected with the fuselage, and is used for supporting the UAV 110 when landing.
  • the power system 150 may include one or more electronic speed regulators (referred to as ESCs) 151, one or more propellers 153, and one or more motors 152 corresponding to the one or more propellers 153, wherein the motors 152 are connected to Between the electronic governor 151 and the propeller 153, the motor 152 and the propeller 153 are arranged on the arm of the UAV 110; the electronic governor 151 is used to receive the driving signal generated by the flight control system 160 and provide driving according to the driving signal Current is supplied to the motor 152 to control the speed of the motor 152.
  • the motor 152 is used to drive the propeller to rotate, thereby providing power for the flight of the drone 110, and the power enables the drone 110 to realize one or more degrees of freedom of movement.
  • the drone 110 may rotate about one or more rotation axes.
  • the aforementioned rotation axis may include a roll axis (Roll), a yaw axis (Yaw), and a pitch axis (pitch).
  • the motor 152 may be a DC motor or an AC motor.
  • the motor 152 may be a brushless motor or a brushed motor.
  • the flight control system 160 may include a flight controller 161 and a sensing system 162.
  • the sensing system 162 is used to measure the attitude information of the drone, that is, the position information and state information of the drone 110 in space, such as three-dimensional position, three-dimensional angle, three-dimensional velocity, three-dimensional acceleration, and three-dimensional angular velocity.
  • the sensing system 162 may include, for example, at least one of sensors such as a gyroscope, an ultrasonic sensor, an electronic compass, an inertial measurement unit (IMU), a vision sensor, a global navigation satellite system, and a barometer.
  • the global navigation satellite system may be a global positioning system (Global Positioning System, GPS).
  • the flight controller 161 is used to control the flight of the drone 110, for example, it can control the flight of the drone 110 according to the attitude information measured by the sensor system 162. It should be understood that the flight controller 161 can control the drone 110 according to pre-programmed program instructions, and can also control the drone 110 by responding to one or more control instructions from the control terminal 140.
  • the pan/tilt head 120 may include a motor 122.
  • the pan/tilt is used to carry the camera 123.
  • the flight controller 161 can control the movement of the pan-tilt 120 through the motor 122.
  • the pan/tilt head 120 may further include a controller for controlling the movement of the pan/tilt head 120 by controlling the motor 122.
  • the pan-tilt 120 may be independent of the drone 110 or a part of the drone 110.
  • the motor 122 may be a DC motor or an AC motor.
  • the motor 122 may be a brushless motor or a brushed motor.
  • the pan-tilt may be located on the top of the drone or on the bottom of the drone.
  • the photographing device 123 may be, for example, a device for capturing images, such as a camera or a video camera, and the photographing device 123 may communicate with the flight controller and take pictures under the control of the flight controller.
  • the imaging device 123 of this embodiment at least includes a photosensitive element, and the photosensitive element is, for example, a Complementary Metal Oxide Semiconductor (CMOS) sensor or a Charge-coupled Device (CCD) sensor. It can be understood that the camera 123 can also be directly fixed to the drone 110, so the pan/tilt 120 can be omitted.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge-coupled Device
  • the display device 130 is located at the ground end of the unmanned aerial system 100, can communicate with the drone 110 in a wireless manner, and can be used to display the attitude information of the drone 110.
  • the image taken by the photographing device may also be displayed on the display device 130.
  • the display device 130 may be an independent device or integrated in the control terminal 140.
  • the control terminal 140 is located on the ground end of the unmanned aerial system 100, and can communicate with the drone 110 in a wireless manner for remote control of the drone 110.
  • FIG. 2 is a schematic structural diagram of a handheld pan/tilt provided by an embodiment of the present invention.
  • the handheld pan/tilt may be, for example, a single-axis handheld pan/tilt, a dual-axis handheld pan/tilt, or a three-axis handheld pan/tilt.
  • Fig. 2 takes a three-axis handheld pan/tilt as an example for illustrative description.
  • the handheld pan/tilt may include a handle mechanism 10 and a pan/tilt mechanism 20.
  • the handle mechanism 10 is connected to the pan-tilt mechanism 20.
  • the pan-tilt mechanism 20 can be used to carry the photographing device 9.
  • the photographing device 9 in this embodiment may be, for example, a camera, a video camera, a mobile phone, and the like.
  • the handle mechanism 10 can be provided with a control button 11, and the pan-tilt mechanism 20 is used to control the handheld pan-tilt. It should be noted that this embodiment does not limit the number and implementation of the control buttons 11.
  • the control button 11 may be a switch button, a mode switch button, or the like.
  • the handle mechanism 10 may be equipped with a battery for powering various components of the handheld pan/tilt.
  • the pan-tilt mechanism 20 may include a pitch axis (Pitch axis) mechanism 21, a translation axis (Yaw axis) mechanism 22, and a roll axis (Roll axis) mechanism 23.
  • the pitch axis mechanism 21 includes a pitch axis rotation axis and a pitch axis drive motor;
  • the translation axis mechanism 22 includes a translation axis rotation axis and a translation axis drive motor;
  • the roll axis mechanism 23 includes a roll axis rotation axis and a roll axis drive motor.
  • the handheld pan/tilt may include a clamping mechanism 6 for fixing the camera 9.
  • the embodiment of the present invention does not limit the shape and position of the clamping mechanism 6.
  • an inertial measurement element can be provided in the clamping mechanism 6.
  • the inertial measurement element may be a gyroscope, accelerometer, etc.
  • FIG. 2 shows that the camera 9 can be separated from the handheld pan/tilt as an example. In other embodiments, the camera 9 may be a part of the handheld pan/tilt.
  • FIG. 3 is a flowchart of a parameter synchronization method provided by an embodiment of the present invention. As shown in FIG. 3, the method of this embodiment can be applied to a photographing device, such as the photographing device 123 in FIG.
  • the photographing device 9 in 2 this embodiment is not limited to this; the method of this embodiment may include:
  • this embodiment since the photographing device includes M elements, and M is an integer greater than or equal to 2, this embodiment takes 5 elements in the photographing device as an example, which are: body, image sensor, lens, memory chip, ISP, as shown in Figure 4. Among them, the solution of this embodiment can be executed by the body of the camera, for example.
  • the body can send a frame synchronization instruction to a plurality of elements in the camera, and the plurality of elements are elements belonging to the M elements.
  • the multiple components respectively set the image frame corresponding to the moment of receiving the frame synchronization instruction to the same initial frame number according to the frame synchronization instruction, for example, the same initial frame number is 0, so that the subsequent counted images of these multiple components Frame number synchronization. Then, the multiple components will receive the frame synchronization signal, and then the image frame numbers will be counted from the same initial frame number, so that the synchronization of the image frame numbers between the body and the multiple components can be ensured.
  • the aforementioned multiple elements include at least two of the following: image sensor, ISP, memory chip, and lens.
  • the frame synchronization signal is sent by an image sensor.
  • the exposure parameters include at least one of the following: electronic shutter value, analog gain value, digital gain value, aperture value, mechanical shutter value, etc.
  • electronic shutter value analog gain value
  • digital gain value digital gain value
  • aperture value aperture value
  • mechanical shutter value etc.
  • the embodiment is not limited to this.
  • the body of the shooting device can generate at least one exposure parameter required for shooting.
  • the body of the shooting device can generate at least one exposure parameter required for shooting according to its working mode.
  • the working mode is, for example, video recording. Mode, camera mode, preview mode, etc. or,
  • the body of the camera can acquire at least one exposure parameter set by the user.
  • the user can perform an exposure parameter setting operation on the camera, and the exposure parameter setting operation is used to set the at least one exposure parameter described above.
  • the user performs the exposure parameter setting operation on the control terminal of the drone.
  • the user performs the exposure parameter setting operation on the touch screen of the control terminal, and the control terminal detects the exposure
  • the parameter setting operation is to send an exposure parameter setting instruction to the camera of the drone, and the exposure parameter setting instruction includes at least one exposure parameter described above.
  • the body of the photographing device receives the exposure parameter setting instruction, so as to obtain at least one exposure parameter.
  • S303 Determine the sending time of each exposure parameter according to the effective delay time of each exposure parameter.
  • the acquired at least one exposure parameter needs to be sent to at least one of the multiple components, and the effective delay time of each exposure parameter in the component may be different. Therefore, the body sets each exposure parameter The sending time to the required components may also be different. Therefore, the body determines the sending time of each exposure parameter according to the effective delay time of each exposure parameter.
  • the body when it is determined that the sending time of each exposure parameter arrives, the body sends the exposure parameter to N elements of the plurality of elements that require the exposure parameter, where N is an integer greater than or equal to 2.
  • the N elements may be at least two of the image sensor, lens, memory chip, and ISP.
  • the received exposure parameter is used according to the effective delay time of the exposure parameter. Since the transmission time of each exposure parameter is determined according to the effective delay time, so After each component receives the exposure parameter, it will use the exposure parameter generated in the same time S302 at the same image frame number according to the effective delay time, so as to ensure that the exposure parameters generated at the same time take effect on the same image frame.
  • a frame synchronization instruction is sent to multiple elements in the photographing device, so that the multiple elements set the image frame numbers corresponding to the moments when the frame synchronization instructions are received to the same initial frame number and
  • the image frame number is counted from the same initial frame number, at least one exposure parameter required for shooting is obtained, and the transmission time of each exposure parameter is determined according to the effective delay time of each exposure parameter
  • the exposure parameter is sent to the N elements of the multiple elements that require the exposure parameter, so that the N elements use the at least one exposure parameter in the same image frame number.
  • Kind of exposure parameters are therefore, the exposure parameters generated at the same time will take effect in the same image frame in the N elements, thereby avoiding the exposure problem of the captured image and improving the quality of the captured image.
  • a possible implementation manner of the foregoing S303 may include S3031 and S3032:
  • S3031 according to the effective delay time of each exposure parameter, determine the maximum effective delay time among the effective delay time of various exposure parameters.
  • the maximum effective delay time of the effective delay time of the various exposure parameters is determined.
  • the duration in this embodiment may be represented by the number of image frames, for example.
  • the effective delay time of the first exposure parameter is 2 image frames
  • the effective delay time of the second exposure parameter is 1 image frame
  • the effective delay time of the third exposure parameter is 0 image frames
  • the maximum effective delay duration among these effective delay durations is 2 image frames.
  • the transmission time of each exposure parameter is determined according to the above-determined maximum effective delay time and the effective delay time of each exposure parameter, so as to ensure that each element can be in the same image frame number when receiving the exposure parameter. use.
  • the time difference between the transmission time of each type of exposure parameter and the generation time of the type of exposure parameter is equal to the maximum effective delay time and the effective delay of the type of exposure parameter.
  • the difference in duration is equal to the maximum effective delay time and the effective delay of the type of exposure parameter.
  • time is represented by image frames, and three exposure parameters are used.
  • the first type of exposure parameter has an effective delay time of 2 image frames
  • the second type of exposure parameter has an effective delay time of 1 image frame.
  • Frame the effective delay time of the third exposure parameter is 0 image frames
  • the maximum effective delay time is 2 image frames as an example.
  • the transmission time of the first exposure parameter is determined to be the nth image frame; according to the maximum effective delay time is If the effective delay time of 2 image frames and the second type of exposure parameter is 1 image frame, the transmission time of the first type of exposure parameter is determined to be the n+1 image frame; according to the maximum effective delay time, it is 2 images If the effective delay time of frame and the third exposure parameter is 0 frame image frame, it is determined that the transmission time of the third exposure parameter is the n+2th frame image frame.
  • the time for the component to use the first exposure parameter is the n+2th frame Image frame. Since the component receives the second exposure parameter in the n+1th image frame, and the effective delay time of the second exposure parameter is 1 image frame, the time for the component to use the second exposure parameter is the n+th 2 image frames. Since the component receives the third exposure parameter in the n+2 image frame, and the effective delay time of the third exposure parameter is 0 frame image frame, the time for the component to use the third exposure parameter is n+ 2 image frames. It can be seen that the exposure parameters generated at the same time, even if the effective delay time is different, can be used in the same image frame number in each component, which ensures the synchronization of the exposure parameters.
  • the at least one exposure parameter in S302 may include: an electronic shutter value, an analog gain value, and a digital gain value.
  • the N elements include: an image sensor and an ISP.
  • the exposure parameters required by the image sensor include: an electronic shutter value and an analog gain value; the exposure parameters required by the ISP include: a digital gain value.
  • the body can generate the electronic shutter value, the analog gain value and the digital gain value required by the ISP for the image sensor.
  • the body and the image sensor are connected through a two-wire serial (Inter-Integrated Circuit, I2C) bus. /Serial Peripheral Interface (SPI) bus communication connection.
  • I2C Inter-Integrated Circuit
  • SPI Serial Peripheral Interface
  • the body determines the transmission time of the electronic shutter value, the transmission time of the analog gain value, and the transmission of the digital gain value according to the effective delay time of the electronic shutter value, the effective delay time of the analog gain value, and the effective delay time of the digital gain value.
  • the specific implementation process can refer to the description of the foregoing embodiments, and details are not described herein again.
  • the transmission time of the electronic shutter value generated in the nth image frame determined by the fuselage is the nth image frame
  • the transmission time of the analog gain value generated in the nth image frame is the nth
  • the transmission time of the +1 image frame and the digital gain value generated in the nth image frame is the n+2th image frame.
  • the transmission time of the electronic shutter value generated in the n+1th image frame determined by the body is the n+1th image frame
  • the transmission time of the analog gain value generated in the n+1th image frame is the n+2th
  • the transmission time of the frame image frame and the digital gain value generated in the n+1th image frame is the n+3th image frame.
  • the transmission time of the electronic shutter value generated in the n+2th image frame determined by the body is the n+2th image frame
  • the transmission time of the analog gain value generated in the n+2th image frame is the n+3th image frame.
  • the transmission time of the frame image frame and the digital gain value generated in the n+2th image frame is the n+4th image frame.
  • the transmission time of the electronic shutter value generated in the n+3th image frame determined by the body is the n+3th image frame
  • the transmission time of the analog gain value generated in the n+3th image frame is the n+4th
  • the transmission time of the frame image frame and the digital gain value generated in the n+3th image frame is the n+5th image frame.
  • the transmission time of the electronic shutter value generated in the n+4th image frame determined by the body is the n+4th image frame
  • the transmission time of the analog gain value generated in the n+4th image frame is the n+5th
  • the transmission time of the frame image frame and the digital gain value generated in the n+4th image frame is the n+6th image frame.
  • the image sensor receives the electronic shutter value generated in the nth image frame in the nth image frame, and receives the analog gain value generated in the nth image frame in the n+1th image frame. ;
  • the electronic shutter value generated when the n+1th image frame is received;
  • the simulation generated when the n+1th image frame is received Gain value;
  • the n+3th image frame is received, the electronic shutter value generated when the n+3th image frame is received, and when the n+3th image frame is received, it is generated when the n+2th image frame is received
  • the analog gain value of the image frame; the electronic shutter value generated when the image frame of the n+4 frame is received in the image frame of the n+4 frame, and the image frame of the n+3 frame is received in the image frame of the n+4 frame The analog gain value generated at the time; the electronic shutter value generated at the n+5th image frame is received
  • the ISP receives the digital gain value generated in the nth image frame in the n+2th image frame; received the digital gain value generated in the n+1th image frame in the n+3th image frame ; Receive the digital gain value generated in the n+2 image frame in the n+4th image frame; receive the digital gain value generated in the n+3th image frame in the n+5th image frame Gain value; the digital gain value generated when the n+4th image frame is received in the n+6th image frame.
  • the image sensor uses the electronic shutter value and the analog gain value generated in the nth image frame in the n+2th image frame, and the ISP uses the nth image frame in the n+2th image frame.
  • the digital gain value is the exposure parameter of the nth frame corresponding to the brightness of the obtained n+2th image frame.
  • the image sensor uses the electronic shutter value and analog gain value generated in the n+1 image frame in the n+3 image frame, and the ISP uses the n+1 image frame in the n+3 image frame. Therefore, the brightness of the obtained image frame of the n+3th frame corresponds to the exposure parameter of the n+1th frame.
  • the image sensor uses the electronic shutter value and analog gain value generated in the n+2 image frame in the n+4th image frame, and the ISP uses the n+2th image frame in the n+4th image frame. Therefore, the brightness of the n+4th image frame obtained corresponds to the exposure parameter of the n+2th frame.
  • the image sensor uses the electronic shutter value and analog gain value generated in the n+3 image frame in the n+5th image frame, and the ISP uses the n+3th image frame in the n+5th image frame. Therefore, the brightness of the n+5th image frame obtained corresponds to the exposure parameter of the n+3th frame.
  • the image sensor uses the electronic shutter value and analog gain value generated in the n+4 image frame in the n+6 image frame, and the ISP uses the n+4 image frame in the n+6 image frame. Therefore, the brightness of the n+6th image frame obtained corresponds to the exposure parameter of the n+4th frame.
  • the N elements when the working mode of the photographing device is the video recording mode, the N elements further include: a storage chip; the exposure parameters required by the storage chip include: a digital gain value.
  • the memory chip may include an FPGA chip.
  • the fuselage and the memory chip are connected for communication via an I2C bus/SPI bus, for example.
  • the body sends the digital gain value generated in the nth image frame in the n+2th image frame, and sends the digital gain value generated in the n+1th image frame in the n+3th image frame Digital gain value, send the digital gain value generated in the n+2 image frame in the n+4th image frame, and send the digital gain value generated in the n+3th image frame in the n+5th image frame, In the n+6th image frame, the digital gain value generated in the n+4th image frame is sent.
  • the memory chip receives and uses the digital gain value generated in the nth image frame in the n+2th image frame; receives and uses the n+1th frame in the n+3th image frame The digital gain value generated in the image frame; the digital gain value generated in the n+2 image frame is received and used in the n+4th image frame; the digital gain value generated in the n+5th image frame is received and used in the n+5th image frame The digital gain value generated in the n+3th image frame; the digital gain value generated in the n+4th image frame is received and used in the n+6th image frame.
  • the body of the camera determines the aperture value of the lens; sends an aperture setting instruction to the lens,
  • the aperture setting instruction includes the aperture value and a preset time, and the aperture setting instruction is used for the lens to start using the aperture value at the preset time; the preset time is when the camera is taking a picture A point in time during the mode.
  • the lens of the shooting device when the shooting device is in the shooting mode, the lens of the shooting device needs to use the aperture value. Therefore, the body of the shooting device is in the process of switching the shooting device from the preview mode to the shooting mode.
  • the aperture value of the lens is determined, and then an aperture setting instruction is sent to the lens.
  • the aperture setting instruction includes the aperture value and a preset time, wherein the body and the lens are connected via I2C bus/SPI bus communication, for example.
  • the lens receives the aperture setting instruction, and according to the aperture setting instruction, starts to use the aperture value at the preset time.
  • the preset time may be, for example, the time for shooting the first image frame when the shooting device is in the shooting mode.
  • the memory chip before the body sends an aperture setting instruction to the lens, the memory chip also stops its internal processing logic.
  • the camera will switch from the camera mode to the preview mode after the camera is finished taking pictures.
  • the aperture value used by the lens in the preview mode of the camera is different from the aperture value used in the camera mode. Therefore, the lens is When the photographing device is in the photographing mode, the aperture value is restored to the aperture value in the preview mode.
  • the memory chip restores its internal processing logic.
  • the image sensor of the camera may also need to use a mechanical shutter value.
  • the body of the camera not only determines the aperture value of the lens, but also determines the mechanical shutter value of the image sensor; then sends a mechanical shutter value setting instruction to the image sensor,
  • the mechanical shutter value setting instruction includes the mechanical shutter value and a preset time, and the mechanical shutter value setting instruction is used to instruct the image sensor to start using the mechanical shutter value at the preset time.
  • the image sensor receives the mechanical shutter value setting instruction during the switching process of the camera from the preview mode to the photographing mode, and starts using the mechanical shutter value at the preset time according to the mechanical shutter value setting instruction.
  • the time when the image sensor starts to use the mechanical shutter value and the lens starts to use the aperture value is the same, thereby realizing the time synchronization between the mechanical shutter value and the aperture value between the components.
  • the preset time may be, for example, the time for shooting the first image frame when the shooting device is in the shooting mode.
  • the aforementioned frame synchronization indication includes an operation mode switching indication.
  • a possible implementation of the above S301 is: when the photographing device switches between working modes, The component sends a working mode switching instruction, so that the multiple components set the image frame numbers corresponding to the moment when the working mode switching instruction is received to the same initial frame number.
  • the camera body When the camera is switched between video mode and preview mode, or between video mode and camera mode, or between preview mode and camera mode, the camera body sends work mode switching instructions to multiple components .
  • these multiple components respectively set the corresponding image frame numbers at the time when the operating mode switching instruction is received to the same initial frame number, and these multiple components will all receive the frame synchronization signal, and the frames are received in these multiple components.
  • the image frame numbers are counted from the same initial frame number mentioned above, so that the image frame numbers counted in these multiple elements are synchronized.
  • the image frame number is synchronized, the synchronization of the exposure parameters is more accurate.
  • the above-mentioned multiple components include a lens and a memory chip.
  • the body needs to synchronize the image frame with the lens and the memory chip. Therefore, during the process of switching between the operating modes of the camera body, the operating mode switching instructions are sent to the lens and the memory chip respectively, and both the lens and the memory chip will receive the operating mode switching instructions, and then the lens and memory chip will The image frames that have received the operation mode switching instruction are set to the same initial frame number, for example, the respective stored image frame numbers are set, for example, the image frame number is set to 0.
  • the image sensor is not only communicatively connected with the body, but also communicatively connected with the memory chip and the lens.
  • the vd pin of the image sensor can be connected with the body, lens and memory chip, and the image sensor will be connected simultaneously through the vd pin.
  • Send frame synchronization signals (for example: vd signal or vsync signal) to the body, memory chip and lens respectively.
  • the body, memory chip, and lens will receive the frame synchronization signal, and then count the image frame numbers starting from the image frame number being 0, so that the image frame number synchronization between the body, lens and memory chip can be ensured.
  • the recording command when a recording device receives a recording command, the recording command is used to control the recording device to enter the recording mode and send a recording start instruction to the storage chip, and the recording start instruction is used to instruct the storage chip
  • the image frame is stored in the first memory outside the storage chip; the image frame number of the first image frame to be encoded is determined, and the image frame number is sent to the storage chip, so that the storage chip according to the Image frame number, starting from the image frame number, the corresponding image frames stored in the first memory are sequentially stored in the second memory outside the memory chip.
  • the user When the user needs to record, perform the recording operation on the camera. For example, take the camera with the drone as an example, the user can perform the recording operation on the drone's control terminal, and the control terminal will shoot the drone
  • the device sends a recording command.
  • the body of the camera will receive the video command, and the body will send a video start instruction to the memory chip.
  • the video start instruction is used to instruct the memory chip to store the image frame in the first memory.
  • the first memory and the memory chip Communication connection, and the first memory is a memory independent of the memory chip, and the first memory is, for example, a Double Data Rate (DDR) memory.
  • DDR Double Data Rate
  • the storage chip stores the image frames in the storage chip in the first memory.
  • the body determines the image frame number of the first image frame to be encoded, and the frame number can be an absolute image frame number or a relative image frame number.
  • the body then sends the image frame number to the storage chip, and after receiving the image frame number, the storage chip stores the image frame corresponding to the image frame number from the first memory to the second memory in.
  • the second memory is a memory different from the first memory, and the second memory is in communication connection with the storage chip.
  • the first memory is, for example, a solid state drive (SSD). Therefore, this embodiment can ensure that the image frames stored in the second memory are aligned with the image frame numbers of the encoded image frames.
  • the body and the memory chip may communicate through a general purpose input output (GPIO) interface or a universal asynchronous receiver/transmitter (UART) interface.
  • GPIO general purpose input output
  • UART universal asynchronous receiver/transmitter
  • the above-mentioned image frame number may be sent by the body to the storage chip through the at least one GPIO interface.
  • the at least one GPIO interface is, for example, 4 GPIO interfaces.
  • the frame number may be a relative image frame number, for example.
  • the above-mentioned recording start instruction may be sent by the body to the storage chip via the I2C bus.
  • the aforementioned image frame number may be sent by the fuselage to the storage chip through the UART interface, and the image frame number may be, for example, an absolute image frame number.
  • the above-mentioned recording start instruction may be sent by the body to the storage chip through SPI. Since this embodiment uses SPI to transmit the recording start instruction, the communication bandwidth can not be greatly increased.
  • the image frame number is transmitted through the GPIO interface or the UART interface, which can avoid that when the image frame number is transmitted through the I2C bus, because the amount of data transmitted on the I2C bus is very large, the transmission of the frame number will be greatly delayed. , The delay may be too large, causing the cache of the memory chip to burst, and the corresponding image frame is lost.
  • the solution executed by the body of the above-mentioned camera may be executed by a processor in the body of the camera.
  • the embodiment of the present invention also provides a computer storage medium, the computer storage medium stores program instructions, and the program execution may include part or all of the steps of the parameter synchronization method in FIG. 3 and its corresponding embodiments.
  • FIG. 8 is a schematic structural diagram of a photographing device provided by an embodiment of the present invention.
  • the photographing device 800 of this embodiment may include: a body 801 and multiple components other than the body. 801 can be communicatively connected with these multiple components; wherein, FIG. 8 shows multiple components including: an image sensor 802, an ISP 803, a storage chip 804, and a lens 805 as an example, but this embodiment is not limited to this.
  • the body 801 includes a processor 8011.
  • the processor 8011 is configured to send a frame synchronization instruction to the multiple elements, so that the multiple elements set the image frame number corresponding to the moment when the frame synchronization instruction is received to the same initial frame number, and the When the multiple elements receive the frame synchronization signal, start counting the image frame numbers from the same initial frame number; obtain at least one exposure parameter required for shooting; determine each exposure according to the effective delay time of each exposure parameter The transmission time of the parameter; when the transmission time of each exposure parameter arrives, the exposure parameter is sent to the N elements of the plurality of elements that require the exposure parameter, so that the N elements are in the same image frame Use the at least one exposure parameter, and the N is an integer greater than or equal to 2.
  • the processor 8011 determines the sending time of each exposure parameter according to the effective delay time of each exposure parameter, it is specifically configured to:
  • the effective delay time of each exposure parameter determine the maximum effective delay time among the effective delay time of various exposure parameters
  • the sending time of each exposure parameter is determined according to the maximum effective delay time and the effective delay time of each exposure parameter.
  • the time difference between the transmission time of each type of exposure parameter and the generation time of that type of exposure parameter is equal to the difference between the maximum effective delay time and the effective delay time of this type of exposure parameter Difference.
  • the plurality of components includes at least two of the following: image sensor, ISP, memory chip, lens.
  • the at least one exposure parameter includes: an electronic shutter value, an analog gain value, and a digital gain value.
  • the N elements when the working mode of the photographing device is a preview mode, a video recording mode, or a photographing mode, the N elements include: an image sensor 802 and an ISP 803;
  • the exposure parameters required by the image sensor 802 include: electronic shutter value and analog gain value;
  • the exposure parameters required by the ISP 803 include: digital gain value.
  • the N elements when the working mode of the photographing device is the video recording mode, the N elements further include: a storage chip 804;
  • the exposure parameters required by the memory chip 804 include digital gain values.
  • the memory chip 804 includes an FPGA chip.
  • the processor 8011 is further configured to:
  • the aperture setting instruction includes the aperture value and a preset time
  • the aperture setting instruction is used for the lens 805 to start using the aperture value at the preset time
  • the lens 805 is configured to start using the aperture value at the preset time
  • the preset time is a point in time when the photographing device 800 is in the photographing mode.
  • the processor 8011 is further configured to:
  • the mechanical shutter value setting instruction includes the mechanical shutter value and a preset time, and the mechanical shutter value setting instruction is used to instruct the image sensor 802 in the preset time. Set the time to start using the mechanical shutter value;
  • the image sensor 802 is configured to start using the mechanical shutter value at the preset time.
  • the frame synchronization instruction includes a work mode switching instruction; when the processor 8011 sends a frame synchronization instruction to multiple elements in the camera, it is specifically configured to:
  • the camera 800 When the camera 800 switches between the operating modes, it sends an operating mode switching instruction to the multiple components.
  • the frame synchronization signal is sent by the image sensor 802.
  • the processor 8011 is further configured to: when a recording command is received, the recording command is used to control the shooting device 800 to enter the recording mode, and send a recording start instruction to the storage chip 804, The recording start instruction is used to instruct the storage chip 804 to store the image frame in the first memory 806 outside the storage chip;
  • the storage chip 804 is configured to store image frames in the first memory 806 outside the storage chip 804, and according to the image frame number, store the image frames in the first memory 806 starting from the image frame number The corresponding image frames are sequentially stored in the second memory 807 outside the storage chip 804.
  • the photographing device 800 includes the first memory 806 and the second memory 807 described above. In other embodiments, at least one of the first memory 806 and the second memory 807 is a memory external to the camera 800.
  • the first memory 806 is a DDR memory
  • the second memory 807 is an SSD.
  • the body 801 and the storage chip 804 are communicatively connected through at least one GPIO interface or a UART interface.
  • the processor 8011 sends the image frame number to the storage chip 804, specifically Used for:
  • the image frame number is sent to the storage chip 804 through at least one GPIO interface or UART interface.
  • the body 801 and the storage chip 804 are connected through SPI communication;
  • the processor 8011 sends a video recording start instruction to the storage chip 804, it is specifically configured to send the video recording start instruction to the storage chip 804 via SPI.
  • the image frame number is an absolute image frame number or a relative image frame number.
  • the photographing device in this embodiment can be used to implement the technical solutions in the foregoing method embodiments of the present invention, and its implementation principles and technical effects are similar, and will not be repeated here.
  • FIG. 9 is a schematic structural diagram of a movable platform provided by an embodiment of the present invention.
  • the movable platform 900 of this embodiment may include: a body 901 of the movable platform 900 and a photographing device 902.
  • the device 902 is mounted on the body 901 of the movable platform.
  • the movable platform 900 further includes a pan/tilt (not shown in the figure), the pan/tilt is connected to the body 901, and the camera 902 is mounted on the pan/tilt.
  • the movable platform 900 is a handheld PTZ.
  • the movable platform of this embodiment can be used to implement the technical solutions in the foregoing method embodiments of the present invention, and its implementation principles and technical effects are similar, and will not be repeated here.
  • a person of ordinary skill in the art can understand that all or part of the steps in the above method embodiments can be implemented by a program instructing relevant hardware.
  • the foregoing program can be stored in a computer readable storage medium. When the program is executed, it is executed. Including the steps of the foregoing method embodiment; and the foregoing storage medium includes: read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disks or optical disks, etc., which can store program codes Medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé de synchronisation de paramètres, un appareil de capture d'images et une plateforme mobile. Le procédé consiste à : envoyer une instruction de synchronisation de trame à de multiples éléments dans l'appareil de capture d'image, de sorte que les multiples éléments configurent des numéros de trame d'image correspondant au moment où l'instruction de synchronisation de trame est reçue comme étant le même numéro de trame initial, et compter les numéros de trame d'image à partir du même numéro de trame initial lors de la réception d'un signal de synchronisation de trame (S301) ; obtenir au moins un paramètre d'exposition requis pour la capture d'image (S302) ; déterminer le temps d'envoi de chaque paramètre d'exposition en fonction de la durée de retard effective de chaque paramètre d'exposition (S303) ; et lorsque le temps d'envoi de chaque paramètre d'exposition est atteint, envoyer le paramètre d'exposition à N éléments qui nécessitent le paramètre d'exposition parmi les multiples éléments, de sorte que les N éléments utilisent le ou les paramètres d'exposition au même numéro de trame d'image (S304). Des paramètres d'exposition générés en même temps peuvent avoir un effet sur la même trame d'image dans les N éléments, ce qui permet d'éviter le problème d'exposition d'image et d'améliorer la qualité d'image.
PCT/CN2019/077859 2019-03-12 2019-03-12 Procédé de synchronisation de paramètres, appareil de capture d'images et plateforme mobile Ceased WO2020181494A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2019/077859 WO2020181494A1 (fr) 2019-03-12 2019-03-12 Procédé de synchronisation de paramètres, appareil de capture d'images et plateforme mobile
CN201980005567.9A CN111345033A (zh) 2019-03-12 2019-03-12 参数同步方法、拍摄装置和可移动平台

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/077859 WO2020181494A1 (fr) 2019-03-12 2019-03-12 Procédé de synchronisation de paramètres, appareil de capture d'images et plateforme mobile

Publications (1)

Publication Number Publication Date
WO2020181494A1 true WO2020181494A1 (fr) 2020-09-17

Family

ID=71187739

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/077859 Ceased WO2020181494A1 (fr) 2019-03-12 2019-03-12 Procédé de synchronisation de paramètres, appareil de capture d'images et plateforme mobile

Country Status (2)

Country Link
CN (1) CN111345033A (fr)
WO (1) WO2020181494A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113791639A (zh) * 2021-09-08 2021-12-14 沃飞长空科技(成都)有限公司 光电吊舱数据对齐方法、无人机及可读存储介质
CN115734066A (zh) * 2022-10-18 2023-03-03 深圳锐视智芯科技有限公司 一种传感器同步控制方法、图像传感器及可读存储介质
CN115988317A (zh) * 2022-12-12 2023-04-18 Oppo广东移动通信有限公司 图像处理方法、装置、芯片、通信设备、及存储介质
CN116783643A (zh) * 2021-01-27 2023-09-19 高通股份有限公司 显示器内摄像头激活

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112532890B (zh) * 2020-11-02 2022-06-07 浙江大华技术股份有限公司 曝光控制方法、摄像设备及计算机可读存储介质
WO2022120827A1 (fr) * 2020-12-11 2022-06-16 深圳市大疆创新科技有限公司 Procédé de réglage d'exposition, dispositif photographique, plateforme mobile et support de stockage
CN115484381A (zh) * 2021-05-31 2022-12-16 华为技术有限公司 同步方法、电子设备、计算机可读存储介质及程序产品
CN113676673B (zh) * 2021-08-10 2023-06-16 广州极飞科技股份有限公司 图像采集方法、图像采集系统及无人设备
CN114279370B (zh) * 2021-12-06 2024-04-30 珠海格力智能装备有限公司 一种电容外观检测方法、装置、系统及存储介质
CN114363478B (zh) * 2022-01-04 2023-11-10 平头哥(上海)半导体技术有限公司 信号处理单元、方法、加速单元、电子设备和片上系统
CN115278052B (zh) * 2022-06-22 2025-02-28 展讯通信(上海)有限公司 图像处理的方法、装置、电子设备及计算机可读存储介质
CN115167669B (zh) * 2022-06-30 2025-11-21 恒玄科技(上海)股份有限公司 一种智能眼镜、同步显示方法及介质
CN116058788A (zh) * 2023-01-17 2023-05-05 北京鹰瞳科技发展股份有限公司 用于眼底相机中图像帧采集的方法及其相关产品

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030133018A1 (en) * 2002-01-16 2003-07-17 Ted Ziemkowski System for near-simultaneous capture of multiple camera images
CN102427509A (zh) * 2011-09-16 2012-04-25 杭州海康威视数字技术股份有限公司 一种控制补光灯同步摄像机的装置及方法
CN104412585A (zh) * 2013-07-05 2015-03-11 联发科技股份有限公司 用于多传感器照相机装置的同步控制器和相关同步方法
CN108419017A (zh) * 2018-04-28 2018-08-17 Oppo广东移动通信有限公司 控制拍摄的方法、装置、电子设备及计算机可读存储介质

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100370354C (zh) * 2003-11-28 2008-02-20 北京中星微电子有限公司 一种消除照明灯下曝光闪烁的自动曝光控制电路
US7437063B2 (en) * 2006-04-07 2008-10-14 Lab Partners Associates, Inc. Wireless camera flash synchronizer system and method
CN1968230A (zh) * 2006-06-27 2007-05-23 华为技术有限公司 一种信号处理装置及同步方法
US8917350B2 (en) * 2009-02-12 2014-12-23 Lab Patners Associates, Inc. Early photographic synchronization system and method
US8768158B2 (en) * 2010-02-01 2014-07-01 Canon Kabushiki Kaisha Image pickup apparatus, flash apparatus, and camera system
KR101953614B1 (ko) * 2012-10-12 2019-05-22 삼성전자주식회사 카메라장치의 이미지처리장치 및 방법
TWI477123B (zh) * 2013-01-15 2015-03-11 Univ Nat Chiao Tung 多功能控制照明裝置
KR102124598B1 (ko) * 2013-09-30 2020-06-19 삼성전자주식회사 이미지 획득 방법 및 장치
US10721420B2 (en) * 2015-06-02 2020-07-21 Intel Corporation Method and system of adaptable exposure control and light projection for cameras
CN107277385B (zh) * 2017-06-12 2020-04-17 深圳市瑞立视多媒体科技有限公司 一种多相机系统同步曝光的控制方法、装置及终端设备
CN108781259B (zh) * 2017-07-31 2021-04-16 深圳市大疆创新科技有限公司 一种图像拍摄的控制方法、控制装置及控制系统
CN109089013A (zh) * 2018-09-21 2018-12-25 中兴新通讯有限公司 一种多光源检测图像获取方法以及机器视觉检测系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030133018A1 (en) * 2002-01-16 2003-07-17 Ted Ziemkowski System for near-simultaneous capture of multiple camera images
CN102427509A (zh) * 2011-09-16 2012-04-25 杭州海康威视数字技术股份有限公司 一种控制补光灯同步摄像机的装置及方法
CN104412585A (zh) * 2013-07-05 2015-03-11 联发科技股份有限公司 用于多传感器照相机装置的同步控制器和相关同步方法
CN108419017A (zh) * 2018-04-28 2018-08-17 Oppo广东移动通信有限公司 控制拍摄的方法、装置、电子设备及计算机可读存储介质

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116783643A (zh) * 2021-01-27 2023-09-19 高通股份有限公司 显示器内摄像头激活
CN113791639A (zh) * 2021-09-08 2021-12-14 沃飞长空科技(成都)有限公司 光电吊舱数据对齐方法、无人机及可读存储介质
CN115734066A (zh) * 2022-10-18 2023-03-03 深圳锐视智芯科技有限公司 一种传感器同步控制方法、图像传感器及可读存储介质
CN115988317A (zh) * 2022-12-12 2023-04-18 Oppo广东移动通信有限公司 图像处理方法、装置、芯片、通信设备、及存储介质

Also Published As

Publication number Publication date
CN111345033A (zh) 2020-06-26

Similar Documents

Publication Publication Date Title
WO2020181494A1 (fr) Procédé de synchronisation de paramètres, appareil de capture d'images et plateforme mobile
CN110730287B (zh) 一种可拆换的云台相机、飞行器、系统及其云台拆换方法
US20190291864A1 (en) Transformable apparatus
JP6785412B2 (ja) 無人航空機システム
WO2021217371A1 (fr) Procédé et appareil de commande pour plateforme mobile
WO2018064831A1 (fr) Tête de trépied, véhicule aérien sans pilote et procédé de commande de celui-ci
WO2018035764A1 (fr) Procédé permettant de prendre des photos à grand angle, dispositif, têtes à berceau, véhicule aérien sans pilote et robot
WO2020172800A1 (fr) Procédé de commande de patrouille pour plate-forme mobile et plate-forme mobile
CN108924520B (zh) 传输控制方法、装置、控制器、拍摄设备及飞行器
CN108521814A (zh) 云台的控制方法、控制器和云台
EP3595286A1 (fr) Dispositif, procédé et programme de traitement d'informations
CN108780321B (zh) 用于设备姿态调整的方法、设备、系统和计算机可读存储介质
JP6910785B2 (ja) 移動撮像装置およびその制御方法、ならびに撮像装置およびその制御方法、無人機、プログラム、記憶媒体
WO2019227289A1 (fr) Procédé et dispositif de commande de chronophotographie
WO2018191971A1 (fr) Procédé de commande de tête de berceau et tête de berceau
WO2022061934A1 (fr) Procédé et dispositif de traitement d'image, système, plateforme et support de stockage lisible par ordinateur
CN204287973U (zh) 飞行相机
JP2017112438A (ja) 撮像システムおよびその制御方法、通信装置、移動撮像装置、プログラム
WO2020006650A1 (fr) Caméra d'image, tête de berceau portative et plateforme mobile
CN105807783A (zh) 飞行相机
WO2021168821A1 (fr) Procédé de commande de plateforme mobile et dispositif
WO2020237429A1 (fr) Procédé de commande pour dispositif de commande à distance et dispositif de commande à distance
CN112585936B (zh) 控制方法、拍摄装置、镜头、可移动平台和计算机可读介质
WO2022109860A1 (fr) Procédé de suivi d'objet cible et cardan
JP6700868B2 (ja) 撮影制御装置及びその制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19919307

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19919307

Country of ref document: EP

Kind code of ref document: A1