WO2018188086A1 - Unmanned aerial vehicle and control method therefor - Google Patents
Unmanned aerial vehicle and control method therefor Download PDFInfo
- Publication number
- WO2018188086A1 WO2018188086A1 PCT/CN2017/080647 CN2017080647W WO2018188086A1 WO 2018188086 A1 WO2018188086 A1 WO 2018188086A1 CN 2017080647 W CN2017080647 W CN 2017080647W WO 2018188086 A1 WO2018188086 A1 WO 2018188086A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- drone
- coordinate system
- coordinate
- control device
- view
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C27/00—Rotorcraft; Rotors peculiar thereto
Definitions
- the invention relates to a drone, and in particular to a control method of a drone.
- the existing UAV operation uses the remote control joystick to control the attitude of the aircraft, which can achieve precise control, but requires two-hand control, which is very inconvenient; there are also some small aircrafts that operate through the mobile phone and use the touch screen to simulate the joystick. Or map the attitude of the aircraft (sense control) by the gesture of the mobile phone.
- the first one is designed for more professional aerial vehicle design, which can achieve more precise control, but requires two-hand operation and high requirements for operators;
- the second type lacks feedback (it is difficult to accurately control when touched), and due to transmission delay, wireless interference, mobile phone attitude measurement error, etc., the somatosensory operation cannot achieve precise operation.
- the main feature of the small self-timer UAV is portable. It is aimed at the average consumer. Therefore, it is generally not equipped with a professional remote control. It is difficult to operate accurately with the body of the mobile phone.
- Self-timer is a composition, and the selection of the scene is relatively high. Manipulating the aircraft.
- the existing methods are not suitable for small self-timer drones.
- the invention proposes a way to accurately control the drone with one hand from the perspective of improving the user experience of the small aircraft self-timer.
- You can use the smart phone to complete the composition and scene selection.
- You can also use the one-handed device (OSMO MOBILE) to make it easy to use, so that the average consumer can quickly get started and focus on it. Take a photo of itself.
- OSMO MOBILE one-handed device
- the technical problem to be solved by the present invention is to provide a control method for a drone, which can more easily perform operations such as composition and framing, so that the user can concentrate more on photographing.
- An aspect of the present invention provides a drone control method, the method comprising: calculating a first coordinate of the drone in a first coordinate system, and calculating the drone according to the first coordinate a second coordinate in the two coordinate system, and controlling the drone flight according to the second coordinate.
- An unmanned aerial vehicle comprising a memory, a processor, and a computer program stored on the memory and operable on the processor, the drone communicating with a control device, the control device Including an imaging device, wherein the processor performs the following steps when executing the computer program, calculating a first coordinate of the drone in the first coordinate system, and calculating the unmanned according to the first coordinate The second coordinate of the machine in the second coordinate system, and controlling the drone flight according to the second coordinate.
- the first coordinate system is a spherical coordinate system with the control device as an origin
- the second coordinate system is a navigation coordinate system
- the first coordinate comprises a distance of the drone from the control device, a zenith angle of the drone in the first coordinate system, and the drone is The azimuth in the first coordinate system.
- the calculating the first coordinates of the drone in the first coordinate system comprises calculating a distance of the drone from the control device, and calculating the zenith angle and the Azimuth.
- the calculating the zenith angle and the azimuth angle includes acquiring coordinates of the drone in an image captured by the imaging device, acquiring a field of view of the imaging device, acquiring Determining the resolution of the imaging device, and calculating the zenith angle and the azimuth based on the coordinates, the field of view, and the resolution.
- the field of view includes a horizontal field of view and a vertical field of view, the resolution including horizontal resolution and vertical resolution.
- Figure 1 is a schematic structural view of a drone according to an embodiment of the present invention.
- FIG. 2 is a schematic structural view of a bottom of a drone provided by an embodiment of the present invention.
- FIG. 3 is a flow chart of calculating a target position of a drone according to an embodiment of the present invention
- FIG. 4 is a schematic diagram of a UAV computing target position provided by an embodiment of the present invention.
- FIG. 5 is a flowchart of a user controlled drone according to an embodiment of the present invention.
- FIG. 6 is a schematic diagram of a drone control according to an embodiment of the present invention.
- FIG. 7 is a schematic diagram of a module of a drone according to an embodiment of the present invention.
- FIG. 1 is a schematic structural diagram of a drone according to an embodiment of the present invention.
- the drone 100 can include a fuselage 110 that includes a central portion 111 and at least one outer portion 112.
- the fuselage 110 includes four outer portions 112 (such as the arm 113).
- the four outer portions 112 extend from the central portion 111, respectively.
- the body 110 can include any number of external portions 112 (eg, 6, 8, etc.).
- each of the outer portions 112 can carry a propulsion system 120 that can drive the drone 100 to move (e.g., climb, land, horizontally move, etc.).
- the arm 113 can carry a corresponding motor 121, and the motor 121 can drive the corresponding propeller to rotate.
- the drone 100 can control any set of motors 121 and their corresponding propellers 122 without being affected by the remaining motors 121 and their corresponding propellers.
- the body 110 can carry a load 130, such as an imaging device 131.
- the imaging device 131 can include a camera, for example, an image, video, etc. around the drone can be taken.
- the camera is photosensitive to light of various wavelengths including, but not limited to, visible light, ultraviolet light, infrared light, or any combination thereof.
- the load 130 can include other kinds of sensors.
- the load 130 is coupled to the fuselage 110 by a pan/tilt 150 such that the load 130 can move relative to the fuselage 110. For example, when the load 130 carries the imaging device 131, the imaging device 131 can move relative to the body 110 to capture images, videos, and the like around the drone 100.
- the landing gear 114 can support the drone 100 to protect the load 130 when the drone 100 is on the ground.
- the body 110 can carry two or more loads.
- the body 110 can carry two pan/tilt heads, each of which is connected to a camera.
- the drone 100 can include a control system 140, Control system 140 includes components disposed in said drone 100 and components that are separate from said drone 100.
- the control system 140 can include a first controller 141 disposed on the drone 100, and a remote from the drone 100 and coupled via a communication link 160 (eg, a wireless link)
- the second controller 142 is connected to the first controller 141.
- the first controller 141 can include at least one processor, a memory, and an onboard computer readable medium 143a, the onboard computer readable medium 143a can store program instructions for controlling the behavior of the drone 100,
- the behavior includes, but is not limited to, the operation of the propulsion system 120 and the imaging device 131, controlling the drone to perform automatic landing, and the like.
- the computer readable medium 143a can also be used to store state information of the drone 100, such as altitude, speed, location, preset reference height, and the like.
- the second controller 142 can include at least one processor, memory, off-board computer readable medium 143b, and at least one input and output device 148, such as display device 144 and control device 145.
- An operator of the drone 100 can remotely control the drone 100 through the control device 145 and receive feedback information from the drone 100 via the display device 144 and/or other devices.
- the drone 100 can operate autonomously, at which time the second controller 142 can be omitted, or the second controller 142 can only be used to make the drone operator heavy Write a function for drone flight.
- the onboard computer readable medium 143a can be moved out of the drone 100.
- the off-board computing readable medium 143b can be moved out of the second controller 142.
- the drone 100 can include two forward looking cameras 171 and 172 that are sensitive to light of various wavelengths (eg, visible light, infrared light, ultraviolet light) for shooting. An image or video around the drone. In some embodiments, the drone 100 includes at least one sensor placed at the bottom.
- various wavelengths eg, visible light, infrared light, ultraviolet light
- the drone 100 can include two lower looking cameras 173 and 174 placed at the bottom of the fuselage 110.
- the drone 100 further includes two ultrasonic sensors 177 and 178 placed at the bottom of the body 110.
- the ultrasonic sensors 177 and 178 can detect and/or monitor objects and ground at the bottom of the drone 100 and measure the distance from the object or the ground by transmitting and receiving ultrasonic waves.
- the drone 100 may include an inertial measurement unit (English: inertial measurement unit, IMU), a GPS (English: Global Positioning System) module, an infrared sensor, a microwave sensor, a temperature sensor, and a close range.
- Sensor English: proximity sensor
- 3D laser range finder 3D TOF, etc.
- the three-dimensional laser range finder and the 3D TOF can detect the distance of an object or a body surface under the drone.
- the input and output device 148 is a smart phone.
- the user can control the drone to fly through the mobile phone.
- the drone 100 can receive input information from the input and output device 148, such as a user transmitting a target to the drone 100 through the input and output device 148.
- the drone 100 can identify a corresponding position of the target on the ground according to the target, and the first controller can control the drone 100 to fly above the corresponding position and hover.
- the drone 100 can receive input information from the input and output device 148, such as a user transmitting a target to the drone 100 through the input and output device 148.
- the drone 100 can identify a corresponding position of the target on the ground according to the target.
- the first controller may control the drone 100 to fly to a preset reference altitude and fly along the preset reference altitude.
- the drone may calculate the first coordinate in the first coordinate system and calculate the second coordinate of the drone in the second coordinate system according to the first coordinate. Get the target location. The drone can fly to the second coordinate.
- FIG. 3 is a flowchart of calculating a target position of a drone according to an embodiment of the present invention.
- the drone can calculate a target location in accordance with method 300 and fly in accordance with the target location.
- program instructions to perform the method 300 can be stored in the onboard computer readable medium 143a. In other embodiments, program instructions to perform the method 300 can be stored in the off-board computer readable medium 143b.
- Step 301 Calculate a first coordinate of the drone in the first coordinate system.
- FIG. 4 is a schematic diagram of a UAV computing target position according to an embodiment of the present invention.
- the first coordinate system is a polar coordinate system with the control device O as the origin.
- the first coordinate includes a distance r of the drone 100 from the control device O, a zenith angle ⁇ of the drone 100 in the polar coordinate system, and the drone 100 is in the Azimuth in the polar coordinate system . Therefore, the first coordinate (ie, the coordinates of the drone 100 in the polar coordinate system) can be expressed as (r, ⁇ , ).
- the drone 100 can acquire a horizontal distance and a vertical distance of the drone 100 from the control device O, and calculate the above by the horizontal distance and the vertical distance.
- Distance r For example, the drone 100 can detect the vertical distance by one or more sensors (such as ultrasonic sensors, TOF sensors, barometers, etc.) onboard. As another example, the drone 100 can detect the horizontal distance through a GPS module.
- the drone 100 can receive location information from the control device O to calculate the zenith angle ⁇ and The azimuth .
- the control device O includes a display device and an imaging device (such as a camera or the like) through which the user can photograph the drone to generate a display device that can be displayed on the display device.
- the control device O can identify the position information of the drone, such as the position A, on the image 101 captured by the imaging device, and calculate the position A from the image center O' by calculating the position A. The pixel difference is used to determine the coordinates of the position A.
- the coordinates of the position A may be represented as ( ⁇ u, ⁇ v), ⁇ u is the horizontal pixel difference between the position A and the image center O', ⁇ v is the position A and the image The vertical pixel difference of the center O'.
- the coordinates of the position A may be (200px, 300px).
- the drone 100 can acquire a field of view (Fov) of the imaging device, the resolution of the imaging device (frame size: w, frame height: h), and according to the position A Calculating the zenith angle ⁇ and the azimuth angle by coordinates, the field of view Fov, and the resolution .
- Fov field of view
- the resolution of the imaging device frame size: w, frame height: h
- the field of view Fov includes horizontal field of view hFov and vertical view Wild vFov.
- the horizontal field of view hFov and the vertical field of view vFov can be calculated by the following formula:
- the field of view Fov is a field of view of a diagonal of the photosensitive element of the imaging device.
- the field of view Fov can be given by the manufacturer and can be called directly when needed.
- the control device O can transmit the field of view Fov to the drone 100.
- the drone 100 may calculate the horizontal field of view hFov and the vertical field of view vFov according to the field of view Fov.
- the control device O can also calculate the horizontal field of view hFov and the vertical field of view vFov locally according to the field of view Fov.
- the resolution includes a frame width w and a frame height h.
- the ratio of the width w of the frame to the height h of the frame may be 16:9 or 4:3.
- the frame width w may be 1920px, and the frame height h may be 1080px.
- the control device O can obtain the frame width w and the frame height h from the local.
- the control device O may also send the frame width w and the frame height h to the drone 100.
- the drone 100 or the control device O can calculate the zenith angle ⁇ and the azimuth according to the following formula .
- the control device O can be a handheld pan/tilt.
- the user can take the drone through the imaging device of the handheld pan/tilt (such as a built-in camera, mobile phone or camera, etc.) to generate an image that can be displayed on the imaging device.
- the handheld pan/tilt can acquire the zenith angle ⁇ and the azimuth angle through a built-in attitude sensor (such as an inertial measurement unit or the like) .
- the handheld gimbal may have the zenith angle ⁇ and the azimuth angle Sended to the drone 100.
- Step 302 Calculate a second coordinate of the UAV in the second coordinate system according to the first coordinate.
- the first coordinate (r, ⁇ , is calculated)
- the drone 100 or the control device O can calculate the second coordinate of the drone in the second coordinate system according to the first coordinate.
- the second coordinate system is a navigation coordinate system. Specifically, the x-axis points to the north, the y-axis points to the east, and the z-axis points to the center of the earth.
- the second coordinate system may be a fuselage coordinate system.
- the x-axis points to the front of the fuselage direction
- the y-axis points to the right of the fuselage direction
- the z-axis points to the lower side of the fuselage direction.
- the direction of the fuselage is the orientation of the nose of the drone.
- the drone 100 or the control device O can calculate the second coordinate (x, y, z) according to the following formula.
- Step 303 Control the drone to fly according to the second coordinate.
- the drone 100 can fly to the second coordinate (x, y, z).
- the user can control the position of the aircraft with one hand by a control device (such as a mobile phone, etc.) to complete the composition, the selection, and the like, as simple as the selfie stick. Allows the average user to get up and running quickly, so they can focus more on the photo itself.
- a control device such as a mobile phone, etc.
- the above description of the method 300 is only for ease of understanding of the present invention. It will be apparent to those skilled in the art that the present invention may be modified and modified without departing from the scope of the invention.
- the method 300 may be performed by the drone 100, or by the control device O, or by the drone 100 and the control device O carried out.
- the control device O may further include a tablet computer, a smart watch, a wristband, VR glasses, AR glasses, and the like.
- FIG. 5 is a flowchart of a user controlled drone according to an embodiment of the present invention.
- step 501 the distance between the drone and the control device is adjusted.
- the user can control the drone to take off by a control device (such as a cell phone and/or a remote control) and adjust the distance between the control device and the drone, ie the distance r (see Figure 4).
- a control device such as a cell phone and/or a remote control
- the user can slide a screen of the control device (such as a mobile phone screen), click a specified position of the control device screen (such as a specified position of a mobile phone screen), or multi-touch (such as 3D touch, etc.).
- a control device such as a cell phone and/or a remote control
- the distance r see Figure 4
- the user can slide a screen of the control device (such as a mobile phone screen), click a specified position of the control device screen (such as a specified position of a mobile phone screen), or multi-touch (such as 3D touch, etc.).
- multi-touch such as 3D touch, etc.
- step 502 the user is locked by the drone.
- the user can lock the user through a screen of the control device, such as a cell phone screen.
- the user can switch the screen on the screen to the shooting preview screen of the drone. Thereafter, the user can roughly adjust the orientation of the drone to cause the user to appear in the shooting preview screen.
- the user may lock the user by boxing the user book in the shooting preview screen.
- the user may enable the smart follow function of the drone to continuously follow the user by boxing the user in the shooting preview screen. After the smart follow function is enabled, the unattended user finely adjusts the orientation of the drone and the position of the gimbal according to the user's habits, so that the user appears in the designated position in the shooting preview screen, thereby completing the automatic Composition.
- the specified location may be the center of the shooting preview screen, or may be a location specified by any user on the shooting preview screen.
- step 503 the drone is locked by the control device.
- the user may switch a screen on the screen of the control device to a shot preview screen of the imaging device of the control device.
- the drone is then caused to appear in the shooting preview screen by moving the control device.
- the user may frame the drone in the shooting preview screen to lock the drone.
- the user can enable the smart follow function on the control device.
- the drone is continuously followed by the drone being framed in the shooting preview screen.
- step 503 can be performed prior to step 502, or step 502 and step 503 can be performed simultaneously.
- Step 504 moving the control device to control the position of the drone.
- the user can move the control device to control the position of the drone.
- the drone may calculate in real time according to the method 300. a target position (such as the second coordinate) and flying according to the target position (eg, flying to the target position).
- the user can rotate the control device in place, or move the control device up and down, so that the operations of composition, framing, and the like can be conveniently performed, so that the user can focus more on the photographing.
- the control device is a handheld pan/tilt.
- the user can move the handheld pan/tilt to control the location of the drone.
- the user can directly rotate the handheld pan/tilt, or rotate the handheld pan/tilt by operating the joystick of the handheld pan/tilt, so that the drone appears in the shooting preview screen of the handheld pan/tilt.
- the handheld pan/tilt can acquire the zenith angle through a built-in attitude sensor. And the azimuth angle ⁇ .
- control device may further include a tablet computer, a smart watch, a wristband, VR glasses, AR glasses, and the like.
- FIG. 6 is a schematic diagram of a drone control according to an embodiment of the present invention.
- the user can control the flight of the drone 100 by the control device 200 (such as a mobile phone or the like) with one hand.
- the control device 200 such as a mobile phone or the like
- the user may use the control device 200 to control the flight of the drone 100 by the method 300 to complete operations such as composition, framing, etc., to facilitate the user to take a picture.
- the first coordinate system may be a cylindrical coordinate system with the control device 200 as the origin.
- the first coordinate can be expressed as ( ⁇ , ,z). Where ⁇ represents the horizontal distance between the drone 100 and the control device 200, Indicates the azimuth of the drone 100 in the cylindrical coordinate system, and z represents the height of the drone 100.
- ⁇ represents the horizontal distance between the drone 100 and the control device 200
- z represents the height of the drone 100.
- ⁇ , , z can refer to the description in FIG. 3, and details are not described herein.
- FIG. 7 is a schematic diagram of a UAV module according to an embodiment of the present invention.
- the drone 100 can include at least one processor 701, a sensor 702, a memory 703, and an input input module 704.
- the processor 701 may include at least one processor, including but not limited to a microprocessor (English: microcontroller), a reduced instruction set computer (English: reduced RISC), an application specific integrated circuit ( English: application specific integrated circuits (ASIC), application-specific instruction-set processor (ASIP), central processing unit (English: central processing unit, CPU for short), physical processing English (English: physics processing unit, referred to as: PPU), digital signal processor (English: digital signal processor, referred to as DSP), field programmable gate array (English: field programmable gate array, referred to as: FPGA).
- a microprocessor English: microcontroller
- a reduced instruction set computer English: reduced RISC
- an application specific integrated circuit English: application specific integrated circuits (ASIC), application-specific instruction-set processor (ASIP)
- central processing unit English: central processing unit, CPU for short
- physical processing English English: physics processing unit, referred to as: PPU
- digital signal processor English: digital signal processor, referred to as DSP
- the processor 701 can be configured to perform the method 300. That is, the processor 701 may be configured to calculate a first coordinate of the UAV 100 in the first coordinate system, and calculate a second coordinate of the UAV 100 in the second coordinate system according to the first coordinate. And controlling the drone to fly according to the second coordinate.
- the first coordinate system is a spherical coordinate system with the control device as an origin
- the second coordinate system is a navigation coordinate system
- the first coordinate comprises a distance of the drone from the control device, a zenith angle of the drone in the first coordinate system, and the drone is The azimuth in the first coordinate system.
- the processor 701 can be configured to calculate a distance of the drone 100 from the control device O, and calculate the zenith angle and the azimuth.
- the processor 701 can be configured to acquire the drone Obtaining a field of view of the imaging device at a coordinate in an image captured by the imaging device, acquiring a resolution of the imaging device, and calculating the day based on the coordinate, the field of view, and the resolution The apex angle and the azimuth angle.
- the field of view includes a horizontal field of view and a vertical field of view, the resolution including horizontal resolution and vertical resolution.
- the sensor 302 may include at least one sensor including, but not limited to, a temperature sensor, an inertial measurement unit, an accelerometer, an image sensor (such as a camera), an ultrasonic sensor, a TOF sensor, a microwave sensor, a proximity sensor, and a three-dimensional laser measurement. Distance meter, infrared sensor, etc.
- the inertial measurement unit can be used to measure attitude information (eg, pitch angle, roll angle, yaw angle, etc.) of the drone.
- the inertial measurement unit may include, but is not limited to, at least one accelerometer, gyroscope, magnetometer, or any combination thereof.
- the accelerometer can be used to measure the acceleration of the drone to calculate the speed of the drone.
- the memory 303 can include, but is not limited to, a read only memory (ROM), a random access memory (RAM), a programmable institutional memory (PROM), an electronically erasable programmable read only memory (EEPROM), and the like.
- the storage module 303 can include a transitory computer readable medium that can store code, logic or instructions for performing at least one of the steps described elsewhere herein.
- the control module 301 can perform at least one step, individually or collectively, in accordance with code, logic or instructions of the non-transitory computer readable medium described herein.
- the storage module can be used to store state information of the drone 100, such as height, speed, position, preset reference height, and the like.
- the memory 303 can store program instructions for performing the method 300.
- the input/output module 304 is configured to output information or instructions to an external device, such as receiving an instruction sent by the input/output device 148 (see FIG. 1), or transmitting an image captured by the imaging device 131 (see FIG. 1) to The input and output device 148.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Aviation & Aerospace Engineering (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
本发明涉及一种无人机,且特别涉及一种无人机的控制方法。The invention relates to a drone, and in particular to a control method of a drone.
现有的无人机操作多使用遥控器摇杆控制飞行器的姿态,可以实现精准控制,但需要双手操控,十分不方便;还有一些小型的飞行器通过手机来操作,利用触屏模拟摇杆,或者是通过手机姿态来映射飞行器姿态(体感控制)。The existing UAV operation uses the remote control joystick to control the attitude of the aircraft, which can achieve precise control, but requires two-hand control, which is very inconvenient; there are also some small aircrafts that operate through the mobile phone and use the touch screen to simulate the joystick. Or map the attitude of the aircraft (sense control) by the gesture of the mobile phone.
现有的操作方式:Existing methods of operation:
第一种针对于较为专业的航拍飞行器设计,能够实现更加精准的控制,但是需要双手操作,并且对操作者要求较高;The first one is designed for more professional aerial vehicle design, which can achieve more precise control, but requires two-hand operation and high requirements for operators;
第二种由于缺少反馈(触摸的时候很难精准控制),由于存在传输延时,无线干扰,手机姿态测量误差等原因,体感操作也并不能实现精准操作。The second type lacks feedback (it is difficult to accurately control when touched), and due to transmission delay, wireless interference, mobile phone attitude measurement error, etc., the somatosensory operation cannot achieve precise operation.
小型自拍无人机主要特点是便携,针对于一般消费者,所以一般不配有专业的遥控器,使用手机体感又难以精准的操作,而自拍又是对构图,选景要求较高,需要较为精准的操控飞行器。The main feature of the small self-timer UAV is portable. It is aimed at the average consumer. Therefore, it is generally not equipped with a professional remote control. It is difficult to operate accurately with the body of the mobile phone. Self-timer is a composition, and the selection of the scene is relatively high. Manipulating the aircraft.
综上所述,现有的方法均不适合于小型自拍无人机。In summary, the existing methods are not suitable for small self-timer drones.
本发明就从改善小型飞行器自拍的用户体验角度,提出一种能够单手精准控制无人机的方式。可以直接使用智能手机完成构图、选景的操作,也可以借助适配单手操作的设备(OSMO MOBILE)完成,使用方式十分简单易用,让一般消费者也能快速上手使用,从而更加专注于拍照本身。 The invention proposes a way to accurately control the drone with one hand from the perspective of improving the user experience of the small aircraft self-timer. You can use the smart phone to complete the composition and scene selection. You can also use the one-handed device (OSMO MOBILE) to make it easy to use, so that the average consumer can quickly get started and focus on it. Take a photo of itself.
发明内容Summary of the invention
本发明主要解决的技术问题是提供一种无人机的控制方法,可以更方便地完成构图、取景等操作,从而使用户可以更加专注于拍照。The technical problem to be solved by the present invention is to provide a control method for a drone, which can more easily perform operations such as composition and framing, so that the user can concentrate more on photographing.
本发明一方面提供了一种无人机控制方法,所述方法包括,计算所述无人机在第一坐标系统中的第一坐标,根据所述第一坐标计算所述无人机在第二坐标系统中的第二坐标,以及根据所述第二坐标控制所述无人机飞行。An aspect of the present invention provides a drone control method, the method comprising: calculating a first coordinate of the drone in a first coordinate system, and calculating the drone according to the first coordinate a second coordinate in the two coordinate system, and controlling the drone flight according to the second coordinate.
一种无人机,所述无人机包括存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述无人机与控制装置通信,所述控制装置包括成像装置,其特征在于,所述处理器执行所计算机述程序时实现以下步骤,计算所述无人机在第一坐标系统中的第一坐标,根据所述第一坐标计算所述无人机在第二坐标系统中的第二坐标,以及根据所述第二坐标控制所述无人机飞行。An unmanned aerial vehicle comprising a memory, a processor, and a computer program stored on the memory and operable on the processor, the drone communicating with a control device, the control device Including an imaging device, wherein the processor performs the following steps when executing the computer program, calculating a first coordinate of the drone in the first coordinate system, and calculating the unmanned according to the first coordinate The second coordinate of the machine in the second coordinate system, and controlling the drone flight according to the second coordinate.
在一些实施例中,所述第一坐标系为以所述控制装置为原点的球坐标系,所述第二坐标系为导航坐标系。In some embodiments, the first coordinate system is a spherical coordinate system with the control device as an origin, and the second coordinate system is a navigation coordinate system.
在一些实施例中,所述第一坐标包括所述无人机与所述控制装置的距离,所述无人机在所述第一坐标系统中的天顶角、以及所述无人机在所述第一坐标系统中的方位角。In some embodiments, the first coordinate comprises a distance of the drone from the control device, a zenith angle of the drone in the first coordinate system, and the drone is The azimuth in the first coordinate system.
在一些实施例中,所述计算所述无人机在第一坐标系统中的第一坐标包括,计算所述无人机与所述控制装置的距离,以及计算所述天顶角以及所述方位角。In some embodiments, the calculating the first coordinates of the drone in the first coordinate system comprises calculating a distance of the drone from the control device, and calculating the zenith angle and the Azimuth.
在一些实施例中,所述计算所述天顶角以及所述方位角包括,获取所述无人机在由所述成像装置拍摄的图像中的坐标,获取所述成像装置的视野,获取所述成像装置的分辨率,以及根据所述坐标、所述视野、以及所述分辨率计算所述天顶角以及所述方位角。In some embodiments, the calculating the zenith angle and the azimuth angle includes acquiring coordinates of the drone in an image captured by the imaging device, acquiring a field of view of the imaging device, acquiring Determining the resolution of the imaging device, and calculating the zenith angle and the azimuth based on the coordinates, the field of view, and the resolution.
在一些实施例中,所述视野包括水平视野以及竖直视野,所述分辨率包括水平分辨率以及竖直分辨率。 In some embodiments, the field of view includes a horizontal field of view and a vertical field of view, the resolution including horizontal resolution and vertical resolution.
为了更清楚地说明本披露实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本披露的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, a brief description of the drawings used in the embodiments or the prior art description will be briefly described below. Obviously, the drawings in the following description It is a number of embodiments of the present disclosure, and other drawings may be obtained from those of ordinary skill in the art without departing from the drawings.
图1位本发明实施例提供的无人机的结构示意图;Figure 1 is a schematic structural view of a drone according to an embodiment of the present invention;
图2位本发明实施例提供的无人机底部的结构示意图;2 is a schematic structural view of a bottom of a drone provided by an embodiment of the present invention;
图3位本发明实施例提供的无人机计算目标位置的流程图;FIG. 3 is a flow chart of calculating a target position of a drone according to an embodiment of the present invention; FIG.
图4位本发明实施例提供的无人机计算目标位置的示意图;4 is a schematic diagram of a UAV computing target position provided by an embodiment of the present invention;
图5为本发明实施例提供的用户控制无人机的流程图;FIG. 5 is a flowchart of a user controlled drone according to an embodiment of the present invention;
图6为本发明实施例提供的无人机控制的示意图;6 is a schematic diagram of a drone control according to an embodiment of the present invention;
图7为本发明实施例提供的的无人机的模块示意图。FIG. 7 is a schematic diagram of a module of a drone according to an embodiment of the present invention.
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅是本发明的一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其它实施例,都属于本发明保护的范围。The technical solutions in the embodiments of the present invention are clearly and completely described in the following with reference to the accompanying drawings in the embodiments of the present invention. It is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments of the present invention without creative efforts are within the scope of the present invention.
本发明的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的术语在适当情况下可以互换,这仅仅是描述本发明的实施例中对相同属性的对象在描述时所采用的区分方式。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,以便包含一系列单元的过程、方法、系统、产品或设备不必限于那些单元,而是可包括没有清楚地列出的或对于这些过程、 方法、产品或设备固有的其它单元。The terms "first", "second" and the like in the specification and claims of the present invention and the above drawings are used to distinguish similar objects, and are not necessarily used to describe a particular order or order. It is to be understood that the terms so used are interchangeable as appropriate, and are merely illustrative of the manner in which the objects of the same. In addition, the terms "comprises" and "comprises" and "comprises", and any variations thereof, are intended to cover a non-exclusive inclusion so that a process, method, system, product, or device comprising a series of units is not necessarily limited to those elements, but may include Listed or for these processes, Other units inherent to the method, product or equipment.
下面结合附图,对本发明的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。Some embodiments of the present invention are described in detail below with reference to the accompanying drawings. The features of the embodiments and examples described below can be combined with each other without conflict.
下面结合附图和实施例对本发明进行详细说明。The invention will now be described in detail in conjunction with the drawings and embodiments.
参阅图1,图1为本发明实施例提供的无人机结构示意图。无人机100可以包括机身110,所述机身110包括中央部分111以及至少一个外部部分112。在图1所示的实施例中,所述机身110包括四个外部部分112(如机臂113)。所述四个外部部分112分别从所述中央部分111延伸出来。在其他实施例中,所述机身110可以包含任意数量的外部部分112(如6个、8个等)。在任何上述实施例中,每个所述外部部分112可以承载一个推进系统120,所述推进系统120可以驱动所述无人机100运动(如爬升、降落、水平移动等)。例如:所述机臂113可以承载对应的电机121,所述电机121可以驱动对应的螺旋桨转动。所述无人机100可以控制任意一组电机121及其对应的螺旋桨122,而不受其余的电机121及其对应的螺旋桨影响。Referring to FIG. 1, FIG. 1 is a schematic structural diagram of a drone according to an embodiment of the present invention. The
所述机身110可以携带一个负载130,例如:成像装置131。在一些实施例中,所述成像装置131可以包括一个摄像头,例如:可以拍摄所述无人机周围的图像、视频等。所述摄像头光敏于各种波长的光线,包括但不限于可见光、紫外线、红外线或其中的任意组合。在一些实施例中,所述负载130可以包括其他种类的传感器。在一些实施例中,所述负载130通过云台150与所述机身110连接在一起,使得所述负载130可以相对于所述机身110运动。例如:当所述负载130携带成像装置131时,所述成像装置131可以相对于机身110运动以拍摄所述无人机100周围的图像、视频等。如图所示,当无人机100位于地面时,起落架114可以支撑无人机100以保护所述负载130。值得注意的是,在其他实施例中,所述机身110可以携带两个或两个以上负载。例如,所述机身110可以携带两个云台,每个云台分别连接一个相机。The
在一些实施例中,所述无人机100可以包括控制系统140,所述
控制系统140包括置于所述无人机100的组件以及与所述无人机100分离的组件。例如,所述控制系统140可以包括一个置于所述无人机100上的第一控制器141,以及一个远离所述无人机100并通过通信链路160(如无线链路)与所述第一控制器141连接的第二控制器142。所述第一控制器141可以包括至少一个处理器、存储器、以及机载计算机可读介质143a,所述机载计算机可读介质143a可以存储用于控制无人机100行为的程序指令,所述行为包括但不限于所述推进系统120及所述成像装置131的操作,控制所述无人机进行自动降落等。所述计算可读介质143a也可用于存储所述无人机100的状态信息,如高度、速度、位置、预置的参考高度等。所述第二控制器142可以包括至少一个处理器、存储器、机外计算机可读介质143b,以及至少一个输入输出装置148,例如:显示装置144及控制装置145。所述无人机100的操作者可以通过所述控制装置145远程控制所述无人机100,并通过所述显示装置144和/或其他装置接收来自所述无人机100的反馈信息。在其他实施例中,所述无人机100可以自主运作,此时所述第二控制器142可以被省去,或者所述第二控制器142可以仅被用来使无人机操作者重写用于无人机飞行的函数。所述机载计算机可读介质143a可以被移出于所述无人机100。所述机外计算接可读介质143b可以被移出于所述第二控制器142。In some embodiments, the
在一些实施例中,所述无人机100可以包括两个前视摄像头171和172,所述前视摄像头171和172光敏于各种波长的光线(如可见光、红外光、紫外线)用于拍摄所述无人机周围的图像或视频。在一些实施例中,所述无人机100包括置于底部的至少一个传感器。In some embodiments, the
图2是本发明实施例提供的无人机底部的结构示意图。所述无人机100可以包括两个置于所述机身110底部的下视摄像头173和174。此外,所述无人机100还包括两个置于所述机身110底部的超声传感器177和178。所述超声传感器177和178可以检测和/或监测所述无人机100底部的物体及地面,并通过发送及接受超声波来测量离该物体或地面的距离。
2 is a schematic structural view of a bottom of a drone according to an embodiment of the present invention. The
在其他实施例中,所述无人机100可以包括惯性测量单元(英文:inertial measurement unit,缩写:IMU)、GPS(英文:Global Positioning System)模块、红外传感器、微波传感器、温度传感器、近距离传感器(英文:proximity sensor)、三维激光测距仪、3D TOF等。所述三维激光测距仪及所述3D TOF可以检测无人机具下方物体或体面的距离。In other embodiments, the
在一些实施例中,所述输入输出装置148为一智能手机。用户可以通过所述手机控制所述无人机飞行。In some embodiments, the input and
在一些实施例中,所述无人机100可以从所述输入输出装置148接收输入信息,如用户在通过所述输入输出装置148向所述无人机100发送一目标。所述无人机100可以根据所述目标,识别出所述目标在地面上的对应位置,所述第一控制器可以控制所述无人机100飞至所述对应位置的上方并悬停。In some embodiments, the
在一些实例中,所述无人机100可以从所述输入输出装置148接收输入信息,如用户在通过所述输入输出装置148向所述无人机100发送一目标。所述无人机100可以根据所述目标,识别出所述目标在地面上的对应位置。所述第一控制器可以控制所述无人机100飞行至一预置的参考高度,并沿所述预置的参考高度飞行。In some examples, the
在一些实施例中,所述无人机可以通过计算在第一坐标系统中的第一坐标,以及根据所述第一坐标计算所述无人机在第二坐标系统中的第二坐标,来获取目标位置。所述无人机可以飞行至第二坐标处。In some embodiments, the drone may calculate the first coordinate in the first coordinate system and calculate the second coordinate of the drone in the second coordinate system according to the first coordinate. Get the target location. The drone can fly to the second coordinate.
参阅图3,图3为本发明实施例提供的无人机计算目标位置的流程图。所述无人机可以根据方法300计算出目标位置,并根据所述目标位置飞行。Referring to FIG. 3, FIG. 3 is a flowchart of calculating a target position of a drone according to an embodiment of the present invention. The drone can calculate a target location in accordance with
在一些实施例中,执行所述方法300的程序指令可以被存储于所述机载计算机可读介质143a中。在其他实施例中,执行所述方法300的程序指令可以被存储于所述机外计算机可读介质143b中。In some embodiments, program instructions to perform the
步骤301,计算无人机在第一坐标系统中的第一坐标。Step 301: Calculate a first coordinate of the drone in the first coordinate system.
参阅图4,图4为本发明实施例提供的无人机计算目标位置的示
意图。在图4中,所述第一坐标系统为以控制装置O为原点的极坐标系。所述第一坐标包括所述无人机100离所述控制装置O的距离r,所述无人机100在所述极坐标系中的天顶角θ,以及所述无人机100在所述极坐标系中的方位角。因此,所述第一坐标(即所述无人机100在所述极坐标系中的坐标)可以被表示为(r,θ,)。Referring to FIG. 4, FIG. 4 is a schematic diagram of a UAV computing target position according to an embodiment of the present invention. In FIG. 4, the first coordinate system is a polar coordinate system with the control device O as the origin. The first coordinate includes a distance r of the
在一些实施例中,所述无人机100可以获取所述无人机100距离所述控制装置O的水平距离以及竖直距离,并通过所述水平距离以及所述竖直距离计算出所述距离r。例如,所述无人机100可以通过机载的一个或多个传感器(如超声传感器、TOF传感器、气压计等),检测出所述竖直距离。再如,所述无人机100可以通过GPS模块检测出所述水平距离。In some embodiments, the
在一些实施例中,在用户通过所述控制装置O控制所述无人机100起飞后,所述无人机100可以从所述控制装置O接收位置信息,以计算所述天顶角θ以及所述方位角。In some embodiments, after the user controls the
在一些实施例中,所述控制装置O包括一显示装置以及一成像装置(如摄像头等),所述用户可以通过所述成像装置拍摄所述无人机,以生成可以显示在所述显示装置上的图像。参阅图4,所述控制装置O可以在所述成像装置拍摄的图像101上识别出所述无人机的位置信息,如位置A,并通过计算所述位置A距离所述图像中心O’的像素差来确定所述位置A的坐标。In some embodiments, the control device O includes a display device and an imaging device (such as a camera or the like) through which the user can photograph the drone to generate a display device that can be displayed on the display device. The image on it. Referring to FIG. 4, the control device O can identify the position information of the drone, such as the position A, on the
在一些实施例中,所述位置A的坐标可以被表示为(Δu,Δv),Δu为所述位置A与所述图像中心O’的水平像素差,Δv为所述位置A与所述图像中心O’的竖直像素差。例如,所述位置A的坐标可以为(200px,300px)。In some embodiments, the coordinates of the position A may be represented as (Δu, Δv), Δu is the horizontal pixel difference between the position A and the image center O', Δv is the position A and the image The vertical pixel difference of the center O'. For example, the coordinates of the position A may be (200px, 300px).
在一些实施例中,所述无人机100可以获取所述成像装置的视野(Fov),所述成像装置的分辨率(画幅款:w,画幅高:h),并根据所述位置A的坐标、所述视野Fov、以及所述分辨率计算所述天顶角θ以及所述方位角。In some embodiments, the
在一些实施例中,所述视野Fov包括水平视野hFov以及竖直视 野vFov。所述水平视野hFov以及所述竖直视野vFov可以由以下公式计算:In some embodiments, the field of view Fov includes horizontal field of view hFov and vertical view Wild vFov. The horizontal field of view hFov and the vertical field of view vFov can be calculated by the following formula:
其中,所述视野Fov为所述成像装置的感光元件对角线的视野。所述视野Fov可以由厂商给出,在需要时可以直接调用。例如,所述控制装置O可以将所述视野Fov发送给所述无人机100。所述无人机100可以根据所述视野Fov计算所述水平视野hFov以及所述竖直视野vFov。可选地,所述控制装置O也可在本地根据所述视野Fov计算所述水平视野hFov以及所述竖直视野vFov。Wherein, the field of view Fov is a field of view of a diagonal of the photosensitive element of the imaging device. The field of view Fov can be given by the manufacturer and can be called directly when needed. For example, the control device O can transmit the field of view Fov to the
在一些实施例中,所述分辨率包括画幅宽w以及画幅高h。例如,所述画幅宽w与所述画幅高h的比可以为16:9或者4:3。所述画幅宽w可以为1920px,所述画幅高h可以为1080px。所述控制装置O可以从本地获取所述画幅宽w以及所述画幅高h。可选地,所述控制装置O也可以将所述画幅宽w以及所述画幅高h发送给所述无人机100。In some embodiments, the resolution includes a frame width w and a frame height h. For example, the ratio of the width w of the frame to the height h of the frame may be 16:9 or 4:3. The frame width w may be 1920px, and the frame height h may be 1080px. The control device O can obtain the frame width w and the frame height h from the local. Optionally, the control device O may also send the frame width w and the frame height h to the
在一些实施例中,所述无人机100或者所述控制装置O可以根据以下公式计算所述天顶角θ以及所述方位角。In some embodiments, the
在一些实施例中,所述控制装置O可以为一手持云台。用户可以通过所述手持云台的成像装置(如自带的摄像头、手机或相机等)拍摄所述无人机,生成可以在所述成像装置上显示的图像。所述手持云台可以通过内置的姿态传感器(如惯性测量单元等)获取所述天顶角θ以及所述方位角。可选地,所述手持云台可以将所述天顶角θ以
及所述方位角发送给所述无人机100。In some embodiments, the control device O can be a handheld pan/tilt. The user can take the drone through the imaging device of the handheld pan/tilt (such as a built-in camera, mobile phone or camera, etc.) to generate an image that can be displayed on the imaging device. The handheld pan/tilt can acquire the zenith angle θ and the azimuth angle through a built-in attitude sensor (such as an inertial measurement unit or the like) . Optionally, the handheld gimbal may have the zenith angle θ and the azimuth angle Sended to the
步骤302,根据所述第一坐标计算所述无人机在第二坐标系统的第二坐标。Step 302: Calculate a second coordinate of the UAV in the second coordinate system according to the first coordinate.
在一些实施例中,在计算出所述第一坐标(r,θ,)之后,所述无人机100或者所述控制装置O可以根据所述第一坐标计算所述无人机在第二坐标系统中的第二坐标。In some embodiments, the first coordinate (r, θ, is calculated) After that, the
参阅图4,所述第二坐标系统为导航坐标系。具体地,x轴指向北,y轴指向东,z轴指向地心。可选地,所述第二坐标系统可以为机身坐标系。例如,x轴指向机身方向的前方,y轴指向机身方向的右方,z轴指向机身方向的下方。所述机身方向为所述无人机机头的朝向。Referring to FIG. 4, the second coordinate system is a navigation coordinate system. Specifically, the x-axis points to the north, the y-axis points to the east, and the z-axis points to the center of the earth. Optionally, the second coordinate system may be a fuselage coordinate system. For example, the x-axis points to the front of the fuselage direction, the y-axis points to the right of the fuselage direction, and the z-axis points to the lower side of the fuselage direction. The direction of the fuselage is the orientation of the nose of the drone.
在一些实施例中,所述无人机100或者所述控制装置O可以根据以下公式计算所述第二坐标(x,y,z)。In some embodiments, the
步骤303,根据所述第二坐标控制所述无人机飞行。Step 303: Control the drone to fly according to the second coordinate.
在一些实施例中,在计算出所述第二坐标(x,y,z)之后,所述无人机100可以飞至所述第二坐标(x,y,z)处。In some embodiments, after calculating the second coordinate (x, y, z), the
在一些实施例中,通过所述方法300,用户可以通过控制装置(如手机等),单手精确地控制飞机的位置,来完成构图、选景等操作,如同自拍杆一样简单易用。使一般用户也能快速上手使用,从而更加专注于拍照本身。In some embodiments, by the
值得注意的是,上述对所述方法300的描述仅为了便于理解本发明。对本领域的普通技术人员来说,可以在理解本发明的基础上对本发明做出一些修改与变换,但所述修改与变换仍在本发明的保护范围之内。例如,所述方法300可以由所述无人机100执行,也可以由所述控制装置O执行,或者由所述无人机100与所述控制装置O共同
执行。再如,所述控制装置O还可以包括平板电脑、智能手表、手环、VR眼镜、AR眼镜等。It should be noted that the above description of the
参阅图5,图5为本发明实施例提供的用户控制无人机的流程图。Referring to FIG. 5, FIG. 5 is a flowchart of a user controlled drone according to an embodiment of the present invention.
步骤501,调整无人机与控制装置之间的距离。In
在一些实施例中,用户可以通过控制装置(如手机和/或遥控器)控制无人机起飞,并调整所述控制装置与所述无人机之间的距离,即所述距离r(见图4)。例如,所述用户可以通过滑动所述控制装置的屏幕(如手机屏幕)、点击所述控制装置屏幕的指定位置(如手机屏幕的指定位置)、或多点触控(如3D touch等),来调整所述距离r。In some embodiments, the user can control the drone to take off by a control device (such as a cell phone and/or a remote control) and adjust the distance between the control device and the drone, ie the distance r (see Figure 4). For example, the user can slide a screen of the control device (such as a mobile phone screen), click a specified position of the control device screen (such as a specified position of a mobile phone screen), or multi-touch (such as 3D touch, etc.). To adjust the distance r.
步骤502,通过无人机锁定用户。In
在一些实施例中,所述用户可以通过所述控制装置的屏幕(如手机屏幕),锁定所述用户。例如,所述用户可以将所述屏幕上的画面切换为所述无人机的拍摄预览画面。之后,所述用户可以粗略调整所述无人机的朝向,使所述用户出现在所述拍摄预览画面中。可选地,所述用户可以通过在所述拍摄预览画面中框选所述用户本,来锁定所述用户。具体地,所述用户可以启用所述无人机的智能跟随功能,通过在所述拍摄预览画面中框选所述用户,来持续跟随所述用户。启用智能跟随功能之后,所述无人机会根据用户习惯,更加精细调整所述无人机的朝向,以及云台位置,使所述用户出现在所述拍摄预览画面中指定的位置,从而完成自动构图。可选地,所述指定的位置可以为所述拍摄预览画面的中心,也可以为任何用户在所述拍摄预览画面上指定的位置。In some embodiments, the user can lock the user through a screen of the control device, such as a cell phone screen. For example, the user can switch the screen on the screen to the shooting preview screen of the drone. Thereafter, the user can roughly adjust the orientation of the drone to cause the user to appear in the shooting preview screen. Optionally, the user may lock the user by boxing the user book in the shooting preview screen. Specifically, the user may enable the smart follow function of the drone to continuously follow the user by boxing the user in the shooting preview screen. After the smart follow function is enabled, the unattended user finely adjusts the orientation of the drone and the position of the gimbal according to the user's habits, so that the user appears in the designated position in the shooting preview screen, thereby completing the automatic Composition. Optionally, the specified location may be the center of the shooting preview screen, or may be a location specified by any user on the shooting preview screen.
步骤503,通过控制装置锁定无人机。In
在一些实施例中,所述用户可以将所述控制装置的屏幕上的画面切换为所述控制装置的成像装置的拍摄预览画面。然后通过移动所述控制装置,使所述无人机出现在所述拍摄预览画面中。可选地,所述用户可以在所述拍摄预览画面中框选所述无人机,以锁定所述无人机。具体地,所述用户可以在所述控制装置上启用智能跟随功能。通过在所述拍摄预览画面中框选所述无人机,来持续跟随所述无人机。 In some embodiments, the user may switch a screen on the screen of the control device to a shot preview screen of the imaging device of the control device. The drone is then caused to appear in the shooting preview screen by moving the control device. Optionally, the user may frame the drone in the shooting preview screen to lock the drone. In particular, the user can enable the smart follow function on the control device. The drone is continuously followed by the drone being framed in the shooting preview screen.
通过步骤502和步骤503,完成了所述无人机锁定所述用户,所述控制装置锁定所述无人机。值得注意的是,以上的描述仅为了便于理解本发明,不应被视为是本发明唯一的实施方式。在其他实施例中,步骤503可以先于步骤502执行,或者步骤502和步骤503可以同时执行。Through
步骤504,移动控制装置,以控制无人机的位置。
在一些实施例中,所述用户可以移动所述控制装置,以控制所述无人机的位置。可选地,当所述用户移动所述控制装置,使所述无人机在所述控制装置的屏幕中的位置发生改变时,所述无人机可以根据所述方法300,实时地计算出目标位置(如所述第二坐标),并根据所述目标位置飞行(如飞至所述目标位置处)。可选地,所述用户可以原地转动所述控制装置,或者上下移动所述控制装置,则可以方便地完成构图、取景等操作,使用户可以更加专注于拍照。In some embodiments, the user can move the control device to control the position of the drone. Optionally, when the user moves the control device to change the position of the drone in the screen of the control device, the drone may calculate in real time according to the
在一些实施例中,所述控制装置为一手持云台。所述用户可以移动所述手持云台,以控制所述无人机的位置。例如,所述用户可以直接手动转动手持云台,或者通过操作所述手持云台的摇杆转动所述手持云台,使所述无人机出现在所述手持云台的拍摄预览画面中。根据所述方法300,所述手持云台可以通过内置的姿态传感器获取所述天顶角及所述方位角θ。In some embodiments, the control device is a handheld pan/tilt. The user can move the handheld pan/tilt to control the location of the drone. For example, the user can directly rotate the handheld pan/tilt, or rotate the handheld pan/tilt by operating the joystick of the handheld pan/tilt, so that the drone appears in the shooting preview screen of the handheld pan/tilt. According to the
值得注意的是,上述对所述方法500的描述仅为了便于理解本发明,不应被视为是本发明唯一的实施例。对本领域的普通技术人员来说,可以在理解本发明的基础上对本发明做出一些修改与变换,但所述修改与变换仍在本发明的保护范围之内。例如,所述控制装置还可以包括平板电脑、智能手表、手环、VR眼镜、AR眼镜等。It is to be noted that the above description of the
参阅图6,图6为本发明实施例提供的无人机控制的示意图。用户可以单手通过控制装置200(如手机等)控制无人机100的飞行。Referring to FIG. 6, FIG. 6 is a schematic diagram of a drone control according to an embodiment of the present invention. The user can control the flight of the
在一些实施例中,所述用户可以利用所述控制装置200通过所述方法300控制所述无人机100的飞行,以完成构图、取景等操作,便于用户拍照。参阅图3及图6,所述第一坐标系统可以为以所述控制 装置200为原点的圆柱坐标系。所述第一坐标可以被表示为(ρ,,z)。其中,ρ表示所述无人机100与所述控制装置200的水平距离,表示所述无人机100在所述圆柱坐标系中的方位角,z表示所述无人机100的高度。具体计算所述第一坐标(ρ,,z)可以参考图3中的描述,在此不赘述。In some embodiments, the user may use the control device 200 to control the flight of the drone 100 by the method 300 to complete operations such as composition, framing, etc., to facilitate the user to take a picture. Referring to Figures 3 and 6, the first coordinate system may be a cylindrical coordinate system with the control device 200 as the origin. The first coordinate can be expressed as (ρ, ,z). Where ρ represents the horizontal distance between the drone 100 and the control device 200, Indicates the azimuth of the drone 100 in the cylindrical coordinate system, and z represents the height of the drone 100. Specifically calculating the first coordinate (ρ, , z) can refer to the description in FIG. 3, and details are not described herein.
图7为本发明实施例提供的的无人机模块示意图。参阅图7,无人机100可以包括至少一个处理器701、传感器702、存储器703以及输入输入模块704。FIG. 7 is a schematic diagram of a UAV module according to an embodiment of the present invention. Referring to FIG. 7, the
所述处理器701可以包括至少一个处理器,所述处理器包括但不限于微处理器(英文:microcontroller),精简指令集计算机(英文:reduced instruction set computer,简称:RISC),专用集成电路(英文:application specific integrated circuits,简称:ASIC),专用指令集处理器(英文:application-specific instruction-set processor,简称:ASIP),中央处理单元(英文:central processing unit,简称:CPU),物理处理器英文(英文:physics processing unit,简称:PPU),数字信号处理器(英文:digital signal processor,简称DSP),现场可编程门阵列(英文:field programmable gate array,简称:FPGA)等。The processor 701 may include at least one processor, including but not limited to a microprocessor (English: microcontroller), a reduced instruction set computer (English: reduced RISC), an application specific integrated circuit ( English: application specific integrated circuits (ASIC), application-specific instruction-set processor (ASIP), central processing unit (English: central processing unit, CPU for short), physical processing English (English: physics processing unit, referred to as: PPU), digital signal processor (English: digital signal processor, referred to as DSP), field programmable gate array (English: field programmable gate array, referred to as: FPGA).
在一些实施例中,所述处理器701可以用于执行所述方法300。即所述处理器701可以用于计算所述无人机100在第一坐标系统中的第一坐标,根据所述第一坐标计算所述无人机100在第二坐标系统中的第二坐标,以及根据所述第二坐标控制所述无人机飞行。In some embodiments, the processor 701 can be configured to perform the
在一些实施例中,所述第一坐标系为以所述控制装置为原点的球坐标系,所述第二坐标系为导航坐标系。In some embodiments, the first coordinate system is a spherical coordinate system with the control device as an origin, and the second coordinate system is a navigation coordinate system.
在一些实施例中,所述第一坐标包括所述无人机与所述控制装置的距离,所述无人机在所述第一坐标系统中的天顶角、以及所述无人机在所述第一坐标系统中的方位角。In some embodiments, the first coordinate comprises a distance of the drone from the control device, a zenith angle of the drone in the first coordinate system, and the drone is The azimuth in the first coordinate system.
在一些实施例中,所述处理器701可以用于计算所述无人机100与所述控制装置O的距离,以及计算所述天顶角以及所述方位角。In some embodiments, the processor 701 can be configured to calculate a distance of the
在一些实施例中,所述处理器701可以用于,获取所述无人机 100在由所述成像装置拍摄的图像中的坐标,获取所述成像装置的视野,获取所述成像装置的分辨率,以及根据所述坐标、所述视野、以及所述分辨率计算所述天顶角以及所述方位角。In some embodiments, the processor 701 can be configured to acquire the drone Obtaining a field of view of the imaging device at a coordinate in an image captured by the imaging device, acquiring a resolution of the imaging device, and calculating the day based on the coordinate, the field of view, and the resolution The apex angle and the azimuth angle.
在一些实施例中,所述视野包括水平视野以及竖直视野,所述分辨率包括水平分辨率以及竖直分辨率。In some embodiments, the field of view includes a horizontal field of view and a vertical field of view, the resolution including horizontal resolution and vertical resolution.
所述传感器302可以包括至少一个传感器,所述传感器包括但不限于温度传感器、惯性测量单元、加速度计、图像传感器(如摄像头)、超声传感器、TOF传感器、微波传感器、近距离传感器、三维激光测距仪、红外传感器等。The
在一些实施例中,所述惯性测量单元可以用于测量所述无人机的姿态信息(如俯仰角、横滚角、偏航角等)。所述惯性测量单元可以包括但不限于,至少一个加速度计、陀螺仪、磁力仪或其中的任意组合。所述加速度计可以用于测量所述无人机的加速度,以计算所述无人机的速度。In some embodiments, the inertial measurement unit can be used to measure attitude information (eg, pitch angle, roll angle, yaw angle, etc.) of the drone. The inertial measurement unit may include, but is not limited to, at least one accelerometer, gyroscope, magnetometer, or any combination thereof. The accelerometer can be used to measure the acceleration of the drone to calculate the speed of the drone.
所述存储器303可以包括但不限于只读存储器(ROM)、随机存储器(RAM)、可编程制度存储器(PROM)、电子抹除式可编程只读存储器(EEPROM)等。所述存储模块303可以包括费暂时性计算机可读介质,其可以存储用于执行本文其他各处所描述的至少一个步骤的代码、逻辑或指令。所述控制模块301,其可以根据本文所描述的非暂时性计算机可读介质的代码、逻辑或指令而单独地或共同地执行至少一个步骤。所述存储模块可用于存储所述无人机100的状态信息,如高度、速度、位置、预置的参考高度等。The
在一些实施例中,所述存储器303可以存储用于执行所述方法300的程序指令。In some embodiments, the
所述输入输出模块304用于向外部设备输出信息或指令,如接收所述输入输出装置148(见图1)发送的指令,或将所述成像装置131(见图1)拍摄的图像发送给所述输入输出装置148。The input/output module 304 is configured to output information or instructions to an external device, such as receiving an instruction sent by the input/output device 148 (see FIG. 1), or transmitting an image captured by the imaging device 131 (see FIG. 1) to The input and
以上所述仅为本发明的实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换, 或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。The above is only the embodiment of the present invention, and is not intended to limit the scope of the invention, and the equivalent structure or equivalent flow transformation made by the description of the invention and the drawings, It is also directly or indirectly used in other related technical fields, and is included in the scope of patent protection of the present invention.
本专利文件披露的内容包含受版权保护的材料。该版权为版权所有人所有。版权所有人不反对任何人复制专利与商标局的官方记录和档案中所存在的该专利文件或者该专利披露。The disclosure of this patent document contains material that is subject to copyright protection. This copyright is the property of the copyright holder. The copyright owner has no objection to the reproduction of the patent document or the patent disclosure in the official records and files of the Patent and Trademark Office.
最后应说明的是:以上各实施例仅用以说明本披露的技术方案,而非对其限制;尽管参照前述各实施例对本披露进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本披露各实施例技术方案的范围。 Finally, it should be noted that the above embodiments are merely illustrative of the technical solutions of the present disclosure, and are not intended to be limiting; although the present disclosure has been described in detail with reference to the foregoing embodiments, those skilled in the art will understand that The technical solutions described in the foregoing embodiments may be modified, or some or all of the technical features may be equivalently replaced; and the modifications or substitutions do not deviate from the technical solutions of the embodiments of the disclosure. range.
Claims (12)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2017/080647 WO2018188086A1 (en) | 2017-04-14 | 2017-04-14 | Unmanned aerial vehicle and control method therefor |
| CN201780083166.6A CN110177997A (en) | 2017-04-14 | 2017-04-14 | Unmanned plane and its control method |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2017/080647 WO2018188086A1 (en) | 2017-04-14 | 2017-04-14 | Unmanned aerial vehicle and control method therefor |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018188086A1 true WO2018188086A1 (en) | 2018-10-18 |
Family
ID=63792216
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2017/080647 Ceased WO2018188086A1 (en) | 2017-04-14 | 2017-04-14 | Unmanned aerial vehicle and control method therefor |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN110177997A (en) |
| WO (1) | WO2018188086A1 (en) |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP1901153A1 (en) * | 2006-09-12 | 2008-03-19 | OFFIS e.V. | Control system for unmanned 4-rotor-helicopter |
| JP2009149157A (en) * | 2007-12-19 | 2009-07-09 | Toshiba Corp | Rotary wing aircraft mounting apparatus, operation adjustment method, and computer program |
| CN103604427A (en) * | 2013-12-10 | 2014-02-26 | 中国航天空气动力技术研究院 | Unmanned aerial vehicle system and method for dynamically positioning ground moving target |
| CN104729497A (en) * | 2015-01-16 | 2015-06-24 | 上海大学 | Ultra-small dual-duct unmanned plane combined navigation system and dual-mode navigation method |
| CN104760695A (en) * | 2015-03-23 | 2015-07-08 | 松翰科技(深圳)有限公司 | Method for controlling quadrotor aircraft by vector rotation method |
| CN105865439A (en) * | 2016-02-24 | 2016-08-17 | 深圳高科新农技术有限公司 | Unmanned aerial vehicle navigation method |
| US20160266577A1 (en) * | 2015-03-12 | 2016-09-15 | Alarm.Com Incorporated | Robotic assistance in security monitoring |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6279097B2 (en) * | 2015-05-18 | 2018-02-14 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | Control method and device for drone based on headless mode |
| CN104898699B (en) * | 2015-05-28 | 2020-03-17 | 小米科技有限责任公司 | Flight control method and device and electronic equipment |
| CN104913776B (en) * | 2015-06-19 | 2018-06-01 | 广州快飞计算机科技有限公司 | Unmanned plane localization method and device |
| CN105045281A (en) * | 2015-08-13 | 2015-11-11 | 深圳一电科技有限公司 | Unmanned aerial vehicle flight control method and device |
-
2017
- 2017-04-14 WO PCT/CN2017/080647 patent/WO2018188086A1/en not_active Ceased
- 2017-04-14 CN CN201780083166.6A patent/CN110177997A/en active Pending
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP1901153A1 (en) * | 2006-09-12 | 2008-03-19 | OFFIS e.V. | Control system for unmanned 4-rotor-helicopter |
| JP2009149157A (en) * | 2007-12-19 | 2009-07-09 | Toshiba Corp | Rotary wing aircraft mounting apparatus, operation adjustment method, and computer program |
| CN103604427A (en) * | 2013-12-10 | 2014-02-26 | 中国航天空气动力技术研究院 | Unmanned aerial vehicle system and method for dynamically positioning ground moving target |
| CN104729497A (en) * | 2015-01-16 | 2015-06-24 | 上海大学 | Ultra-small dual-duct unmanned plane combined navigation system and dual-mode navigation method |
| US20160266577A1 (en) * | 2015-03-12 | 2016-09-15 | Alarm.Com Incorporated | Robotic assistance in security monitoring |
| CN104760695A (en) * | 2015-03-23 | 2015-07-08 | 松翰科技(深圳)有限公司 | Method for controlling quadrotor aircraft by vector rotation method |
| CN105865439A (en) * | 2016-02-24 | 2016-08-17 | 深圳高科新农技术有限公司 | Unmanned aerial vehicle navigation method |
Also Published As
| Publication number | Publication date |
|---|---|
| CN110177997A (en) | 2019-08-27 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11649052B2 (en) | System and method for providing autonomous photography and videography | |
| US11377211B2 (en) | Flight path generation method, flight path generation system, flight vehicle, program, and storage medium | |
| US11263761B2 (en) | Systems and methods for visual target tracking | |
| CN107108023B (en) | Unmanned aerial vehicle and its control method | |
| CN205263655U (en) | A system, Unmanned vehicles and ground satellite station for automatic generation panoramic photograph | |
| JP6765512B2 (en) | Flight path generation method, information processing device, flight path generation system, program and recording medium | |
| WO2018098704A1 (en) | Control method, apparatus, and system, unmanned aerial vehicle, and mobile platform | |
| US20180275659A1 (en) | Route generation apparatus, route control system and route generation method | |
| US20210185235A1 (en) | Information processing device, imaging control method, program and recording medium | |
| CN105391988A (en) | Multi-view unmanned aerial vehicle and multi-view display method thereof | |
| US12148205B2 (en) | Contour scanning with an unmanned aerial vehicle | |
| US20220390940A1 (en) | Interfaces And Control Of Aerial Vehicle For Automated Multidimensional Volume Scanning | |
| US20200221056A1 (en) | Systems and methods for processing and displaying image data based on attitude information | |
| WO2020048365A1 (en) | Flight control method and device for aircraft, and terminal device and flight control system | |
| JPWO2018073878A1 (en) | Three-dimensional shape estimation method, three-dimensional shape estimation system, flying object, program, and recording medium | |
| JP6329219B2 (en) | Operation terminal and moving body | |
| WO2022188151A1 (en) | Image photographing method, control apparatus, movable platform, and computer storage medium | |
| WO2018188086A1 (en) | Unmanned aerial vehicle and control method therefor | |
| US12481074B2 (en) | Motion-based calibration of an aerial device | |
| JP7707439B2 (en) | Contour scanning using unmanned aerial vehicles | |
| US20240348751A1 (en) | Autonomous monitoring by unmanned aerial vehicle systems and methods |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17905348 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17905348 Country of ref document: EP Kind code of ref document: A1 |