WO2018103689A1 - Procédé et appareil de commande d'azimut relatif pour véhicule aérien sans pilote - Google Patents
Procédé et appareil de commande d'azimut relatif pour véhicule aérien sans pilote Download PDFInfo
- Publication number
- WO2018103689A1 WO2018103689A1 PCT/CN2017/114974 CN2017114974W WO2018103689A1 WO 2018103689 A1 WO2018103689 A1 WO 2018103689A1 CN 2017114974 W CN2017114974 W CN 2017114974W WO 2018103689 A1 WO2018103689 A1 WO 2018103689A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- drone
- relative orientation
- information
- gesture
- height difference
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/106—Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
Definitions
- the present invention relates to the field of aviation science and technology, and more particularly to a method and apparatus for controlling relative orientation of a drone.
- the unmanned drone is referred to as the unmanned aerial vehicle, and is an unmanned drone operated by a remote control method and a self-contained program control device.
- the unmanned aerial vehicle In order to maintain the balance of the body and complete the task, more and more sensors are installed on the unmanned aircraft.
- microelectronics technology it has become a reality to integrate multiple high-precision sensors on small unmanned aerial vehicles.
- the functions that UAVs can achieve are also increasing, and have been widely used in aerial reconnaissance, surveillance, communication, anti-submarine, and electronic interference. In some usage scenarios, it is necessary to control the drone to maintain a certain relative orientation with the user, so as to complete operations such as drone shooting.
- the flight control of the drone is greatly affected by the environment. If the user performs the relative azimuth control completely manually, not only the user's flight control technology level is required, but also the user needs to focus on the relative position control of the drone, which is difficult to complete at the same time. Other operations. Therefore, in general, the following two methods are used to control the relative orientation of the drone:
- the sensors of the drone and the remote control device are used to detect the orientation information of the drone and the remote control device, and the relative orientation control is performed according to the orientation information.
- the method 1 relies on the positioning sensor of the remote control device, has low precision, and is not conducive to the lightness of the remote control device, affecting the human-computer interaction experience in the relative orientation control; the method 2 has a high complexity of the graphics algorithm. It not only occupies a large amount of computing resources of the drone, but also limits the application scenario of the relative position control of the drone, and the effect is not satisfactory.
- the object of the present invention is to provide a method and device for controlling relative orientation of a drone, and a method and device for controlling relative orientation of a drone for a wearable device, which are capable of improving The efficiency of the relative orientation control of the drone, the control method can provide an implementation manner for improving the relative azimuth control efficiency of the drone.
- the present invention also provides a drone control device and a wearable device control device that are compatible with the aforementioned control method.
- the embodiment of the present invention provides a method for controlling a relative orientation of a drone, comprising the steps of: acquiring an infrared image formed by an infrared light action gesture area emitted by the wearable device; determining infrared light according to the infrared image. Depicting a gesture area of the contour and a gesture instruction type characterizing the relative orientation preset value; detecting first relative orientation information between the present drone and the gesture area; according to the first relative orientation information and the gesture instruction type The flight state of the drone is controlled such that the relative orientation between the drone and the gesture area is the preset value.
- an embodiment of the present invention provides a method for controlling a relative orientation of a drone for a wearable device, comprising the steps of: receiving a drive for driving a wearable device to emit infrared light based on a trusted connection; In response to the driving instruction, the infrared illuminating component preset to drive the wearable device emits infrared light, so that the drone determines the gesture area and the gesture instruction type characterization of the relative orientation preset value based on the infrared imaging to apply to the drone Relative orientation control; receiving an alarm instruction of the drone based on the trust connection; controlling the wearable device to start the vibration in response to the alarm instruction Move the motor and/or turn on the indicator light to alert the user that the current drone is in an azimuth deviation state.
- an embodiment of the present invention provides a drone relative orientation control apparatus, including: at least one processor; and at least one memory communicably coupled to the at least one processor;
- a memory includes processor-executable instructions that, when executed by the at least one processor, cause the apparatus to perform at least the following: obtaining infrared light emitted by the wearable device after acting on the gesture area Forming an infrared image; determining, according to the infrared image, a gesture area characterized by infrared light and a gesture instruction type indicating a relative orientation preset value; detecting first relative orientation information between the drone and the gesture area; And controlling the flight state of the drone according to the first relative orientation information and the gesture instruction type, so that the relative orientation between the drone and the gesture area is the preset value.
- an embodiment of the present invention provides a drone relative orientation assisting control apparatus for a wearable device, comprising: at least one processor; and at least one memory communicable with the at least one processor
- the at least one memory includes processor-executable instructions that, when executed by the at least one processor, cause the apparatus to perform at least the following operations: receiving based on a trusted connection a driving command for driving the wearing device to emit infrared light; in response to the driving instruction, driving the infrared light emitting component preset by the wearing device to emit infrared light, so that the drone determines the gesture region and characterizes the relative orientation based on the infrared imaging Setting a type of gesture instruction to apply to the relative orientation control of the drone; receiving an alarm instruction of the drone based on the trusted connection; controlling the wearable device to activate the vibration motor and/or turning on the indicator light in response to the alarm instruction Prompt the user that the current drone is in an azimuth deviation state.
- an embodiment of the present invention provides a drone control device having a function of implementing the method for controlling a relative orientation of a drone in the above first aspect.
- Said function It can be implemented by hardware or by software.
- the hardware or software includes one or more units corresponding to the functions described above.
- the structure of the drone control device includes: one or more cameras, at least one of which has an infrared imaging function; one or more sensors for detecting the relative orientation information; a program for storing a supportive wearable device to perform the above-described drone relative orientation control method; a communication interface for communicating with the wearable device or other device or communication network; and one or more processors for executing the memory a program stored therein; one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors; the one or more programs Means for driving the one or more processors to construct a UAV relative orientation control method as described in the first aspect or any one of the implementations.
- an embodiment of the present invention provides a wearable device control device having a function of implementing the unmanned aerial vehicle relative orientation assist control method for the wearable device in the second aspect.
- the functions may be implemented by hardware or by corresponding software implemented by hardware.
- the hardware or software includes one or more units corresponding to the functions described above.
- the structure of the wearable device control device includes: a memory for storing a program supporting the wearable device to perform the above-described drone relative orientation assist control method for the wearable device; and a communication interface for the wearable device Communicating with a drone or other device or communication network; vibrating motor and/or indicator light for prompting the user of the current state of the drone; one or more processors for executing the program stored in said memory; infrared a lighting assembly comprising one or more infrared light sources for emitting infrared light; one or more applications, wherein the one or more applications are stored in the memory and configured to be by the one or more Processor execution One or more programs for driving the one or more processors to construct a relative orientation of the drone for the wearable device described in the second aspect or any one of its implementations A unit of the auxiliary control method.
- a computer program in an embodiment of the present invention, comprising computer readable code, when the drone runs the computer readable code, causing the method described in the first aspect to be performed.
- a computer program in an embodiment of the present invention, comprising computer readable code, when the wearable device runs the computer readable code, causing the method described in the second aspect to be performed.
- a ninth aspect the embodiment of the present invention provides a computer readable medium, wherein the computer program according to the seventh aspect or the eighth aspect is stored.
- the technical solution provided by the present invention has at least the following advantages:
- the present invention determines a gesture area according to an infrared image formed by infrared light emitted by the wearable device, and a gesture instruction type that represents a relative orientation preset value, and detects a first relative orientation information between the drone and the gesture area. And controlling the flight state of the drone according to the first relative orientation information and the gesture instruction type, so that the relative orientation between the drone and the gesture area is the preset value.
- the gesture area and the recognition gesture can be determined according to the infrared imaging, and the relative orientation is determined by combining the position information of the drone, thereby realizing the relative azimuth control of the drone, reducing the calculation amount of the image recognition of the drone, and improving the relative of the drone While the efficiency of the azimuth control is improved, the accuracy of the relative orientation control of the drone is improved.
- the user can adjust the relative orientation of the drone and himself by gesture control by using a wearable device capable of emitting infrared light, without the need to wear the device to provide orientation information, thereby reducing the cost of the wearable device. And make the wearable device lighter and lighter. Improved user experience in drone relative orientation control.
- the drone can maintain the preset relative orientation with the user when the user goes up and down, reducing the relative orientation of the environment change to the drone.
- the impact of control when detecting the influence of the above environmental change, the drone sends an alarm command to the wearable device to prompt the user that the current drone is in an azimuth deviation state. It is beneficial for users to make corresponding adjustments in time to prevent drone loss and safety accidents.
- FIG. 1 is a block diagram showing the structure of an apparatus for a relative azimuth control method of a drone according to an embodiment of the present invention
- FIG. 2 is a schematic flow chart of a method for controlling a relative orientation of a drone according to an embodiment of the present invention
- FIG. 3 is a schematic diagram of a scenario of a relative azimuth control process of a drone according to an embodiment of the present invention
- FIG. 4 is a schematic diagram of a scenario of a relative azimuth control process of a drone according to an embodiment of the present invention
- FIG. 5 is a schematic flow chart of a method for assisting a relative orientation of a drone for a wearable device according to an embodiment of the present invention
- FIG. 6 is a structural block diagram of a relative azimuth control device for a drone according to an embodiment of the present invention.
- FIG. 7 is a structural block diagram of a relative azimuth assist control device for a wearable device according to an embodiment of the present invention.
- FIG. 8 is a schematic structural diagram of a drone control device according to an embodiment of the present invention.
- FIG. 9 is a schematic structural diagram of a device for controlling a wearable device according to an embodiment of the present invention.
- Figure 10 shows a block diagram of a drone or wearable device for performing the method according to the invention
- Figure 11 shows a schematic diagram of a memory unit for holding or carrying program code implementing a method in accordance with the present invention.
- control device or "unmanned control device” as used herein includes both a device of a wireless signal receiver, a device having only a wireless signal receiver without a transmitting capability, and a receiving device. And a device that transmits hardware having a receiving and transmitting hardware capable of two-way communication over a two-way communication link.
- Such devices may include cellular or other communication devices having single line displays or multi-line displays or cellular or other communication devices without multi-line displays; portable, transportable, and mounted on vehicles (aviation, sea and/or land) Mobile smart devices, such as drones, unmanned airships, etc.
- wearing device or “wearing device control device” as used herein includes both a device of a wireless signal transmitter, a device having only a wireless signal transmitter without receiving capability, and a receiving and receiving device.
- Such a device can be designed to be placed on a person, especially an arm, including a smart bracelet, a smart watch and a bracelet, and the like.
- the method of the present invention is mainly applicable to a terminal having a communication function such as a drone or a wearable device. Not limited to the type of its operating system, it can be Android, IOS, WP, Symbian and other operating systems, or embedded operating systems.
- FIG. 1 a block diagram of a device structure for a method for controlling a relative position of a drone is shown in FIG. 1.
- the overall structure includes a processor 704, a sensor module, a controller, an execution control terminal, and the like, wherein the sensor module includes Inertial measurement unit (IMU, including acceleration sensor, gyro sensor), magnetometer, direction sensor, ranging sensor, satellite positioning sensor (such as GPS sensor, Beidou sensor, etc.), image sensor, etc., used to generate Various sensor data is generated to generate azimuth information, heading information, image information, positioning information, distance information, etc. for drone control, thereby reflecting various parameters of the drone flight, and facilitating the drone to make its own adjustments. .
- IMU Inertial measurement unit
- the inertial sensor can detect the change of the attitude data of the drone, and the drone adjusts its posture after acquiring the attitude data to ensure flight according to the control command;
- the distance sensor can be used to detect the distance from the obstacle, so that the obstacle avoidance action can be quickly performed, thereby ensuring that the airframe is not damaged, and when the drone has obstacle avoidance measures
- the image sensor is used to acquire the infrared light emitted by the wearable device.
- An infrared image formed after the gesture area The infrared image determines a gesture area and a gesture instruction type that characterizes a relative orientation preset value; and detects a first relative between the drone and the gesture area by using a gyro sensor, a satellite positioning sensor, a ranging sensor, and a direction sensor Positioning information; controlling the flight state of the drone according to the first relative orientation information and the gesture instruction type, such that the relative orientation between the drone and the gesture area is the preset value.
- the processor 704 is a core part of performing data integration, transmission control, and execution of the operation.
- the specific information is identified from the data through a series of algorithms, thereby determining that the data is to be executed according to the information.
- the operation of the person skilled in the art can understand that the processor 704 can not only complete the integration and sending of sensor data, but also perform other operations.
- the processor 704 is capable of performing any unmanned aircraft relative orientation control. method.
- the controller is a control device for controlling the drone. Generally, when the remote control device (such as a wearable device) is used as a controller to control the drone, the control frequency of the drone and the controller needs to be set to ensure effective control.
- the drone flies.
- the execution control terminal is used for the drone to execute the operation instruction, and the execution control terminal communicates with the processor 704 to ensure that the drone is executed according to the operation instruction.
- the method includes the following steps:
- Step S11 Acquire an infrared image formed by the infrared light emitted by the wearable device acting on the gesture area.
- UAVs usually include camera units, processors, storage, etc., based on computer vision for gesture recognition.
- the drone includes at least one camera having an infrared imaging function, and the drone acquires an infrared image formed by the infrared light emitted by the wearable device, and separates the gesture area from the background area according to the infrared image to complete the gesture segmentation. To achieve infrared gesture recognition.
- the camera unit includes at least one camera.
- the camera unit can acquire an infrared image by any one or any of an IR-CUT dual filter technology, an IR lens technology, and an infrared induction CCD technology.
- the wearable device emits infrared light to illuminate the back of the user's hand, so that the outline of the user's gesture area is "illuminated” by the infrared light to form an infrared image of the gesture area.
- the user's gesture area and background area are distinguished by infrared light in infrared imaging.
- Step S12 determining a gesture area that describes the contour by the infrared light and a gesture instruction type that represents the relative orientation preset value according to the infrared image.
- the drone separates the gesture area from the background area according to the infrared image to complete the gesture segmentation, and the gesture area can be determined.
- an image algorithm may also be utilized to cause the camera unit of the drone to lock the gesture area.
- the determining, by the infrared image, the gesture area and the gesture instruction type comprises: one or more frames of images acquired from a preview infrared image acquired by the camera unit; determining the multi-frame image A gesture area in which the contour is described by the infrared light; the gesture feature data is extracted based on the gesture area, and matched with the preset gesture instruction type description data to determine a corresponding gesture instruction type.
- the gesture instruction type can characterize the relative orientation preset for the drone relative orientation control.
- the video acquired by the UAV through the camera unit can be regarded as composed of multi-frame images.
- static gesture recognition only one or a few frames of the gesture are analyzed to extract gesture feature data, and the gesture feature data may include a gesture profile. Data and/or gesture depth data.
- dynamic gesture recognition it is also necessary to acquire the spatio-temporal features of gestures.
- the common methods of dynamic gesture spatio-temporal trajectory analysis are mainly two categories: Trajectories Matching and State Space Modeling. . Therefore, it is necessary to analyze the multi-frame image to obtain the spatiotemporal trajectory generated by the gesture during the movement.
- the drone After acquiring the infrared image, the drone separates the gesture area from the background area according to the infrared image to complete the gesture segmentation, determines the gesture area that describes the contour by the infrared light, and then acquires the gesture feature through the gesture area and estimates the gesture model parameter. Gesture analysis, and then classifying the gesture according to the model parameters to determine the corresponding gesture instruction type, and implementing infrared gesture recognition.
- the identification method may be based on template matching, based on a hidden Markov model (HMM) or a method based on a neural network.
- HMM hidden Markov model
- the drone can determine the corresponding instruction type according to the infrared imaging capturing gesture. It reduces the occupation of computing resources, shortens the response time required for the user to use the gesture recognition to control the relative position of the drone in the background complex or dimly lit, and improves the efficiency and accuracy of the user's human-computer interaction, especially in the unmanned This is especially true when the machine and/or user is in the process of moving.
- Step S13 detecting first relative orientation information between the drone and the gesture area.
- the drone determines the relative position according to the relative orientation information between the drone and the gesture area, and it is pointed out that the first relative orientation information of the drone includes between the drone and the gesture area.
- the distance information, the azimuth information, the altitude angle information, the height difference information, and the positioning information of the drone are any one or any of a plurality of types. Therefore, the first relative position information is a general term, and when specifically applied, The specific data listed here can be determined as needed.
- the detecting process of the first relative orientation information between the drone and the gesture area comprises: detecting the positioning information of the drone by the satellite positioning sensor of the drone;
- the distance measuring sensor detects the distance information between the drone and the gesture area; detects the azimuth information between the drone and the gesture area by using the direction sensor of the drone;
- the gyro sensor detects the height angle information between the drone and the gesture area, and calculates the height difference information between the drone and the gesture area according to the distance information and the height angle information; According to the positioning information, The height difference information, the azimuth information, and the distance information calculate relative orientation information between the drone and the gesture area.
- the horizontal distance between the gesture areas of the drone can be adjusted to achieve horizontal movement between the drone and the user.
- the head orientation of the drone can be adjusted, or the shooting direction of the camera unit in the drone can be controlled, so that when the position of the gesture area changes, the drone can adjust its nose or The orientation of the camera unit enables the camera unit to lock the user at all times.
- the relative height between the drone and the operator can be adjusted, so that the drone can maintain a relative orientation with the preset value when the user performs the uphill movement or the downhill movement.
- the positioning information of the first relative orientation information may represent the latitude and longitude coordinates A(x 1 , y 1 ) of the drone, and the latitude and longitude coordinates may be acquired by a satellite positioning sensor installed by the drone, and the satellite positioning sensor
- the positioning function is implemented based on a satellite positioning system connected thereto, and the satellite system connected to the positioning module includes but is not limited to: a GPS positioning system, a Beidou positioning system, a GLONAS positioning system or a Galileo positioning system;
- the distance is a drone a linear distance l from the gesture area, the ranging sensor is a laser ranging sensor and/or an infrared ranging sensor;
- the azimuth angle ⁇ AB is also called Azimuth (angle), referred to as Az, indicating From the north direction of the drone, the horizontal angle between the clockwise direction and the direction line of the gesture area, for example, in the Android system, obtained by public static float[]getOrientation(float[]R,float[]value
- the positioning coordinates B(x 2 , y 2 ) of the current gesture area can be obtained.
- the distance measuring sensor or the ultrasonic ranging sensor may also be used to detect the relative height information of the drone and the ground, and the height difference information may be replaced in some usage scenarios (such as open flat).
- the horizontal distance between the drone and the gesture area can be adjusted to achieve horizontal movement between the drone and the user.
- the head orientation of the drone can be adjusted, or the shooting direction of the camera unit in the drone can be controlled, so that when the position of the gesture area changes, the drone can adjust the head or the camera unit accordingly.
- the orientation allows the camera unit to lock the user all the time.
- the height difference information the height between the drone and the user can be adjusted to maintain the relative orientation of the drone when the user performs the uphill movement and the downhill movement.
- the first relative orientation information of the drone is obtained, which is used to represent the relative orientation between the drone and the gesture area.
- the first relative orientation information may be obtained according to the information using an information fusion algorithm, such as a Kalman filter algorithm, to improve accuracy.
- the first relative orientation information may decompose the positioning information, the height difference information, the azimuth information, and the distance information.
- the first relative orientation information may also be a data packet including the positioning information, the height difference information, the azimuth information, and the distance information.
- the first relative orientation information characterizes the relative orientation between the drone and the gesture area for the relative orientation control of the drone.
- Step S14 controlling the flight state of the drone according to the first relative orientation information and the gesture instruction type, so that the relative orientation between the drone and the gesture area is the preset value.
- the second relative orientation information may be calculated according to the first relative orientation information and the relative orientation preset value represented by the gesture instruction type, and further according to the second relative orientation.
- the information controls the drone to adjust the flight status accordingly. Adjust its relative orientation with the gesture area.
- the relative orientation preset value may include a distance, an azimuth angle, and a height difference between the adjusted drone and the gesture area.
- the drone is controlled to adjust the heading angle according to the azimuth information in the second relative orientation information, so that the drone is oriented toward the gesture area.
- the height difference or the first relative orientation information is characterized by the same height difference.
- the coordinate position C(x 3 , y 3 ) represented by the position information in the second relative orientation information described above may be represented by the position information in the first relative orientation information by the UAV coordinate position A(x 1 , y 1 )
- the preset value is obtained according to the foregoing formula.
- the horizontal distance between the drone and the operator can be adjusted to achieve horizontal movement between the drone and the operator.
- the nose orientation of the drone can be adjusted, which is actually controlling the shooting direction of the camera unit in the drone.
- the drone can adjust the orientation of the handpiece accordingly, so that the camera can always lock the operator.
- the height difference the height between the drone and the operator can be adjusted to realize the uphill movement and the downhill movement of the drone.
- the first relative orientation information and the preset value only change the distance between the drone and the gesture area, and keep the azimuth and the height difference unchanged, so that the drone and the gesture area remain in the preset value.
- the distance is changed; or the azimuth is changed, and the distance and the height difference are kept constant, so that the drone performs operations such as shooting around the gesture area.
- the user can realize the control of the relative orientation of the drone simply and quickly.
- the relative orientation between the drone and the gesture area is After the preset value, the step of detecting the third relative orientation information between the drone and the gesture area is further included.
- the principle and the step of detecting the third relative orientation information are the same as the detection of the first relative orientation information, and details are not described herein again. Further, according to actual needs, the following steps in at least one of the following solutions may be included:
- the height difference between the drone and the gesture area is raised to the first preset height difference.
- the flight state of the drone is controlled such that the height difference between the drone and the gesture area is reduced to the second preset height difference.
- the rate of change of the distance represented by the third relative orientation information may represent the relative speed between the user and the drone, and the drone is adjusted by the adjustment of the flight speed of the drone.
- the distance between the gesture areas is within a predetermined distance.
- the preset speed range and the predetermined distance range may be preset according to actual needs and/or correspondingly according to the gesture instruction type to represent the relative orientation preset value, so as to ensure that the drone better follows the user, and does not You can take a clearer picture when you are shooting, and/or to ensure that the drone is shooting.
- the update frequency of the ranging sensor is generally low, when the gesture area changes rapidly when moving at a relatively fast speed, that is, when the user moves quickly, the response of the drone is often not fast enough, and there is a delay phenomenon.
- the distance sensor can be used to measure the distance of the gesture area, and the motion speed of the gesture area can be calculated periodically and the following speed of the drone can be adjusted in real time according to the motion speed.
- the machine can adjust the flight speed according to the moving speed of the gesture area, so that the drone and the user maintain the relative orientation as a preset range, achieve a good follow-up effect, and improve the user's drone interaction experience during fast motion.
- the drone when the height difference between the user and the drone is less than the first preset height difference, the flight state of the drone is controlled, and the height difference is raised to the first preset height. difference.
- the drone can maintain the preset relative height with the operator by adjusting the height of the user during the ascending movement to avoid a collision accident.
- the first preset height difference is 0.5 meters.
- the second preset height difference is 0.5 meters.
- the method for controlling the relative orientation of the drone shown in the embodiment of the present invention is adopted between the drone and the user Relative position, real-time adjustment of the flight state of the drone, so as to achieve the relative position control of the drone and the user.
- the complexity of the control operation of the drone is reduced, and the hidden trouble of the operation error is also reduced.
- the user in this embodiment may not only be a real person, but also other drones, or devices such as cars on the ground.
- Another embodiment is an improvement based on the previous embodiment.
- the wearer is connected based on the trust connection.
- the device sends an alarm instruction: the rate of change of the distance represented by the third relative orientation information is greater than the preset speed range; and the height difference represented by the third relative orientation information is less than the first preset height difference;
- the height difference represented by the third relative orientation information is greater than the second preset height difference.
- the UAV When the UAV detects the impact of the above environmental changes, it sends an alarm command to the wearable device to prompt the user that the current drone is in an azimuth deviation state. It is beneficial for users to make corresponding adjustments in time, which reduces the hidden dangers of drone loss and safety accidents.
- driving control of the infrared light emitted by the wearer to the wearer is implemented, and the loss of power of the wearable device is reduced.
- the method further includes the following pre-step: sending the device to the wearer based on the trusted connection A drive command that emits infrared light.
- UAVs and wearable devices are typically connected by communication to effect the transmission of data and instructions.
- wireless communication is used.
- the signal amplifying device such as a signal repeater can be connected.
- a trust connection is adopted, so that only the drone and the remote control device that have been authenticated by identity (ID) can perform the interaction operation.
- another embodiment of the present invention further includes the following pre-steps: authenticating the wearable device through a communication connection; when the identity verification is successful, The drone establishes a trust connection with the wearable device.
- the pre-step only the wearable device that has been authenticated by identity (ID) can establish a trust connection with the drone, thereby implementing an interactive operation, preventing the identification device from misjudging or malicious interference, and improving system accuracy and security.
- ID identity
- the implementation of the present invention can determine the gesture area and the recognition gesture by infrared imaging, and determine the relative orientation by combining the position information of the drone itself, thereby improving the implementation of the drone.
- the relative orientation control efficiency improves the user experience.
- the method includes the following steps:
- Step S21 Receive a driving instruction of the drone for driving the wearing device to emit infrared light based on the trusted connection.
- the wearable device interacts with the drone through gesture recognition and communication connections.
- the communication connection uses a wireless communication connection.
- the distance between the wearable device and the drone is relatively long, or the environmental electromagnetic conditions are complicated, and the like, and the signal amplifying device such as a signal repeater can be connected.
- a trust connection manner may also be adopted, so that only the drone and the wearable device that have been authenticated by identity (ID) can perform the interaction operation.
- ID identity
- the trusted connection between the drone and the wearable device may be any one or more of a Bluetooth trusted connection, a near field communication connection, a UBW trusted connection, a ZigBee trusted connection, or an Internet trusted connection.
- the wearable device receives a drive command of the drone based on the connection, and the drive command is used to drive the wearable device to emit infrared light.
- Step S22 in response to the driving instruction, driving the infrared illuminating component preset by the wearable device to transmit Infrared light causes the drone to determine the gesture area and the type of gesture instruction that characterizes the relative orientation preset based on infrared imaging to apply to the relative orientation control of the drone.
- the wearable device includes an infrared illumination assembly, and in one embodiment, the infrared light diodes of the infrared illumination assembly are arranged linearly along the sides of the wear device.
- the infrared illuminating component presets one or more infrared point light sources, such as infrared light emitting diodes, for emitting infrared light.
- Another embodiment is an improvement made on the basis of the previous embodiment.
- the wearable device is adapted to be disposed on the arm such that the gesture implementation area is located in the wearable device and the identification device. between.
- the wearable device drives the infrared light-emitting component preset by the wearable device to emit infrared light to form an infrared ring in response to the driving instruction of the drone, so that the drone determines the gesture region and the representation relative based on the infrared imaging.
- the type of gesture command for the orientation preset to apply to the relative orientation control of the drone.
- the drone can generate a corresponding gesture interaction event based on the infrared imaging capture gesture.
- the infrared light emitted by the wearable device is diffusely reflected on the back of the user's hand, "illuminating" the outline of the hand, so that the user's gesture area and the background area are distinguished by infrared light in infrared imaging.
- the infrared light may also be partially absorbed by the hand, making the infrared imaging of the hand more visible.
- the UAV performs gesture segmentation based on infrared imaging, which can reduce the calculation amount of the processor, shorten the response time, and improve the efficiency and accuracy of gesture recognition and relative position control of the drone, especially when the drone or user The effect is particularly significant when moving.
- the wearable device drives only one of the preset one or more infrared point sources in the infrared illumination assembly to emit infrared light in response to the drive command.
- the wearable device drives only one of the preset one or more infrared point sources in the infrared illumination assembly to emit infrared light in response to the drive command.
- the energy consumption of the wearable device can be reduced and the use time can be prolonged while ensuring the use effect.
- the infrared light emitted by the infrared light emitting device is controlled to emit in a wavelength range of 0.76 to 2.5 um.
- the hand contour in the infrared image acquired by the identification device is mainly formed by the infrared light reflected by the hand, which is safer for the human body and has better recognition effect.
- Step S23 receiving an alarm instruction of the drone based on the trust connection.
- the wearable device and the drone can interact through wireless communication.
- the distance between the wearable device and the drone is relatively long, or the environmental electromagnetic conditions are complicated, and the like, and the signal amplifying device such as a signal repeater can also be connected.
- a trust connection manner may also be adopted, so that only the drone and the wearable device that have been authenticated by identity (ID) can perform the interaction operation.
- the wearable device is based on the trusted connection, and the receiving drone sends an alert command when the determined azimuth deviation state condition is satisfied, for prompting the user that the current drone is in an azimuth deviation state.
- Step S24 in response to the alarm instruction, controlling the wearable device to activate the vibration motor and/or turn on the indicator light to prompt the user that the current drone is in an azimuth deviation state.
- the wearable device After receiving the alarm instruction, the wearable device activates the vibration motor and/or turns on the indicator light accordingly in response to the alarm instruction.
- the vibration mode of the vibration motor and/or the flashing mode of the indicator light may be a preset mode or a corresponding setting mode for different UAV azimuth deviation states characterized according to the alarm instruction; may be set at the factory, or User set up.
- the method further includes the following concurrent step of: calculating a working duration of the infrared light emitting component, and controlling the infrared light emitting component when the working time exceeds a predetermined length of time Stop emitting infrared light. Allowing the wearable device to illuminate at the illumination component The infrared component is automatically turned off after a predetermined length of time has elapsed. Therefore, it is effective to prevent waste of electric energy due to user negligence and the like, and the user can also set the predetermined length of time to control the illumination duration of the infrared illuminating component, thereby improving work efficiency.
- the unmanned aerial vehicle relative orientation assist control method for the wearable device of the present invention further includes the following Step: sending an authentication request to the drone through the communication connection; when the identity verification is successful, the wearable device and the drone establish a trust connection.
- Step sending an authentication request to the drone through the communication connection; when the identity verification is successful, the wearable device and the drone establish a trust connection.
- ID identity
- the implementation of the present invention can improve the efficiency of the relative orientation control of the drone by performing infrared gestures and communication interaction with the drone. Improve the user experience.
- the invention further proposes a relative orientation control device for the unmanned aerial vehicle based on the above-mentioned unmanned aerial vehicle relative orientation control method.
- the image capturing unit 11, the identification unit 12, the detecting unit 13, and the control unit 14 are specifically disclosed as follows:
- the imaging unit 11 is configured to acquire an infrared image formed by the infrared light emitted by the wearable device acting on the gesture area.
- the drone usually includes a camera unit 11, a processor, a memory, etc., and performs gesture recognition based on computer vision.
- the drone includes at least one camera having an infrared imaging function, and the drone acquires an infrared image formed by the infrared light emitted by the wearable device, and the hand is pressed according to the infrared image.
- the potential area is separated from the background area to complete the gesture segmentation to achieve infrared gesture recognition.
- the camera unit 11 includes at least one camera.
- the camera unit 11 can acquire an infrared image by any one or any of IR-CUT dual filter technology, IR lens technology, and infrared sensing CCD technology.
- the wearable device emits infrared light to illuminate the back of the user's hand, so that the outline of the user's gesture area is "illuminated” by the infrared light to form an infrared image of the gesture area.
- the user's gesture area and background area are distinguished by infrared light in infrared imaging.
- the identification unit 12 is configured to determine a gesture area that describes the contour by the infrared light and a gesture instruction type that characterizes the relative orientation preset value according to the infrared image.
- the drone separates the gesture area from the background area according to the infrared image to complete the gesture segmentation, and the gesture area can be determined.
- an image algorithm may also be utilized to cause the camera unit 11 of the drone to lock the gesture area.
- the determining unit 12 determines, according to the infrared image, the gesture area and the gesture instruction type, including: one or more frames of images acquired from the preview infrared image acquired by the camera unit 11; a gesture area in which the contour is described by infrared light in the multi-frame image; the gesture feature data is extracted based on the gesture area, and matched with the preset gesture instruction type description data to determine a corresponding gesture instruction type.
- the gesture instruction type can characterize the relative orientation preset for the drone relative orientation control.
- the video acquired by the UAV through the camera unit 11 can be regarded as composed of a multi-frame image.
- static gesture recognition only one or a few frames of the gesture are analyzed to extract gesture feature data, and the gesture feature data may include a gesture. Contour data and/or gesture depth data.
- dynamic gesture recognition it is also necessary to acquire the spatio-temporal features of gestures.
- the common methods of dynamic gesture spatio-temporal trajectory analysis are mainly two categories: Trajectories Matching and State Space Modeling. . Therefore, it is necessary to analyze the multi-frame image to obtain the gesture in motion. The resulting space-time trajectory.
- the recognition unit 12 separates the gesture area from the background area according to the infrared image to complete the gesture segmentation, determines the gesture area described by the infrared light, and then acquires the gesture feature through the gesture area and estimates the gesture model.
- the parameters are analyzed by gestures, and then the gestures are classified according to the model parameters to determine the corresponding gesture instruction type, and the infrared gesture recognition is realized.
- the identification method may be based on template matching, based on a hidden Markov model (HMM) or a method based on a neural network.
- HMM hidden Markov model
- the drone can determine the corresponding instruction type according to the infrared imaging capturing gesture. It reduces the occupation of computing resources, shortens the response time required for the user to use the gesture recognition to control the relative position of the drone in the background complex or dimly lit, and improves the efficiency and accuracy of the user's human-computer interaction, especially in the unmanned This is especially true when the machine and/or user is in the process of moving.
- the detecting unit 13 is configured to detect first relative orientation information between the drone and the gesture area.
- the drone determines the relative position according to the relative orientation information between the drone and the gesture area, and it is pointed out that the first relative orientation information of the drone includes between the drone and the gesture area.
- the distance information, the azimuth information, the altitude angle information, the height difference information, and the positioning information of the drone are any one or any of a plurality of types. Therefore, the first relative position information is a general term, and when specifically applied, The specific data listed here can be determined as needed.
- the detecting process of the first relative orientation information between the UAV and the gesture area by the detecting unit 13 includes: detecting the positioning information of the UAV through the satellite positioning sensor of the UAV; The distance measuring sensor of the drone detects the distance information between the drone and the gesture area; and detects the azimuth information between the drone and the gesture area by using the direction sensor of the drone; Gyro of the drone The sensor detects the height angle information between the drone and the gesture area, and calculates the height difference information between the drone and the gesture area according to the distance information and the height angle information; The positioning information, the height difference information, the azimuth information and the distance information are calculated to obtain relative orientation information between the drone and the gesture area.
- the horizontal distance between the gesture areas of the drone can be adjusted to achieve horizontal movement between the drone and the user.
- the head orientation of the drone can be adjusted, or the shooting direction of the camera unit 11 in the drone can be controlled, so that when the position of the gesture area changes, the drone can adjust the head accordingly.
- the orientation of the imaging unit 11 enables the imaging unit 11 to always lock the user.
- the relative height between the drone and the operator can be adjusted, so that the drone can maintain a relative orientation with the preset value when the user performs the uphill movement or the downhill movement.
- the positioning information of the first relative orientation information may represent the latitude and longitude coordinates A(x 1 , y 1 ) of the drone, and the latitude and longitude coordinates may be acquired by a satellite positioning sensor installed by the drone, and the satellite positioning sensor
- the positioning function is implemented based on a satellite positioning system connected thereto, and the satellite system connected to the positioning module includes but is not limited to: a GPS positioning system, a Beidou positioning system, a GLONAS positioning system or a Galileo positioning system;
- the distance is a drone a linear distance l from the gesture area, the ranging sensor is a laser ranging sensor and/or an infrared ranging sensor;
- the azimuth angle ⁇ AB is also called Azimuth (angle), referred to as Az, indicating From the north direction of the drone, the horizontal angle between the clockwise direction and the direction line of the gesture area, for example, in the Android system, obtained by public static float[]getOrientation(float[]R,float[]value
- the positioning coordinates B(x 2 , y 2 ) of the current gesture area can be obtained.
- the distance measuring sensor or the ultrasonic ranging sensor may also be used to detect the relative height information of the drone and the ground, and the height difference information may be replaced in some usage scenarios (such as open flat).
- the horizontal distance between the drone and the gesture area can be adjusted to achieve horizontal movement between the drone and the user.
- the head orientation of the drone can be adjusted, or the shooting direction of the camera unit 11 in the drone can be controlled, so that when the position of the gesture area changes, the drone can adjust the head or the camera unit accordingly.
- the orientation of 11 enables the camera unit 11 to lock the user all the time.
- the height difference information the height between the drone and the user can be adjusted to maintain the relative orientation of the drone when the user performs the uphill movement and the downhill movement.
- the first relative orientation information of the drone is obtained, which is used to represent the relative orientation between the drone and the gesture area.
- the first relative orientation information may be obtained according to the information using an information fusion algorithm, such as a Kalman filter algorithm, to improve accuracy.
- the first relative orientation information may decompose the positioning information, the height difference information, the azimuth information, and the distance information.
- the first relative orientation information may also be a data packet including the positioning information, the height difference information, the azimuth information, and the distance information.
- the first relative orientation information characterizes the relative orientation between the drone and the gesture area for the relative orientation control of the drone.
- the control unit 14 is configured to control the flight state of the drone according to the first relative orientation information and the gesture instruction type, so that the relative orientation between the drone and the gesture region is The preset value.
- the second relative orientation information may be calculated according to the first relative orientation information and the relative orientation preset value represented by the gesture instruction type, and then the control unit 14 is configured according to the first
- the second relative position information correspondingly controls the drone to adjust the flight state and adjust the relative orientation between the drone and the gesture area.
- the relative orientation preset value may include a distance, an azimuth angle, and a height difference between the adjusted drone and the gesture area.
- control unit 14 can flexibly select at least one of the following according to actual requirements, so that the relative orientation between the UAV and the gesture area is the Set value:
- the drone is controlled to adjust the heading angle according to the azimuth information in the second relative orientation information, so that the drone is oriented toward the gesture area.
- the height difference or the first relative orientation information is characterized by the same height difference.
- the coordinate position C(x 3 , y 3 ) represented by the position information in the second relative orientation information described above may be represented by the position information in the first relative orientation information by the UAV coordinate position A(x 1 , y 1 )
- the preset value is obtained according to the foregoing formula.
- the horizontal distance between the drone and the operator can be adjusted to achieve horizontal movement between the drone and the operator.
- the head orientation of the drone can be adjusted, and actually the shooting direction of the camera unit 11 in the drone is controlled.
- the drone can adjust the orientation of the handpiece accordingly, so that the camera can always lock the operator.
- the height difference the height between the drone and the operator can be adjusted to realize the uphill movement and the downhill movement of the drone.
- the control unit 14 may change only the drone and the gesture area according to the first relative orientation information and the preset value.
- the distance while maintaining its azimuth and height difference, keeps the drone and the gesture area at a preset distance; or, changing the azimuth, keeping the distance and height difference constant, making the drone Perform operations such as shooting around the gesture area.
- the user can realize the control of the relative orientation of the drone simply and quickly.
- the relative orientation between the drone and the gesture area is After the preset value, the detecting unit 13 is further configured to detect third relative orientation information between the drone and the gesture area.
- the principle and the step of detecting the third relative orientation information are the same as the detection of the first relative orientation information, and details are not described herein again.
- a determining unit is further included, and the determining unit and the control unit 14 can be configured according to at least one solution according to actual requirements:
- the determining unit is configured to determine whether a rate of change of the distance characterized by the third relative orientation information is greater than a preset speed range; the control unit 14 is configured to control the current limit when greater than the preset speed range The human machine adjusts the flight speed such that the distance between the drone and the gesture area is within a predetermined distance range.
- the determining unit is configured to determine that the height difference represented by the third relative orientation information is No less than the first preset height difference; the control unit 14 is configured to control the flight state of the drone to make the altitude between the drone and the gesture area when the first preset height difference is smaller than the first preset height difference The difference is raised to the first preset height difference.
- the determining unit is configured to determine whether the height difference represented by the third relative orientation information is greater than a second preset height difference; the control unit 14 is configured to be greater than the second preset height difference And controlling the flight state of the drone to reduce the height difference between the drone and the gesture area to the second preset height difference.
- the rate of change of the distance represented by the third relative orientation information may represent the relative speed between the user and the drone, and the drone is adjusted by the adjustment of the flight speed of the drone.
- the distance between the gesture areas is within a predetermined distance.
- the preset speed range and the predetermined distance range may be preset according to actual needs and/or correspondingly according to the gesture instruction type to represent the relative orientation preset value, so as to ensure that the drone better follows the user, and does not You can take a clearer picture when you are shooting, and/or to ensure that the drone is shooting.
- the update frequency of the ranging sensor is generally low, when the gesture area changes rapidly when moving at a relatively fast speed, that is, when the user moves quickly, the response of the drone is often not fast enough, and there is a delay phenomenon.
- the distance sensor can be used to measure the distance of the gesture area, and the motion speed of the gesture area can be calculated periodically and the following speed of the drone can be adjusted in real time according to the motion speed.
- the machine can adjust the flight speed according to the moving speed of the gesture area, so that the drone and the user maintain the relative orientation as a preset range, achieve a good follow-up effect, and improve the user's drone interaction experience during fast motion.
- the drone when the height difference between the user and the drone is less than the first preset height difference, the flight state of the drone is controlled, and the height difference is raised to the first preset height. difference.
- the first preset height difference is 0.5 meters.
- the second preset height difference is 0.5 meters.
- the relative azimuth control method of the unmanned aerial vehicle shown in the embodiment of the present invention adjusts the flight state of the drone in real time by the relative orientation between the drone and the user, thereby realizing the relative azimuth control of the drone and the user.
- the complexity of the control operation of the drone is reduced, and the hidden trouble of the operation error is also reduced.
- the user in this embodiment may not only be a real person, but also other drones, or devices such as cars on the ground.
- a transmitting unit is further configured to be configured to determine when any of the following azimuth deviation state conditions is determined by the drone When satisfied, the alarm command is sent to the wearable device based on the trust connection: the rate of change of the distance represented by the third relative orientation information is greater than a preset speed range; and the height difference represented by the third relative orientation information is less than the first preset height difference The third relative orientation information represents a height difference greater than a second predetermined height difference.
- the sending unit sends an alarm instruction to the wearable device to prompt the user that the current drone is in an azimuth deviation state. It is beneficial for users to make corresponding adjustments in time, which reduces the hidden dangers of drone loss and safety accidents.
- the drone is implemented to drive control of the infrared light emitted by the wearable device to reduce the loss of power of the wearable device, and the transmitting unit is further configured to: based on the trusted connection, A drive command for driving the infrared light to be emitted is sent to the wearable device.
- UAVs and wearable devices are typically connected by communication to effect the transmission of data and instructions.
- wireless communication is used.
- the signal amplifying device such as a signal repeater can be connected.
- a trust connection is adopted, so that only the drone and the remote control device that have been authenticated by identity (ID) can perform the interaction operation.
- another embodiment of the present invention further includes a first communication unit configured to: authenticate the wearable device through the communication connection; When the verification is successful, the drone establishes a trust connection with the wearable device.
- a first communication unit configured to: authenticate the wearable device through the communication connection; When the verification is successful, the drone establishes a trust connection with the wearable device.
- ID identity
- the implementation of the present invention can determine the gesture area and the recognition gesture by infrared imaging, and determine the relative orientation by combining the position information of the drone itself, thereby improving the drone.
- the relative orientation control efficiency improves the user experience.
- the present invention further provides a drone relative orientation assisting control device for a wearable device based on the above-described unmanned aerial vehicle relative orientation assisting control method for a wearable device.
- the first receiving unit 21, the driving unit 22, the second receiving unit 23, the alarm unit 24, and each unit are included.
- the functions implemented are specifically disclosed as follows:
- the first receiving unit 21 is configured to receive a driving instruction of the drone for driving the wearing device to emit infrared light based on the trusted connection.
- the wearable device interacts with the drone through gesture recognition and communication connections.
- the communication connection uses a wireless communication connection.
- the distance between the wearable device and the drone is relatively long, or the environmental electromagnetic conditions are complicated, and the like, and the signal amplifying device such as a signal repeater can be connected.
- a trust connection manner may also be adopted, so that only the drone and the wearable device that have been authenticated by identity (ID) can perform the interaction operation.
- ID identity
- the trusted connection between the drone and the wearable device may be any one or more of a Bluetooth trusted connection, a near field communication connection, a UBW trusted connection, a ZigBee trusted connection, or an Internet trusted connection.
- the first receiving unit 21 of the wearable device receives a driving instruction of the drone based on the connection, and the driving instruction is for driving the wearing device to emit infrared light.
- the driving unit 22 is configured to, in response to the driving instruction, driving the infrared illuminating component preset by the wearing device to emit infrared light, so that the drone determines the gesture area and the gesture instruction type characterization of the relative orientation preset value based on the infrared imaging, Applied to the relative position control of the drone.
- the wearable device includes an infrared illumination assembly, and in one embodiment, the infrared light diodes of the infrared illumination assembly are arranged linearly along the sides of the wear device.
- the infrared illuminating component presets one or more infrared point light sources, such as infrared light emitting diodes, for emitting infrared light.
- Another embodiment is an improvement made on the basis of the previous embodiment.
- the wearable device is adapted to be disposed on the arm such that the gesture implementation area is located in the wearable device and the identification device. between.
- the driving unit 22 drives the infrared illuminating component preset by the wearable device to emit infrared light to form an infrared ring in response to the driving instruction of the drone, so that the drone determines the gesture based on infrared imaging.
- the drone After the infrared illuminating component of the wearable device emits infrared light, the drone is caused to capture a gesture based on infrared imaging to generate a corresponding gesture interaction event.
- the infrared light emitted by the wearable device is diffusely reflected on the back of the user's hand, "illuminating" the outline of the hand, so that the user's gesture area and the background area are distinguished by infrared light in infrared imaging.
- the infrared light may also be partially absorbed by the hand, making the infrared imaging of the hand more visible.
- the UAV performs gesture segmentation based on infrared imaging, which can reduce the calculation amount of the processor, shorten the response time, and improve the efficiency and accuracy of gesture recognition and relative position control of the drone, especially when the drone or user The effect is particularly significant when moving.
- the driving unit 22 drives only one of the preset one or more infrared point sources in the infrared illuminating component to emit infrared light in response to the driving instruction.
- the driving unit 22 drives only one of the preset one or more infrared point sources in the infrared illuminating component to emit infrared light in response to the driving instruction.
- more infrared point sources are driven to emit light; otherwise, less infrared point sources are driven to emit light. Therefore, the energy consumption of the wearable device can be reduced and the use time can be prolonged while ensuring the use effect.
- the infrared light emitted by the infrared light emitting device is controlled to emit in a wavelength range of 0.76 to 2.5 um.
- the hand contour in the infrared image acquired by the identification device is mainly formed by the infrared light reflected by the hand, which is safer for the human body and has better recognition effect.
- the second receiving unit 23 is configured to receive an alert instruction of the drone based on the trusted connection.
- the wearable device and the drone can interact through wireless communication.
- the distance between the wearable device and the drone is relatively long, or the environmental electromagnetic conditions are complicated, and the like, and the signal amplifying device such as a signal repeater can also be connected.
- a trust connection manner may also be adopted, so that only the drone and the wearable device that have been authenticated by identity (ID) can perform the interaction operation.
- the second receiving unit 23 receives based on the trusted connection.
- the drone sends an alarm command when the determined azimuth deviation state condition is satisfied, and is used to prompt the user that the current drone is in an azimuth deviation state.
- the alarm unit 24 is configured to control the wearable device to activate the vibration motor and/or turn on the indicator light in response to the alarm instruction to prompt the user that the current drone is in an azimuth deviation state.
- the alert unit 24 activates the vibration motor and/or turns on the indicator light in response to the alert command.
- the vibration mode of the vibration motor and/or the flashing mode of the indicator light may be a preset mode or a corresponding setting mode for different UAV azimuth deviation states characterized according to the alarm instruction; may be set at the factory, or User set up.
- the method further includes: a shutdown unit configured to: calculate a working duration of the infrared light emitting component, and when the working time exceeds a predetermined length of time, control the The infrared illuminating component stops emitting infrared light.
- the wearable device can automatically turn off the infrared component after the illumination component has been illuminated for more than a predetermined period of time. Therefore, it is effective to prevent waste of electric energy due to user negligence and the like, and the user can also set the predetermined length of time to control the illumination duration of the infrared illuminating component, thereby improving work efficiency.
- the unmanned aerial vehicle relative orientation control device for the wearable device of the present invention further includes a second The communication unit is configured to: send an identity verification request to the drone through the communication connection; when the identity verification is successful, the wearable device and the drone establish a trust connection.
- the communication unit is configured to: send an identity verification request to the drone through the communication connection; when the identity verification is successful, the wearable device and the drone establish a trust connection.
- the implementation of the present invention can improve the efficiency of the relative orientation control of the drone by performing infrared gestures and communication interaction with the drone. Improve the user experience.
- another embodiment of the present invention further provides a UAV control device having a function of implementing the above-described UAV relative azimuth control method.
- the functions may be implemented by hardware or by corresponding software implemented by hardware.
- the hardware or software includes one or more units corresponding to the functions described above.
- the structure of the drone control device includes:
- One or more cameras 707 at least one of which has an infrared imaging function
- the memory 702 is configured to store a program for supporting the wearable device to perform the foregoing method for controlling the relative orientation of the drone;
- a communication interface 703, configured to communicate with the wearable device or other device or communication network;
- One or more processors 704 for executing programs stored in the memory are provided.
- the one or more programs 705 are configured to drive the one or more processors 704 to construct a unit for performing any of the above-described drone relative orientation control methods.
- FIG. 8 shows the related to the unmanned aerial vehicle relative orientation control device provided by the embodiment of the present invention.
- memory 702 memory 702
- communication interface 703 one or more processors 704
- applications 705 power source 706, one or more cameras 707, and one or more sensors 708, and the like.
- power source 706 one or more cameras 707
- sensors 708 sensors
- the memory 702 can be used to store software programs and modules, and the processor 704 executes various functional applications and data processing of the drone by running software programs and modules stored in the memory 702.
- the memory 702 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program 705 required for at least one function, and the like; the storage data area may store data created according to usage of the drone, and the like.
- memory 702 can include high speed random access memory area 702, and can also include non-volatile memory area 702, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
- the communication interface 703 is used in the above control process to communicate with the wearable device and other devices or communication networks.
- the communication interface 703 is an interface between the processor 704 and the external subsystem for transmitting information between the processor 704 and the external system to achieve the purpose of the control subsystem.
- the processor 704 is a control center of the drone, and connects various parts of the entire UAV relative orientation control device using various communication interfaces 703 and lines, by running or executing software programs and/or modules stored in the storage area 702, The data stored in the storage area 702 is called, and various functions and processing data of the drone are executed to perform overall monitoring of the drone.
- the processor 704 may include one or more processing units; preferably, the processor 704 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, and an application 705. Etc, modulation The demodulation processor primarily handles wireless communications. It will be appreciated that the above described modem processor may also not be integrated into the processor 704.
- One or more applications 705, preferably, are stored in the storage area 702 and configured to be executed by the one or more processors 704, the one or more programs being configured A function implemented by any embodiment of a drone relative orientation control method.
- a power source 706 (such as a battery) that supplies power to the various components.
- the power source 706 can be logically coupled to the processor 704 via a power management system to manage functions such as charging, discharging, and power management through the power 706 management system.
- the drone may further include one or more cameras 707 including at least one camera having infrared imaging functions, the cameras 707 are connected to the processor 704 and controlled by the processor 704, and the images acquired by the camera 707 may be stored in the memory. 702.
- IMU inertial sensors
- accelerometers accelerometers, gyroscope sensors
- magnetometers magnetometers
- direction sensors e.g., direction sensors
- ranging sensors e.g., ranging sensors
- satellite positioning sensors eg, GPS sensors, Beidou sensors, etc.
- image sensor etc.
- the drone may further include a Bluetooth module or the like, which will not be described herein.
- the processor 704 included in the drone has the following functions:
- the embodiment of the invention further provides a computer storage medium for storing computer software instructions for use in the above-mentioned drone relative orientation control device, which comprises a program for performing the above-mentioned design for the drone.
- another embodiment of the present invention further provides a wearable device control device having a function for implementing the above-described method for assisting a relative position control of a wearable device.
- the functions may be implemented by hardware or by corresponding software implemented by hardware.
- the hardware or software includes one or more units corresponding to the functions described above.
- the structure of the drone control device includes:
- a memory 702 configured to store a program for supporting the wearable device to perform the above-described drone relative orientation assist control method for the wearable device;
- a communication interface 703, configured to communicate with the drape device or other device or communication network;
- a vibration motor and/or indicator light 710 for prompting the user of the current state of the drone
- One or more processors 704 for executing programs stored in the memory are provided.
- the infrared illuminating component 709 includes one or more infrared light sources for emitting infrared light;
- the one or more programs 705 are configured to drive the one or more processors 704 to construct a unit for performing any of the above-described wearable device drone assisted relative orientation control methods.
- FIG. 9 is a block diagram showing a partial structure of a smart wristband associated with the wearable device control device provided by the embodiment of the present invention. Including: a memory 702, a communication interface 703, one or more processors 704, one or more applications 705, a power source 706, an infrared illumination component 709, an indicator light 710, and the like.
- a memory 702 a communication interface 703, one or more processors 704, one or more applications 705, a power source 706, an infrared illumination component 709, an indicator light 710, and the like.
- the structure illustrated in Figure 9 does not constitute a limitation of the opponent's ring, may include more or fewer components than illustrated, or combine some components, or different component arrangements.
- the memory 702 can be used to store software programs and modules, and the processor 704 executes various functional applications and data processing of the wearable device by running software programs and modules stored in the memory 702.
- the memory 702 may mainly include a storage program area and an storage data area, wherein the storage program area may store an operating system, an application 705 required for at least one function, and the like; the storage data area may store data created according to usage of the wearable device, and the like.
- memory 702 can include high speed random access memory area 702, and can also include non-volatile memory area 702, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
- the communication interface 703 is used for communication between the smart bracelet and the unmanned aerial vehicle relative orientation control device and other devices or communication networks in the above control process.
- the communication interface 703 is an interface between the processor 704 and the external subsystem for transmitting information between the processor 704 and the external system to achieve the purpose of the control subsystem.
- the processor 704 is a control center for the smart bracelet that connects various portions of the entire wearable device with various communication interfaces 703 and lines, by running or executing software programs and/or modules stored in the storage area 702, and by calling stored in storage.
- the data within the area 702 performs various functions of the wearable device and processing data to thereby perform overall monitoring of the wearable device.
- the processor 704 may include one or more processing units; preferably, the processor 704 may integrate an application processor and a modem processor, where The application processor mainly processes an operating system, a user interface, an application 705, etc., and the modem processor mainly processes wireless communication. It will be appreciated that the above described modem processor may also not be integrated into the processor 704.
- One or more applications 705, preferably, are stored in the storage area 702 and configured to be executed by the one or more processors 704, the one or more programs being configured A function implemented for any embodiment for performing a drone relative azimuth assisted control method for a wearable device.
- the infrared illuminating component 709 presets one or more infrared point light sources, such as infrared light emitting diodes, for emitting infrared light.
- the indicator light 710 may be in a preset mode or a corresponding mode set for different UAV azimuth deviation states according to the alarm instruction; and is used to prompt the user that the current drone is in an azimuth deviation state.
- the processor 704 included in the wearable device further has the following functions:
- the infrared illuminating component preset to drive the wearable device emits infrared light, so that the drone determines the gesture area and the type of gesture instruction characterization of the relative orientation preset value based on the infrared imaging to apply to the relative orientation of the drone control;
- the wearable device In response to the alerting command, the wearable device is controlled to activate the vibration motor and/or turn on the indicator light to prompt the user that the current drone is in an azimuthally offset state.
- Figure 10 illustrates a wearable device (referred to collectively as a drone and wearable device) of a drone or drone relative azimuth assist control that can implement relative orientation control in accordance with the present invention.
- the device conventionally includes a processor 1010 and a computer program product or computer readable medium in the form of a memory 1020.
- the memory 1020 may be an electronic memory such as a flash memory, an EEPROM (Electrically Erasable Programmable Read Only Memory), an EPROM, a hard disk, or a ROM.
- the memory 1020 has a memory space 1030 for executing program code 1031 of any of the above method steps.
- storage space 1030 for program code may include various program code 1031 for implementing various steps in the above methods, respectively.
- the program code can be read from or written to one or more computer program products.
- These computer program products include program code carriers such as hard disks, compact disks (CDs), memory cards or floppy disks.
- Such a computer program product is typically a portable or fixed storage unit as described with reference to FIG.
- the storage unit may have a storage section or a storage space or the like arranged similarly to the storage 1020 in FIG.
- the program code can be compressed, for example, in an appropriate form.
- the storage unit comprises program code 1031' for performing the steps of the method according to the invention, ie code that can be read by, for example, a processor such as 1010, which when executed by the device causes the device to perform the above Each step in the described method.
- the disclosed system, apparatus and method can be implemented in other manners.
- the device embodiments described above are merely illustrative.
- the division of the unit is only a logical function division.
- there may be another division manner for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
- the coupling or direct coupling or communication connection between the two may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
- the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
- each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
- the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- User Interface Of Digital Computer (AREA)
- Selective Calling Equipment (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
L'invention concerne un procédé et un appareil de commande d'azimut relatif pour un véhicule aérien sans pilote. Le procédé de commande d'azimut relatif pour le véhicule aérien sans pilote comprend les étapes suivantes consistant : à acquérir une image infrarouge formée après qu'une lumière infrarouge émise par un dispositif à porter sur soi a agi sur une région de geste (S11) ; à déterminer la région de geste permettant d'utiliser la lumière infrarouge afin de décrire un profil et un type d'instruction de geste caractérisant une valeur d'azimut relatif préétablie en fonction de l'image infrarouge (S12) ; à détecter des premières informations d'azimut relatif entre le présent véhicule aérien sans pilote et ladite région de geste (S13) ; et à commander un état de vol du présent véhicule aérien sans pilote en fonction des premières informations d'azimut relatif et du type d'instruction de geste, de telle sorte que l'azimut relatif entre le présent véhicule aérien sans pilote et la région de geste corresponde à la valeur préétablie (S14). En conséquence, l'efficacité de commande de l'azimut relatif du véhicule aérien sans pilote peut être améliorée, et l'expérience utilisateur de commande de l'azimut relatif de ce véhicule aérien sans pilote est meilleure.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201611114763.4 | 2016-12-07 | ||
| CN201611114763.4A CN106444843B (zh) | 2016-12-07 | 2016-12-07 | 无人机相对方位控制方法及装置 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018103689A1 true WO2018103689A1 (fr) | 2018-06-14 |
Family
ID=58216143
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2017/114974 Ceased WO2018103689A1 (fr) | 2016-12-07 | 2017-12-07 | Procédé et appareil de commande d'azimut relatif pour véhicule aérien sans pilote |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN106444843B (fr) |
| WO (1) | WO2018103689A1 (fr) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109857260A (zh) * | 2019-02-27 | 2019-06-07 | 百度在线网络技术(北京)有限公司 | 三维互动影像的控制方法、装置和系统 |
| CN114585985A (zh) * | 2020-11-05 | 2022-06-03 | 深圳市大疆创新科技有限公司 | 无人机控制方法、装置、无人机及计算机可读存储介质 |
| CN114730193A (zh) * | 2020-11-06 | 2022-07-08 | 深圳市大疆创新科技有限公司 | 无人机的控制方法、系统、无人机及存储介质 |
| CN115841487A (zh) * | 2023-02-20 | 2023-03-24 | 深圳金三立视频科技股份有限公司 | 一种沿输电线路的隐患定位方法及终端 |
Families Citing this family (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106444843B (zh) * | 2016-12-07 | 2019-02-15 | 北京奇虎科技有限公司 | 无人机相对方位控制方法及装置 |
| WO2018176426A1 (fr) * | 2017-03-31 | 2018-10-04 | 深圳市大疆创新科技有限公司 | Procédé de commande de vol d'un véhicule aérien sans pilote et véhicule aérien sans pilote |
| WO2018191840A1 (fr) * | 2017-04-17 | 2018-10-25 | 英华达(上海)科技有限公司 | Système et procédé de photographie interactive destinés à un véhicule aérien sans pilote |
| CN114879720A (zh) * | 2017-04-28 | 2022-08-09 | 深圳市大疆创新科技有限公司 | 无人飞行器的控制方法、设备及无人飞行器 |
| CN107024725B (zh) * | 2017-05-31 | 2023-09-22 | 湖南傲英创视信息科技有限公司 | 一种大视场微光低空无人机探测装置 |
| CN107643074B (zh) * | 2017-09-07 | 2019-12-03 | 天津津航技术物理研究所 | 一种航空扫描仪摆扫成像方位预置方法 |
| CN114879715A (zh) * | 2018-01-23 | 2022-08-09 | 深圳市大疆创新科技有限公司 | 无人机的控制方法、设备和无人机 |
| WO2020019193A1 (fr) * | 2018-07-25 | 2020-01-30 | 深圳市大疆创新科技有限公司 | Procédé et système de commande de véhicule aérien sans pilote, et véhicule aérien sans pilote |
| CN109270954A (zh) * | 2018-10-30 | 2019-01-25 | 西南科技大学 | 一种基于姿态识别的无人机交互系统及其控制方法 |
| CN109725637B (zh) * | 2018-12-04 | 2021-10-15 | 广东嘉腾机器人自动化有限公司 | 一种agv防丢包调度方法、存储装置及agv交管系统 |
| CN112189330A (zh) * | 2019-08-13 | 2021-01-05 | 深圳市大疆创新科技有限公司 | 拍摄控制方法、终端、云台、系统及存储介质 |
| CN113568596B (zh) * | 2020-04-29 | 2025-09-12 | 阿里巴巴集团控股有限公司 | 电子设备 |
| CN112051856B (zh) * | 2020-07-31 | 2024-01-19 | 深圳市贝贝特科技实业有限公司 | 用于无人机动态回收的复合传感系统 |
| CN114442305A (zh) * | 2020-11-02 | 2022-05-06 | 上海迈利船舶科技有限公司 | 一种视觉增强ais船舶望远镜 |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105138126A (zh) * | 2015-08-26 | 2015-12-09 | 小米科技有限责任公司 | 无人机的拍摄控制方法及装置、电子设备 |
| CN105518576A (zh) * | 2013-06-28 | 2016-04-20 | 陈家铭 | 根据手势的控制装置操作 |
| WO2016078742A1 (fr) * | 2014-11-20 | 2016-05-26 | Audi Ag | Procédé permettant de faire fonctionner un système de navigation d'un véhicule automobile par un geste de commande |
| CN105677300A (zh) * | 2016-02-04 | 2016-06-15 | 普宙飞行器科技(深圳)有限公司 | 基于手势识别操控无人机的方法、无人机及系统 |
| CN105676860A (zh) * | 2016-03-17 | 2016-06-15 | 歌尔声学股份有限公司 | 一种可穿戴设备、无人机控制装置和控制实现方法 |
| CN106054914A (zh) * | 2016-08-17 | 2016-10-26 | 腾讯科技(深圳)有限公司 | 一种飞行器的控制方法及飞行器控制装置 |
| CN106094846A (zh) * | 2016-05-31 | 2016-11-09 | 中国航空工业集团公司西安飞机设计研究所 | 一种飞机飞行控制方法 |
| CN106444843A (zh) * | 2016-12-07 | 2017-02-22 | 北京奇虎科技有限公司 | 无人机相对方位控制方法及装置 |
-
2016
- 2016-12-07 CN CN201611114763.4A patent/CN106444843B/zh not_active Expired - Fee Related
-
2017
- 2017-12-07 WO PCT/CN2017/114974 patent/WO2018103689A1/fr not_active Ceased
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105518576A (zh) * | 2013-06-28 | 2016-04-20 | 陈家铭 | 根据手势的控制装置操作 |
| WO2016078742A1 (fr) * | 2014-11-20 | 2016-05-26 | Audi Ag | Procédé permettant de faire fonctionner un système de navigation d'un véhicule automobile par un geste de commande |
| CN105138126A (zh) * | 2015-08-26 | 2015-12-09 | 小米科技有限责任公司 | 无人机的拍摄控制方法及装置、电子设备 |
| CN105677300A (zh) * | 2016-02-04 | 2016-06-15 | 普宙飞行器科技(深圳)有限公司 | 基于手势识别操控无人机的方法、无人机及系统 |
| CN105676860A (zh) * | 2016-03-17 | 2016-06-15 | 歌尔声学股份有限公司 | 一种可穿戴设备、无人机控制装置和控制实现方法 |
| CN106094846A (zh) * | 2016-05-31 | 2016-11-09 | 中国航空工业集团公司西安飞机设计研究所 | 一种飞机飞行控制方法 |
| CN106054914A (zh) * | 2016-08-17 | 2016-10-26 | 腾讯科技(深圳)有限公司 | 一种飞行器的控制方法及飞行器控制装置 |
| CN106444843A (zh) * | 2016-12-07 | 2017-02-22 | 北京奇虎科技有限公司 | 无人机相对方位控制方法及装置 |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109857260A (zh) * | 2019-02-27 | 2019-06-07 | 百度在线网络技术(北京)有限公司 | 三维互动影像的控制方法、装置和系统 |
| CN114585985A (zh) * | 2020-11-05 | 2022-06-03 | 深圳市大疆创新科技有限公司 | 无人机控制方法、装置、无人机及计算机可读存储介质 |
| CN114730193A (zh) * | 2020-11-06 | 2022-07-08 | 深圳市大疆创新科技有限公司 | 无人机的控制方法、系统、无人机及存储介质 |
| CN115841487A (zh) * | 2023-02-20 | 2023-03-24 | 深圳金三立视频科技股份有限公司 | 一种沿输电线路的隐患定位方法及终端 |
| CN115841487B (zh) * | 2023-02-20 | 2023-06-09 | 深圳金三立视频科技股份有限公司 | 一种沿输电线路的隐患定位方法及终端 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN106444843A (zh) | 2017-02-22 |
| CN106444843B (zh) | 2019-02-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2018103689A1 (fr) | Procédé et appareil de commande d'azimut relatif pour véhicule aérien sans pilote | |
| US9977434B2 (en) | Automatic tracking mode for controlling an unmanned aerial vehicle | |
| US12416918B2 (en) | Unmanned aerial image capture platform | |
| US11649052B2 (en) | System and method for providing autonomous photography and videography | |
| US11914370B2 (en) | System and method for providing easy-to-use release and auto-positioning for drone applications | |
| US11604479B2 (en) | Methods and system for vision-based landing | |
| US11531340B2 (en) | Flying body, living body detection system, living body detection method, program and recording medium | |
| CN106292799B (zh) | 无人机、遥控装置及其控制方法 | |
| JPWO2017170148A1 (ja) | 飛行装置、電子機器およびプログラム | |
| TW201706970A (zh) | 無人飛機導航系統及方法 | |
| US20220342428A1 (en) | Unmanned aerial vehicles | |
| KR102486769B1 (ko) | 탐지 상황에 따라 자동으로 이동 경로를 설정하는 무인 항공기, 및 운용 방법 | |
| CN106647788B (zh) | 无人机飞行控制方法及装置 | |
| US10557718B2 (en) | Auxiliary control method and system for unmanned aerial vehicle | |
| US11354897B2 (en) | Output control apparatus for estimating recognition level for a plurality of taget objects, display control system, and output control method for operating output control apparatus |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17879044 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17879044 Country of ref document: EP Kind code of ref document: A1 |