WO2017181511A1 - 终端设备和无人驾驶飞行器的控制系统 - Google Patents
终端设备和无人驾驶飞行器的控制系统 Download PDFInfo
- Publication number
- WO2017181511A1 WO2017181511A1 PCT/CN2016/086313 CN2016086313W WO2017181511A1 WO 2017181511 A1 WO2017181511 A1 WO 2017181511A1 CN 2016086313 W CN2016086313 W CN 2016086313W WO 2017181511 A1 WO2017181511 A1 WO 2017181511A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- aerial vehicle
- unmanned aerial
- control command
- state
- display state
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0016—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/08—Control of attitude, i.e. control of roll, pitch, or yaw
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0038—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Definitions
- the invention relates to the technical field of unmanned aerial vehicles, in particular to a terminal device, a control system of an unmanned aerial vehicle and an unmanned aerial vehicle control system based on the terminal device.
- FPV is the abbreviation of English First Person View, which is “first person main perspective”. It is a kind of wireless camera back-transfer equipment installed on the drone (or unmanned aerial vehicle) or vehicle model. The method of manipulating the model. This method is widely used in consumer drones and is a combination of video codec technology, wireless transmission technology and flight control technology. The main idea of this method is to collect video data, transmit it remotely and then display it. An unmanned aerial vehicle operator on the ground can view images captured by the unmanned aerial vehicle through the display and control the aircraft based on the images.
- the target object may be offset or even lost in the picture, or out of focus.
- the target object is always kept in the preset display state (for example, always in the center of the screen, always maintain a certain size, always maintain a certain clarity, etc.) Users usually need to take pictures of the camera. Remote adjustment is made, which greatly increases the difficulty of the user's operation.
- the technical problem to be solved by the present invention is how to simplify the user's control operation on the unmanned aerial vehicle to achieve flight and shooting control centered on the target object.
- a terminal device characterized in that the terminal device comprises: a display for displaying a picture captured by an image acquisition device on an unmanned aerial vehicle; an acquisition component, collecting and identifying a user a gesture signal; an instruction generating component, configured to generate a control command according to the gesture signal, the control command is used to control a flight state of the unmanned aerial vehicle and a shooting state of the image capturing device on the unmanned aerial vehicle; The control command is sent to the unmanned aerial vehicle to control the flight state of the unmanned aerial vehicle and the photographing state of the image capturing device on the unmanned aerial vehicle such that the photographed target object is preset on the display The display status is displayed.
- a control system for an unmanned aerial vehicle comprises: a receiving component that receives a control command from the terminal device, the control command corresponding to a gesture signal of the user, Controlling the flight state of the unmanned aerial vehicle and the photographing state of the image capturing device on the unmanned aerial vehicle; the command conversion unit converting the received control command into a flight control command for controlling the flight state of the unmanned aerial vehicle, And a photographing control command for controlling a photographing state of the image capturing device on the unmanned aerial vehicle; a driving component that changes a driving state of the unmanned aerial vehicle according to the flight control command to change a flying state of the unmanned aerial vehicle; The image capturing device changes the shooting state according to the shooting control command, so that the captured target object is displayed in a preset display state in the captured image; and the transmitting component transmits the captured image to the terminal device.
- a terminal device-based unmanned aerial vehicle control system comprising: the aforementioned terminal system and a control system for an unmanned aerial vehicle, wherein the control system is mounted on an unmanned aerial vehicle .
- the display is displayed in a preset display state, thereby realizing simultaneous control of the flight state and shooting state of the aircraft with a simple one-hand operation, and achieving target-centered flight and simple operation with simple operation.
- Shooting control reduces operational complexity.
- FIG. 1 is a block diagram showing a terminal device according to an embodiment of the present invention.
- Figure 2a shows a gesture signal with a single contact moving down.
- Figure 2b shows a gesture signal with a single contact moving up.
- Figure 2c shows a gesture signal with a single contact moving to the right.
- Figure 2d shows the gesture signal with a single contact moving to the left.
- Figure 2e shows the gesture signal with the two contacts facing away from movement.
- Figure 2f shows a gesture signal with two contacts approaching movement.
- Figure 2g shows a gesture signal with two contacts moving down in parallel.
- Figure 2h shows a gesture signal with two contacts moving up in parallel.
- Figure 2i shows a gesture signal for the two contacts moving clockwise.
- Figure 2j shows a gesture signal for two contacts rotating counterclockwise.
- FIG. 3 is a block diagram showing a control system of an unmanned aerial vehicle according to an embodiment of the present invention.
- FIG. 4a is a flow chart showing a control method corresponding to a single-contact downward movement gesture signal for a control command received by an unmanned aerial vehicle, in accordance with an embodiment of the present invention.
- 4b is a flow chart showing a control method corresponding to a single-contact upward movement gesture signal for a control command received by an unmanned aerial vehicle according to another example of an embodiment of the present invention.
- 4c is a flow chart showing a control method corresponding to a control signal received by an unmanned aerial vehicle and a single-touch rightward moving gesture signal according to still another example of an embodiment of the present invention.
- 4d is a flow chart showing a control method corresponding to a control signal received by an unmanned aerial vehicle and a single-touch leftward gesture signal, in accordance with an exemplary embodiment of the present invention.
- 4e is a flow chart showing a control method corresponding to a control command received by an unmanned aerial vehicle and a gesture signal in which two contacts face away from movement according to another example of an embodiment of the present invention.
- 4f is a flow chart showing a control method corresponding to a control command received by an unmanned aerial vehicle and a gesture signal in which two contacts are close to move, according to still another example of an embodiment of the present invention.
- 4g illustrates a flow chart of a control method corresponding to a control command received by an unmanned aerial vehicle and a gesture signal in which two contacts move in parallel downward, in accordance with an embodiment of the present invention.
- 4h is a flow chart showing a control method corresponding to a control command received by an unmanned aerial vehicle and a gesture signal in which two contacts move in parallel according to another example of an embodiment of the present invention.
- 4i is a flow chart showing a control method corresponding to a control command received by an unmanned aerial vehicle and a gesture signal of two contacts rotating clockwise according to still another example of an embodiment of the present invention.
- 4j illustrates a flow chart of a control method corresponding to a control signal received by an unmanned aerial vehicle and a gesture signal of two contacts rotating counterclockwise in accordance with an embodiment of the present invention.
- FIG. 5 is a block diagram showing the structure of an unmanned aerial vehicle control system based on a terminal device according to an embodiment of the present invention.
- FIG. 6 is a block diagram showing the structure of a terminal device according to another embodiment of the present invention.
- FIG. 1 is a block diagram showing a terminal device according to an embodiment of the present invention.
- the terminal device 1 mainly includes: a display 101 for displaying a picture captured by an image acquisition device on an unmanned aerial vehicle; an acquisition unit 102 for acquiring and recognizing a gesture signal of the user; and an instruction generation unit 103, according to the a gesture signal generating control command for controlling a flight state of the unmanned aerial vehicle and a shooting state of the image capturing device on the unmanned aerial vehicle; the communication component 104 transmitting the control command to the unmanned aerial vehicle And controlling the flight state of the unmanned aerial vehicle and the photographing state of the image capturing device on the unmanned aerial vehicle such that the photographed target object is displayed on the display 101 in a preset display state.
- the terminal device identifies the gesture signal of the user, and generates a control instruction according to the corresponding gesture signal to control the flight state of the unmanned aerial vehicle and the shooting state of the image capturing device on the unmanned aerial vehicle, thereby enabling the captured image.
- the target object is displayed on the display in a preset display state, thereby achieving simultaneous flight status of the aircraft with a simple one-handed operation. Controlled by the shooting state, the target-object-based flight and shooting control is realized with a simple operation, which reduces the operational complexity.
- the display state referred to herein may include one or more of the following: a display position of the target object in the screen displayed by the display (for example, a relative center position), and a target object occupying an area percentage of the screen (for example, two of the entire screen) One-by-one, one-quarter or other percentage), the display resolution of the target object (for example, only the overall outline of the target object needs to be resolved, or some details of the target object need to be resolved, etc.).
- a desired display state for the target object can be set to "the target object is always at the relative center of the screen, reaching 1/4 of the total area of the screen and reaching the preset definition".
- control command will adjust the flight state of the aircraft.
- shooting angles such as the shooting angle and the shooting focal length of the image capturing device are adjusted so that the target object is always displayed in the display state set as described above regardless of the flight state.
- the display 101 can be used to display a picture captured by an image acquisition device on an unmanned aerial vehicle.
- the picture can be transmitted to the terminal device via the communication device on the unmanned aerial vehicle, received via the communication device on the terminal device and provided to the display 101 for display.
- the display 101 can be any device that can be used to display an image, and can be a display with a touch display, such as a mobile phone screen display or an iPad screen display, which is not limited by the present invention.
- the acquisition component 102 can be a component for acquiring and recognizing a gesture signal of the user.
- the acquisition component 102 can make the user Gestures are captured and different gesture signals are identified.
- the gesture signal may include: a single contact moves downward (as shown in FIG. 2a, the aircraft can be controlled to descend), a single contact moves upward (as shown in FIG. 2b, the aircraft can be controlled to rise), and a single contact moves to the right.
- the aircraft can be controlled to move to the right
- the single contact moves to the left (as shown in Figure 2d, the aircraft can be moved to the left)
- the two contacts move away from each other (as shown in Figure 2e, the aircraft can be controlled close to The target object), the two contacts move close to each other (as shown in Figure 2f, the aircraft can be controlled away from the target object), and the two contacts move downward in parallel (as shown in Figure 2g, the aircraft can be controlled to move in a downward direction, such as bowing, Dive flight, etc.), the two contacts move upwards in parallel (as shown in Figure 2h, the aircraft can be moved upside down, such as head-up, up-and-forward flight, etc.), and the two contacts move clockwise (as shown in Figure 2i).
- One or more of the right-handed movement of the aircraft can be controlled and the two contacts can be rotated counterclockwise (as shown in Figure 2j, the left-handed movement of the aircraft can be controlled).
- the above gesture signals can be realized by one hand, and the operation is simple and convenient.
- the acquisition component 102 can be any component known to those skilled in the art that can implement acquisition and recognition of gesture signals, for example, can be a touch screen gesture acquisition and recognition system associated with a touch screen, which can be combined with a general purpose processor and dedicated hardware. Logic instructions are implemented.
- the instruction generating component 103 can receive the gesture signal from the acquisition component 102 and generate a control command according to the gesture signal, which can be used to control the flight state of the unmanned aerial vehicle and the shooting state of the image acquisition device on the unmanned aerial vehicle.
- a control command according to the gesture signal, which can be used to control the flight state of the unmanned aerial vehicle and the shooting state of the image acquisition device on the unmanned aerial vehicle.
- control instructions may control the flight status of the unmanned aerial vehicle and initiate adjustments to the photographing state of the image capture device.
- the current display state of the target object usually deviates from the preset display state (for example, off center, becomes too large or too small, out of focus, etc.), and the adjustment can be captured according to the image capturing device.
- Embodiment 2 For the specific manner of adjustment, refer to Embodiment 2.
- the instruction generating component 103 may include a first instruction generating unit, a second instruction generating unit, a third instruction generating unit, a fourth instruction generating unit, a fifth instruction generating unit, and a sixth instruction generating unit.
- One or more of the first to tenth control commands are generated by clockwise rotation movement and one or more gesture signals of two contacts counterclockwise rotation movements, respectively, to respectively cause the unmanned aerial vehicle to descend One or more of ascending, ascending, moving to the right, moving to the left, close to the target object, away from the target object, tilting movement, tilting movement, right rotation movement, and left rotation movement, and corresponding Initiating adjustment of the shooting state of the image capturing device, adjusting the shooting state of the image capturing device according to the difference between the current display state of the target object and the preset display state in the image captured by the image capturing device, to adapt to the driverless
- the instruction generating component 103 can be any component that can be implemented by an instruction that can be implemented by a person skilled in the art.
- the component can be implemented by a general-purpose processor in combination with the logic instruction, or can be implemented by a dedicated hardware circuit. component.
- the communication component 104 can implement signal transmission between the terminal device and the unmanned aerial vehicle, and the control command generated by the command generating component 103 is transmitted to the unmanned aerial vehicle through the communication component 104, and the unmanned aerial vehicle is controlled according to the received control command.
- the flight state of the unmanned aerial vehicle and the photographing state of the image pickup device on the unmanned aerial vehicle cause the photographed target object to be displayed on the display 101 in a preset display state.
- Communication component 104 can be any component known to those skilled in the art that can transmit communication signals.
- communication component 104 can be a fifth generation wireless fidelity 5G WIFI communication module.
- FIG. 3 is a block diagram showing a control system of an unmanned aerial vehicle according to another embodiment of the present invention.
- the control system 3 of the unmanned aerial vehicle mainly includes: a receiving part 301, which receives a control instruction from the terminal device, the control instruction corresponding to a gesture signal of the user, and is used for controlling the unmanned aerial vehicle.
- the command conversion component 302 converts the received control command into a flight control command for controlling the flight state of the unmanned aerial vehicle, and is used to control the unmanned a photographing control command for capturing a photographing state of the image capturing device on the aircraft; a driving component 303, changing a driving state of the unmanned aerial vehicle according to the flight control command to change a flying state of the unmanned aerial vehicle; the image capturing device 304, According to the shooting control command, the shooting state is changed such that the captured target object is displayed in a preset display state in the captured screen; the transmitting component 305 transmits the captured image to the terminal device.
- the embodiment receives the control command corresponding to the gesture signal to control the flight state of the unmanned aerial vehicle and the shooting state of the image capturing device on the unmanned aerial vehicle, so that the target object to be photographed is displayed in a preset manner.
- the status display makes it possible to control the flight state and shooting state of the aircraft at the same time with a simple one-hand operation, and realizes the flight and shooting control centered on the target object with a simple operation, thereby reducing the operation complexity.
- the receiving component 301 may be a component for receiving a control command from the terminal device, wherein the control command from the terminal device may correspond to a gesture signal of the touch display screen of the user touching the display, the control command being converted by the instruction conversion component 302 It can then be used to control the flight status of the unmanned aerial vehicle and the shooting status of the image acquisition device 304 on the unmanned aerial vehicle.
- the receiving component 301 can be any component known to those skilled in the art that can implement the functionality of receiving control commands, such as a communications module that can receive control commands.
- the receiving component 301 passes the received control command to the command conversion component 302, and the command conversion component 302 can convert the received control command into a corresponding flight control command for controlling the flight state of the unmanned aerial vehicle, and for controlling A shooting control command for the shooting state of the image capturing device 304 on the unmanned aerial vehicle.
- the instruction conversion component 302 can be implemented by a flight control system composed of a processor on the drone in conjunction with dedicated logic instructions.
- the flight control command for controlling the flight state of the unmanned aerial vehicle may indicate the manner and magnitude of the flight state change (eg, a drop of N meters), where the amplitude may be proportional to the magnitude of the gesture.
- the flight control command may also only indicate the manner in which the flight state changes, and the change amplitude may be a preset fixed amplitude corresponding to one command. For example, based on a control command corresponding to a single-contact up gesture, the command conversion component 302 can generate a flight control command that rises by 10 meters, which can be provided to a flight system of the aircraft (eg, a drive motor) to control aircraft changes. Its flight status.
- control system may further comprise a determining component that determines whether the aircraft has completed the change of the flight state according to the indication of the flight control instruction, and the determining component can determine, for example, whether the aircraft has moved in place according to the indication of the flight control instruction. If the determination is YES, the notification command conversion unit 302 stops outputting the flight control command to stop the aircraft from changing the flight state. For example, if the flight control command is to raise the aircraft by 10 meters, the judging means can judge whether the aircraft ascending distance reaches 10 meters according to the ascending speed and the rising time of the aircraft, and if so, notify the command conversion unit 302, thereby The plane stopped rising.
- the instruction conversion component 302 can generate a shooting control instruction to adjust the image collection according to the difference between the current display state of the target object and the preset display state in the image captured by the image capturing device 304.
- the shooting state of the device 304 e.g., shooting angle, focal length, etc.
- the shooting state of the device 304 is adapted to the flight state of the unmanned aerial vehicle.
- the instruction conversion component 302 can identify a target object in the picture (which can be based on video recognition techniques selectable by those skilled in the art) Performs) and calculates the difference between the current display state of the target object and the preset display state (for example, the distance and direction of the target object from the preset position (such as the center of the screen), the deviation from the preset size, and the deviation
- the degree of sharpness and the like may be performed by means of a video tracking technology selectable by a person skilled in the art, and the shooting control command may be a change mode and a change amount of the shooting state of the image capturing device 304 calculated based on the difference described above.
- the image acquisition device can be calculated.
- Rotation angle and direction according to the degree to which the target object deviates from the preset definition, the focus adjustment direction and amplitude of the image acquisition device can be calculated.
- a person skilled in the art can establish a corresponding relationship between the above difference and the shooting state changing manner and the amount of change of the image capturing device according to actual conditions (for example, image capturing device parameters, aircraft parameters, etc.), thereby generating a shooting control command.
- the instruction conversion component 302 can start the process of generating the shooting control instruction when receiving the control instruction from the terminal device, and monitor the current display state in real time according to the current display state and the preset The difference in status is displayed, the shooting control command is generated in real time and the process is stopped after the flight state control is completed (ie, after the flight state change of the aircraft is completed, or after moving to the designated position).
- the command conversion component 302 can continue to monitor the current display state in real time after the flight state control is completed, and generate a shooting control command in real time according to the difference between the current display state and the preset display state. To track the movement of the target object.
- control system can also include a carrier component for carrying the image acquisition device 304, such as a pan/tilt.
- the carrier member can be moved according to the shooting control command (for example, moving up (for example, rotating upward at a fixed point), moving down (for example, rotating downward at a fixed point), moving left (for example, rotating to a left point), and shifting to the right (for example, rotating to the right around a fixed point), etc.) to change the shooting angle of the image capture device 304.
- the image capture device 304 can also change the shooting focal length according to the shooting control command.
- the driving component 303 can change the driving state of the unmanned aerial vehicle according to the flight control instruction from the command conversion component 302, thereby changing the flight state of the unmanned aerial vehicle, for example, the unmanned aerial vehicle can be lowered, raised, moved to the right, Move left, close to the target object, away from the target object, tilting, tilting, right-handing, or left-handing.
- Drive component 303 can be any component known to those skilled in the art that can change the flight state of an unmanned aerial vehicle, such as a drive motor.
- the image pickup device 304 can change its photographing state in accordance with the photographing control instruction from the instruction conversion section 302, so that the photographed target object is displayed in the photographed screen in a display state set in advance by the user.
- changing the shooting state of the image capturing device 304 may be lengthening or shortening the shooting focal length, changing the shooting angle of the image capturing device in the horizontal or vertical direction, and the like.
- the image capture device 304 can be any device or component known to those skilled in the art that can implement image acquisition, such as a CCD digital camera, an optical camera, an infrared scanner, a laser scanner, etc., and the present invention does not impose any limitation.
- the sending component 305 can receive the image captured by the image capturing device 304, and send the captured image to the terminal device for real-time display and update, so that the user can operate the unmanned aerial vehicle based on the screen displayed by the terminal device to achieve the target of the user.
- Object-centric flight and shooting control can be used to control the unmanned aerial vehicle based on the screen displayed by the terminal device to achieve the target of the user.
- Transmitting component 305 can be any component known to those skilled in the art that can transmit a picture taken by image acquisition device 304, such as a 5G WIFI communication module.
- control system of the unmanned aerial vehicle receives different control commands to control the flight state of the unmanned aerial vehicle and the shooting state of the image capturing device 304 on the unmanned aerial vehicle, respectively, as an example to illustrate the difference of the embodiment.
- the examples are only for ease of understanding and are not intended to limit the invention in any way.
- FIG. 4a is a flow chart showing a control method corresponding to a single-contact downward movement gesture signal for a control command received by an unmanned aerial vehicle, in accordance with an embodiment of the present invention.
- the instruction conversion unit 302 can generate a control for control.
- a flight control command that the unmanned aerial vehicle descends the command may control the flight state of the unmanned aerial vehicle to be descending (eg, the command may include a reduced distance, or a fixed distance is decreased each time an instruction is received).
- the command conversion unit 302 can determine whether the current display state of the target object in the screen captured by the image capturing device 304 reaches a preset display state, and if it is determined that the preset display state is not reached, according to the current display.
- the difference between the state and the preset display state generates a shooting control command for moving the carrier member up, for example, the shooting control command may include an angle at which the pan/tilt is rotated upward, or a fixed angle of rotation up each time the command is received.
- FIG. 4b is a flow chart showing a control method corresponding to a single-contact upward movement gesture signal for a control command received by an unmanned aerial vehicle according to another example of an embodiment of the present invention.
- the command conversion unit 302 can generate a control for A flight control command that the human aircraft is propelled, the command may control the flight state of the unmanned aerial vehicle to rise (eg, the command may include a rising distance, or a fixed distance each time an instruction is received).
- the command conversion unit 302 can determine whether the current display state of the target object in the screen captured by the image capturing device 304 reaches a preset display state, and if it is determined that the preset display state is not reached, according to the current display.
- the difference between the state and the preset display state generates a shooting control command for moving the carrier member downward.
- the shooting control command may include an angle at which the pan/tilt is rotated downward, or a fixed angle is rotated downward every time the command is received.
- FIG. 4c is a flow chart showing a control method corresponding to a control signal received by an unmanned aerial vehicle and a single-touch rightward moving gesture signal according to still another example of an embodiment of the present invention.
- the command conversion unit 302 can generate a control for control.
- a flight control command that moves the unmanned aerial vehicle to the right
- the command may control the flight state of the unmanned aerial vehicle to move to the right (for example, the command may include a distance to the right, or each time an instruction is received to move to the right distance).
- the command conversion unit 302 can determine whether the current display state of the target object in the screen captured by the image capturing device 304 reaches a preset display state, and if it is determined that the preset display state is not reached, according to the current display.
- the difference between the state and the preset display state generates a shooting control command for moving the carrier member to the left.
- the shooting control command may include an angle at which the pan/tilt is rotated to the left, or a fixed angle is rotated to the left each time the command is received.
- FIG. 4d is a flow chart showing a control method corresponding to a control signal received by an unmanned aerial vehicle and a single-touch leftward gesture signal, in accordance with an exemplary embodiment of the present invention.
- the control method is similar to the control method shown in FIG. 4c, except that the flight state of the unmanned aerial vehicle is controlled to move to the left, and the shooting control command is to move the carrier member to the right.
- FIG. 4e is a flow chart showing a control method corresponding to a control command received by an unmanned aerial vehicle and a gesture signal of two contacts facing away from movement, as shown in FIG. 4e, in the instruction conversion, according to another example of the present invention.
- the command conversion component 302 can generate a flight for controlling the unmanned aerial vehicle approaching the target object.
- a control command that can control the flight state of the unmanned aerial vehicle to be close to the target object (eg, the command can include moving closer to the target object, or each time the command is received is close to a fixed distance relative to the target object),
- the command conversion unit 302 can determine whether the current display state of the target object in the image captured by the image capture device 304 reaches a preset display state, and if it is determined that the preset display is not reached.
- a shooting control command for causing the image capturing device 304 to lengthen the focal length is generated based on the difference between the current display state and the preset display state.
- the shooting control command may include a specific value of the focal length change, or each time the command is received, the fixed focal length is elongated.
- FIG. 4f is a flow chart showing a control method corresponding to a control command received by an unmanned aerial vehicle and a gesture signal in which two contacts are close to move, according to still another example of an embodiment of the present invention.
- the control method is similar to the control method shown in FIG. 4e, except that the flight state of the unmanned aerial vehicle is controlled to be away from the target object, and the shooting control command is to cause the image capturing device 304 to shorten the focal length.
- FIG. 4g illustrates a flow chart of a control method corresponding to a control command received by an unmanned aerial vehicle and a gesture signal in which two contacts move in parallel downward, in accordance with an embodiment of the present invention.
- the command conversion unit 302 can generate a flight control command for controlling the downward movement of the unmanned aerial vehicle, the command may control the flight state of the unmanned aerial vehicle to be a downward movement (eg, the command may include an angle that causes the aircraft to bow, or each time an instruction is received, the head is fixed angle).
- the command conversion unit 302 can determine whether the current display state of the target object in the screen captured by the image capturing device 304 reaches a preset display state, and if it is determined that the preset display state is not reached, determine the target object.
- the shooting control command may include an angle at which the pan/tilt is rotated downward, or a fixed angle is rotated downward every time the command is received; if it is determined to be biased downward, according to the current display state and the preset display
- the difference in state generates a shooting control command that causes the carrier member to move up to compensate (for example, the shooting control command may include an angle at which the pan/tilt is rotated upward, or a fixed angle of rotation up each time the command is received).
- FIG. 4h is a flow chart showing a control method corresponding to a control command received by an unmanned aerial vehicle and a gesture signal in which two contacts move in parallel according to another example of an embodiment of the present invention.
- the control The method of control is similar to the control method shown in Fig. 4g, except that the flight state of the unmanned aerial vehicle is controlled to be an upshift.
- FIG. 4i is a flow chart showing a control method corresponding to a control command received by an unmanned aerial vehicle and a gesture signal of two contacts rotating clockwise according to still another example of an embodiment of the present invention.
- the command conversion unit 302 can generate a flight control command for controlling the right-handed movement of the unmanned aerial vehicle, the command may control the flight state of the unmanned aerial vehicle to be a right-handed movement (eg, the command may include an angle of turning the aircraft right-handed, or each time an instruction is received Fixed angle).
- the command conversion unit 302 can determine whether the current display state of the target object in the screen captured by the image capturing device 304 reaches a preset display state, and if it is determined that the preset display state is not reached, determine the target object. Whether the current display position is shifted to the left or right relative to the preset display position, and if it is determined to be leftward, the camera that generates the right shift of the carrier member is compensated according to the difference between the current display state and the preset display state.
- the shooting control command may include an angle at which the pan/tilt rotates to the right, or a fixed angle is rotated to the right each time the command is received; if it is determined to be rightward, according to the current display state and the preset display The difference in state generates a shooting control command that causes the carrier member to shift to the left to compensate (for example, the shooting control command may include an angle at which the pan/tilt is rotated to the left, or a fixed angle to the left each time the command is received).
- FIG. 4j illustrates a flow chart of a control method corresponding to a control signal received by an unmanned aerial vehicle and a gesture signal of two contacts rotating counterclockwise in accordance with an embodiment of the present invention.
- the control method is similar to the control method shown in FIG. 4i, except that the flight state of the unmanned aerial vehicle is controlled to be left-handed.
- Another embodiment of the present invention further provides a terminal device-based unmanned aerial vehicle control system, the system comprising the terminal device described in Embodiment 1 and the unmanned flight described in Embodiment 2 The control system of the device, wherein the control system is mounted on an unmanned aerial vehicle.
- the system may include a terminal device 1 and a control system 3 of the unmanned aerial vehicle, by which the user applies one or more gesture signals with one hand on the display of the terminal device according to the screen displayed on the display.
- the control of the unmanned aerial vehicle can be realized to carry out the flight and shooting centered on the target object, which reduces the operation complexity of the user, makes the operation more intelligent and simple, and improves the user experience.
- FIG. 6 is a block diagram showing the structure of a terminal device according to another embodiment of the present invention.
- the terminal device 1100 may be a host server having a computing capability, a personal computer PC, or a portable computer or terminal that can be carried.
- the specific embodiments of the present invention do not limit the specific implementation of the computing node.
- the terminal device 1100 includes a processor 1110, a communication interface 1120, a memory 1130, and a bus 1140.
- the processor 1110, the communication interface 1120, and the memory 1130 complete communication with each other through the bus 1140.
- Communication interface 1120 is for communicating with network devices, including, for example, a virtual machine management center, shared storage, and the like.
- the processor 1110 is configured to execute a program.
- the processor 1110 may be a central processing unit CPU, or an Application Specific Integrated Circuit (ASIC), or one or more integrated circuits configured to implement the embodiments of the present invention.
- ASIC Application Specific Integrated Circuit
- the memory 1130 is used to store files.
- the memory 1130 may include a high speed RAM memory and may also include a non-volatile memory such as at least one disk memory.
- Memory 1130 can also be a memory array.
- the memory 1130 may also be partitioned, and the blocks may be combined into a virtual volume according to certain rules.
- the function is implemented in the form of computer software and sold or used as a stand-alone product, it is considered to some extent that all or part of the technical solution of the present invention (for example, a part contributing to the prior art) is It is embodied in the form of computer software products.
- the computer software product is typically stored in a computer readable non-volatile storage medium, including instructions for causing a computer device (which may be a personal computer, server, or network device, etc.) to perform all of the methods of various embodiments of the present invention. Or part of the steps.
- the foregoing storage medium includes various media that can store program codes, such as a USB flash drive, a removable hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk.
- the display is displayed in a preset display state, thereby realizing simultaneous control of the flight state and shooting state of the aircraft with a simple one-hand operation, and achieving target-centered flight and simple operation with simple operation.
- Shooting control reduces operational complexity.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Theoretical Computer Science (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
一种终端设备(1,1100)和无人驾驶飞行器的控制系统(3)。该终端设备(1,1100)包括:显示器(101),显示无人驾驶飞行器上的图像采集设备(304)拍摄到的画面;采集部件(102),采集和识别用户的手势信号;指令生成部件(103),根据手势信号生成控制指令,控制指令用于控制无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备(304)的拍摄状态;通信部件(104),将控制指令发送至无人驾驶飞行器,以控制所述无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备(304)的拍摄状态,使得被拍摄的目标对象在显示器(101)上以预先设定的显示状态显示。通过终端设备(1,1100)识别用户的手势信号,来对飞行器的飞行状态和拍摄状态进行控制,以简单的操作实现了以目标对象为中心的飞行和拍摄控制,降低了操作复杂度。
Description
交叉引用
本申请主张2016年4月20日提交的中国专利申请号为201610249037.7的优先权,其全部内容通过引用包含于此。
本发明涉及无人驾驶飞行器技术领域,尤其涉及一种终端设备、无人驾驶飞行器的控制系统及基于终端设备的无人驾驶飞行器控制系统。
FPV是英文First Person View的缩写,即“第一人称主视角”,它是一种在无人机(或称无人驾驶飞行器)或者车辆模型上加装无线摄像头回传设备,在地面观看显示器进而操控模型的方法。这种方法在消费类无人机中应用很普遍,是视频编解码技术,无线传输技术及飞行控制技术的结合。这种方法主体思想是采集视频数据,远程传输然后终端显示。位于地面的无人驾驶飞行器操作者可以通过显示器观看无人驾驶飞行器拍摄到的图像,并基于图像对飞行器进行控制。
在某些应用场合下,可能希望从不同角度针对特定的目标对象进行拍摄,或者跟踪拍摄移动的目标对象,即需要进行以目标对象为中心的飞行和拍摄,此时需改变飞行器的飞行状态,然而飞行器一旦改变飞行状态(例如位置移动或俯仰角改变),则目标对象就可能在画面中偏移甚至丢失,或者失焦。如果希望在对飞行器的飞行状态进行控制的过程中,所拍摄的目标对象始终保持预设的显示状态(例如始终位于画面中心区域,始终保持一定的大小,始终保持一定的清晰度等),则用户通常需要对摄像头的拍摄状态也
进行遥控调整,这大大增加了用户的操作难度。
发明内容
有鉴于此,本发明要解决的技术问题是,如何简化用户对无人驾驶飞行器的控制操作,以实现以目标对象为中心的飞行和拍摄控制。
解决方案
为了解决上述技术问题,一方面,提出了一种终端设备,其特征在于,所述终端设备包括:显示器,显示无人驾驶飞行器上的图像采集设备拍摄到的画面;采集部件,采集和识别用户的手势信号;指令生成部件,根据所述手势信号生成控制指令,所述控制指令用于控制无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备的拍摄状态;通信部件,将所述控制指令发送至无人驾驶飞行器,以控制所述无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备的拍摄状态,使得被拍摄的目标对象在所述显示器上以预先设定的显示状态显示。
另一方面,提出了一种无人驾驶飞行器的控制系统,其特征在于,所述控制系统包括:接收部件,接收来自终端设备的控制指令,所述控制指令与用户的手势信号相对应,用于控制无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备的拍摄状态;指令转换部件,将所接收的控制指令转换为用于控制无人驾驶飞行器的飞行状态的飞行控制指令,和用于控制无人驾驶飞行器上的图像采集设备的拍摄状态的拍摄控制指令;驱动部件,根据所述飞行控制指令改变对无人驾驶飞行器的驱动状态,以改变无人驾驶飞行器的飞行状态;图像采集设备,根据所述拍摄控制指令,改变拍摄状态,使得被拍摄的目标对象在所拍摄的画面中以预先设定的显示状态显示;发送部件,将所拍摄的画面发送至终端设备。
又一方面,提出了一种基于终端设备的无人驾驶飞行器控制系统,所述系统包括:前述提出的终端系统和无人驾驶飞行器的控制系统,其中所述控制系统搭载在无人驾驶飞行器上。
通过终端设备识别用户的手势信号,并根据相应的手势信号生成控制指令,以控制无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备的拍摄状态,从而使被拍摄的目标对象在所述显示器上以预先设定的显示状态显示,由此实现以简单的单手操作即可同时对飞行器的飞行状态和拍摄状态进行控制,以简单的操作实现了以目标对象为中心的飞行和拍摄控制,降低了操作复杂度。
根据下面参考附图对示例性实施例的详细说明,本发明的其它特征及方面将变得清楚。
包含在说明书中并且构成说明书的一部分的附图与说明书一起示出了本发明的示例性实施例、特征和方面,并且用于解释本发明的原理。
图1示出根据本发明一实施例的终端设备的结构图。
图2a示出单触点向下移动的手势信号。
图2b示出单触点向上移动的手势信号。
图2c示出单触点向右移动的手势信号。
图2d示出单触点向左移动的手势信号。
图2e示出两个触点背离移动的手势信号。
图2f示出两个触点靠近移动的手势信号。
图2g示出两个触点并行向下移动的手势信号。
图2h示出两个触点并行向上移动的手势信号。
图2i示出两个触点顺时针旋转移动的手势信号。
图2j示出两个触点逆时针旋转移动的手势信号。
图3示出根据本发明一实施例的无人驾驶飞行器的控制系统的结构图。
图4a示出根据本发明一实施例的一个示例的无人驾驶飞行器接收的控制指令与单触点向下移动的手势信号相对应的控制方法流程图。
图4b示出根据本发明一实施例的另一示例的无人驾驶飞行器接收的控制指令与单触点向上移动的手势信号相对应的控制方法流程图。
图4c示出根据本发明一实施例的又一示例的无人驾驶飞行器接收的控制指令与单触点向右移动的手势信号相对应的控制方法流程图。
图4d示出根据本发明一实施例的一个示例的无人驾驶飞行器接收的控制指令与单触点向左移动的手势信号相对应的控制方法流程图。
图4e示出根据本发明一实施例的另一示例的无人驾驶飞行器接收的控制指令与两个触点背离移动的手势信号相对应的控制方法流程图。
图4f示出根据本发明一实施例的又一示例的无人驾驶飞行器接收的控制指令与两个触点靠近移动的手势信号相对应的控制方法流程图。
图4g示出根据本发明一实施例的一个示例的无人驾驶飞行器接收的控制指令与两个触点并行向下移动的手势信号相对应的控制方法流程图。
图4h示出根据本发明一实施例的另一示例的无人驾驶飞行器接收的控制指令与两个触点并行向上移动的手势信号相对应的控制方法流程图。
图4i示出根据本发明一实施例的又一示例的无人驾驶飞行器接收的控制指令与两个触点顺时针旋转移动的手势信号相对应的控制方法流程图。
图4j示出根据本发明一实施例的一个示例的无人驾驶飞行器接收的控制指令与两个触点逆时针旋转移动的手势信号相对应的控制方法流程图。
图5示出了根据本发明一实施例的基于终端设备的无人驾驶飞行器控制系统的结构框图。
图6示出了本发明的另一实施例的一种终端设备的结构框图。
以下将参考附图详细说明本发明的各种示例性实施例、特征和方面。附图中相同的附图标记表示功能相同或相似的元件。尽管在附图中示出了实施例的各种方面,但是除非特别指出,不必按比例绘制附图。
在这里专用的词“示例性”意为“用作例子、实施例或说明性”。这里作为“示例性”所说明的任何实施例不必解释为优于或好于其它实施例。
另外,为了更好的说明本发明,在下文的具体实施方式中给出了众多的具体细节。本领域技术人员应当理解,没有某些具体细节,本发明同样可以实施。在另外一些实例中,对于本领域技术人员熟知的方法、手段、元件和电路未作详细描述,以便于凸显本发明的主旨。
实施例1
图1示出根据本发明一实施例的终端设备的结构图。如图1所示,该终端设备1主要包括:显示器101,显示无人驾驶飞行器上的图像采集设备拍摄到的画面;采集部件102,采集和识别用户的手势信号;指令生成部件103,根据所述手势信号生成控制指令,所述控制指令用于控制无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备的拍摄状态;通信部件104,将所述控制指令发送至无人驾驶飞行器,以控制所述无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备的拍摄状态,使得被拍摄的目标对象在所述显示器101上以预先设定的显示状态显示。
本实施例通过终端设备识别用户的手势信号,并根据相应的手势信号生成控制指令,以控制无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备的拍摄状态,从而使被拍摄的目标对象在所述显示器上以预先设定的显示状态显示,由此实现以简单的单手操作即可同时对飞行器的飞行状态
和拍摄状态进行控制,以简单的操作实现了以目标对象为中心的飞行和拍摄控制,降低了操作复杂度。
其中,本文所称的显示状态可以包括以下中的一个或多个:目标对象在显示器显示的画面中的显示位置(例如相对中心位置)、目标对象占画面的面积百分比(例如占整个画面的二分之一、四分之一或其他百分比)、目标对象的显示清晰度(例如仅需要分辨目标对象的整体轮廓,或需要分辨目标对象的某些细节等)。本领域技术人员可以根据需要设定针对目标对象的期望的显示状态。例如,可设定为“目标对象始终位于画面相对中心位置,达到画面总面积的1/4,并达到预设清晰度”,在这种情况下,控制指令将在调整飞行器的飞行状态的同时,相应地调整图像采集设备的拍摄角度和拍摄焦距等拍摄状态,使得无论飞行状态如何改变,目标对象在画面中始终以上述设定的显示状态显示。
显示器101
本实施例中,显示器101可以用于显示无人驾驶飞行器上的图像采集设备拍摄到的画面。该画面可通过无人驾驶飞行器上的通信装置发送至终端设备,经由终端设备上的通信装置接收并提供给显示器101进行显示。
显示器101可以采用本领域技术人员已知的任何可以显示图像的器件,可以是具有触摸式显示屏的显示器,例如手机屏幕显示器或iPad屏幕显示器,本发明对此不作限制。
采集部件102
采集部件102可以是用于采集和识别用户的手势信号的部件,当用户做出某一手势时,例如用户以某一手势触摸显示器的触摸式显示屏时,采集部件102可以对用户做出的手势进行采集,并识别出不同手势信号。其中,手势信号可以包括:单触点向下移动(如图2a所示,可控制飞行器下降)、单触点向上移动(如图2b所示,可控制飞行器上升)、单触点向右移动(如图
2c所示,可控制飞行器向右移动)、单触点向左移动(如图2d所示,可控制飞行器向左移动)、两个触点背离移动(如图2e所示,可控制飞行器靠近目标对象)、两个触点靠近移动(如图2f所示,可控制飞行器远离目标对象)、两个触点并行向下移动(如图2g所示,可控制飞行器俯式移动,例如低头、俯冲飞行等)、两个触点并行向上移动(如图2h所示,可控制飞行器仰式移动,例如抬头、上仰飞行等)、两个触点顺时针旋转移动(如图2i所示,可控制飞行器右旋移动)和两个触点逆时针旋转移动(如图2j所示,可控制飞行器左旋移动)中的一种或多种。以上手势信号均可通过单手实现,操作简单方便。
采集部件102可以是本领域技术人员已知的任何可以实现对手势信号进行采集和识别的部件,例如,可以是与触摸屏相关联的触摸屏手势采集和识别系统,可通过专用硬件结合通用处理器和逻辑指令来实现。
指令生成部件103
指令生成部件103可以接收来自采集部件102的手势信号,并根据手势信号生成控制指令,该控制指令可以用于控制无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备的拍摄状态,通过一个手势,同时实现两种相对应的控制,使得被拍摄的目标对象在所述显示器101上以预先设定的显示状态显示。
在一个示例中,控制指令可控制所述无人驾驶飞行器的飞行状态,并启动针对图像采集设备的拍摄状态的调整。飞行器的飞行状态发生变化时,目标对象的当前显示状态通常会偏离预先设定的显示状态(例如偏离中心、变得过大或过小、失焦等),该调整可根据图像采集设备拍摄到的画面中目标对象的当前显示状态与所述预先设定的显示状态之间的差别,调整图像采集设备的拍摄状态,以适应所述无人驾驶飞行器的飞行状态。调整的具体方式可参见实施例2。
在一种可能的实现方式中,指令生成部件103可以包括第一指令生成单元、第二指令生成单元、第三指令生成单元、第四指令生成单元、第五指令生成单元、第六指令生成单元、第七指令生成单元、第八指令生成单元、第九指令生成单元、第十指令生成单元中的一个或多个,以根据采集部件102采集到的单触点向下移动、单触点向上移动、单触点向右移动、单触点向左移动、两个触点背离移动、两个触点靠近移动、两个触点并行向下移动、两个触点并行向上移动、两个触点顺时针旋转移动和两个触点逆时针旋转移动中的一种或多种手势信号,分别生成第一至第十控制指令中的一种或多种,以分别使无人驾驶飞行器进行下降、上升、向右移动、向左移动、靠近目标对象、远离目标对象、俯式移动、仰式移动、右旋移动和左旋移动中的一种或多种,并相应地启动针对图像采集设备的拍摄状态的调整,根据图像采集设备拍摄到的画面中目标对象的当前显示状态与预先设定的显示状态之间的差别调整图像采集设备的拍摄状态,以适应无人驾驶飞行器的上述飞行状态。
指令生成部件103可以是本领域技术人员已知的任何可以实现指令生成的部件,例如可以是通过通用处理器结合逻辑指令来实现指令生成的部件,也可以是通过专用硬件电路来实现指令生成的部件。
通信部件104
通信部件104可以实现终端设备与无人驾驶飞行器之间的信号传输,由指令生成部件103生成的控制指令通过通信部件104发送至无人驾驶飞行器,进而无人驾驶飞行器根据接收到的控制指令控制无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备的拍摄状态,使得被拍摄的目标对象在所述显示器101上以预先设定的显示状态显示。
通信部件104可以是本领域技术人员已知的任何可以对通信信号进行传输的部件。例如,通信部件104可以是第五代无线保真5G WIFI通信模块。
实施例2
图3示出根据本发明另一实施例的无人驾驶飞行器的控制系统的结构图。如图3所示,该无人驾驶飞行器的控制系统3主要包括:接收部件301,接收来自终端设备的控制指令,所述控制指令与用户的手势信号相对应,用于控制无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备的拍摄状态;指令转换部件302,将所接收的控制指令转换为用于控制无人驾驶飞行器的飞行状态的飞行控制指令,和用于控制无人驾驶飞行器上的图像采集设备的拍摄状态的拍摄控制指令;驱动部件303,根据所述飞行控制指令改变对无人驾驶飞行器的驱动状态,以改变无人驾驶飞行器的飞行状态;图像采集设备304,根据所述拍摄控制指令,改变拍摄状态,使得被拍摄的目标对象在所拍摄的画面中以预先设定的显示状态显示;发送部件305,将所拍摄的画面发送至终端设备。
本实施例通过接收与手势信号相对应的控制指令,以控制无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备的拍摄状态,从而使被拍摄的目标对象以预先设定的显示状态显示,由此实现以简单的单手操作即可同时对飞行器的飞行状态和拍摄状态进行控制,以简单的操作实现了以目标对象为中心的飞行和拍摄控制,降低了操作复杂度。
接收部件301
接收部件301可以是用于接收来自终端设备的控制指令的部件,其中,来自终端设备的控制指令可以与用户触摸显示器的触摸式显示屏的手势信号相对应,该控制指令经过指令转换部件302转换后可以用于控制无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备304的拍摄状态。
接收部件301可以是本领域技术人员已知的任何可以实现接收控制指令功能的部件,例如可以接收控制指令的通信模块。
指令转换部件302
接收部件301将接收到的控制指令传递给指令转换部件302,指令转换部件302可以将接收到的控制指令转换成相应的用于控制无人驾驶飞行器的飞行状态的飞行控制指令,和用于控制无人驾驶飞行器上的图像采集设备304的拍摄状态的拍摄控制指令。
在一个示例中,指令转换部件302可通过无人机上由处理器构成的飞控系统结合专用的逻辑指令来实现。
在一种可能的实现方式中,用于控制无人驾驶飞行器的飞行状态的飞行控制指令中可指示飞行状态改变的方式和幅度(例如下降N米),其中幅度可以与手势的幅度成比例。在另一种可能的实现方式中,飞行控制指令也可仅指示飞行状态改变的方式,改变幅度可以为事先设定的、对应于一次指令的固定幅度。举例来说,根据一次单触点向上的手势对应的控制指令,指令转换部件302可生成上升10米的飞行控制指令,该控制指令可提供给飞行器的驱动系统(例如驱动电机)以控制飞行器改变其飞行状态。
在一种可能的实现方式中,控制系统还可包括判断飞行器是否已经按照飞行控制指令的指示完成飞行状态的改变的判断部件,判断部件例如可判断飞行器是否已经按照飞行控制指令的指示移动到位,并在判断为是的情况下,通知指令转换部件302停止输出飞行控制指令,使飞行器停止飞行状态的改变。举例来说,如果飞行控制指令为使飞机上升10米,则判断部件可以根据飞机上升速度和上升时间,判断飞机上升距离是否达到10米,在达到的情况下,通知指令转换部件302,从而使飞机停止上升。
在一个可能的实现方式中,指令转换部件302可以根据图像采集设备304拍摄到的画面中目标对象的当前显示状态与预先设定的显示状态之间的差别,生成拍摄控制指令,以调整图像采集设备304的拍摄状态(例如拍摄角度、焦距等)来适应所述无人驾驶飞行器的飞行状态。例如,指令转换部件302可识别画面中的目标对象(可基于本领域技术人员可选择的视频识别技
术进行),并计算目标对象的当前显示状态与预先设定的显示状态之间的差别(例如目标对象偏离预设位置(例如画面中心)的距离和方向、偏离预设大小的幅度、偏离预设清晰度的程度等,可借助于本领域技术人员可以选择的视频跟踪技术进行),所述拍摄控制指令可以是基于上述差别计算得出的图像采集设备304拍摄状态的改变方式和改变量,例如上、下、左、右移动的距离,围绕固定点的旋转方向和角度、拉长、缩短焦距的数值等,例如,根据目标对象偏离预设位置的距离和方向,可以计算图像采集设备的旋转角度和方向,根据目标对象偏离预设清晰度的程度,可以计算图像采集设备的焦距调整方向和幅度等。本领域技术人员可以根据实际情况(例如图像采集设备参数,飞行器参数等)建立上述差别与图像采集设备的拍摄状态改变方式和改变量之间的对应关系,从而生成拍摄控制指令。
在一种可能的实现方式中,指令转换部件302可在接收到来自终端设备的控制指令时,开始上述生成拍摄控制指令的过程,实时地监测当前显示状态,根据当前显示状态与预先设定的显示状态的差别,实时地产生拍摄控制指令并在飞行状态控制完成后(即飞行器飞行状态改变完成,或移动到指定位置后),停止该过程。在另一种可能的实现方式中,指令转换部件302也可以在飞行状态控制完成后继续实时地监测当前显示状态,根据当前显示状态与预先设定的显示状态的差别,实时地产生拍摄控制指令,以跟踪目标对象的移动。
在一个示例中,控制系统还可包括用于承载图像采集设备304的承载部件,例如云台。其中,承载部件可根据拍摄控制指令而运动(例如上移(例如绕固定点向上旋转)、下移(例如绕固定点向下旋转)、左移(例如绕固定点向左旋转)、右移(例如绕固定点向右旋转)等),以改变图像采集设备304的拍摄角度。图像采集设备304也可根据所述拍摄控制指令而改变拍摄焦距。
驱动部件303
驱动部件303可以根据来自指令转换部件302的飞行控制指令改变对无人驾驶飞行器的驱动状态,从而改变无人驾驶飞行器的飞行状态,例如可以使无人驾驶飞行器下降、上升、向右移动、向左移动、靠近目标对象、远离目标对象、俯式移动、仰式移动、右旋移动或左旋移动等。
驱动部件303可以是本领域技术人员已知的任何可以改变无人驾驶飞行器的飞行状态的部件,例如驱动电机。
图像采集设备304
图像采集设备304可以根据来自指令转换部件302的拍摄控制指令,改变其拍摄状态,从而使得被拍摄的目标对象在所拍摄的画面中以用户预先设定的显示状态显示。其中,改变图像采集设备304的拍摄状态可以是拉长或缩短拍摄焦距,改变图像采集设备在水平或垂直方向上的拍摄角度等。
图像采集设备304可以是本领域技术人员已知的任何可以实现图像采集的设备或部件,例如CCD数码相机、光学相机、红外扫描仪、激光扫描仪等,本发明对此不做任何限制。
发送部件305
发送部件305可以接收图像采集设备304拍摄的画面,并将拍摄的画面发送至终端设备予以实时显示并更新,使用户可以基于终端设备显示的画面对无人驾驶飞行器进行操作,以实现用户以目标对象为中心的飞行和拍摄控制。
发送部件305可以是本领域技术人员已知的任何可以发送图像采集设备304拍摄的画面的部件,例如5G WIFI通信模块。
以下分别以无人驾驶飞行器的控制系统接收到不同的控制指令对无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备304的拍摄状态进行控制为例,来说明本实施例的不同示例,这些示例仅为了便于理解,而不以任何方式限制本发明。
图4a示出根据本发明一实施例的一个示例的无人驾驶飞行器接收的控制指令与单触点向下移动的手势信号相对应的控制方法流程图。如图4a所示,在指令转换部件302所接收的控制指令与单触点向下移动的手势信号相对应的情况下,根据所述控制指令,一方面,指令转换部件302可以生成用于控制无人驾驶飞行器下降的飞行控制指令,该指令可以控制无人驾驶飞行器的飞行状态为下降(例如该指令可以包括下降的距离,或者每次收到指令下降固定的距离)。在控制飞行器下降的过程中,可判断无人驾驶飞行器是否到达设定的距地面最低点,如果判断未到达该最低点,则返回继续控制无人驾驶飞行器下降,如果达到最低点,则停止飞行器的下降。另一方面,指令转换部件302可判断图像采集设备304拍摄到的画面中目标对象的当前显示状态是否达到预先设定的显示状态,如果判断未达到预先设定的显示状态,则根据当前的显示状态与预先设定的显示状态的差别,生成使承载部件上移的拍摄控制指令,例如该拍摄控制指令可以包括云台向上旋转的角度,或者每次收到指令向上旋转固定的角度。
图4b示出根据本发明一实施例的另一示例的无人驾驶飞行器接收的控制指令与单触点向上移动的手势信号相对应的控制方法流程图。如图4b所示,在指令转换部件302所接收的控制指令与单触点向上移动的手势信号相对应的情况下,根据所述控制指令,一方面,指令转换部件302可以生成用于控制无人驾驶飞行器上升的飞行控制指令,该指令可以控制无人驾驶飞行器的飞行状态为上升(例如该指令可以包括上升的距离,或每次收到指令上升固定的距离)。另一方面,指令转换部件302可判断图像采集设备304拍摄到的画面中目标对象的当前显示状态是否达到预先设定的显示状态,如果判断未达到预先设定的显示状态,则根据当前的显示状态与预先设定的显示状态的差别,生成使承载部件下移的拍摄控制指令,例如该拍摄控制指令可以包括云台向下旋转的角度,或者每次收到指令向下旋转固定的角度。
图4c示出根据本发明一实施例的又一示例的无人驾驶飞行器接收的控制指令与单触点向右移动的手势信号相对应的控制方法流程图。如图4c所示,在指令转换部件302所接收的控制指令与单触点向右移动的手势信号相对应的情况下,根据所述控制指令,一方面,指令转换部件302可以生成用于控制无人驾驶飞行器向右移动的飞行控制指令,该指令可以控制无人驾驶飞行器的飞行状态为向右移动(例如该指令可以包括向右移动的距离,或者每次收到指令向右移动固定的距离)。另一方面,指令转换部件302可判断图像采集设备304拍摄到的画面中目标对象的当前显示状态是否达到预先设定的显示状态,如果判断未达到预先设定的显示状态,则根据当前的显示状态与预先设定的显示状态的差别,生成使承载部件左移的拍摄控制指令,例如该拍摄控制指令可以包括云台向左旋转的角度,或者每次收到指令向左旋转固定的角度。
图4d示出根据本发明一实施例的一个示例的无人驾驶飞行器接收的控制指令与单触点向左移动的手势信号相对应的控制方法流程图。该控制方法与图4c所示的控制方法类似,不同之处在于,控制无人驾驶飞行器的飞行状态为向左移动,拍摄控制指令为使承载部件右移。
图4e示出根据本发明一实施例的另一示例的无人驾驶飞行器接收的控制指令与两个触点背离移动的手势信号相对应的控制方法流程图,如图4e所示,在指令转换部件302所接收的控制指令与两个触点背离移动的手势信号相对应的情况下,根据所述控制指令,一方面,指令转换部件302可以生成用于控制无人驾驶飞行器靠近目标对象的飞行控制指令,该指令可以控制无人驾驶飞行器的飞行状态为靠近目标对象(例如该指令可以包括靠近目标对象移动的距离,或者每次收到指令相对于目标对象靠近固定的距离),另一方面,指令转换部件302可判断图像采集设备304拍摄到的画面中目标对象的当前显示状态是否达到预先设定的显示状态,如果判断未达到预先设定的显
示状态,则根据当前的显示状态与预先设定的显示状态的差别,生成使图像采集设备304拉长焦距的拍摄控制指令。例如该拍摄控制指令可以包括焦距变化的具体数值,或者每次收到指令拉长固定的焦距。
图4f示出根据本发明一实施例的又一示例的无人驾驶飞行器接收的控制指令与两个触点靠近移动的手势信号相对应的控制方法流程图。该控制方法与图4e所示的控制方法类似,不同之处在于,控制无人驾驶飞行器的飞行状态为远离目标对象,拍摄控制指令为使图像采集设备304缩短焦距。
图4g示出根据本发明一实施例的一个示例的无人驾驶飞行器接收的控制指令与两个触点并行向下移动的手势信号相对应的控制方法流程图。如图4g所示,在指令转换部件302所接收的控制指令与两个触点并行向下移动的手势信号相对应的情况下,根据所述控制指令,一方面,指令转换部件302可以生成用于控制无人驾驶飞行器俯式移动的飞行控制指令,该指令可以控制无人驾驶飞行器的飞行状态为俯式移动(例如该指令可以包括使飞行器低头的角度,或者每次收到指令低头固定的角度)。另一方面,指令转换部件302可判断图像采集设备304拍摄到的画面中目标对象的当前显示状态是否达到预先设定的显示状态,如果判断未达到预先设定的显示状态,则判断目标对象的当前显示位置相对于预先设定的显示位置是否偏上或偏下,如果判断为偏上,则根据当前的显示状态与预先设定的显示状态的差别,生成使承载部件下移进行补偿的拍摄控制指令(例如该拍摄控制指令可以包括云台向下旋转的角度,或者每次收到指令向下旋转固定的角度);如果判断为偏下,则根据当前的显示状态与预先设定的显示状态的差别,生成使承载部件上移进行补偿的拍摄控制指令(例如该拍摄控制指令可以包括云台向上旋转的角度,或者每次收到指令向上旋转固定的角度)。
图4h示出根据本发明一实施例的另一示例的无人驾驶飞行器接收的控制指令与两个触点并行向上移动的手势信号相对应的控制方法流程图。该控
制方法与图4g所示的控制方法类似,不同之处在于,控制无人驾驶飞行器的飞行状态为仰式移动。
图4i示出根据本发明一实施例的又一示例的无人驾驶飞行器接收的控制指令与两个触点顺时针旋转移动的手势信号相对应的控制方法流程图。如图4i所示,在指令转换部件302所接收的控制指令与两个触点顺时针旋转移动的手势信号相对应的情况下,根据所述控制指令,一方面,指令转换部件302可以生成用于控制无人驾驶飞行器右旋移动的飞行控制指令,该指令可以控制无人驾驶飞行器的飞行状态为右旋移动(例如该指令可以包括使飞行器右旋的角度,或者每次收到指令右旋固定的角度)。另一方面,指令转换部件302可判断图像采集设备304拍摄到的画面中目标对象的当前显示状态是否达到预先设定的显示状态,如果判断未达到预先设定的显示状态,则判断目标对象的当前显示位置相对于预先设定的显示位置是否偏左或偏右,如果判断为偏左,则根据当前的显示状态与预先设定的显示状态的差别,生成使承载部件右移进行补偿的拍摄控制指令(例如该拍摄控制指令可以包括云台向右旋转的角度,或者每次收到指令向右旋转固定的角度);如果判断为偏右,则根据当前的显示状态与预先设定的显示状态的差别,生成使承载部件左移进行补偿的拍摄控制指令(例如该拍摄控制指令可以包括云台向左旋转的角度,或者每次收到指令向左旋转固定的角度)。
图4j示出根据本发明一实施例的一个示例的无人驾驶飞行器接收的控制指令与两个触点逆时针旋转移动的手势信号相对应的控制方法流程图。该控制方法与图4i所示的控制方法类似,不同之处在于,控制无人驾驶飞行器的飞行状态为左旋移动。
实施例3
本发明的另一实施例还提出了一种基于终端设备的无人驾驶飞行器控制系统,该系统包括实施例1所述的终端设备和实施例2所述的无人驾驶飞行
器的控制系统,其中所述控制系统搭载在无人驾驶飞行器上。
如图5所示,该系统可以包括终端设备1和无人驾驶飞行器的控制系统3,利用该系统,用户根据显示器上显示的画面,在终端设备的显示器上单手作用一个或多个手势信号,就可以实现对无人驾驶飞行器的控制,以进行以目标对象为中心的飞行和拍摄,降低了用户的操作复杂度,使操作更加智能、简单,提高了用户体验。
实施例4
图6示出了本发明的另一个实施例的一种终端设备的结构框图。所述终端设备1100可以是具备计算能力的主机服务器、个人计算机PC、或者可携带的便携式计算机或终端等。本发明具体实施例并不对计算节点的具体实现做限定。
所述终端设备1100包括处理器(processor)1110、通信接口(Communications Interface)1120、存储器(memory)1130和总线1140。其中,处理器1110、通信接口1120、以及存储器1130通过总线1140完成相互间的通信。
通信接口1120用于与网络设备通信,其中网络设备包括例如虚拟机管理中心、共享存储等。
处理器1110用于执行程序。处理器1110可能是一个中央处理器CPU,或者是专用集成电路ASIC(Application Specific Integrated Circuit),或者是被配置成实施本发明实施例的一个或多个集成电路。
存储器1130用于存放文件。存储器1130可能包含高速RAM存储器,也可能还包括非易失性存储器(non-volatile memory),例如至少一个磁盘存储器。存储器1130也可以是存储器阵列。存储器1130还可能被分块,并且所述块可按一定的规则组合成虚拟卷。
本领域普通技术人员可以意识到,本文所描述的实施例中的各示例性单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。
这些功能究竟以硬件还是软件形式来实现,取决于技术方案的特定应用和设计约束条件。专业技术人员可以针对特定的应用选择不同的方法来实现所描述的功能,但是这种实现不应认为超出本发明的范围。
如果以计算机软件的形式来实现所述功能并作为独立的产品销售或使用时,则在一定程度上可认为本发明的技术方案的全部或部分(例如对现有技术做出贡献的部分)是以计算机软件产品的形式体现的。该计算机软件产品通常存储在计算机可读取的非易失性存储介质中,包括若干指令用以使得计算机设备(可以是个人计算机、服务器、或者网络设备等)执行本发明各实施例方法的全部或部分步骤。而前述的存储介质包括U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以所述权利要求的保护范围为准。
实用性
通过终端设备识别用户的手势信号,并根据相应的手势信号生成控制指令,以控制无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备的拍摄状态,从而使被拍摄的目标对象在所述显示器上以预先设定的显示状态显示,由此实现以简单的单手操作即可同时对飞行器的飞行状态和拍摄状态进行控制,以简单的操作实现了以目标对象为中心的飞行和拍摄控制,降低了操作复杂度。
Claims (12)
- 一种终端设备,其特征在于,所述终端设备包括:显示器,显示无人驾驶飞行器上的图像采集设备拍摄到的画面;采集部件,采集和识别用户的手势信号;指令生成部件,根据所述手势信号生成控制指令,所述控制指令用于控制无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备的拍摄状态;通信部件,将所述控制指令发送至无人驾驶飞行器,以控制所述无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备的拍摄状态,使得被拍摄的目标对象在所述显示器上以预先设定的显示状态显示。
- 根据权利要求1所述的终端设备,其特征在于,所述控制指令控制所述无人驾驶飞行器的飞行状态,并启动针对图像采集设备的拍摄状态的调整,所述调整根据图像采集设备拍摄到的画面中目标对象的当前显示状态与所述预先设定的显示状态之间的差别,调整图像采集设备的拍摄状态,以适应所述无人驾驶飞行器的飞行状态。
- 根据权利要求1所述的终端设备,其特征在于,其中,用户的手势信号包括用户触摸所述显示器的触摸式显示屏的手势信号,所述手势信号包括以下中的一种或多种:单触点向下移动、单触点向上移动、单触点向右移动、单触点向左移动、两个触点背离移动、两个触点靠近移动、两个触点并行向下移动、两个触点并行向上移动、两个触点顺时针旋转移动和两个触点逆时针旋转移动。
- 根据权利要求2所述的终端设备,其特征在于,其中,所述指令生成部件包括以下中的一个或多个:第一指令生成单元,根据单触点向下移动的手势信号,生成第一控制指令,该第一控制指令使无人驾驶飞行器下降,并启动针对图像采集设备的拍摄状态的调整,以适应无人驾驶飞行器的所述下降;第二指令生成单元,根据单触点向上移动的手势信号,生成第二控制指令,该第二控制指令使无人驾驶飞行器上升,并启动针对图像采集设备的拍摄状态的调整,以适应无人驾驶飞行器的所述上升;第三指令生成单元,根据单触点向右移动的手势信号,生成第三控制指令,该第三控制指令使无人驾驶飞行器向右移动,并启动针对图像采集设备的拍摄状态的调整,以适应无人驾驶飞行器的所述向右移动;第四指令生成单元,根据单触点向左移动的手势信号,生成第四控制指令,该第四控制指令使无人驾驶飞行器向左移动,并启动针对图像采集设备的拍摄状态的调整,以适应无人驾驶飞行器的所述向左移动;第五指令生成单元,根据两个触点背离移动的手势信号,生成第五控制指令,该第五控制指令使无人驾驶飞行器靠近目标对象,并启动针对图像采集设备的拍摄状态的调整,以适应无人驾驶飞行器的所述靠近目标对象;第六指令生成单元,根据两个触点靠近移动的手势信号,生成第六控制指令,该第六控制指令使无人驾驶飞行器远离目标对象,并启动针对图像采集设备的拍摄状态的调整,以适应无人驾驶飞行器的所述远离目标对象;第七指令生成单元,根据两个触点并行向下移动的手势信号,生成第七控制指令,该第七控制指令使无人驾驶飞行器俯式移动,并启动针对图像采集设备的拍摄状态的调整,以适应无人驾驶飞行器的所述俯式移动;第八指令生成单元,根据两个触点并行向上移动的手势信号,生成第八控制指令,该第八控制指令使无人驾驶飞行器仰式移动,并启动针对图像采集设备的拍摄状态的调整,以适应无人驾驶飞行器的所述仰式移动;第九指令生成单元,根据两个触点顺时针旋转移动的手势信号,生成第 九控制指令,该第九控制指令使无人驾驶飞行器右旋移动,并启动针对图像采集设备的拍摄状态的调整,以适应无人驾驶飞行器的所述右旋移动;第十指令生成单元,根据两个触点逆时针旋转移动的手势信号,生成第十控制指令,该第十控制指令使无人驾驶飞行器左旋移动,并启动针对图像采集设备的拍摄状态的调整,以适应无人驾驶飞行器的所述左旋移动。
- 根据权利要求1所述的终端设备,其特征在于,所述显示状态包括目标对象在画面中的显示位置、目标对象占画面的面积百分比、目标对象的显示清晰度中的一个或多个。
- 一种无人驾驶飞行器的控制系统,其特征在于,所述控制系统包括:接收部件,接收来自终端设备的控制指令,所述控制指令与用户的手势信号相对应,用于控制无人驾驶飞行器的飞行状态和无人驾驶飞行器上的图像采集设备的拍摄状态;指令转换部件,将所接收的控制指令转换为用于控制无人驾驶飞行器的飞行状态的飞行控制指令,和用于控制无人驾驶飞行器上的图像采集设备的拍摄状态的拍摄控制指令;驱动部件,根据所述飞行控制指令改变对无人驾驶飞行器的驱动状态,以改变无人驾驶飞行器的飞行状态;图像采集设备,根据所述拍摄控制指令,改变拍摄状态,使得被拍摄的目标对象在所拍摄的画面中以预先设定的显示状态显示;发送部件,将所拍摄的画面发送至终端设备。
- 根据权利要求6所述的无人驾驶飞行器的控制系统,其特征在于,所述指令转换部件根据图像采集设备拍摄到的画面中目标对象的当前显示状 态与所述预先设定的显示状态之间的差别,生成所述拍摄控制指令,以调整图像采集设备的拍摄状态来适应所述无人驾驶飞行器的飞行状态。
- 根据权利要求7所述的无人驾驶飞行器的控制系统,其特征在于,所述控制系统还包括:承载部件,承载所述图像采集设备,其中,所述承载部件根据所述拍摄控制指令而运动,以改变图像采集设备的拍摄角度。
- 根据权利要求7所述的无人驾驶飞行器的控制系统,其特征在于,所述图像采集设备根据所述拍摄控制指令而改变拍摄焦距。
- 根据权利要求7所述的无人驾驶飞行器的控制系统,其特征在于,所述显示状态包括目标对象在画面中的显示位置、目标对象占画面的面积百分比、目标对象的显示清晰度中的一个或多个。
- 根据权利要求8所述的无人驾驶飞行器的控制系统,其特征在于,指令转换部件具体用于以下中的一个或多个:在所接收的控制指令与单触点向下移动的手势信号相对应的情况下,根据所述控制指令生成用于控制无人驾驶飞行器下降的飞行控制指令;并且在图像采集设备拍摄到的画面中目标对象的当前显示状态未达到所述预先设定的显示状态的情况下,根据该当前显示状态与所述预先设定的显示状态的差别,生成使承载部件上移的拍摄控制指令;在所接收的控制指令与单触点向上移动的手势信号相对应的情况下,根据所述控制指令生成用于控制无人驾驶飞行器上升的飞行控制指令;并且在 图像采集设备拍摄到的画面中目标对象的当前显示状态未达到所述预先设定的显示状态的情况下,根据该当前显示状态与所述预先设定的显示状态的差别,生成使承载部件下移的拍摄控制指令;在所接收的控制指令与单触点向右移动的手势信号相对应的情况下,根据所述控制指令生成用于控制无人驾驶飞行器向右移动的飞行控制指令;并且在图像采集设备拍摄到的画面中目标对象的当前显示状态未达到所述预先设定的显示状态的情况下,根据该当前显示状态与所述预先设定的显示状态的差别,生成使承载部件左移的拍摄控制指令;在所接收的控制指令与单触点向左移动的手势信号相对应的情况下,根据所述控制指令生成用于控制无人驾驶飞行器向左移动的飞行控制指令;并且在图像采集设备拍摄到的画面中目标对象的当前显示状态未达到所述预先设定的显示状态的情况下,根据该当前显示状态与所述预先设定的显示状态的差别,生成使承载部件右移的拍摄控制指令;在所接收的控制指令与两个触点背离移动的手势信号相对应的情况下,根据所述控制指令生成用于控制无人驾驶飞行器靠近目标对象的飞行控制指令;并且在图像采集设备拍摄到的画面中目标对象的当前显示状态未达到所述预先设定的显示状态的情况下,根据该当前显示状态与所述预先设定的显示状态的差别,生成使图像采集设备拉长拍摄焦距的拍摄控制指令;在所接收的控制指令与两个触点靠近移动的手势信号相对应的情况下,根据所述控制指令生成用于控制无人驾驶飞行器远离目标对象的飞行控制指令;并且在图像采集设备拍摄到的画面中目标对象的当前显示状态未达到所述预先设定的显示状态的情况下,根据该当前显示状态与所述预先设定的显示状态的差别,生成使图像采集设备缩短拍摄焦距的拍摄控制指令;在所接收的控制指令与两个触点并行向下移动的手势信号相对应的情况下,根据所述控制指令生成用于控制无人驾驶飞行器俯式移动的飞行控制 指令;并且在图像采集设备拍摄到的画面中目标对象的当前显示状态未达到所述预先设定的显示状态的情况下,根据该当前显示状态与所述预先设定的显示状态的差别,生成使承载部件上移或下移的拍摄控制指令;在所接收的控制指令与两个触点并行向上移动的手势信号相对应的情况下,根据所述控制指令生成用于控制无人驾驶飞行器仰式移动的飞行控制指令;并且在图像采集设备拍摄到的画面中目标对象的当前显示状态未达到所述预先设定的显示状态的情况下,根据该当前显示状态与所述预先设定的显示状态的差别,生成使承载部件上移或下移的拍摄控制指令;在所接收的控制指令与两个触点顺时针旋转移动的手势信号相对应的情况下,根据所述控制指令生成用于控制无人驾驶飞行器右旋移动的飞行控制指令;并且在图像采集设备拍摄到的画面中目标对象的当前显示状态未达到所述预先设定的显示状态的情况下,根据该当前显示状态与所述预先设定的显示状态的差别,生成使承载部件左移或右移的拍摄控制指令;在所接收的控制指令与两个触点逆时针旋转移动的手势信号相对应的情况下,根据所述控制指令生成用于控制无人驾驶飞行器左旋移动的飞行控制指令;并且在图像采集设备拍摄到的画面中目标对象的当前显示状态未达到所述预先设定的显示状态的情况下,根据该当前显示状态与所述预先设定的显示状态的差别,生成使承载部件左移或右移的拍摄控制指令。
- 一种基于终端设备的无人驾驶飞行器控制系统,所述系统包括:根据权利要求1-5中任意一项所述的终端设备和根据权利要求6-11中任意一项所述的无人驾驶飞行器的控制系统,其中所述控制系统搭载在无人驾驶飞行器上。
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201610249037.7A CN105867362A (zh) | 2016-04-20 | 2016-04-20 | 终端设备和无人驾驶飞行器的控制系统 |
| CN201610249037.7 | 2016-04-20 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017181511A1 true WO2017181511A1 (zh) | 2017-10-26 |
Family
ID=56633123
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2016/086313 Ceased WO2017181511A1 (zh) | 2016-04-20 | 2016-06-17 | 终端设备和无人驾驶飞行器的控制系统 |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN105867362A (zh) |
| WO (1) | WO2017181511A1 (zh) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114137995A (zh) * | 2021-11-24 | 2022-03-04 | 广东电网有限责任公司 | 一种无人机控制系统及其控制方法 |
Families Citing this family (23)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109416535B (zh) * | 2016-05-25 | 2022-11-11 | 深圳市大疆创新科技有限公司 | 基于图像识别的飞行器导航技术 |
| CN107765709B (zh) * | 2016-08-22 | 2021-12-31 | 广州亿航智能技术有限公司 | 基于飞行器实现自拍的方法及装置 |
| CN108351651B (zh) * | 2016-09-27 | 2024-06-21 | 深圳市大疆创新科技有限公司 | 一种基于影像的控制方法、装置及飞行器 |
| CN106331508B (zh) * | 2016-10-19 | 2020-04-03 | 深圳市道通智能航空技术有限公司 | 拍摄构图的方法及装置 |
| CN106598081B (zh) * | 2016-10-27 | 2019-08-23 | 纳恩博(北京)科技有限公司 | 一种图像采集方法及电子设备 |
| CN107450573B (zh) * | 2016-11-17 | 2020-09-04 | 广州亿航智能技术有限公司 | 飞行拍摄控制系统和方法、智能移动通信终端、飞行器 |
| JP6920454B2 (ja) * | 2016-11-28 | 2021-08-18 | グリフィン、マイケル・エイマーGRIFFIN, Michael Amor | 遠隔制御装置及びシステム |
| WO2018098678A1 (zh) | 2016-11-30 | 2018-06-07 | 深圳市大疆创新科技有限公司 | 飞行器的控制方法、装置和设备以及飞行器 |
| WO2018098784A1 (zh) * | 2016-12-01 | 2018-06-07 | 深圳市大疆创新科技有限公司 | 无人机的控制方法、装置、设备和无人机的控制系统 |
| CN106973221B (zh) * | 2017-02-24 | 2020-06-16 | 北京大学 | 基于美学评价的无人机摄像方法和系统 |
| WO2018191989A1 (zh) * | 2017-04-22 | 2018-10-25 | 深圳市大疆灵眸科技有限公司 | 一种拍摄控制方法、装置 |
| CN114397903B (zh) * | 2017-05-24 | 2025-09-12 | 深圳市大疆创新科技有限公司 | 一种导航处理方法及控制设备 |
| CN107589691A (zh) * | 2017-08-11 | 2018-01-16 | 北京小米移动软件有限公司 | 无人机的拍摄控制方法及装置 |
| CN107861683B (zh) * | 2017-11-07 | 2020-08-21 | 苏州九号电子科技有限公司 | 无人机无按钮操作方法及装置 |
| CN109981972B (zh) * | 2017-12-27 | 2021-01-08 | 深圳市优必选科技有限公司 | 一种机器人的目标跟踪方法、机器人及存储介质 |
| CN109196438A (zh) * | 2018-01-23 | 2019-01-11 | 深圳市大疆创新科技有限公司 | 一种飞行控制方法、设备、飞行器、系统及存储介质 |
| WO2019148348A1 (zh) | 2018-01-31 | 2019-08-08 | 深圳市大疆创新科技有限公司 | 云台控制方法和装置 |
| CN109471450B (zh) * | 2018-03-10 | 2019-08-27 | 福建通图信息技术有限公司 | 无人机飞行高度调节方法 |
| CN110609562B (zh) * | 2018-06-15 | 2021-07-16 | 华为技术有限公司 | 一种图像信息采集方法和装置 |
| DE102018123411A1 (de) * | 2018-09-24 | 2020-03-26 | Autel Robotics Europe Gmbh | Zielbeobachtungsverfahren, zugehörige Vorrichtung und System |
| WO2020107372A1 (zh) * | 2018-11-30 | 2020-06-04 | 深圳市大疆创新科技有限公司 | 拍摄设备的控制方法、装置、设备及存储介质 |
| EP3931744A1 (en) * | 2020-04-28 | 2022-01-05 | SZ DJI Technology Co., Ltd. | System and method for operating a movable object based on human body indications |
| CN117716702A (zh) * | 2022-02-28 | 2024-03-15 | 深圳市大疆创新科技有限公司 | 图像拍摄方法、装置与可移动平台 |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080085048A1 (en) * | 2006-10-05 | 2008-04-10 | Department Of The Navy | Robotic gesture recognition system |
| TW201339903A (zh) * | 2012-03-26 | 2013-10-01 | Hon Hai Prec Ind Co Ltd | 無人飛行載具控制系統及方法 |
| CN103426282A (zh) * | 2013-07-31 | 2013-12-04 | 深圳市大疆创新科技有限公司 | 遥控方法及终端 |
| CN105100728A (zh) * | 2015-08-18 | 2015-11-25 | 零度智控(北京)智能科技有限公司 | 无人机视频跟踪拍摄系统及方法 |
| CN105425952A (zh) * | 2015-11-04 | 2016-03-23 | 腾讯科技(深圳)有限公司 | 无人机操控界面交互方法和装置 |
| CN205139708U (zh) * | 2015-10-28 | 2016-04-06 | 上海顺砾智能科技有限公司 | 一种无人机的动作识别远程控制装置 |
Family Cites Families (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8903568B1 (en) * | 2013-07-31 | 2014-12-02 | SZ DJI Technology Co., Ltd | Remote control method and terminal |
| CN103587708B (zh) * | 2013-11-14 | 2016-05-25 | 上海大学 | 超小型无人旋翼飞行器野外定点零盲区自主软着陆方法 |
| CN104159031A (zh) * | 2014-08-19 | 2014-11-19 | 湖北易瓦特科技有限公司 | 针对目标对象进行定位跟踪的方法及其设备 |
| CN104598108B (zh) * | 2015-01-02 | 2020-12-22 | 北京时代沃林科技发展有限公司 | 一种智能终端触控方式比例遥控被遥控设备的方法 |
| CN104618658B (zh) * | 2015-02-26 | 2017-11-03 | 广东欧珀移动通信有限公司 | 移动终端及其旋转摄像头控制方法 |
| CN104808799A (zh) * | 2015-05-20 | 2015-07-29 | 成都通甲优博科技有限责任公司 | 一种能够识别手势的无人机及其识别方法 |
| CN104853104B (zh) * | 2015-06-01 | 2018-08-28 | 深圳市微队信息技术有限公司 | 一种自动跟踪拍摄运动目标的方法以及系统 |
| CN105391939B (zh) * | 2015-11-04 | 2017-09-29 | 腾讯科技(深圳)有限公司 | 无人机拍摄控制方法和装置、无人机拍摄方法和无人机 |
| CN105487552B (zh) * | 2016-01-07 | 2019-02-19 | 深圳一电航空技术有限公司 | 无人机跟踪拍摄的方法及装置 |
-
2016
- 2016-04-20 CN CN201610249037.7A patent/CN105867362A/zh active Pending
- 2016-06-17 WO PCT/CN2016/086313 patent/WO2017181511A1/zh not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080085048A1 (en) * | 2006-10-05 | 2008-04-10 | Department Of The Navy | Robotic gesture recognition system |
| TW201339903A (zh) * | 2012-03-26 | 2013-10-01 | Hon Hai Prec Ind Co Ltd | 無人飛行載具控制系統及方法 |
| CN103426282A (zh) * | 2013-07-31 | 2013-12-04 | 深圳市大疆创新科技有限公司 | 遥控方法及终端 |
| CN105100728A (zh) * | 2015-08-18 | 2015-11-25 | 零度智控(北京)智能科技有限公司 | 无人机视频跟踪拍摄系统及方法 |
| CN205139708U (zh) * | 2015-10-28 | 2016-04-06 | 上海顺砾智能科技有限公司 | 一种无人机的动作识别远程控制装置 |
| CN105425952A (zh) * | 2015-11-04 | 2016-03-23 | 腾讯科技(深圳)有限公司 | 无人机操控界面交互方法和装置 |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114137995A (zh) * | 2021-11-24 | 2022-03-04 | 广东电网有限责任公司 | 一种无人机控制系统及其控制方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN105867362A (zh) | 2016-08-17 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2017181511A1 (zh) | 终端设备和无人驾驶飞行器的控制系统 | |
| US11914370B2 (en) | System and method for providing easy-to-use release and auto-positioning for drone applications | |
| CN107000839B (zh) | 无人机的控制方法、装置、设备和无人机的控制系统 | |
| US10587790B2 (en) | Control method for photographing using unmanned aerial vehicle, photographing method using unmanned aerial vehicle, mobile terminal, and unmanned aerial vehicle | |
| EP3182202B1 (en) | Selfie-drone system and performing method thereof | |
| CN108476288A (zh) | 拍摄控制方法及装置 | |
| WO2021016897A1 (zh) | 航测方法、拍摄控制方法、飞行器、终端、系统及存储介质 | |
| WO2020107372A1 (zh) | 拍摄设备的控制方法、装置、设备及存储介质 | |
| US12132993B2 (en) | Display method, imaging method and related devices | |
| CN105959625A (zh) | 控制无人机追踪拍摄的方法及装置 | |
| CN110651466A (zh) | 可移动平台的拍摄控制方法和设备 | |
| CN107450573B (zh) | 飞行拍摄控制系统和方法、智能移动通信终端、飞行器 | |
| WO2018036040A1 (zh) | 搭载于无人机云台上的智能设备的拍照方法和系统 | |
| CN106227230A (zh) | 一种无人机控制方法 | |
| CN107040716A (zh) | 控制设备运动的方法及其控制系统 | |
| CN112154391A (zh) | 确定环绕航线的方法、航拍方法、终端、无人飞行器及系统 | |
| JP5200800B2 (ja) | 撮影装置及び撮影システム | |
| WO2022056683A1 (zh) | 视场确定方法、视场确定装置、视场确定系统和介质 | |
| WO2017173502A1 (en) | Aerial devices, rotor assemblies for aerial devices, and device frameworks and methodologies configured to enable control of aerial devices | |
| CN110083174B (zh) | 无人机控制方法、装置及系统 | |
| WO2022109860A1 (zh) | 跟踪目标对象的方法和云台 | |
| JP2016058986A (ja) | 自動追尾カメラシステム | |
| WO2022000211A1 (zh) | 拍摄系统的控制方法、设备、及可移动平台、存储介质 | |
| CN119277200A (zh) | 一种基于遥控设备类型的无人设备相机视角切换方法、装置、设备和存储介质 | |
| WO2018214075A1 (zh) | 视频画面生成方法及装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16899098 Country of ref document: EP Kind code of ref document: A1 |
|
| 32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 12.04.2019) |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 16899098 Country of ref document: EP Kind code of ref document: A1 |