[go: up one dir, main page]

WO2018184232A1 - Body sensing remote control method, control apparatus, gimbal and unmanned aerial vehicle - Google Patents

Body sensing remote control method, control apparatus, gimbal and unmanned aerial vehicle Download PDF

Info

Publication number
WO2018184232A1
WO2018184232A1 PCT/CN2017/079790 CN2017079790W WO2018184232A1 WO 2018184232 A1 WO2018184232 A1 WO 2018184232A1 CN 2017079790 W CN2017079790 W CN 2017079790W WO 2018184232 A1 WO2018184232 A1 WO 2018184232A1
Authority
WO
WIPO (PCT)
Prior art keywords
remote control
somatosensory
information
command
remote
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2017/079790
Other languages
French (fr)
Chinese (zh)
Inventor
缪宝杰
刘怀宇
吴一凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to CN201780013879.5A priority Critical patent/CN108700893A/en
Priority to PCT/CN2017/079790 priority patent/WO2018184232A1/en
Publication of WO2018184232A1 publication Critical patent/WO2018184232A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0016Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C19/00Electric signal transmission systems

Definitions

  • the invention belongs to the technical field of remote control, in particular to the technical field of control of an unmanned aerial vehicle, and particularly relates to a somatosensory remote control method and a corresponding somatosensory control device, which have corresponding pan/tilt and unmanned aerial vehicles.
  • the invention can also be applied to other devices that require remote control, such as various unmanned vehicles.
  • Unmanned aerial vehicles have the advantages of good maneuverability, low cost and convenient use, and have been applied in many industries, such as aerial photography, agricultural plant protection, surveying and so on.
  • a virtual reality head-mounted display device that is, a VR (Virtual Reality) head display, including VR glasses, VR eye masks, VR helmets, and the like.
  • the UAV aircraft can be connected with the virtual reality head-mounted display device to realize the first person perspective: the image captured by the camera device on the UAV aircraft can be transmitted back to the virtual reality head-mounted display device in real time; through the mobile terminal
  • the controller can directly control the throttle, attitude angle and flight speed of the aircraft to achieve very precise control of the aircraft.
  • Currently, in the field of VR glasses technology there is no specific VR glasses product for maneuvering unmanned aerial vehicles. Most of them can only watch the picture and cannot control the attitude of the drone.
  • An aspect of the present invention provides a somatosensory remote control method for remotely executing a device by a somatosensory method, comprising: acquiring somatosensory information; generating a remote control command for controlling the execution device according to the somatosensory information; The remote command execution action is described.
  • a somatosensory control apparatus comprising: an acquisition module for acquiring somatosensory information; and an instruction generation module for generating a remote control instruction for controlling the execution device based on the somatosensory information.
  • pan/tilt head comprising a receiving module, configured to receive a remote control command, the remote control
  • the instructions are generated by the somatosensory information; the execution module is configured to perform an action according to the remote command.
  • an unmanned aerial vehicle includes a receiving module for receiving a remote control command, the remote control command being generated by the somatosensory information, and an execution module for performing an action according to the remote control command.
  • Another aspect of the present invention provides a VR body sensing device including the somatosensory control device.
  • Figure 1 shows a block diagram of one embodiment of a somatosensory control device of the present invention.
  • Fig. 2 is a block diagram showing another embodiment of the somatosensory control device of the present invention.
  • Fig. 3 is a diagram showing an embodiment of the manner of information interaction between the somatosensory control device and the executing device of the present invention.
  • Fig. 4 is a block diagram showing the module of the somatosensory control device and the executing device in the embodiment of Fig. 3.
  • 5 to 8 are schematic diagrams showing the corresponding postures of the control area of the somatosensory control device and the VR body-sensing glasses.
  • Fig. 9 is a diagram showing another embodiment of the manner of information interaction between the somatosensory control device and the executing device of the present invention.
  • FIG. 10 and 11 are block diagrams showing the configuration of the somatosensory control device, the executing device, and the remote control device of the executing device in the embodiment of Fig. 9.
  • the present invention provides a somatosensory remote control method in which the device is remotely controlled by a somatosensory method.
  • the term "soul” as used herein refers to body perception, including the perception of various body state information of a person.
  • the body state information in turn includes body motion state information, including exercise time, motion speed, motion acceleration, motion angle, motion angular velocity, motion posture, and the like.
  • an actuator refers to any device or device that can perform any action under the command, such as a drone, or a gimbal on a drone.
  • the drone is usually remotely controlled by a remote control device with a control handle, and the operator can control the handle without remote control of the drone. If the operator's own body-sensing information of the drone can be well utilized, the operator's operating load can be greatly reduced.
  • the starting point of the present invention is to utilize the operator's somatosensory information, that is, first, the somatosensory information needs to be acquired.
  • the present invention can be utilized. That is, the present invention is not limited to how to acquire somatosensory information and what type of somatosensory information.
  • the present invention entails generating a remote command for controlling the execution device based on the somatosensory information; the executing device can thereby perform an action in accordance with the remote command.
  • a drone can perform an action based on a remote command of a remote controller, which is well known.
  • the somatosensory information cannot be directly used as a remote control command. Therefore, the present invention proposes to convert the somatosensory information into a predetermined command format that can be read and recognized by the drone. Thereby, the executing device can perform the corresponding action in accordance with the remote control command in the usual manner.
  • the present invention is not limited to a specific device or element for acquiring somatosensory information, in view of convenience of handling and sense of presence, the present invention preferably uses a wearable device to collect somatosensory information.
  • the wearable device is, for example, a smart bracelet, a smart watch, a virtual helmet, smart glasses, smart sports shoes, etc.
  • the wearable device can collect motion information or posture information of at least one part of the user as the body feeling information. That is, the present invention can embed a somatosensory control device in a wearable device.
  • the present invention more preferably employs wearable devices with virtual reality (VR) functions, such as VR helmets, VR blinds. , VR glasses, etc.
  • VR virtual reality
  • the shooting picture so that the operator produces a first-view immersive feeling.
  • the existing VR device does not have the function of obtaining the sensory information of the controller.
  • the present invention proposes to make VR
  • the device has the function of obtaining somatosensory information, thereby converting the movement of a person or a part of a person into a manipulation information of the unmanned vehicle, so that the controller obtains an immersive manipulation feeling.
  • the VR glasses will be described as a carrier of the body feeling control device as an example.
  • FIG. 1 shows a block diagram of one embodiment of a somatosensory control device of the present invention.
  • the somatosensory control device 10 can be included in a VR body spectacle 1 including an acquisition module 11 and an instruction generation module 12.
  • the acquisition module 11 is configured to acquire the somatosensory information
  • the instruction generation module 12 is configured to generate a remote control instruction for controlling the execution device (not shown) according to the somatosensory information.
  • the somatosensory information acquired by the acquisition module 11 or the remote control command generated by the command generation module 12 can be directly transferred to the VR stereoscopic glasses 1 , and the VR stereoscopic glasses 1 complete the transmission of the somatosensory information or the remote control command and the interaction with the execution device.
  • Fig. 2 is a block diagram showing another embodiment of the somatosensory control device of the present invention.
  • the somatosensory control device 10 further includes a transmitting module 13. It is used to send the somatosensory information acquired by the acquisition module 11 or the remote control command generated by the instruction generation module 12 to the execution device or the remote control device of the execution device. That is, the body feeling control device 10 of this embodiment itself performs the transmission of the body feeling information or the remote control command and the interaction with the execution device.
  • Fig. 3 is a diagram showing an embodiment of the manner of information interaction between the somatosensory control device and the executing device of the present invention.
  • the somatosensory control device 10 is included in the VR body spectacle 1 which generates a remote control command based on the somatosensory information acquired by the somatosensory control device 10, and the remote control command is transmitted from the VR tactile glasses 1 to the drone.
  • the flight control module or the pan/tilt control module of the drone 2 serves as the execution device (not shown).
  • Fig. 4 is a block diagram showing the module of the somatosensory control device and the executing device in the embodiment of Fig. 3.
  • the body feeling control device 10 includes an acquisition module 11, an instruction generation module, and a transmission module.
  • the acquisition module 11 is configured to acquire the somatosensory information
  • the instruction generation module 12 is configured to generate a remote control instruction for controlling the execution device 20 according to the somatosensory information.
  • the sending module 13 is configured to send the somatosensory information acquired by the acquiring module 11 or the remote control command generated by the command generating module 12 to the executing device 20.
  • the execution device 20 includes an execution module 21, a main control module 22, and a receiving module 23.
  • the execution module is configured to perform a corresponding action in accordance with an instruction of the control module 22.
  • the execution module may be a component that controls the flight state of the drone, such as an engine, a rotor, a rudder, and the like.
  • the execution module may be an actuating mechanism that controls the attitude of the gimbal.
  • the invention is not limited to the specific configuration of the specific execution device and execution module.
  • the receiving module 23 of the executing device is configured to receive the somatosensory information or the remote control command from the transmitting module 13 of the somatosensory control device 10 by wire or wirelessly, and forward it to the control module 22.
  • the control module 22 generates a remote control command for controlling the execution module 21 according to the somatosensory information or a remote control command.
  • the control module 22 receives the somatosensory information, it first generates a remote control command according to the somatosensory information, and then generates a remote control command of the execution module 21 according to the remote control command.
  • the obtaining module 11 further includes the step of acquiring the posture information of the current body feeling control device 20 when the VR body glasses 1 is turned on before the obtaining the body feeling information.
  • the posture includes a direction, an angle, and the like, and the posture information of the body feeling control device 20 when the VR body glasses 1 is turned on is stored as reference information.
  • the step of acquiring the somatosensory information may generate the somatosensory information according to the difference between the current posture information and the reference information by acquiring the current posture information of the somatosensory control device 20. Since the sensible control device 20 is built in the VR body spectacles 1, the posture information of the sensation control device 20 can also be regarded as the posture information of the VR body spectacles 1.
  • the reference information of the VR body-spectrum 1 and the current attitude information may be acquired by inertial measurement elements, that is, the acquisition device may be implemented by an inertial measurement element, and the conventional measurement element may include a gyroscope and an accelerometer.
  • the command generation module when the VR body-spectrum 1 is turned on, the command generation module generates a back-in command, and can be sent to the execution device 20 through the sending module, so that the control module 22 of the executing device 20 controls execution.
  • the device causes the posture of the actuator 20 to be in an initial posture.
  • the initial attitude may be that the three axes of the drone or the pan/tilt are perpendicular to each other, and the heading axis (yaw axis) is a vertical direction.
  • the control area of the somatosensory control device 10 includes a control dead zone and a control active area.
  • the command generation module 12 does not generate a remote control command.
  • the control region of the somatosensory control device 10 further includes a control saturation region, and when the difference between the current posture and the reference information is greater than or equal to a preset threshold, the remote control command is unchanged.
  • FIG. 5 to 8 are schematic diagrams showing the corresponding postures of the control area of the somatosensory control device and the VR body-sensing glasses.
  • the angle of the heading axis (yaw axis) is taken as an example of the attitude information, but it should be understood that the attitude information may be other angles.
  • the acquisition module acquires the posture information of the current body feeling control device 20, and uses the posture information acquired at this time as reference information, wherein the current yaw axis angle is used as a reference yaw axis angle.
  • the yaw axis angle difference is 0.
  • the present invention sets the angle range as a dead zone, and when the yaw axis angle difference is in the dead zone, The command generation module 12 of the somatosensory control device 10 does not generate a remote command, so that no remote command is sent to the executing device.
  • the manipulation command is given to the execution device such as the drone, so that the drone and its pan/tilt do not excessively act on the user's minute movements.
  • the reaction ensures the stability of the drone's operating state.
  • the angular range of the dead zone is, for example, between -5 degrees and +5 degrees.
  • the posture balls in FIGS. 5 to 8 can be displayed on the screen of the VR body-wearing glasses 1 so that the user can intuitively feel the magnitude of the motion range and the accuracy of the operation.
  • the "control effective area" shown in the figure is entered.
  • the generation module 12 generates remote command to manipulate the actuator 30 based on the magnitude of the yaw axis deflection.
  • the difference in the angular direction of the current yaw axis direction minus the dead zone boundary is taken as the control angle ⁇ , and a remote command is generated based on the angle of the ⁇ , and the execution device is instructed to operate.
  • the yaw axis of the drone can be controlled to deflect the alpha angle or to deflect an angle associated with the alpha angle.
  • the executing device is included in the drone as the flight control module, a remote command for deflecting the unmanned machine yaw axis by a certain amount of the lever may be generated according to the alpha angle, and the lever amount has a corresponding relationship with the alpha angle.
  • the "saturation control zone" shown in the figure is entered.
  • the generation module 12 no longer generates remote command according to the magnitude of the yaw axis deflection angle, but still generates a remote command by controlling the yaw axis deflection angle corresponding to the maximum boundary of the effective area.
  • the magnitude of the motion of the controller is too large, there is no instruction to make the execution device unexecutable or dangerous or malfunctioning during execution.
  • it can ensure that the drone and its head are operating within the design tolerance to ensure the stability of flight or posture.
  • Fig. 9 is a diagram showing another embodiment of the manner of information interaction between the somatosensory control device and the executing device of the present invention.
  • the somatosensory control device 10 is included in the VR body-wearing glasses 1, and the VR-skinned eyewear 1 generates a remote control command according to the somatosensory information acquired by the somatosensory control device 10, and the remote control command is sensed by the VR.
  • the glasses 1 are sent to the drone 2.
  • Flight control module or pan/tilt control mode of drone 2 The block acts as the execution device (not shown).
  • the actuator also has a remote control device 3, such as a remote control for the drone.
  • the somatosensory information acquired by the somatosensory control device 10 or the generated remote control command may be transmitted to the remote control device 3 instead of being directly transmitted to the executing device.
  • the remote control device 3 itself has a transceiver module that can directly transfer the somatosensory information or remote control commands to the execution device 2.
  • the body feeling control device 10 includes an acquisition module 11, an instruction generation module, and a transmission module.
  • the acquisition module 11 is configured to acquire the somatosensory information;
  • the instruction generation module 12 is configured to generate a remote control instruction for controlling the execution device 20 according to the somatosensory information.
  • the sending module 13 is configured to send the somatosensory information acquired by the acquiring module 11 or the remote control command generated by the command generating module 12 to the executing device 20.
  • the execution device 20 of this embodiment also includes an execution module 21, a main control module 22, and a receiving module 23.
  • the execution module 21 is configured to perform a corresponding action according to an instruction of the control module 22.
  • the receiving module 23 of the executing device is configured to receive the somatosensory information or the remote control command from the transmitting module 13 of the somatosensory control device 10 by wire or wirelessly, and forward it to the control module 22.
  • the control module 22 generates a remote control command for controlling the execution module 21 according to the somatosensory information or a remote control command.
  • the control module 22 When the control module 22 receives the somatosensory information, it first generates a remote control command according to the somatosensory information, and then generates a remote control command of the execution module 21 according to the remote control command.
  • the body feeling control device 10 may not send the body feeling information or the remote control command directly to the execution device, but may simply transmit it to the remote control device 3.
  • the remote control device 3 itself has a transceiver module 31 and a control module 32.
  • the transceiver module 31 can directly forward the received somatosensory information or remote control command to the executing device 2, or process the received somatosensory information or remote control command and send it to the executing device. 2.
  • the processing may be converting the somatosensory information into a remote control command or a fusion of the remote control commands. The fusion of remote command is further explained below.
  • Unmanned vehicles such as drones usually have matching remote controls, and the remote control itself can generate remote commands.
  • the remote control command generated by the somatosensory control device 10 as the first remote control command
  • the remote control command generated by the remote control device as the second remote control command.
  • the second remote command can be generated by a user input element such as a joystick.
  • the remote control device fuses the first remote control command and the second remote control command to form a third remote control command.
  • the fusion may be a direct superposition, or may be superimposed according to a predetermined weight, or calculated according to a predetermined linear or non-linear formula, and finally the third remote control command is obtained.
  • the merging operation can be implemented by the remote control device 3 or directly by the execution device.
  • Figure 10 and Figure 11 show the two cases, respectively, in the case shown in Figure 10, the fusion module 33 as a remote control A portion of control module 32 of 3, in the illustrated situation of FIG. 11, fusion module 34 is a portion of control module 22 that is the execution device 2.
  • the operation of using the posture of the VR body-wearing glasses to control the yaw of the drone and the pitch of the pan-tilt is realized.
  • the yaw (left and right turn) direction of the eyeglasses controls the yaw of the aircraft, and the pitch of the eyeglasses of the eyeglasses controls the pan/tilt, and the control of the pan/tilt is consistent with the control of the pan/tilt.
  • the glasses When the VR body glasses are turned on, the glasses periodically sample the yaw axis and pitch axis data of their own IMU (Inertial Measurement Unit), calculate the angular difference (including the yaw and pitch components) from the reference pose, and then use this difference as the cloud.
  • the angle of the station or drone is offset, and a remote command is generated to be sent to the drone or its head. After the drone or pan/tilt receives the remote control command of the angle control, adjust the yaw and pitch angles of the drone or pan/tilt according to the yaw and pitch components of the angular difference indicated by the command.
  • the "sense control" has a higher priority than the "remote control”. Therefore, it is possible to set the control of the remote control device to be ineffective when the somatosensory control is set. Further, as a specific embodiment, when the VR body glasses are turned off, the periodic acquisition of the body feeling information is stopped, and at this time, a remote control command is transmitted to control the return of the drone or the pan/tilt as the executing device.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Selective Calling Equipment (AREA)

Abstract

A body sensing remote control method and a body sensing control apparatus using said method, used for remote control of an actuator by means of a body sensing mode. The body sensing remote control method comprises: acquiring body sensing information; generating a remote control command used to control the actuator according to said body sensing information; the actuator executing an action according to the remote control command. The body sensing control apparatus at least comprises: an acquisition module, used to acquire body sensing information; and a command generation module, used to generate a remote control command used to control the execution apparatus according to the body sensing information. The present invention also relates to a corresponding gimbal and unmanned aerial vehicle.

Description

体感遥控方法、控制装置、云台和无人飞行器Somatosensory remote control method, control device, pan/tilt and unmanned aerial vehicle

版权申明Copyright statement

本专利文件披露的内容包含受版权保护的材料。该版权为版权所有人所有。版权所有人不反对任何人复制专利与商标局的官方记录和档案中所存在的该专利文件或者专利披露。The disclosure of this patent document contains material that is subject to copyright protection. This copyright is the property of the copyright holder. The copyright owner has no objection to the reproduction of the patent document or patent disclosure contained in the official records and files of the Patent and Trademark Office.

技术领域Technical field

本发明属于遥控技术领域,特别是无人飞行器的控制技术领域,具体涉及一种体感遥控方法和相应的体感控制装置,以有相应的云台和无人飞行器。本发明亦可应用于其他需要遥控的设备中,例如各种无人载具。The invention belongs to the technical field of remote control, in particular to the technical field of control of an unmanned aerial vehicle, and particularly relates to a somatosensory remote control method and a corresponding somatosensory control device, which have corresponding pan/tilt and unmanned aerial vehicles. The invention can also be applied to other devices that require remote control, such as various unmanned vehicles.

背景技术Background technique

无人飞行器具有可操控性好、低成本、使用方便等优点,已经被应用于许多行业,如在航拍、农业植保、测绘等。虚拟现实头戴式显示设备,即VR(VirtualReality)头显,包括VR眼镜、VR眼罩、VR头盔等等。无人机飞行器能够与虚拟现实头戴式显示设备连接,实现第一人称视角:即无人机飞行器上的摄像设备所采集的图像可实时回传至虚拟现实头戴式显示设备上;通过移动终端,操控者可以直接控制飞行器的油门、姿态角和飞行速度等,实现对飞行器进行非常精确的操控。而目前在VR眼镜技术领域,并没有具体地针对无人飞行器进行操控的VR眼镜产品。大多都只能观看画面并无法操控无人机的姿态。Unmanned aerial vehicles have the advantages of good maneuverability, low cost and convenient use, and have been applied in many industries, such as aerial photography, agricultural plant protection, surveying and so on. A virtual reality head-mounted display device, that is, a VR (Virtual Reality) head display, including VR glasses, VR eye masks, VR helmets, and the like. The UAV aircraft can be connected with the virtual reality head-mounted display device to realize the first person perspective: the image captured by the camera device on the UAV aircraft can be transmitted back to the virtual reality head-mounted display device in real time; through the mobile terminal The controller can directly control the throttle, attitude angle and flight speed of the aircraft to achieve very precise control of the aircraft. Currently, in the field of VR glasses technology, there is no specific VR glasses product for maneuvering unmanned aerial vehicles. Most of them can only watch the picture and cannot control the attitude of the drone.

发明内容Summary of the invention

本发明的一个方面提供了一种体感遥控方法,其用于通过体感方式遥控执行装置,包括:获取体感信息;根据所述体感信息产生用于控制所述执行装置的遥控指令;执行装置根据所述遥控指令执行动作。An aspect of the present invention provides a somatosensory remote control method for remotely executing a device by a somatosensory method, comprising: acquiring somatosensory information; generating a remote control command for controlling the execution device according to the somatosensory information; The remote command execution action is described.

本发明的另一个方面提供了一种体感控制装置,包括:获取模块,用于获取体感信息;指令产生模块,用于根据所述体感信息产生用于控制执行装置的遥控指令。Another aspect of the present invention provides a somatosensory control apparatus comprising: an acquisition module for acquiring somatosensory information; and an instruction generation module for generating a remote control instruction for controlling the execution device based on the somatosensory information.

本发明的另一个方面一种云台,包括接收模块,用于接收遥控指令,所述遥控 指令由体感信息生成;执行模块,用于根据所述遥控指令执行动作。Another aspect of the invention is a pan/tilt head, comprising a receiving module, configured to receive a remote control command, the remote control The instructions are generated by the somatosensory information; the execution module is configured to perform an action according to the remote command.

本发明的另一个方面一种无人飞行器,包括接收模块,用于接收遥控指令,所述遥控指令由体感信息生成;执行模块,用于根据所述遥控指令执行动作。In another aspect of the invention, an unmanned aerial vehicle includes a receiving module for receiving a remote control command, the remote control command being generated by the somatosensory information, and an execution module for performing an action according to the remote control command.

本发明的另一个方面提供了一种VR体感设备,包括所述的体感控制装置。Another aspect of the present invention provides a VR body sensing device including the somatosensory control device.

附图说明DRAWINGS

为了更完整地理解本发明及其优势,现在将参考结合附图的以下描述,其中:For a more complete understanding of the present invention and its advantages, reference will now be made to the following description

图1显示了本发明的体感控制装置的一个实施例的模块图。Figure 1 shows a block diagram of one embodiment of a somatosensory control device of the present invention.

图2显示了本发明的体感控制装置的另一个实施例的模块图。Fig. 2 is a block diagram showing another embodiment of the somatosensory control device of the present invention.

图3显示了本发明的体感控制装置与执行装置之间的信息交互方式的一个实施例的示意图。Fig. 3 is a diagram showing an embodiment of the manner of information interaction between the somatosensory control device and the executing device of the present invention.

图4显示了图3实施例中的体感控制装置与执行装置的模块架构图。Fig. 4 is a block diagram showing the module of the somatosensory control device and the executing device in the embodiment of Fig. 3.

图5至图8显示了所述体感控制装置的控制区域与VR体感眼镜的对应姿态示意图。5 to 8 are schematic diagrams showing the corresponding postures of the control area of the somatosensory control device and the VR body-sensing glasses.

图9显示了本发明的体感控制装置与执行装置之间的信息交互方式的另一个实施例的示意图。Fig. 9 is a diagram showing another embodiment of the manner of information interaction between the somatosensory control device and the executing device of the present invention.

图10和图11显示了图9实施例中的体感控制装置、执行装置及执行装置的遥控装置的模块架构图。10 and 11 are block diagrams showing the configuration of the somatosensory control device, the executing device, and the remote control device of the executing device in the embodiment of Fig. 9.

具体实施方式detailed description

根据结合附图对本发明示例性实施例的以下详细描述,本发明的其它方面、优势和突出特征对于本领域技术人员将变得显而易见。Other aspects, advantages, and salient features of the present invention will become apparent to those skilled in the <

在本发明中,术语“包括”和“含有”及其派生词意为包括而非限制;术语“或”是包含性的,意为“和/或”。In the present invention, the terms "include" and "include" and their derivatives are intended to be inclusive and not limiting; the term "or" is inclusive, meaning "and/or".

在本说明书中,下述用于描述本发明原理的各种实施例只是说明,不应该以任何方式解释为限制发明的范围。参照附图的下述描述用于帮助全面理解由权利要求及其等同物限定的本发明的示例性实施例。下述描述包括多种具体细节来帮助理解,但这些细节应认为仅仅是示例性的。因此,本领域普通技术人员应认识到,在不背离本发明的范围和精神的情况下,可以对本文中描述的实施例进行多种改变和修改。 此外,为了清楚和简洁起见,省略了公知功能和结构的描述。此外,贯穿附图,相同附图标记用于相同或相似的功能和操作。In the present specification, the following various embodiments for describing the principles of the present invention are merely illustrative and should not be construed as limiting the scope of the invention. The following description of the invention is intended to be understood as The description below includes numerous specific details to assist the understanding, but these details should be considered as merely exemplary. Accordingly, it will be apparent to those skilled in the art that various changes and modifications may be made to the embodiments described herein without departing from the scope and spirit of the invention. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness. Further, the same reference numerals are used throughout the drawings for the same or similar functions and operations.

总的来说,本发明提出一种体感遥控方法,即通过体感方式来遥控执行装置。这里所说的“体感”是指身体感知,包括对人的各种身体状态信息的感知。身体状态信息又包括身体的运动状态信息,包括运动时间、运动速度、运动加速度、运动角度、运动角速度、运动姿势等。这里所说有执行装置则泛指一切能够在指令下执行任何动作的设备或设备的一部分,例如无人机,或者无人机上的云台等。In general, the present invention provides a somatosensory remote control method in which the device is remotely controlled by a somatosensory method. The term "soul" as used herein refers to body perception, including the perception of various body state information of a person. The body state information in turn includes body motion state information, including exercise time, motion speed, motion acceleration, motion angle, motion angular velocity, motion posture, and the like. As used herein, an actuator refers to any device or device that can perform any action under the command, such as a drone, or a gimbal on a drone.

无人机通常由带有控制手柄的遥控装置遥控,操作者对无人机的遥控离不开对手柄进行控制。如果无人机的操作者本身的体感信息也能被良好的利用,则可以大大降低操作者的操作负荷。The drone is usually remotely controlled by a remote control device with a control handle, and the operator can control the handle without remote control of the drone. If the operator's own body-sensing information of the drone can be well utilized, the operator's operating load can be greatly reduced.

因此,首先,本发明的出发点是利用操作者的体感信息,即首先需获取体感信息。目前已存在多种获取体感信息的技术,本发明都可以利用。即,本发明并不限于如何获取体感信息以及什么类型的体感信息。Therefore, first of all, the starting point of the present invention is to utilize the operator's somatosensory information, that is, first, the somatosensory information needs to be acquired. At present, there are various techniques for acquiring somatosensory information, and the present invention can be utilized. That is, the present invention is not limited to how to acquire somatosensory information and what type of somatosensory information.

其次,本发明需要根据所述体感信息产生用于控制执行装置的遥控指令;执行装置由此可根据所述遥控指令执行动作。举例来说,无人机可以基于遥控器的遥控指令来执行动作,这是众所周知的。但是,体感信息并不能直接作为遥控指令,因此,本发明提出将体感信息进行转换,使之成为能被无人机读取和识别的预定指令格式。由此,执行装置就能够以通常的方式根据遥控指令来执行相应的动作。Secondly, the present invention entails generating a remote command for controlling the execution device based on the somatosensory information; the executing device can thereby perform an action in accordance with the remote command. For example, a drone can perform an action based on a remote command of a remote controller, which is well known. However, the somatosensory information cannot be directly used as a remote control command. Therefore, the present invention proposes to convert the somatosensory information into a predetermined command format that can be read and recognized by the drone. Thereby, the executing device can perform the corresponding action in accordance with the remote control command in the usual manner.

尽管本发明不限于具体的获取体感信息的装置或元件,但是考虑到操控的便利性和现场感,本发明优选为使用可穿戴设备来采集体感信息。可穿戴设备例如是智能手环、智能手表、虚拟头盔、智能眼镜、智能运动鞋等,通常,可穿戴设备可以采集使用者的至少一部位的运动信息或姿态信息作为体感信息。也就是说,本发明可以在可穿戴设备中嵌入体感控制装置。Although the present invention is not limited to a specific device or element for acquiring somatosensory information, in view of convenience of handling and sense of presence, the present invention preferably uses a wearable device to collect somatosensory information. The wearable device is, for example, a smart bracelet, a smart watch, a virtual helmet, smart glasses, smart sports shoes, etc. Generally, the wearable device can collect motion information or posture information of at least one part of the user as the body feeling information. That is, the present invention can embed a somatosensory control device in a wearable device.

然而,对于遥控设备来说,特别是对于无人机等无人载具进行遥控的方式来说,本发明更优选为采用具有虚拟现实(VR)功能的可穿戴设备,例如VR头盔、VR眼罩、VR眼镜等。已存在将无人载具实时拍摄的画面传送到操控者的虚拟现实设备的技术,从而使得操控者通过遥控设备操控无人载具的时候,其VR设备在人的眼前实时再现无人载具的拍摄画面,从而使操作者产生第一视角的身临其境感。但是,现有的VR设备不具有获得操控者的体感信息的功能。由此,本发明提出使VR 设备具有获得体感信息的功能,从而将人的运动或人的某个部位的运动转化为对无人载具的操控信息,以使得操控者获得一种沉浸式的操控感受。However, for remote control devices, particularly for remote control of unmanned vehicles such as drones, the present invention more preferably employs wearable devices with virtual reality (VR) functions, such as VR helmets, VR blinds. , VR glasses, etc. There has been a technology for transmitting a picture taken by a no-man's vehicle in real time to a controller's virtual reality device, so that when the controller manipulates the unmanned vehicle through the remote control device, the VR device reproduces the unmanned vehicle in real time in front of the person's eyes. The shooting picture, so that the operator produces a first-view immersive feeling. However, the existing VR device does not have the function of obtaining the sensory information of the controller. Thus, the present invention proposes to make VR The device has the function of obtaining somatosensory information, thereby converting the movement of a person or a part of a person into a manipulation information of the unmanned vehicle, so that the controller obtains an immersive manipulation feeling.

下面以VR眼镜为例作为体感控制装置的载体进行说明。Hereinafter, the VR glasses will be described as a carrier of the body feeling control device as an example.

图1显示了本发明的体感控制装置的一个实施例的模块图。如图1所示,体感控制装置10可包含于一个VR体感眼镜1中,其包括一个获取模块11和一个指令产生模块12。获取模块11用于获取体感信息;指令产生模块12用于根据所述体感信息产生用于控制执行装置(未示出)的遥控指令。获取模块11所获取的体感信息或指令产生模块12产生的遥控指令可以直接转送给VR体感眼镜1,由VR体感眼镜1完成体感信息或遥控指令的传送及与执行装置的交互。Figure 1 shows a block diagram of one embodiment of a somatosensory control device of the present invention. As shown in FIG. 1, the somatosensory control device 10 can be included in a VR body spectacle 1 including an acquisition module 11 and an instruction generation module 12. The acquisition module 11 is configured to acquire the somatosensory information; the instruction generation module 12 is configured to generate a remote control instruction for controlling the execution device (not shown) according to the somatosensory information. The somatosensory information acquired by the acquisition module 11 or the remote control command generated by the command generation module 12 can be directly transferred to the VR stereoscopic glasses 1 , and the VR stereoscopic glasses 1 complete the transmission of the somatosensory information or the remote control command and the interaction with the execution device.

图2显示了本发明的体感控制装置的另一个实施例的模块图。如图2所示,与图1所示的实施例不同的是,该体感控制装置10还包括有一个发送模块13。其用于发送获取模块11所获取的体感信息或指令产生模块12产生的遥控指令给所述执行装置或该执行装置的遥控装置。也就是说,该实施例的体感控制装置10自身完成体感信息或遥控指令的传送及与执行装置的交互。Fig. 2 is a block diagram showing another embodiment of the somatosensory control device of the present invention. As shown in FIG. 2, unlike the embodiment shown in FIG. 1, the somatosensory control device 10 further includes a transmitting module 13. It is used to send the somatosensory information acquired by the acquisition module 11 or the remote control command generated by the instruction generation module 12 to the execution device or the remote control device of the execution device. That is, the body feeling control device 10 of this embodiment itself performs the transmission of the body feeling information or the remote control command and the interaction with the execution device.

图3显示了本发明的体感控制装置与执行装置之间的信息交互方式的一个实施例的示意图。如图3所示,体感控制装置10包含于VR体感眼镜1中,所述VR体感眼镜1根据体感控制装置10获取的体感信息产生遥控指令,且遥控指令由VR体感眼镜1发送给无人机2。无人机2的飞控模块或云台控制模块作为所述的执行装置(图中未示)。Fig. 3 is a diagram showing an embodiment of the manner of information interaction between the somatosensory control device and the executing device of the present invention. As shown in FIG. 3, the somatosensory control device 10 is included in the VR body spectacle 1 which generates a remote control command based on the somatosensory information acquired by the somatosensory control device 10, and the remote control command is transmitted from the VR tactile glasses 1 to the drone. 2. The flight control module or the pan/tilt control module of the drone 2 serves as the execution device (not shown).

图4显示了图3实施例中的体感控制装置与执行装置的模块架构图。如图4所示,体感控制装置10包括获取模块11、指令产生模块和发送模块。获取模块11用于获取体感信息;指令产生模块12用于根据所述体感信息产生用于控制执行装置20的遥控指令。发送模块13用于发送获取模块11所获取的体感信息或指令产生模块12产生的遥控指令给所述执行装置20。Fig. 4 is a block diagram showing the module of the somatosensory control device and the executing device in the embodiment of Fig. 3. As shown in FIG. 4, the body feeling control device 10 includes an acquisition module 11, an instruction generation module, and a transmission module. The acquisition module 11 is configured to acquire the somatosensory information; the instruction generation module 12 is configured to generate a remote control instruction for controlling the execution device 20 according to the somatosensory information. The sending module 13 is configured to send the somatosensory information acquired by the acquiring module 11 or the remote control command generated by the command generating module 12 to the executing device 20.

执行装置20包括执行模块21、主控模块22和接收模块23。执行模块用于根据控制模块22的指令执行相应的动作。当执行装置20是无人机时,执行模块可以是控制无人机飞行状态的部件,例如发动机、旋翼、机舵等。当执行装置是搭载于无人机的云台时,执行模块可以是控制云台的姿态的致动机构。本发明并不限于具体的执行装置和执行模块的具体构成。 The execution device 20 includes an execution module 21, a main control module 22, and a receiving module 23. The execution module is configured to perform a corresponding action in accordance with an instruction of the control module 22. When the execution device 20 is a drone, the execution module may be a component that controls the flight state of the drone, such as an engine, a rotor, a rudder, and the like. When the execution device is mounted on the head of the drone, the execution module may be an actuating mechanism that controls the attitude of the gimbal. The invention is not limited to the specific configuration of the specific execution device and execution module.

执行装置的接收模块23用于从体感控制装置10的发送模块13通过有线或无线方式接收体感信息或遥控指令,并将其转送给控制模块22。控制模块22根据所述体感信息或遥控指令产生控制所述执行模块21的遥控指令。当控制模块22接收的是体感信息时,其先根据体感信息来产生遥控指令,再根据遥控指令来产生执行模块21的遥控指令。The receiving module 23 of the executing device is configured to receive the somatosensory information or the remote control command from the transmitting module 13 of the somatosensory control device 10 by wire or wirelessly, and forward it to the control module 22. The control module 22 generates a remote control command for controlling the execution module 21 according to the somatosensory information or a remote control command. When the control module 22 receives the somatosensory information, it first generates a remote control command according to the somatosensory information, and then generates a remote control command of the execution module 21 according to the remote control command.

在图3显示的实施例中,获取模块11在所述获取体感信息之前,还包括在VR体感眼镜1开启时获取当前体感控制装置20的姿态信息的步骤。所述姿态包括方向、角度等,并将VR体感眼镜1开启时的体感控制装置20的姿态信息存储以作为参考信息。获取体感信息的步骤可能通过获取体感控制装置20的当前姿态信息,根据所述当前姿态信息与所述参考信息的差值,生成体感信息。由于体感控制装置20内置于VR体感眼镜1,因此,体感控制装置20的姿态信息也可以当作是VR体感眼镜1的姿态信息。In the embodiment shown in FIG. 3, the obtaining module 11 further includes the step of acquiring the posture information of the current body feeling control device 20 when the VR body glasses 1 is turned on before the obtaining the body feeling information. The posture includes a direction, an angle, and the like, and the posture information of the body feeling control device 20 when the VR body glasses 1 is turned on is stored as reference information. The step of acquiring the somatosensory information may generate the somatosensory information according to the difference between the current posture information and the reference information by acquiring the current posture information of the somatosensory control device 20. Since the sensible control device 20 is built in the VR body spectacles 1, the posture information of the sensation control device 20 can also be regarded as the posture information of the VR body spectacles 1.

所述VR体感眼镜1的参考信息以及当前姿态信息可通过惯性测量元个获取,即获取装置可以由惯性测量元件来实现,惯用测量元件可以包括陀螺仪和加速度计。The reference information of the VR body-spectrum 1 and the current attitude information may be acquired by inertial measurement elements, that is, the acquisition device may be implemented by an inertial measurement element, and the conventional measurement element may include a gyroscope and an accelerometer.

此外,该实施例中,开启所述VR体感眼镜1时,指令产生模块产生一个回中指令,并能过发送模块发送给述执行装置20,以使所述执行装置20的控制模块22控制执行装置使得执行装置20的姿态处于初始姿态。对于无人机或无人机的云台来说,该初始姿态可以是无人机或云台的三个轴相互垂直,且航向轴(yaw轴)为竖直方向。In addition, in this embodiment, when the VR body-spectrum 1 is turned on, the command generation module generates a back-in command, and can be sent to the execution device 20 through the sending module, so that the control module 22 of the executing device 20 controls execution. The device causes the posture of the actuator 20 to be in an initial posture. For a head of a drone or a drone, the initial attitude may be that the three axes of the drone or the pan/tilt are perpendicular to each other, and the heading axis (yaw axis) is a vertical direction.

优选地,所述体感控制装置10的控制区域包括控制死区以及控制有效区,当所述当前姿态和参考信息的差值位于所述死区范围内时,则指令产生模块12不产生遥控指令;当所述当前姿态和参考信息的差值位于所述有效控制范围内时,则根据所述差值生成遥控指令。更优选的,所述体感控制装置10的控制区域还包括控制饱和区,当所述当前姿态和参考信息的差值大于或等于预设阈值时,则遥控指令不变。Preferably, the control area of the somatosensory control device 10 includes a control dead zone and a control active area. When the difference between the current posture and the reference information is within the dead zone range, the command generation module 12 does not generate a remote control command. And when the difference between the current posture and the reference information is within the effective control range, generating a remote control instruction according to the difference. More preferably, the control region of the somatosensory control device 10 further includes a control saturation region, and when the difference between the current posture and the reference information is greater than or equal to a preset threshold, the remote control command is unchanged.

图5至图8显示了所述体感控制装置的控制区域与VR体感眼镜的对应姿态示意图。该示意图中以航向轴(yaw轴)的角度作为姿态信息的例子,但应理解的是,姿态信息也可以是其他角度。如图5所示,当VR体感眼镜1开启时,获取模块获取当前体感控制装置20的姿态信息,并将此时获取的姿态信息作为参考信息,其中,当前的yaw轴角度作为参考yaw轴角度,yaw轴角度差为0。 5 to 8 are schematic diagrams showing the corresponding postures of the control area of the somatosensory control device and the VR body-sensing glasses. In the diagram, the angle of the heading axis (yaw axis) is taken as an example of the attitude information, but it should be understood that the attitude information may be other angles. As shown in FIG. 5, when the VR body glasses 1 is turned on, the acquisition module acquires the posture information of the current body feeling control device 20, and uses the posture information acquired at this time as reference information, wherein the current yaw axis angle is used as a reference yaw axis angle. The yaw axis angle difference is 0.

如图6所示,当VR体感眼镜转动一个角度时,体感控制装置10获取的当前yaw轴角度发生偏转,产生不为0的yaw轴角度差。当该yaw轴角度差值位于一个角度范围内时,很可能并非是操作者的无意识动作造成,因此,本发明将该角度范围设为死区,当yaw轴角度差值位于死区内时,体感控制装置10的指令产生模块12不产生遥控指令,从而不对执行装置发送遥控指令。这样,只要当佩带VR体感眼间的操控者的动作幅度足够大时,才发生操控命令给无人机等执行装置,以使得无人机及其云台不会对用户的微小动作作出过度的反应,从而保证无人机运行状态的稳定。死区的角度范围例如是-5度至+5度之间。As shown in FIG. 6, when the VR body glasses are rotated by an angle, the current yaw axis angle acquired by the body feeling control device 10 is deflected, resulting in a yaw axis angle difference of not 0. When the yaw axis angle difference is within an angular range, it is likely that it is not caused by the operator's unintended action. Therefore, the present invention sets the angle range as a dead zone, and when the yaw axis angle difference is in the dead zone, The command generation module 12 of the somatosensory control device 10 does not generate a remote command, so that no remote command is sent to the executing device. In this way, as long as the action amplitude of the controller between the VR-like eyes is sufficiently large, the manipulation command is given to the execution device such as the drone, so that the drone and its pan/tilt do not excessively act on the user's minute movements. The reaction ensures the stability of the drone's operating state. The angular range of the dead zone is, for example, between -5 degrees and +5 degrees.

图5至图8中的姿态球可以显示于VR体感眼镜1的屏幕中,以便用户直观地感受到其动作幅度的大小及操作的精准度。The posture balls in FIGS. 5 to 8 can be displayed on the screen of the VR body-wearing glasses 1 so that the user can intuitively feel the magnitude of the motion range and the accuracy of the operation.

如图7所示,当VR体感眼镜转动的角度使得体感控制装置10获取的当前yaw轴角度差值超出死区的范围时,则进入图中所示的“控制有效区”,此时,指令产生模块12根据yaw轴方向偏转的大小来产生遥控指令以操控执行装置30。在该实施例中,将当前的yaw轴方向减去死区边界的角度方向的差值作为控制角α,并根据该α的角度产生遥控指令,并指示执行装置进行动作。当执行装置包含于无人机的云台内时,可以控制无人机的yaw轴偏转α角,或者偏转一个与α角相关的一个角。当执行装置包含于无人机中作为飞控模块时,可根据该α角产生一个使无人机yaw轴偏转一定杆量值的遥控指令,该杆量值与α角具有对应关系。As shown in FIG. 7, when the angle of rotation of the VR body-sensing glasses is such that the current yaw axis angle difference acquired by the body-sensing control device 10 exceeds the range of the dead zone, the "control effective area" shown in the figure is entered. The generation module 12 generates remote command to manipulate the actuator 30 based on the magnitude of the yaw axis deflection. In this embodiment, the difference in the angular direction of the current yaw axis direction minus the dead zone boundary is taken as the control angle α, and a remote command is generated based on the angle of the α, and the execution device is instructed to operate. When the actuator is included in the head of the drone, the yaw axis of the drone can be controlled to deflect the alpha angle or to deflect an angle associated with the alpha angle. When the executing device is included in the drone as the flight control module, a remote command for deflecting the unmanned machine yaw axis by a certain amount of the lever may be generated according to the alpha angle, and the lever amount has a corresponding relationship with the alpha angle.

如图8所示,当VR体感眼镜转动的角度使得体感控制装置10获取的当前yaw轴角度差值超出死区的范围时,则进入图中所示的“饱和控制区”,此时,指令生成模块12不再根据yaw轴偏转角的大小来生成遥控指令,而是仍以控制有效区的最大边界对应的yaw轴偏转角来生成遥控指令。这样,当操控者的动作幅度过大时,不会产生使执行装置不可执行或在执行时发生危险或故障的指令。对于无人机或其云台来说,可以保证无人机及其云台在设计容限内进行动作,保证飞行或姿态的稳定。As shown in FIG. 8, when the angle of rotation of the VR body-sensing glasses is such that the current yaw axis angle difference acquired by the body-sensing control device 10 exceeds the range of the dead zone, the "saturation control zone" shown in the figure is entered. The generation module 12 no longer generates remote command according to the magnitude of the yaw axis deflection angle, but still generates a remote command by controlling the yaw axis deflection angle corresponding to the maximum boundary of the effective area. Thus, when the magnitude of the motion of the controller is too large, there is no instruction to make the execution device unexecutable or dangerous or malfunctioning during execution. For the drone or its head, it can ensure that the drone and its head are operating within the design tolerance to ensure the stability of flight or posture.

图9显示了本发明的体感控制装置与执行装置之间的信息交互方式的另一个实施例的示意图。如图9所示,与前一实施例相似,体感控制装置10包含于VR体感眼镜1中,所述VR体感眼镜1根据体感控制装置10获取的体感信息产生遥控指令,且遥控指令由VR体感眼镜1发送给无人机2。无人机2的飞控模块或云台控制模 块作为所述的执行装置(图中未示)。此外,执行装置还具有一个遥控装置3,例如无人机的遥控器。这样,体感控制装置10获取的体感信息或产生的遥控指令可以不直接发送给执行装置,而是发送给该遥控装置3。遥控装置3本身具有收发模块,可以将所述体感信息或遥控指令直接转送给执行装置2。Fig. 9 is a diagram showing another embodiment of the manner of information interaction between the somatosensory control device and the executing device of the present invention. As shown in FIG. 9, similar to the previous embodiment, the somatosensory control device 10 is included in the VR body-wearing glasses 1, and the VR-skinned eyewear 1 generates a remote control command according to the somatosensory information acquired by the somatosensory control device 10, and the remote control command is sensed by the VR. The glasses 1 are sent to the drone 2. Flight control module or pan/tilt control mode of drone 2 The block acts as the execution device (not shown). Furthermore, the actuator also has a remote control device 3, such as a remote control for the drone. Thus, the somatosensory information acquired by the somatosensory control device 10 or the generated remote control command may be transmitted to the remote control device 3 instead of being directly transmitted to the executing device. The remote control device 3 itself has a transceiver module that can directly transfer the somatosensory information or remote control commands to the execution device 2.

图10和图11显示了图9实施例中的体感控制装置、执行装置及执行装置的遥控装置的的模块架构图。如图10所示,体感控制装置10包括获取模块11、指令产生模块和发送模块。获取模块11用于获取体感信息;指令产生模块12用于根据所述体感信息产生用于控制执行装置20的遥控指令。发送模块13用于发送获取模块11所获取的体感信息或指令产生模块12产生的遥控指令给所述执行装置20。10 and 11 are block diagrams showing the body control device, the executing device, and the remote control device of the executing device in the embodiment of Fig. 9. As shown in FIG. 10, the body feeling control device 10 includes an acquisition module 11, an instruction generation module, and a transmission module. The acquisition module 11 is configured to acquire the somatosensory information; the instruction generation module 12 is configured to generate a remote control instruction for controlling the execution device 20 according to the somatosensory information. The sending module 13 is configured to send the somatosensory information acquired by the acquiring module 11 or the remote control command generated by the command generating module 12 to the executing device 20.

同样,该实施例的执行装置20也包括执行模块21、主控模块22和接收模块23。执行模块21用于根据控制模块22的指令执行相应的动作。执行装置的接收模块23用于从体感控制装置10的发送模块13通过有线或无线方式接收体感信息或遥控指令,并将其转送给控制模块22。控制模块22根据所述体感信息或遥控指令产生控制所述执行模块21的遥控指令。当控制模块22接收的是体感信息时,其先根据体感信息来产生遥控指令,再根据遥控指令来产生执行模块21的遥控指令。另一方面,体感控制装置10也可以不将体感信息或遥控指令直接在送给执行装置,而只是发送给遥控装置3。遥控装置3本身具有收发模块31和控制模块32,收发模块31可将收到的体感信息或遥控指令直接转发给执行装置2,或者对收到的体感信息或遥控指令进行处理后发送给执行装置2。所述的处理,可以是将体感信息转换为遥控指令,也可以是对遥控指令的融合。下面进一步来说明遥控指令的融合。Likewise, the execution device 20 of this embodiment also includes an execution module 21, a main control module 22, and a receiving module 23. The execution module 21 is configured to perform a corresponding action according to an instruction of the control module 22. The receiving module 23 of the executing device is configured to receive the somatosensory information or the remote control command from the transmitting module 13 of the somatosensory control device 10 by wire or wirelessly, and forward it to the control module 22. The control module 22 generates a remote control command for controlling the execution module 21 according to the somatosensory information or a remote control command. When the control module 22 receives the somatosensory information, it first generates a remote control command according to the somatosensory information, and then generates a remote control command of the execution module 21 according to the remote control command. On the other hand, the body feeling control device 10 may not send the body feeling information or the remote control command directly to the execution device, but may simply transmit it to the remote control device 3. The remote control device 3 itself has a transceiver module 31 and a control module 32. The transceiver module 31 can directly forward the received somatosensory information or remote control command to the executing device 2, or process the received somatosensory information or remote control command and send it to the executing device. 2. The processing may be converting the somatosensory information into a remote control command or a fusion of the remote control commands. The fusion of remote command is further explained below.

无人机等无人载具通常具有相配的遥控器,遥控器本身能发生遥控指令。为了便于区分,在此,我们将体感控制装置10产生的遥控指令称为第一遥控指令,将遥控装置产生的遥控指令作为第二遥控指令。第二遥控指令可由操作杆等用户输入元件产生。所述遥控装置将所述第一遥控指令和第二遥控指令融合形成第三遥控指令。所述的融合可以是直接的叠加,也可以是按照预定的权值进行叠加,或者按照预定的线性或非线性公式进行计算,最终得到所述第三遥控指令。Unmanned vehicles such as drones usually have matching remote controls, and the remote control itself can generate remote commands. In order to facilitate the distinction, here we refer to the remote control command generated by the somatosensory control device 10 as the first remote control command, and the remote control command generated by the remote control device as the second remote control command. The second remote command can be generated by a user input element such as a joystick. The remote control device fuses the first remote control command and the second remote control command to form a third remote control command. The fusion may be a direct superposition, or may be superimposed according to a predetermined weight, or calculated according to a predetermined linear or non-linear formula, and finally the third remote control command is obtained.

所述的融合操作既可由遥控装置3来实现,亦可由执行装置直接进行。图10和图11分别显示了这两种情形,在图10所示的情形中,融合模块33作为遥控装置 3的控制模块32的一个部分,在图11的所示的情形中,融合模块34则是作为执行装置2的控制模块22的一个部分。The merging operation can be implemented by the remote control device 3 or directly by the execution device. Figure 10 and Figure 11 show the two cases, respectively, in the case shown in Figure 10, the fusion module 33 as a remote control A portion of control module 32 of 3, in the illustrated situation of FIG. 11, fusion module 34 is a portion of control module 22 that is the execution device 2.

在上述实施例中,实现了使用VR体感眼镜的姿态控制无人机的偏航以及云台的俯仰(pitch)等操作。眼镜姿态的yaw(左右转头)方向控制飞行器偏航,眼镜姿态的pitch(低头抬头)方向控制云台俯仰,对云台俯仰的控制体感控云台一致。In the above embodiment, the operation of using the posture of the VR body-wearing glasses to control the yaw of the drone and the pitch of the pan-tilt is realized. The yaw (left and right turn) direction of the eyeglasses controls the yaw of the aircraft, and the pitch of the eyeglasses of the eyeglasses controls the pan/tilt, and the control of the pan/tilt is consistent with the control of the pan/tilt.

当VR体感眼镜开启后,眼镜周期性采样自身IMU(惯性测量元件)的yaw轴和pitch轴数据,计算与参考姿态之间的角度差(含yaw和pitch分量),然后以这个差值作为云台或无人机的角度偏移,生成遥控指令送到无人机或其云台。无人机或云台收到角度控制的遥控指令后,根据指令指示的角度差的yaw和pitch分量,分别调整无人机或云台的yaw和pitch角度。When the VR body glasses are turned on, the glasses periodically sample the yaw axis and pitch axis data of their own IMU (Inertial Measurement Unit), calculate the angular difference (including the yaw and pitch components) from the reference pose, and then use this difference as the cloud. The angle of the station or drone is offset, and a remote command is generated to be sent to the drone or its head. After the drone or pan/tilt receives the remote control command of the angle control, adjust the yaw and pitch angles of the drone or pan/tilt according to the yaw and pitch components of the angular difference indicated by the command.

根据另一种实施方式,“体感控制”的优先级比“遥控装置控制”的优先级更高。因此,可以设置在体感控制时,遥控装置的控制不生效。此外,作为一种具体实施方式,当VR体感眼镜关闭时,则停止周期性的采集体感信息,此时,发送遥控指令以控制作为执行装置的无人机或云台的进行回中。According to another embodiment, the "sense control" has a higher priority than the "remote control". Therefore, it is possible to set the control of the remote control device to be ineffective when the somatosensory control is set. Further, as a specific embodiment, when the VR body glasses are turned off, the periodic acquisition of the body feeling information is stopped, and at this time, a remote control command is transmitted to control the return of the drone or the pan/tilt as the executing device.

尽管已经参照本发明的特定示例性实施例示出并描述了本发明,但是本领域技术人员应该理解,在不背离所附权利要求及其等同物限定的本发明的精神和范围的情况下,可以对本发明进行形式和细节上的多种改变。因此,本发明的范围不应该限于上述实施例,而是应该不仅由所附权利要求来进行确定,还由所附权利要求的等同物来进行限定。 Although the present invention has been shown and described with respect to the specific exemplary embodiments of the present invention, those skilled in the art Various changes in form and detail are made to the invention. Therefore, the scope of the invention should not be construed as being limited by the appended claims.

Claims (41)

一种体感遥控方法,其用于通过体感方式遥控执行装置,包括:A somatosensory remote control method for remotely executing a device by a somatosensory method, comprising: 获取体感信息;Get somatosensory information; 根据所述体感信息产生用于控制所述执行装置的遥控指令;Generating a remote control command for controlling the execution device according to the somatosensory information; 执行装置根据所述遥控指令执行动作。The executing device performs an action according to the remote command. 如权利要求1所述的体感遥控方法,其中,The somatosensory remote control method according to claim 1, wherein 所述体感信息为可穿戴设备采集的体感信息。The somatosensory information is somatosensory information collected by the wearable device. 如权利要求2所述的体感遥控方法,其中,所述可穿戴设备为VR设备,所述体感信息为该VR设备的偏转角度信息。The somatosensory remote control method according to claim 2, wherein the wearable device is a VR device, and the somatosensory information is deflection angle information of the VR device. 如权利要求2所述的体感遥控方法,其中,在所述获取体感信息之前,还包括:The somatosensory remote control method according to claim 2, wherein before the obtaining the somatosensory information, the method further comprises: 获取所述可穿戴设备开启时的姿态信息,并将所述可穿戴设备开启时的姿态信息存储以作为参考信息。Obtaining posture information when the wearable device is turned on, and storing posture information when the wearable device is turned on as reference information. 如权利要求4所述的体感遥控方法,其中,所述获取体感信息的步骤包括:The somatosensory remote control method according to claim 4, wherein the step of acquiring the somatosensory information comprises: 获取所述可穿戴设备的当前姿态信息,根据所述当前姿态信息与所述参考信息的差值,生成体感信息。Acquiring the current posture information of the wearable device, and generating the somatosensory information according to the difference between the current posture information and the reference information. 如权利要求5所述的体感遥控方法,其中,通过惯性测量单元获取所述可穿戴设备开启时的参考信息以及当前姿态信息。The somatosensory remote control method according to claim 5, wherein the reference information when the wearable device is turned on and the current posture information are acquired by the inertial measurement unit. 如权利要求2至6中任一项所述的体感遥控方法,其中,开启所述可穿戴设备时,发送指令给所述执行装置,以使所述执行装置的姿态处于初始姿态。The somatosensory remote control method according to any one of claims 2 to 6, wherein, when the wearable device is turned on, an instruction is transmitted to the executing device such that the posture of the executing device is in an initial posture. 如权利要求5所述的体感遥控方法,所述可穿戴设备的控制区域包括控制死区以及控制有效区,当所述当前姿态和参考信息的差值位于所述死区范围内时,则确定运动信息不产生遥控指令;当所述当前姿态和参考信息的差值位于所述有效控制范围内时,则确定根据所述差值生成遥控指令。The somatosensory remote control method according to claim 5, wherein the control area of the wearable device comprises a control dead zone and a control effective area, and when the difference between the current posture and the reference information is within the dead zone range, determining The motion information does not generate a remote control command; when the difference between the current posture and the reference information is within the effective control range, it is determined to generate a remote control command according to the difference. 如权利要求8所述的体感遥控方法,所述可穿戴设备的控制区域还包括控制饱和区,当所述当前姿态和参考信息的差值大于或等于预设阈值时,则遥控指令不变。The somatosensory remote control method according to claim 8, wherein the control area of the wearable device further comprises a control saturation region, and when the difference between the current posture and the reference information is greater than or equal to a preset threshold, the remote control command is unchanged. 如权利要求2所述的体感遥控方法,其中,所述根据所述体感信息产生用 于控制所述执行装置的遥控指令步骤,具体包括:The somatosensory remote control method according to claim 2, wherein said generating according to said body feeling information The step of controlling the remote command of the executing device specifically includes: 所述可穿戴设备根据所述体感信息产生所述遥控指令,且所述遥控指令由所述可穿戴设备发送给所述执行装置。The wearable device generates the remote control command according to the somatosensory information, and the remote control command is sent by the wearable device to the execution device. 如权利要求10所述的体感遥控方法,其中,The somatosensory remote control method according to claim 10, wherein 将所述可穿戴设备根据所述体感信息产生的遥控指令作为第一遥控指令;Determining, by the wearable device, a remote control command generated according to the somatosensory information as a first remote control instruction; 在所述执行装置根据所述遥控指令执行动作步骤中,所述执行装置还接收来自一遥控装置的第二遥控指令,并将所述第一遥控指令和第二遥控指令融合形成第三遥控指令,并根据该第三遥控指令执行动作。In the step of performing the action according to the remote command, the executing device further receives a second remote command from a remote control device, and fuses the first remote command and the second remote command to form a third remote command And performing an action according to the third remote command. 如权利要求2所述的体感遥控方法,其中,The somatosensory remote control method according to claim 2, wherein 所述根据所述体感信息产生用于控制所述执行装置的遥控指令步骤,具体包括:所述可穿戴设备先将所述体感信息发送给一遥控装置,该遥控装置根据所述体感信息产生所述遥控指令。The step of generating a remote control command for controlling the execution device according to the somatosensory information includes: the wearable device first transmitting the somatosensory information to a remote control device, and the remote control device generates the location according to the somatosensory information Remote control instructions. 如权利要求12所述的体感遥控方法,其中,The somatosensory remote control method according to claim 12, wherein 将所述遥控装置根据所述体感信息产生的遥控指令作为第一遥控指令;Using a remote control command generated by the remote control device according to the somatosensory information as a first remote control command; 所述遥控装置还接收用户操作产生第二遥控指令;The remote control device further receives a user operation to generate a second remote control command; 所述遥控装置或所述执行装置将所述第一遥控指令和第二遥控指令融合形成第三遥控指令;The remote control device or the execution device fuses the first remote control command and the second remote control command to form a third remote control command; 所述执行装置根据该第三遥控指令执行动作。The executing device performs an action according to the third remote command. 如权利要求1至13中任一项所述的体感遥控方法,其中,The somatosensory remote control method according to any one of claims 1 to 13, wherein 所述体感信息包括绕航向轴(yaw轴)的运动信息和/或绕俯仰轴(pitch轴)的运动信息。The somatosensory information includes motion information about a heading axis (yaw axis) and/or motion information about a pitch axis (pitch axis). 如权利要求14所述的体感遥控方法,其中,所述运动信息包括偏转角度和/或偏转速度。The somatosensory remote control method according to claim 14, wherein the motion information includes a deflection angle and/or a deflection speed. 如权利要求1至13中任一项所述的体感遥控方法,其中,The somatosensory remote control method according to any one of claims 1 to 13, wherein 所述执行装置为云台,所述遥控指令包括云台的yaw运动指令和/或pitch运动指令。The execution device is a pan/tilt, and the remote control command includes a yaw motion command and/or a pitch motion command of the pan/tilt. 如权利要求1至13中任一项所述的体感遥控方法,其中,The somatosensory remote control method according to any one of claims 1 to 13, wherein 所述执行装置为无人飞行器,所述遥控指令包括用于控制所述无人飞行器的杆量值。 The execution device is an unmanned aerial vehicle, and the remote control command includes a lever amount for controlling the unmanned aerial vehicle. 一种体感控制装置,包括:A somatosensory control device comprising: 获取模块,用于获取体感信息;Obtaining a module for acquiring somatosensory information; 指令产生模块,用于根据所述体感信息产生用于控制执行装置的遥控指令。And an instruction generating module, configured to generate a remote control instruction for controlling the execution device according to the somatosensory information. 如权利要求18所述的体感控制装置,其中,The somatosensory control device according to claim 18, wherein 还包括发送模块,其用于发送所述体感信息或遥控指令给所述执行装置或该执行装置的遥控装置。Also included is a transmitting module for transmitting the somatosensory information or remote control commands to the executing device or the remote control device of the executing device. 如权利要求19所述的体感控制装置,其包含于可穿戴设备中。The somatosensory control device according to claim 19, which is included in the wearable device. 如权利要求20所述的体感控制装置,所述可穿戴设备为VR设备。The somatosensory control device according to claim 20, wherein the wearable device is a VR device. 如权利要求20所述的体感控制装置,其中,The somatosensory control device according to claim 20, wherein 所述获取模块还用于获取所述可穿戴设备开启时的姿态信息,并将所述可穿戴设备开启时的姿态信息存储以作为参考信息。The acquiring module is further configured to acquire posture information when the wearable device is turned on, and store posture information when the wearable device is turned on as reference information. 如权利要求22所述的体感控制装置,其中,所述获取模块用于获取所述可穿戴设备的当前姿态信息,根据所述当前姿态信息与所述参考信息的差值,生成体感信息。The somatosensory control device according to claim 22, wherein the acquisition module is configured to acquire current posture information of the wearable device, and generate somatosensory information according to a difference between the current posture information and the reference information. 如权利要求23所述的体感控制装置,其中,所述获取模块是惯性测量单元,其用于获取所述可穿戴设备开启时的参考信息以及当前姿态信息。The somatosensory control device according to claim 23, wherein the acquisition module is an inertial measurement unit configured to acquire reference information and current posture information when the wearable device is turned on. 如权利要求20至24中任一项所述的体感控制装置,其中,所述发送模块还用于在开启所述可穿戴设备时,发送指令给所述执行装置或所述执行装置的遥控装置,以使所述执行装置的姿态处于初始姿态。The somatosensory control device according to any one of claims 20 to 24, wherein the transmitting module is further configured to send a command to the executing device or the remote device of the executing device when the wearable device is turned on So that the posture of the actuator is in an initial posture. 如权利要求18至25中任一项所述的体感控制装置,其中,The body feeling control device according to any one of claims 18 to 25, wherein 所述体感信息包括绕航向轴的运动信息和/或绕俯仰轴的运动信息。The somatosensory information includes motion information about a heading axis and/or motion information about a pitch axis. 如权利要求26所述的体感控制装置,其中,所述运动信息包括偏转角度和/或偏转速度。The somatosensory control device according to claim 26, wherein the motion information includes a deflection angle and/or a deflection speed. 如权利要求18至25中任一项所述的体感控制装置,其中,The body feeling control device according to any one of claims 18 to 25, wherein 所述遥控指令包括用于控制云台的yaw运动指令和/或pitch运动指令。The remote command includes a yaw motion command and/or a pitch motion command for controlling the pan/tilt. 如权利要求18至25中任一项所述的体感控制装置,其中,The body feeling control device according to any one of claims 18 to 25, wherein 所述执行装置为无人飞行器,所述遥控指令包括用于控制无人飞行器的杆量值。The execution device is an unmanned aerial vehicle, and the remote control command includes a lever amount for controlling the unmanned aerial vehicle. 一种云台,包括a gimbal, including 接收模块,用于接收遥控指令,所述遥控指令由体感信息生成; a receiving module, configured to receive a remote control instruction, where the remote control instruction is generated by the somatosensory information; 执行模块,用于根据所述遥控指令执行动作。An execution module is configured to perform an action according to the remote command. 如权利要求30所述的云台,其中,A pan/tilt head according to claim 30, wherein 所述接收模块用于接收来自于可穿戴设备采集的体感信息。The receiving module is configured to receive somatosensory information collected from the wearable device. 如权利要求31所述的云台,将所述可穿戴设备根据所述体感信息产生的遥控指令作为第一遥控指令;The pan/tilt head according to claim 31, wherein the remote control command generated by the wearable device according to the somatosensory information is used as a first remote command; 所述接收模块还接收来自一遥控装置的第二遥控指令;The receiving module also receives a second remote control command from a remote control device; 所述云台还包括融合模块,所述融合模块用于将所述第一遥控指令和第二遥控指令融合形成第三遥控指令;The cloud platform further includes a fusion module, the fusion module is configured to fuse the first remote control instruction and the second remote control instruction to form a third remote control instruction; 所述执行模块根据该第三遥控指令执行动作。The execution module performs an action according to the third remote control instruction. 如权利要求30至32中任一项所述的云台,其中,A pan/tilt head according to any one of claims 30 to 32, wherein 所述体感信息包括绕航向轴的运动信息和/或绕俯仰轴的运动信息;The somatosensory information includes motion information about a heading axis and/or motion information about a pitch axis; 如权利要求30至32中任一项所述的云台,其中,A pan/tilt head according to any one of claims 30 to 32, wherein 所述遥控指令包括云台的yaw运动指令和/或pitch运动指令。The remote command includes a yaw motion command and/or a pitch motion command of the pan/tilt. 一种无人飞行器,包括An unmanned aerial vehicle, including 接收模块,用于接收遥控指令,所述遥控指令由体感信息生成;a receiving module, configured to receive a remote control instruction, where the remote control instruction is generated by the somatosensory information; 执行模块,用于根据所述遥控指令执行动作。An execution module is configured to perform an action according to the remote command. 如权利要求35所述的无人飞行器,其中,The UAV according to claim 35, wherein 所述接收模块用于接收来自于可穿戴设备采集的体感信息。The receiving module is configured to receive somatosensory information collected from the wearable device. 如权利要求36所述的无人飞行器,将所述可穿戴设备根据所述体感信息产生的遥控指令作为第一遥控指令;The UAV according to claim 36, wherein the remote control command generated by the wearable device according to the somatosensory information is used as a first remote command; 所述接收模块还接收来自一遥控装置的第二遥控指令;The receiving module also receives a second remote control command from a remote control device; 所述无人飞行器还包括融合模块,所述融合模块用于将所述第一遥控指令和第二遥控指令融合形成第三遥控指令;The UAV further includes a fusion module, the fusion module is configured to fuse the first remote control instruction and the second remote control instruction to form a third remote control instruction; 所述执行模块根据该第三遥控指令执行动作。The execution module performs an action according to the third remote control instruction. 如权利要求35-37中任一项所述的无人飞行器,还包括飞行控制模块,所述飞行控制查模块根据所述遥控指令控制所述执行模块以执行所指示的动作。The UAV of any of claims 35-37, further comprising a flight control module that controls the execution module to perform the indicated action in accordance with the remote control command. 如权利要求38所述的无人飞行器,还包括云台,所述云台至少包括可绕航向轴和俯仰轴转动部件,An unmanned aerial vehicle according to claim 38, further comprising a pan/tilt head, the pan/tilt head including at least a rotatable member rotatable about a heading axis and a pitch axis, 所述飞行控制模块根据所述遥控指令控制无人飞行器本身绕航向轴和/或俯仰 轴运动;或者The flight control module controls the UAV itself to orbit and/or pitch according to the remote command Axis movement; or 所述飞行控制模块根据所述遥控指令控制所述云台绕航向轴和/或俯仰轴运动。The flight control module controls the pan/tilt to move about a heading axis and/or a pitch axis according to the remote command. 如权利要求39所述的无人飞行器,其中,所述飞行控制模块根据所述遥控指令控制所述无人飞行器和/或云台恢复至初始状态。The UAV according to claim 39, wherein said flight control module controls said UAV and/or PTZ to return to an initial state in accordance with said remote command. 一种VR体感设备,包括权利要求18至29中任一项所述的体感控制装置。 A VR body sensing device comprising the body feeling control device according to any one of claims 18 to 29.
PCT/CN2017/079790 2017-04-07 2017-04-07 Body sensing remote control method, control apparatus, gimbal and unmanned aerial vehicle Ceased WO2018184232A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780013879.5A CN108700893A (en) 2017-04-07 2017-04-07 Body-sensing remote control method, control device, holder and unmanned vehicle
PCT/CN2017/079790 WO2018184232A1 (en) 2017-04-07 2017-04-07 Body sensing remote control method, control apparatus, gimbal and unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/079790 WO2018184232A1 (en) 2017-04-07 2017-04-07 Body sensing remote control method, control apparatus, gimbal and unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
WO2018184232A1 true WO2018184232A1 (en) 2018-10-11

Family

ID=63712994

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/079790 Ceased WO2018184232A1 (en) 2017-04-07 2017-04-07 Body sensing remote control method, control apparatus, gimbal and unmanned aerial vehicle

Country Status (2)

Country Link
CN (1) CN108700893A (en)
WO (1) WO2018184232A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112383345A (en) * 2020-07-15 2021-02-19 刘杨 Distributed remote control device
CN114641747A (en) * 2020-12-25 2022-06-17 深圳市大疆创新科技有限公司 Control method of movable platform, somatosensory remote controller and storage medium
CN114641743A (en) * 2020-12-25 2022-06-17 深圳市大疆创新科技有限公司 Unmanned aerial vehicle, control method and system thereof, handheld control equipment and head-mounted equipment

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111247494A (en) * 2019-01-29 2020-06-05 深圳市大疆创新科技有限公司 Control method and device for movable platform and movable platform
CN110162088B (en) * 2019-05-16 2022-01-04 沈阳无距科技有限公司 Unmanned aerial vehicle control method and device, unmanned aerial vehicle, wearable device and storage medium
CN111712861A (en) * 2019-05-24 2020-09-25 深圳市大疆创新科技有限公司 Control method of remote control equipment and remote control equipment
CN111123965A (en) * 2019-12-24 2020-05-08 中国航空工业集团公司沈阳飞机设计研究所 Somatosensory operation method and operation platform for aircraft control
CN114830070A (en) * 2020-12-25 2022-07-29 深圳市大疆创新科技有限公司 Flight guidance method, motor calibration method, display device and readable storage medium
CN116710870A (en) * 2021-03-16 2023-09-05 深圳市大疆创新科技有限公司 Control method, device and storage medium based on somatosensory remote controller
CN114815880A (en) * 2022-04-01 2022-07-29 深圳互酷科技有限公司 Unmanned aerial vehicle control method, device and equipment based on handle and readable storage medium
CN119909368A (en) * 2025-04-02 2025-05-02 天津金晟保科技有限公司 Control device for unmanned equipment and unmanned equipment system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN205103661U (en) * 2015-07-24 2016-03-23 刘思成 Unmanned aerial vehicle control system based on control technique is felt to body
CN105469579A (en) * 2015-12-31 2016-04-06 北京臻迪机器人有限公司 Somatosensory remote control and somatosensory remote control flying system and method
CN105739525A (en) * 2016-02-14 2016-07-06 普宙飞行器科技(深圳)有限公司 System of matching somatosensory operation to realize virtual flight
WO2016168117A2 (en) * 2015-04-14 2016-10-20 John James Daniels Wearable electric, multi-sensory, human/machine, human/human interfaces
CN106155090A (en) * 2016-08-29 2016-11-23 电子科技大学 Wearable drone control device based on somatosensory

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105573330B (en) * 2015-03-03 2018-11-09 广州亿航智能技术有限公司 Aircraft control method based on intelligent terminal
US9836117B2 (en) * 2015-05-28 2017-12-05 Microsoft Technology Licensing, Llc Autonomous drones for tactile feedback in immersive virtual reality
CN105828062A (en) * 2016-03-23 2016-08-03 常州视线电子科技有限公司 Unmanned aerial vehicle 3D virtual reality shooting system
CN106227231A (en) * 2016-07-15 2016-12-14 深圳奥比中光科技有限公司 The control method of unmanned plane, body feeling interaction device and unmanned plane

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016168117A2 (en) * 2015-04-14 2016-10-20 John James Daniels Wearable electric, multi-sensory, human/machine, human/human interfaces
CN205103661U (en) * 2015-07-24 2016-03-23 刘思成 Unmanned aerial vehicle control system based on control technique is felt to body
CN105469579A (en) * 2015-12-31 2016-04-06 北京臻迪机器人有限公司 Somatosensory remote control and somatosensory remote control flying system and method
CN105739525A (en) * 2016-02-14 2016-07-06 普宙飞行器科技(深圳)有限公司 System of matching somatosensory operation to realize virtual flight
CN106155090A (en) * 2016-08-29 2016-11-23 电子科技大学 Wearable drone control device based on somatosensory

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112383345A (en) * 2020-07-15 2021-02-19 刘杨 Distributed remote control device
CN114641747A (en) * 2020-12-25 2022-06-17 深圳市大疆创新科技有限公司 Control method of movable platform, somatosensory remote controller and storage medium
CN114641743A (en) * 2020-12-25 2022-06-17 深圳市大疆创新科技有限公司 Unmanned aerial vehicle, control method and system thereof, handheld control equipment and head-mounted equipment

Also Published As

Publication number Publication date
CN108700893A (en) 2018-10-23

Similar Documents

Publication Publication Date Title
WO2018184232A1 (en) Body sensing remote control method, control apparatus, gimbal and unmanned aerial vehicle
CN106125747A (en) Based on the servo-actuated Towed bird system in unmanned aerial vehicle onboard the first visual angle mutual for VR
JP6642432B2 (en) Information processing apparatus, information processing method, and image display system
CN108769531B (en) Method for controlling shooting angle of shooting device, control device and remote controller
JP2017163265A (en) Controlling support system, information processing device, and program
JP4012749B2 (en) Remote control system
US11804052B2 (en) Method for setting target flight path of aircraft, target flight path setting system, and program for setting target flight path
WO2015154627A1 (en) Virtual reality component system
JP7390541B2 (en) Animation production system
KR20170090888A (en) Apparatus for unmanned aerial vehicle controlling using head mounted display
WO2019019398A1 (en) Remote controller and unmanned aircraft system
CN106327583A (en) Virtual reality equipment for realizing panoramic image photographing and realization method thereof
CN115175119B (en) A VR follow-up control system suitable for mobile robots
CN205983222U (en) Unmanned aerial vehicle machine carries hardware connection structure of first visual angle nacelle device
Correia Marques et al. Immersive commodity telepresence with the avatrina robot avatar
JP6744033B2 (en) Mobile control system, control signal transmission system, mobile control method, program, and recording medium
WO2021059361A1 (en) Animation production method
JP7024997B2 (en) Aircraft Maneuvering System and How to Maneuver an Aircraft Using an Aircraft Maneuvering System
KR102263227B1 (en) Method and apparatus for control unmanned vehicle using artificial reality with relative navigation information
KR101973174B1 (en) Apparatus for controlling drone based on gesture-recognition and method for using the same
JP6964302B2 (en) Animation production method
O'Keeffe et al. Oculus rift application for training drone pilots
KR101896239B1 (en) System for controlling drone using motion capture
WO2024000189A1 (en) Control method, head-mounted display device, control system and storage medium
Martínez-Carranza et al. On combining wearable sensors and visual SLAM for remote controlling of low-cost micro aerial vehicles

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17904817

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17904817

Country of ref document: EP

Kind code of ref document: A1