WO2018133593A1 - Dispositif et procédé de commande destinés à un terminal intelligent - Google Patents
Dispositif et procédé de commande destinés à un terminal intelligent Download PDFInfo
- Publication number
- WO2018133593A1 WO2018133593A1 PCT/CN2017/115961 CN2017115961W WO2018133593A1 WO 2018133593 A1 WO2018133593 A1 WO 2018133593A1 CN 2017115961 W CN2017115961 W CN 2017115961W WO 2018133593 A1 WO2018133593 A1 WO 2018133593A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gesture
- smart terminal
- screen
- instruction
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
Definitions
- the present invention relates to the field of UAV technologies, and in particular, to a control method and apparatus for an intelligent terminal.
- the application of the drone is gradually popularized, and it is more common to perform aerial photography by the camera device carried by the drone, thereby conveniently collecting images for the user.
- an object of the present invention is to provide a control method and apparatus for an intelligent terminal to accurately control an intelligent terminal when a user wears a smart terminal.
- a method for controlling a smart terminal is applied to a mobile terminal, the method comprising: sensing a gesture operation, the gesture operation being formed by a user touching on a screen of the mobile terminal Recognizing the gesture operation, the gesture instruction is retrieved from a database preset in the mobile terminal, where the database includes a plurality of gesture commands, and the gesture instruction corresponds to the gesture operation;
- the gesture command is sent to the smart terminal, the smart terminal is used to be worn on the head of the user and has a screen for displaying an image, and the smart terminal controls the image displayed on the screen of the smart terminal according to the gesture instruction. Make changes.
- the smart terminal controls a screen of the smart terminal according to the gesture instruction
- the image displayed in the screen is changed
- the method includes: displaying, by the screen of the smart terminal, a view image transmitted by the external device, and controlling the view image to move in a corresponding direction according to a direction corresponding to the gesture instruction.
- the gesture instruction includes a first gesture instruction and a second gesture instruction that are lower than a preset threshold
- the smart terminal controls the smart terminal according to the gesture instruction.
- the image displayed in the screen is changed, and the method includes: displaying, by the screen of the smart terminal, a field of view image transmitted by the external device, and receiving the first gesture command, drawing a plurality of function icons on the view image;
- the smart terminal selects at least one of the plurality of function icons according to the second finger instruction, and executes a function corresponding to the selected function icon.
- the function corresponding to the multiple function icons comprises: sending a flight instruction to a drone that communicates with the smart terminal to control flight of the drone.
- the foregoing method further includes: the smart terminal drawing a gesture operation prompt on the view image to prompt the user to complete the gesture operation on a screen of the mobile terminal.
- a control device for an intelligent terminal which is applied to a mobile terminal, the device includes: a sensing module that senses a gesture operation, and the gesture operation is performed by a user at the mobile terminal An on-screen touch is formed; the recognition module, the gesture operation is recognized, and a gesture instruction is retrieved from a database preset in the mobile terminal, where the database includes a plurality of gesture commands, the gesture instruction and the gesture
- the sending module sends the gesture instruction to the smart terminal, where the smart terminal is used to be worn on the head of the user and has a screen for displaying an image, and the smart terminal controls according to the gesture instruction. The image displayed on the screen of the smart terminal is changed.
- the screen of the smart terminal displays a field of view image transmitted by the external device, and controls the view image to move in a corresponding direction according to a direction corresponding to the gesture instruction.
- the gesture instruction includes a first gesture instruction and a second gesture instruction that the inter-turn interval is lower than a preset threshold; the screen of the smart terminal displays a view image transmitted by the external device and Receiving the first gesture command, drawing a plurality of function icons on the view image; the smart terminal selecting at least one of the plurality of function icons according to the second finger instruction, and executing the selected function The function corresponding to the icon.
- the function corresponding to the multiple function icons includes: The communicating drone sends a flight command to control the flight of the drone.
- the smart terminal further draws a gesture operation prompt on the view image to prompt the user to complete the gesture operation on a screen of the mobile terminal.
- a control method and apparatus for an intelligent terminal of the present invention have at least the following advantages:
- the control of the smart terminal is implemented by the gesture action, and the user can still accurately perceive the location of the mobile terminal when the smart terminal is worn, so that an accurate gesture operation on the mobile terminal can be implemented, thereby correspondingly based on the mobile
- the gesture operation of the terminal completes the control of the image displayed on the screen of the smart terminal.
- FIG. 1 is a flowchart of a method for controlling an intelligent terminal according to an embodiment of the present invention
- FIG. 2 is a schematic diagram of a gesture operation of a method for controlling an intelligent terminal according to an embodiment of the present invention
- FIG. 3 is a flowchart of a method for controlling an intelligent terminal according to an embodiment of the present invention.
- 4A is a schematic diagram of a method for controlling an intelligent terminal according to an embodiment of the present invention.
- 4B is a schematic diagram of a method for controlling an intelligent terminal according to an embodiment of the present invention.
- FIG. 5 is a block diagram of a control apparatus of an intelligent terminal according to an embodiment of the present invention.
- an embodiment of the present invention provides a method for controlling an intelligent terminal, which is applied to a mobile terminal, where the mobile terminal includes but is not limited to a mobile phone, a tablet computer, etc., and the method includes: [0027] Step S110, sensing a gesture operation, where the gesture operation is formed by a user touching on a screen of the mobile terminal.
- the smart terminal includes, but is not limited to, virtual reality glasses for the user to wear; the gesture operation of the user may be completed based on a touch screen or an attitude sensor installed on the mobile terminal, for example, the user may click on the touch screen of the mobile phone. , double-click, slide, and other touch operations.
- Step S120 Identify a gesture operation, and retrieve a gesture instruction from a database preset in the mobile terminal, where the database includes multiple gesture commands, and the gesture instruction corresponds to the gesture operation.
- the operation gestures used by the user are all based on the daily life recognition operation, and the common operation gestures are as shown in FIG. 2, and the various operation gestures displayed therein are applicable to the technical solution of the embodiment,
- corresponding gesture commands are formed to complete various tasks required by the user.
- Gesture operations and corresponding gesture commands are stored in the mobile terminal database for later retrieval.
- Step S130 Send a gesture instruction to the smart terminal, the smart terminal is used to be worn on the head of the user and has a screen for displaying an image, and the smart terminal controls the image displayed on the screen of the smart terminal to change according to the gesture instruction.
- the control of the smart terminal is implemented by the gesture action, and the user can still accurately perceive the location of the mobile terminal when the smart terminal is worn, so that an accurate gesture to the mobile terminal can be implemented. Operation, thereby controlling the display of the image displayed on the screen of the smart terminal based on the gesture operation on the mobile terminal accordingly.
- an embodiment of the present invention provides a method for controlling an intelligent terminal, including: [0032] Step S310, sensing a gesture operation, where a gesture operation is performed by a user touching a screen of the mobile terminal Form
- Step S320 Identify a gesture operation, and retrieve a gesture instruction from a database preset in the mobile terminal, where the database includes multiple gesture commands, and the gesture instruction corresponds to the gesture operation.
- Step S330 Send the gesture instruction to the smart terminal, and the screen of the smart terminal displays the visual field image transmitted by the external device, and controls the visual field image to move in the corresponding direction according to the direction corresponding to the gesture instruction.
- Step S340 the screen of the smart terminal displays the field of view image transmitted by the external device and receives the first gesture command ⁇ , and draws a plurality of function icons on the view image; the smart terminal selects from the plurality of function icons according to the second finger instruction. At least one, and perform the function corresponding to the selected function icon.
- the gesture instruction includes a first gesture instruction and a second gesture instruction that the inter-turn interval is lower than a preset threshold. [0036] In this embodiment, the gesture instruction is used to control the smart terminal, as follows:
- the functions corresponding to the plurality of function icons include: transmitting a flight instruction to the unmanned aircraft communicating with the smart terminal, and controlling the flight of the drone.
- the first part is that when the aircraft has not yet taken off, the user can adjust the parameters of the smart terminal, watch the video, and the like.
- the user's interface in the smart terminal is mapped to the touch screen of the mobile phone.
- the finger slides from the top to the bottom of the screen of the mobile phone, corresponding to moving the screen display interface of the smart terminal from top to bottom the field of view in the screen display of the smart terminal moves up.
- the screen display interface corresponding to the smart terminal moves from bottom to top, and then the field of view area in the screen display of the smart terminal moves down.
- the field of view in the screen display of the smart terminal moves to the left.
- the field of view moves to the right. Due to the division of the option hierarchy, the field of view can only be switched at the same level.
- other gestures provide more functional actions, including tapping the screen to confirm the adjustment options, swiping left and right in the video to achieve fast rewind and fast forward.
- the second part is that after the system enters the flight mode, the user can touch the mobile phone interface to form the gesture operation without removing the smart terminal, and the mobile phone recognizes After the gesture operation, the command is sent to the aircraft to realize functions such as take-off, aerial shooting, position movement, return flight, and landing of the aircraft. Further, after the user touches the operation on the mobile phone interface, the visual field in the screen display of the smart terminal is moved to the "enter flight mode", and the user touches the take-off option by pressing and holding the mobile phone screen for three seconds, as shown in FIG.
- the smart The field of view inside the terminal's screen display becomes a camera
- the picture is transmitted, the user slides up on the mobile phone interface, the mobile phone recognizes the upward sliding gesture operation, and sends a flight instruction to the aircraft to take off the aircraft.
- the virtual reality glasses are used to control the flight of the drone, and after the user performs the gesture operation on the mobile terminal, the mobile terminal sends the instruction corresponding to the gesture operation to the virtual reality glasses, the virtual reality glasses.
- a plurality of operation options are provided, and an option is selected according to the instruction of the mobile terminal, and the flight instruction corresponding to the option is sent to the drone for the drone to fly according to the flight instruction, as shown in FIG. 4B.
- Step S350 The smart terminal draws a gesture operation prompt on the view image to prompt the user to complete the gesture operation on the screen of the mobile terminal.
- the gesture operation performed by the user is displayed on the screen of the smart terminal.
- the mobile terminal-based control process completed by the user is as follows:
- the picture transmission picture has parameters, maps, prompt pop-ups
- an embodiment of the present invention provides a control device for an intelligent terminal, which is applied to a mobile terminal.
- the mobile terminal includes but is not limited to a mobile phone, a tablet computer, etc., and the method includes:
- the sensing module 510 senses a gesture operation, and the gesture operation is formed by a user touching on a screen of the mobile terminal.
- the smart terminal includes but is not limited to virtual reality glasses for the user to wear.
- the user's gesture operation can be completed based on a touch screen or an attitude sensor installed on the mobile terminal. For example, the user can perform a touch operation such as clicking, double-clicking, sliding, etc. on the touch screen of the mobile phone.
- the identification module 520 identifies a gesture operation, and retrieves a gesture instruction from a database preset in the mobile terminal, where the database includes multiple gesture commands, and the gesture instruction corresponds to the gesture operation.
- the operation gestures used by the user are all based on the daily life recognition operation, and the common operation gestures are as shown in FIG. 2, and the various operation gestures shown in the embodiment are applicable to the technical solution of the embodiment,
- corresponding gesture commands are formed to complete various tasks required by the user.
- the gestures and corresponding gesture commands are stored in the mobile terminal database for later retrieval.
- the sending module 530 sends a gesture instruction to the smart terminal, where the smart terminal is used to be worn on the head of the user and has a screen for displaying an image, and the smart terminal controls the image displayed on the screen of the smart terminal to change according to the gesture instruction. .
- the control of the smart terminal is implemented by the gesture action, and the user can still accurately perceive the location of the mobile terminal when the smart terminal is worn, so that an accurate gesture to the mobile terminal can be implemented. Operation, thereby controlling the display of the image displayed on the screen of the smart terminal based on the gesture operation on the mobile terminal accordingly.
- an embodiment of the present invention provides a control device for an intelligent terminal, including: [0061] a sensing module 510, sensing a gesture operation, and the gesture operation is performed by a user on a screen of the mobile terminal Touch formation.
- the identification module 520 identifies a gesture operation, and retrieves a gesture instruction from a database preset in the mobile terminal, where the database includes multiple gesture commands, and the gesture instruction corresponds to the gesture operation.
- the sending module 530 sends the gesture command to the smart terminal, and the screen of the smart terminal displays the visual field image transmitted by the external device, and controls the visual field image to move in the corresponding direction according to the direction corresponding to the gesture instruction.
- the screen of the smart terminal displays the field of view image transmitted by the external device and receives the first gesture command ⁇ , and draws a plurality of function icons on the view image; the smart terminal selects at least one of the plurality of function icons according to the second finger instruction, And execute the function corresponding to the selected function icon.
- the gesture command includes a first gesture instruction and a second gesture instruction that the inter-turn interval is below a predetermined threshold.
- the gesture instruction is used to control the smart terminal, as follows:
- the functions corresponding to the plurality of function icons include: transmitting a flight instruction to the unmanned aircraft communicating with the smart terminal, and controlling the flight of the drone.
- the first part is that when the aircraft has not yet taken off, the user can adjust the parameters of the smart terminal, watch the video, and the like.
- the user's interface in the smart terminal is mapped to the touch screen of the mobile phone.
- the finger slides from the top to the bottom of the screen of the mobile phone, corresponding to moving the screen display interface of the smart terminal from top to bottom the field of view in the screen display of the smart terminal moves up.
- the screen display interface corresponding to the smart terminal moves from bottom to top, and then the field of view area in the screen display of the smart terminal moves down.
- the field of view in the screen display of the smart terminal moves to the left.
- the field of view moves to the right. Due to the division of the option hierarchy, the field of view can only be switched at the same level.
- other gestures provide more functional actions, including tapping the screen to confirm the adjustment options, swiping left and right in the video to achieve fast rewind and fast forward.
- the second part is that after the system enters the flight mode, the user can touch the mobile phone interface to form the gesture operation without removing the smart terminal, and the mobile phone recognizes After the gesture operation, the command is sent to the aircraft to realize functions such as take-off, aerial shooting, position movement, return flight, and landing of the aircraft. Further, after the user touches the operation on the mobile phone interface, the visual field in the screen display of the smart terminal is moved to the "enter flight mode", and the user touches the take-off option by pressing and holding the mobile phone screen for three seconds, as shown in FIG. 4A, and then smart. The field of view in the screen display of the terminal becomes a camera image transmission screen, the user slides up on the mobile phone interface, and the mobile phone recognizes the upward sliding gesture operation, and sends a Flight instructions to the aircraft take off the aircraft.
- the virtual reality glasses are used to control the flight of the drone, and after the user performs the gesture operation on the mobile terminal, the mobile terminal sends the instruction corresponding to the gesture operation to the virtual reality glasses, the virtual reality glasses.
- a plurality of operation options are provided, and an option is selected according to the instruction of the mobile terminal, and the flight instruction corresponding to the option is sent to the drone for the drone to fly according to the flight instruction, as shown in FIG. 4B.
- the smart terminal draws a gesture operation prompt on the view image to prompt the user to complete the gesture operation on the screen of the mobile terminal.
- the gesture operation performed by the user is displayed on the screen of the smart terminal.
- the mobile terminal-based control process completed by the user is as follows:
- the picture transmission picture has parameters, maps, prompt pop-ups
- the control of the smart terminal is implemented by the gesture action, and the user can still accurately perceive the location of the mobile terminal when the smart terminal is worn, so that an accurate gesture operation on the mobile terminal can be implemented, and accordingly, based on the gesture to the mobile terminal Operation, complete control of the image displayed on the screen of the smart terminal. Therefore, it has industrial applicability.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
La présente invention concerne un dispositif et un procédé de commande destinés à un terminal intelligent, appliqués à un terminal mobile. Le procédé consiste : à détecter une opération de geste de la main, l'opération de geste de la main étant effectuée par un utilisateur touchant un écran du terminal mobile (S110) ; à reconnaître l'opération de geste de la main et à récupérer un ordre de geste de la main à partir d'une base de données prédéfinie dans le terminal mobile, la base de données comprenant une pluralité d'ordres de geste de la main, l'ordre de geste de la main correspondant à l'opération de geste de la main (S120) ; et à envoyer l'ordre de geste de la main au terminal intelligent qui est porté sur une tête d'un utilisateur et comprend un écran pour afficher une image, et le terminal intelligent commandant, sur la base de l'ordre de geste de la main, l'image affichée sur l'écran du terminal intelligent à changer (S130). La commande du terminal intelligent est achevée à l'aide de gestes de la main. Lorsqu'un utilisateur porte le terminal intelligent, l'utilisateur peut toujours percevoir précisément l'emplacement du terminal mobile et peut ainsi réaliser des opérations précises de geste de la main sur le terminal mobile, de façon à achever parallèlement, sur la base des opérations de geste de la main sur le terminal mobile, la commande de l'image affichée sur l'écran du terminal intelligent.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201710031960.8A CN106708412B (zh) | 2017-01-17 | 2017-01-17 | 智能终端的控制方法和装置 |
| CN201710031960.8 | 2017-01-17 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018133593A1 true WO2018133593A1 (fr) | 2018-07-26 |
Family
ID=58907615
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2017/115961 Ceased WO2018133593A1 (fr) | 2017-01-17 | 2017-12-13 | Dispositif et procédé de commande destinés à un terminal intelligent |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN106708412B (fr) |
| WO (1) | WO2018133593A1 (fr) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN110928477A (zh) * | 2019-11-18 | 2020-03-27 | 钟林 | 一种利用滑动手势向智能终端输入操作指令的方法及装置 |
| CN110941385A (zh) * | 2019-11-18 | 2020-03-31 | 钟林 | 一种利用滑按手势操作智能终端变量调节的方法及装置 |
| CN113039550A (zh) * | 2018-10-10 | 2021-06-25 | 深圳市道通智能航空技术股份有限公司 | 手势识别方法、vr视角控制方法以及vr系统 |
| CN114140884A (zh) * | 2021-12-23 | 2022-03-04 | 北京德为智慧科技有限公司 | 一种显示控制装置、方法及终端 |
| CN111367458B (zh) * | 2020-03-10 | 2024-02-06 | 柯旋 | 用于智能触屏移动终端上的无障碍膜和无障碍触屏方法 |
Families Citing this family (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106708412B (zh) * | 2017-01-17 | 2020-06-12 | 亿航智能设备(广州)有限公司 | 智能终端的控制方法和装置 |
| CN107291359A (zh) | 2017-06-06 | 2017-10-24 | 歌尔股份有限公司 | 一种输入方法、装置和系统 |
| CN108257145B (zh) * | 2017-12-13 | 2021-07-02 | 北京华航无线电测量研究所 | 一种基于ar技术的无人机智能侦察处理系统及方法 |
| US10635895B2 (en) | 2018-06-27 | 2020-04-28 | Facebook Technologies, Llc | Gesture-based casting and manipulation of virtual content in artificial-reality environments |
| US10783712B2 (en) * | 2018-06-27 | 2020-09-22 | Facebook Technologies, Llc | Visual flairs for emphasizing gestures in artificial-reality environments |
| CN109120800A (zh) * | 2018-10-18 | 2019-01-01 | 维沃移动通信有限公司 | 一种应用程序图标调整方法及移动终端 |
| CN109753148A (zh) * | 2018-11-15 | 2019-05-14 | 北京奇艺世纪科技有限公司 | 一种vr设备的控制方法、装置及控制终端 |
| CN109410691A (zh) * | 2018-12-17 | 2019-03-01 | 深圳市中智仿真科技有限公司 | 一种手势控制功能的汽车驾培模拟机 |
| CN111586894A (zh) * | 2019-02-15 | 2020-08-25 | 普天信息技术有限公司 | 一种宽带集群终端的模式选择方法 |
| CN112083796A (zh) * | 2019-06-12 | 2020-12-15 | Oppo广东移动通信有限公司 | 控制方法、头戴设备、移动终端和控制系统 |
| CN111340962B (zh) * | 2020-02-24 | 2023-08-15 | 维沃移动通信有限公司 | 控制方法、电子设备及存储介质 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104808800A (zh) * | 2015-05-21 | 2015-07-29 | 上海斐讯数据通信技术有限公司 | 智能眼镜设备、移动终端及移动终端操作方法 |
| CN105425952A (zh) * | 2015-11-04 | 2016-03-23 | 腾讯科技(深圳)有限公司 | 无人机操控界面交互方法和装置 |
| CN105808071A (zh) * | 2016-03-31 | 2016-07-27 | 联想(北京)有限公司 | 一种显示控制方法、装置和电子设备 |
| CN106095309A (zh) * | 2016-06-03 | 2016-11-09 | 广东欧珀移动通信有限公司 | 终端的操作控制方法及装置 |
| CN106708412A (zh) * | 2017-01-17 | 2017-05-24 | 亿航智能设备(广州)有限公司 | 智能终端的控制方法和装置 |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105700541A (zh) * | 2016-03-18 | 2016-06-22 | 普宙飞行器科技(深圳)有限公司 | 无人机的操控方法、无人机系统及无人机 |
-
2017
- 2017-01-17 CN CN201710031960.8A patent/CN106708412B/zh active Active
- 2017-12-13 WO PCT/CN2017/115961 patent/WO2018133593A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104808800A (zh) * | 2015-05-21 | 2015-07-29 | 上海斐讯数据通信技术有限公司 | 智能眼镜设备、移动终端及移动终端操作方法 |
| CN105425952A (zh) * | 2015-11-04 | 2016-03-23 | 腾讯科技(深圳)有限公司 | 无人机操控界面交互方法和装置 |
| CN105808071A (zh) * | 2016-03-31 | 2016-07-27 | 联想(北京)有限公司 | 一种显示控制方法、装置和电子设备 |
| CN106095309A (zh) * | 2016-06-03 | 2016-11-09 | 广东欧珀移动通信有限公司 | 终端的操作控制方法及装置 |
| CN106708412A (zh) * | 2017-01-17 | 2017-05-24 | 亿航智能设备(广州)有限公司 | 智能终端的控制方法和装置 |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113039550A (zh) * | 2018-10-10 | 2021-06-25 | 深圳市道通智能航空技术股份有限公司 | 手势识别方法、vr视角控制方法以及vr系统 |
| CN110928477A (zh) * | 2019-11-18 | 2020-03-27 | 钟林 | 一种利用滑动手势向智能终端输入操作指令的方法及装置 |
| CN110941385A (zh) * | 2019-11-18 | 2020-03-31 | 钟林 | 一种利用滑按手势操作智能终端变量调节的方法及装置 |
| CN111367458B (zh) * | 2020-03-10 | 2024-02-06 | 柯旋 | 用于智能触屏移动终端上的无障碍膜和无障碍触屏方法 |
| CN114140884A (zh) * | 2021-12-23 | 2022-03-04 | 北京德为智慧科技有限公司 | 一种显示控制装置、方法及终端 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN106708412B (zh) | 2020-06-12 |
| CN106708412A (zh) | 2017-05-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2018133593A1 (fr) | Dispositif et procédé de commande destinés à un terminal intelligent | |
| CN112585564B (zh) | 用于为头戴式图像显示设备提供输入的方法和装置 | |
| CN105814522B (zh) | 基于运动识别来显示虚拟输入设备的用户界面的设备和方法 | |
| US20200097093A1 (en) | Touch free interface for augmented reality systems | |
| JP5900393B2 (ja) | 情報処理装置、操作制御方法及びプログラム | |
| US11782514B2 (en) | Wearable device and control method thereof, gesture recognition method, and control system | |
| CN110058759B (zh) | 显示装置及图像显示方法 | |
| US9900541B2 (en) | Augmented reality remote control | |
| CN110377053B (zh) | 无人机的飞行控制方法和装置 | |
| CN103530061B (zh) | 显示装置及控制方法 | |
| US20060209021A1 (en) | Virtual mouse driving apparatus and method using two-handed gestures | |
| JP6390799B2 (ja) | 入力装置、入力方法、及びプログラム | |
| EP2908215B1 (fr) | Procédé et appareil pour une détection de geste et commande d'affichage | |
| CN111182205A (zh) | 拍摄方法、电子设备及介质 | |
| KR20160039499A (ko) | 디스플레이 장치 및 그의 제어 방법 | |
| KR20130105725A (ko) | 콘텐츠의 양 손 제어에 기반한 컴퓨터 비전 | |
| US9904372B2 (en) | Method by which eyeglass-type display device recognizes and inputs movement | |
| CN107977083A (zh) | 基于vr系统的操作执行方法及装置 | |
| CN105630157A (zh) | 控制方法及控制装置、终端和控制系统 | |
| CN106796484A (zh) | 显示装置及其控制方法 | |
| KR101233793B1 (ko) | 손 동작 인식을 이용한 가상 마우스 구동 방법 | |
| CN107861683B (zh) | 无人机无按钮操作方法及装置 | |
| CN110519517B (zh) | 临摹引导方法、电子设备及计算机可读存储介质 | |
| US10437415B2 (en) | System, method, and device for controlling a display | |
| CN111580677A (zh) | 一种人机交互方法及人机交互系统 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17893176 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 041119) |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17893176 Country of ref document: EP Kind code of ref document: A1 |