WO2018133593A1 - 智能终端的控制方法和装置 - Google Patents
智能终端的控制方法和装置 Download PDFInfo
- Publication number
- WO2018133593A1 WO2018133593A1 PCT/CN2017/115961 CN2017115961W WO2018133593A1 WO 2018133593 A1 WO2018133593 A1 WO 2018133593A1 CN 2017115961 W CN2017115961 W CN 2017115961W WO 2018133593 A1 WO2018133593 A1 WO 2018133593A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gesture
- smart terminal
- screen
- instruction
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
Definitions
- the present invention relates to the field of UAV technologies, and in particular, to a control method and apparatus for an intelligent terminal.
- the application of the drone is gradually popularized, and it is more common to perform aerial photography by the camera device carried by the drone, thereby conveniently collecting images for the user.
- an object of the present invention is to provide a control method and apparatus for an intelligent terminal to accurately control an intelligent terminal when a user wears a smart terminal.
- a method for controlling a smart terminal is applied to a mobile terminal, the method comprising: sensing a gesture operation, the gesture operation being formed by a user touching on a screen of the mobile terminal Recognizing the gesture operation, the gesture instruction is retrieved from a database preset in the mobile terminal, where the database includes a plurality of gesture commands, and the gesture instruction corresponds to the gesture operation;
- the gesture command is sent to the smart terminal, the smart terminal is used to be worn on the head of the user and has a screen for displaying an image, and the smart terminal controls the image displayed on the screen of the smart terminal according to the gesture instruction. Make changes.
- the smart terminal controls a screen of the smart terminal according to the gesture instruction
- the image displayed in the screen is changed
- the method includes: displaying, by the screen of the smart terminal, a view image transmitted by the external device, and controlling the view image to move in a corresponding direction according to a direction corresponding to the gesture instruction.
- the gesture instruction includes a first gesture instruction and a second gesture instruction that are lower than a preset threshold
- the smart terminal controls the smart terminal according to the gesture instruction.
- the image displayed in the screen is changed, and the method includes: displaying, by the screen of the smart terminal, a field of view image transmitted by the external device, and receiving the first gesture command, drawing a plurality of function icons on the view image;
- the smart terminal selects at least one of the plurality of function icons according to the second finger instruction, and executes a function corresponding to the selected function icon.
- the function corresponding to the multiple function icons comprises: sending a flight instruction to a drone that communicates with the smart terminal to control flight of the drone.
- the foregoing method further includes: the smart terminal drawing a gesture operation prompt on the view image to prompt the user to complete the gesture operation on a screen of the mobile terminal.
- a control device for an intelligent terminal which is applied to a mobile terminal, the device includes: a sensing module that senses a gesture operation, and the gesture operation is performed by a user at the mobile terminal An on-screen touch is formed; the recognition module, the gesture operation is recognized, and a gesture instruction is retrieved from a database preset in the mobile terminal, where the database includes a plurality of gesture commands, the gesture instruction and the gesture
- the sending module sends the gesture instruction to the smart terminal, where the smart terminal is used to be worn on the head of the user and has a screen for displaying an image, and the smart terminal controls according to the gesture instruction. The image displayed on the screen of the smart terminal is changed.
- the screen of the smart terminal displays a field of view image transmitted by the external device, and controls the view image to move in a corresponding direction according to a direction corresponding to the gesture instruction.
- the gesture instruction includes a first gesture instruction and a second gesture instruction that the inter-turn interval is lower than a preset threshold; the screen of the smart terminal displays a view image transmitted by the external device and Receiving the first gesture command, drawing a plurality of function icons on the view image; the smart terminal selecting at least one of the plurality of function icons according to the second finger instruction, and executing the selected function The function corresponding to the icon.
- the function corresponding to the multiple function icons includes: The communicating drone sends a flight command to control the flight of the drone.
- the smart terminal further draws a gesture operation prompt on the view image to prompt the user to complete the gesture operation on a screen of the mobile terminal.
- a control method and apparatus for an intelligent terminal of the present invention have at least the following advantages:
- the control of the smart terminal is implemented by the gesture action, and the user can still accurately perceive the location of the mobile terminal when the smart terminal is worn, so that an accurate gesture operation on the mobile terminal can be implemented, thereby correspondingly based on the mobile
- the gesture operation of the terminal completes the control of the image displayed on the screen of the smart terminal.
- FIG. 1 is a flowchart of a method for controlling an intelligent terminal according to an embodiment of the present invention
- FIG. 2 is a schematic diagram of a gesture operation of a method for controlling an intelligent terminal according to an embodiment of the present invention
- FIG. 3 is a flowchart of a method for controlling an intelligent terminal according to an embodiment of the present invention.
- 4A is a schematic diagram of a method for controlling an intelligent terminal according to an embodiment of the present invention.
- 4B is a schematic diagram of a method for controlling an intelligent terminal according to an embodiment of the present invention.
- FIG. 5 is a block diagram of a control apparatus of an intelligent terminal according to an embodiment of the present invention.
- an embodiment of the present invention provides a method for controlling an intelligent terminal, which is applied to a mobile terminal, where the mobile terminal includes but is not limited to a mobile phone, a tablet computer, etc., and the method includes: [0027] Step S110, sensing a gesture operation, where the gesture operation is formed by a user touching on a screen of the mobile terminal.
- the smart terminal includes, but is not limited to, virtual reality glasses for the user to wear; the gesture operation of the user may be completed based on a touch screen or an attitude sensor installed on the mobile terminal, for example, the user may click on the touch screen of the mobile phone. , double-click, slide, and other touch operations.
- Step S120 Identify a gesture operation, and retrieve a gesture instruction from a database preset in the mobile terminal, where the database includes multiple gesture commands, and the gesture instruction corresponds to the gesture operation.
- the operation gestures used by the user are all based on the daily life recognition operation, and the common operation gestures are as shown in FIG. 2, and the various operation gestures displayed therein are applicable to the technical solution of the embodiment,
- corresponding gesture commands are formed to complete various tasks required by the user.
- Gesture operations and corresponding gesture commands are stored in the mobile terminal database for later retrieval.
- Step S130 Send a gesture instruction to the smart terminal, the smart terminal is used to be worn on the head of the user and has a screen for displaying an image, and the smart terminal controls the image displayed on the screen of the smart terminal to change according to the gesture instruction.
- the control of the smart terminal is implemented by the gesture action, and the user can still accurately perceive the location of the mobile terminal when the smart terminal is worn, so that an accurate gesture to the mobile terminal can be implemented. Operation, thereby controlling the display of the image displayed on the screen of the smart terminal based on the gesture operation on the mobile terminal accordingly.
- an embodiment of the present invention provides a method for controlling an intelligent terminal, including: [0032] Step S310, sensing a gesture operation, where a gesture operation is performed by a user touching a screen of the mobile terminal Form
- Step S320 Identify a gesture operation, and retrieve a gesture instruction from a database preset in the mobile terminal, where the database includes multiple gesture commands, and the gesture instruction corresponds to the gesture operation.
- Step S330 Send the gesture instruction to the smart terminal, and the screen of the smart terminal displays the visual field image transmitted by the external device, and controls the visual field image to move in the corresponding direction according to the direction corresponding to the gesture instruction.
- Step S340 the screen of the smart terminal displays the field of view image transmitted by the external device and receives the first gesture command ⁇ , and draws a plurality of function icons on the view image; the smart terminal selects from the plurality of function icons according to the second finger instruction. At least one, and perform the function corresponding to the selected function icon.
- the gesture instruction includes a first gesture instruction and a second gesture instruction that the inter-turn interval is lower than a preset threshold. [0036] In this embodiment, the gesture instruction is used to control the smart terminal, as follows:
- the functions corresponding to the plurality of function icons include: transmitting a flight instruction to the unmanned aircraft communicating with the smart terminal, and controlling the flight of the drone.
- the first part is that when the aircraft has not yet taken off, the user can adjust the parameters of the smart terminal, watch the video, and the like.
- the user's interface in the smart terminal is mapped to the touch screen of the mobile phone.
- the finger slides from the top to the bottom of the screen of the mobile phone, corresponding to moving the screen display interface of the smart terminal from top to bottom the field of view in the screen display of the smart terminal moves up.
- the screen display interface corresponding to the smart terminal moves from bottom to top, and then the field of view area in the screen display of the smart terminal moves down.
- the field of view in the screen display of the smart terminal moves to the left.
- the field of view moves to the right. Due to the division of the option hierarchy, the field of view can only be switched at the same level.
- other gestures provide more functional actions, including tapping the screen to confirm the adjustment options, swiping left and right in the video to achieve fast rewind and fast forward.
- the second part is that after the system enters the flight mode, the user can touch the mobile phone interface to form the gesture operation without removing the smart terminal, and the mobile phone recognizes After the gesture operation, the command is sent to the aircraft to realize functions such as take-off, aerial shooting, position movement, return flight, and landing of the aircraft. Further, after the user touches the operation on the mobile phone interface, the visual field in the screen display of the smart terminal is moved to the "enter flight mode", and the user touches the take-off option by pressing and holding the mobile phone screen for three seconds, as shown in FIG.
- the smart The field of view inside the terminal's screen display becomes a camera
- the picture is transmitted, the user slides up on the mobile phone interface, the mobile phone recognizes the upward sliding gesture operation, and sends a flight instruction to the aircraft to take off the aircraft.
- the virtual reality glasses are used to control the flight of the drone, and after the user performs the gesture operation on the mobile terminal, the mobile terminal sends the instruction corresponding to the gesture operation to the virtual reality glasses, the virtual reality glasses.
- a plurality of operation options are provided, and an option is selected according to the instruction of the mobile terminal, and the flight instruction corresponding to the option is sent to the drone for the drone to fly according to the flight instruction, as shown in FIG. 4B.
- Step S350 The smart terminal draws a gesture operation prompt on the view image to prompt the user to complete the gesture operation on the screen of the mobile terminal.
- the gesture operation performed by the user is displayed on the screen of the smart terminal.
- the mobile terminal-based control process completed by the user is as follows:
- the picture transmission picture has parameters, maps, prompt pop-ups
- an embodiment of the present invention provides a control device for an intelligent terminal, which is applied to a mobile terminal.
- the mobile terminal includes but is not limited to a mobile phone, a tablet computer, etc., and the method includes:
- the sensing module 510 senses a gesture operation, and the gesture operation is formed by a user touching on a screen of the mobile terminal.
- the smart terminal includes but is not limited to virtual reality glasses for the user to wear.
- the user's gesture operation can be completed based on a touch screen or an attitude sensor installed on the mobile terminal. For example, the user can perform a touch operation such as clicking, double-clicking, sliding, etc. on the touch screen of the mobile phone.
- the identification module 520 identifies a gesture operation, and retrieves a gesture instruction from a database preset in the mobile terminal, where the database includes multiple gesture commands, and the gesture instruction corresponds to the gesture operation.
- the operation gestures used by the user are all based on the daily life recognition operation, and the common operation gestures are as shown in FIG. 2, and the various operation gestures shown in the embodiment are applicable to the technical solution of the embodiment,
- corresponding gesture commands are formed to complete various tasks required by the user.
- the gestures and corresponding gesture commands are stored in the mobile terminal database for later retrieval.
- the sending module 530 sends a gesture instruction to the smart terminal, where the smart terminal is used to be worn on the head of the user and has a screen for displaying an image, and the smart terminal controls the image displayed on the screen of the smart terminal to change according to the gesture instruction. .
- the control of the smart terminal is implemented by the gesture action, and the user can still accurately perceive the location of the mobile terminal when the smart terminal is worn, so that an accurate gesture to the mobile terminal can be implemented. Operation, thereby controlling the display of the image displayed on the screen of the smart terminal based on the gesture operation on the mobile terminal accordingly.
- an embodiment of the present invention provides a control device for an intelligent terminal, including: [0061] a sensing module 510, sensing a gesture operation, and the gesture operation is performed by a user on a screen of the mobile terminal Touch formation.
- the identification module 520 identifies a gesture operation, and retrieves a gesture instruction from a database preset in the mobile terminal, where the database includes multiple gesture commands, and the gesture instruction corresponds to the gesture operation.
- the sending module 530 sends the gesture command to the smart terminal, and the screen of the smart terminal displays the visual field image transmitted by the external device, and controls the visual field image to move in the corresponding direction according to the direction corresponding to the gesture instruction.
- the screen of the smart terminal displays the field of view image transmitted by the external device and receives the first gesture command ⁇ , and draws a plurality of function icons on the view image; the smart terminal selects at least one of the plurality of function icons according to the second finger instruction, And execute the function corresponding to the selected function icon.
- the gesture command includes a first gesture instruction and a second gesture instruction that the inter-turn interval is below a predetermined threshold.
- the gesture instruction is used to control the smart terminal, as follows:
- the functions corresponding to the plurality of function icons include: transmitting a flight instruction to the unmanned aircraft communicating with the smart terminal, and controlling the flight of the drone.
- the first part is that when the aircraft has not yet taken off, the user can adjust the parameters of the smart terminal, watch the video, and the like.
- the user's interface in the smart terminal is mapped to the touch screen of the mobile phone.
- the finger slides from the top to the bottom of the screen of the mobile phone, corresponding to moving the screen display interface of the smart terminal from top to bottom the field of view in the screen display of the smart terminal moves up.
- the screen display interface corresponding to the smart terminal moves from bottom to top, and then the field of view area in the screen display of the smart terminal moves down.
- the field of view in the screen display of the smart terminal moves to the left.
- the field of view moves to the right. Due to the division of the option hierarchy, the field of view can only be switched at the same level.
- other gestures provide more functional actions, including tapping the screen to confirm the adjustment options, swiping left and right in the video to achieve fast rewind and fast forward.
- the second part is that after the system enters the flight mode, the user can touch the mobile phone interface to form the gesture operation without removing the smart terminal, and the mobile phone recognizes After the gesture operation, the command is sent to the aircraft to realize functions such as take-off, aerial shooting, position movement, return flight, and landing of the aircraft. Further, after the user touches the operation on the mobile phone interface, the visual field in the screen display of the smart terminal is moved to the "enter flight mode", and the user touches the take-off option by pressing and holding the mobile phone screen for three seconds, as shown in FIG. 4A, and then smart. The field of view in the screen display of the terminal becomes a camera image transmission screen, the user slides up on the mobile phone interface, and the mobile phone recognizes the upward sliding gesture operation, and sends a Flight instructions to the aircraft take off the aircraft.
- the virtual reality glasses are used to control the flight of the drone, and after the user performs the gesture operation on the mobile terminal, the mobile terminal sends the instruction corresponding to the gesture operation to the virtual reality glasses, the virtual reality glasses.
- a plurality of operation options are provided, and an option is selected according to the instruction of the mobile terminal, and the flight instruction corresponding to the option is sent to the drone for the drone to fly according to the flight instruction, as shown in FIG. 4B.
- the smart terminal draws a gesture operation prompt on the view image to prompt the user to complete the gesture operation on the screen of the mobile terminal.
- the gesture operation performed by the user is displayed on the screen of the smart terminal.
- the mobile terminal-based control process completed by the user is as follows:
- the picture transmission picture has parameters, maps, prompt pop-ups
- the control of the smart terminal is implemented by the gesture action, and the user can still accurately perceive the location of the mobile terminal when the smart terminal is worn, so that an accurate gesture operation on the mobile terminal can be implemented, and accordingly, based on the gesture to the mobile terminal Operation, complete control of the image displayed on the screen of the smart terminal. Therefore, it has industrial applicability.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
一种智能终端的控制方法和装置,应用于移动终端,该方法包括:感应手势操作,手势操作是通过用户在移动终端的屏幕上触摸形成(S110);识别手势操作,从预设在移动终端的数据库中调取手势指令,其中,数据库中包含多个手势指令,手势指令与手势操作相对应(S120);将手势指令发送给智能终端,智能终端用于佩戴在用户的头部且具有用于显示图像的屏幕,智能终端根据手势指令控制智能终端的屏幕中显示的图像进行改变(S130)。通过手势动作来实现对智能终端的控制,用户在佩戴智能终端的情况下,仍然可以准确地感知移动终端的位置,从而可以实现对移动终端的准确手势操作,从而相应地基于对移动终端的手势操作,完成对智能终端屏幕中所显示图像的控制。
Description
发明名称:智能终端的控制方法和装置
技术领域
[0001] 本发明涉及无人机技术领域, 尤其涉及一种智能终端的控制方法和装置。
背景技术
[0002] 目前, 无人机的应用逐渐普及, 较常见的是, 通过无人机携带的摄像装置进行 航拍, 从而方便地为用户采集影像。
[0003] 现有的无人机操控方式大多为: 采用具有物理按键的遥控器作为无人机的额控 制端, 用户通过物理按键来操控无人机的速度、 高度和方向等。 但是, 当用户 佩戴虚拟现实眼镜来观看无人机采集的图像吋, 用户的视野被虚拟现实眼镜遮 挡, 使得用户难以对虚拟现实眼镜进行控制, 因此, 需要一种新的技术方案来 准确地控制虚拟现实眼镜。
技术问题
[0004] 有鉴于此, 本发明的目的在于提供一种智能终端的控制方法和装置,以在用户佩 戴智能终端的情况下, 准确地对智能终端进行控制。
问题的解决方案
技术解决方案
[0005] 本发明解决上述技术问题所采用的技术方案如下:
[0006] 根据本发明的一个方面, 提供的一种智能终端的控制方法, 应用于移动终端, 该方法包括: 感应手势操作, 所述手势操作是通过用户在所述移动终端的屏幕 上触摸形成; 识别所述手势操作, 从预设在所述移动终端的数据库中调取手势 指令, 其中, 所述数据库中包含多个手势指令, 所述手势指令与所述手势操作 相对应; 将所述手势指令发送给智能终端, 所述智能终端用于佩戴在所述用户 的头部且具有用于显示图像的屏幕, 所述智能终端根据所述手势指令控制所述 智能终端的屏幕中显示的图像进行改变。
[0007] 可选地, 前述的方法, 所述智能终端根据所述手势指令控制所述智能终端的屏
幕中显示的图像进行改变, 具体包括: 所述智能终端的屏幕显示外部设备传输 的视野图像吋, 根据所述手势指令对应的方向, 控制所述视野图像按相应方向 进行移动。
[0008] 可选地, 前述的方法, 所述手势指令包括吋间间隔低于预设阈值的第一手势指 令和第二手势指令, 所述智能终端根据所述手势指令控制所述智能终端的屏幕 中显示的图像进行改变, 具体包括: 所述智能终端的屏幕显示外部设备传输的 视野图像且接收到所述第一手势指令吋, 在所述视野图像上绘制多个功能图标 ; 所述智能终端根据所述第二手指指令从所述多个功能图标中选择至少一个, 并执行被选功能图标对应的功能。
[0009] 可选地, 前述的方法, 所述多个功能图标对应的功能包括: 向与所述智能终端 通信的无人机发送飞行指令, 以控制所述无人机的飞行。
[0010] 可选地, 前述的方法, 还包括: 所述智能终端在所述视野图像绘制手势操作提 示, 以提示所述用户在所述移动终端的屏幕上完成所述手势操作。
[0011] 依据本发明的另一方面, 提供了一种智能终端的控制装置, 应用于移动终端, 该装置包括: 感应模块, 感应手势操作, 所述手势操作是通过用户在所述移动 终端的屏幕上触摸形成; 识别模块, 识别所述手势操作, 从预设在所述移动终 端的数据库中调取手势指令, 其中, 所述数据库中包含多个手势指令, 所述手 势指令与所述手势操作相对应; 发送模块, 将所述手势指令发送给智能终端, 所述智能终端用于佩戴在所述用户的头部且具有用于显示图像的屏幕, 所述智 能终端根据所述手势指令控制所述智能终端的屏幕中显示的图像进行改变。
[0012] 可选地, 前述的装置, 所述智能终端的屏幕显示外部设备传输的视野图像吋, 根据所述手势指令对应的方向, 控制所述视野图像按相应方向进行移动。
[0013] 可选地, 前述的装置, 所述手势指令包括吋间间隔低于预设阈值的第一手势指 令和第二手势指令; 所述智能终端的屏幕显示外部设备传输的视野图像且接收 到所述第一手势指令吋, 在所述视野图像上绘制多个功能图标; 所述智能终端 根据所述第二手指指令从所述多个功能图标中选择至少一个, 并执行被选功能 图标对应的功能。
[0014] 可选地, 前述的装置, 所述多个功能图标对应的功能包括: 向与所述智能终端
通信的无人机发送飞行指令, 以控制所述无人机的飞行。
[0015] 可选地, 前述的装置, 所述智能终端还在所述视野图像绘制手势操作提示, 以 提示所述用户在所述移动终端的屏幕上完成所述手势操作。
发明的有益效果
有益效果
[0016] 根据以上技术方案, 本发明的一种智能终端的控制方法和装置至少具有以下优 点:
[0017] 通过手势动作来实现对智能终端的控制, 用户在佩戴智能终端的情况下, 仍然 可以准确地感知移动终端的位置, 从而可以实现对移动终端的准确手势操作, 从而相应地基于对移动终端的手势操作, 完成对智能终端屏幕中所显示图像的 控制。
对附图的简要说明
附图说明
[0018] 图 1为本发明实施例的一种智能终端的控制方法的流程图;
[0019] 图 2为本发明实施例的一种智能终端的控制方法的手势操作示意图;
[0020] 图 3为本发明实施例的一种智能终端的控制方法的流程图;
[0021] 图 4A为本发明实施例的一种智能终端的控制方法的示意图;
[0022] 图 4B为本发明实施例的一种智能终端的控制方法的示意图;
[0023] 图 5为本发明实施例的一种智能终端的控制装置的框图。
[0024] 本发明目的的实现、 功能特点及优点将结合实施例, 参照附图做进一步说明。
本发明的实施方式
[0025] 为了使本发明所要解决的技术问题、 技术方案及有益效果更加清楚、 明白, 以 下结合附图和实施例, 对本发明进行进一步详细说明。 应当理解, 此处所描述 的具体实施例仅仅用以解释本发明, 并不用于限定本发明。
[0026] 如图 1所示, 本发明的一个实施例中提供了一种智能终端的控制方法, 应用于 移动终端, 移动终端包括但不限于手机、 平板电脑等, 该方法包括:
[0027] 步骤 S110, 感应手势操作, 手势操作是通过用户在移动终端的屏幕上触摸形成 。 在本实施例中, 智能终端包括但不限于虚拟现实眼镜, 供用户进行佩戴; 用 户的手势操作可以基于移动终端上安装的触摸屏或姿态传感器完成, 例如, 用 户可以在手机的触摸屏上进行单击、 双击、 滑动等触摸操作。
[0028] 步骤 S120, 识别手势操作, 从预设在移动终端的数据库中调取手势指令, 其中 , 数据库中包含多个手势指令, 手势指令与手势操作相对应。 在本实施例中, 用户所使用的操作手势都基于日常的生活认知操作, 常见的操作手势如图 2所示 , 其中展示的多种操作手势均适用于本实施例的技术方案, 在没有加重用户记 忆负担的情况下形成对应的手势指令, 来完成用户所需要的各种任务。 手势操 作以及对应的手势指令都存储在移动终端数据库中, 以随吋调取。
[0029] 步骤 S130, 将手势指令发送给智能终端, 智能终端用于佩戴在用户的头部且具 有用于显示图像的屏幕, 智能终端根据手势指令控制智能终端的屏幕中显示的 图像进行改变。
[0030] 根据本实施例的技术方案, 通过手势动作来实现对智能终端的控制, 用户在佩 戴智能终端的情况下, 仍然可以准确地感知移动终端的位置, 从而可以实现对 移动终端的准确手势操作, 从而相应地基于对移动终端的手势操作, 完成对智 能终端屏幕中所显示图像的控制。
[0031] 如图 3所示, 本发明的一个实施例中提供了一种智能终端的控制方法, 包括: [0032] 步骤 S310, 感应手势操作, 手势操作是通过用户在移动终端的屏幕上触摸形成
[0033] 步骤 S320, 识别手势操作, 从预设在移动终端的数据库中调取手势指令, 其中 , 数据库中包含多个手势指令, 手势指令与手势操作相对应。
[0034] 步骤 S330, 将手势指令发送给智能终端, 智能终端的屏幕显示外部设备传输的 视野图像吋, 根据手势指令对应的方向, 控制视野图像按相应方向进行移动。
[0035] 步骤 S340, 智能终端的屏幕显示外部设备传输的视野图像且接收到第一手势指 令吋, 在视野图像上绘制多个功能图标; 智能终端根据第二手指指令从多个功 能图标中选择至少一个, 并执行被选功能图标对应的功能。 手势指令包括吋间 间隔低于预设阈值的第一手势指令和第二手势指令。
[0036] 在本实施例中, 手势指令用于对智能终端进行控制, 具体如下:
[]
[0037] 在本实施例中, 所述多个功能图标对应的功能包括: 向与智能终端通信的无人 机发送飞行指令, 可以控制无人机的飞行。
[0038] 在具体的一个实施例中, 第一部分是当飞机还未起飞吋, 用户可以调节智能终 端的参数、 观看视频等。 当用户在手机屏幕上滑动吋, 将用户在智能终端中的 界面与手机触屏映射。 当手指在手机屏幕从上向下滑动, 对应于将智能终端的 屏幕显示界面从上向下运动, 于是智能终端的屏幕显示内的视野区域上移。 当 手指在手机屏幕从下向上滑动, 对应智能终端的屏幕显示界面从下向上运动, 于是智能终端的屏幕显示内的视野区域下移。 当手指在手机屏幕从左向右滑动 , 智能终端的屏幕显示内的视野向左移动。 当手指在手机屏幕从右向左滑动, 视野向右移动。 由于选项层级的划分, 视野区域只可以在同一层级进行切换。 在上下左右移动界面的基础上, 其他的手势提供了更多的功能动作, 包括轻触 屏幕确认调参选项, 在视频中向左向右滑动实现快退快进等。
[0039] 在具体的另外一个实施例中, 第二部分是当系统进入飞行模式后, 用户可以在 不脱掉智能终端的情况下, 用户在所述手机界面触摸形成所述手势操作, 手机 识别所述手势操作后发送指令给飞机, 实现飞机的起飞、 空中拍摄、 位置运动 、 返航、 降落等功能。 进一步的, 通过用户在手机界面触摸操作, 使得智能终 端的屏幕显示内的视野移动到"进入飞行模式 "后, 用户通过长按手机屏幕三秒触 动起飞选项, 具体如图 4A所示, 之后智能终端的屏幕显示内的的视野变为相机
图传画面, 用户在手机界面向上滑动, 手机识别向上滑动的手势操作, 发送一 飞行指令给飞机使得飞机起飞。
[0040] 在本实施例的技术方案中, 通过虚拟现实眼镜控制无人机的飞行, 用户在移动 终端上进行手势操作后, 移动终端将手势操作对应的指令发送到虚拟现实眼镜 , 虚拟现实眼镜中提供多个操作选项, 并根据移动终端的指令从中选择一个选 项, 将该选项对应的飞行指令发送给无人机, 供无人机按飞行指令进行飞行, 具体如图 4B所示。
[0041] 步骤 S350, 智能终端在视野图像绘制手势操作提示, 以提示用户在移动终端的 屏幕上完成手势操作。 在本实施例中, 在用户在移动终端进行手势操作吋, 由 于用户无法看到移动终端, 所以在智能终端的屏幕上显示用户所进行的手势操 作。 根据本实施例的技术方案, 用户所完成的基于移动终端的控制流程如下:
[0042] 1.用户手拿手机, 从上向下滑动, 视野移动到"参数调整"功能选项;
[0043] 2.用户从右向左滑动, 视野移动到"参数调整 2"选项;
[0044] 3.用户上下滑动手机触屏选择需要的参数选项;
[0045] 4.用户从左向右滑动, 退回"参数调整";
[0046] 5.用户从下向上滑动, 视野移动到"飞行模式";
[0047] 6.用户从右向左滑动, 跳转到 "进入飞行模式";
[0048] 7.用户按提示长按屏幕三秒, 眼镜视野跳转到飞机图传画面;
[0049] 8.图传画面有包含参数, 地图, 提示弹窗;
[0050] 9.用户从右向左滑入屏幕呼入飞行参数调整菜单;
[0051] 10.用户上下滑动屏幕, 以拨轮的方式选择需要的参数;
[0052] 11.用户从左向右滑出屏幕, 收起参数调整菜单;
[0053] 12.用户单击屏幕, 屏幕出现快捷动作菜单。
[0054] 13.用户将手指放在屏幕上, 移动而不抬起手指, 选择需要的快捷动作。
[0055] 如图 5所示, 本发明的一个实施例中提供了一种智能终端的控制装置, 应用于 移动终端, 移动终端包括但不限于手机、 平板电脑等, 该方法包括:
[0056] 感应模块 510, 感应手势操作, 手势操作是通过用户在移动终端的屏幕上触摸 形成。 在本实施例中, 智能终端包括但不限于虚拟现实眼镜, 供用户进行佩戴
; 用户的手势操作可以基于移动终端上安装的触摸屏或姿态传感器完成, 例如 , 用户可以在手机的触摸屏上进行单击、 双击、 滑动等触摸操作。
[0057] 识别模块 520, 识别手势操作, 从预设在移动终端的数据库中调取手势指令, 其中, 数据库中包含多个手势指令, 手势指令与手势操作相对应。 在本实施例 中, 用户所使用的操作手势都基于日常的生活认知操作, 常见的操作手势如图 2 所示, 其中展示的多种操作手势均适用于本实施例的技术方案, 在没有加重用 户记忆负担的情况下形成对应的手势指令, 来完成用户所需要的各种任务。 手 势操作以及对应的手势指令都存储在移动终端数据库中, 以随吋调取。
[0058] 发送模块 530, 将手势指令发送给智能终端, 智能终端用于佩戴在用户的头部 且具有用于显示图像的屏幕, 智能终端根据手势指令控制智能终端的屏幕中显 示的图像进行改变。
[0059] 根据本实施例的技术方案, 通过手势动作来实现对智能终端的控制, 用户在佩 戴智能终端的情况下, 仍然可以准确地感知移动终端的位置, 从而可以实现对 移动终端的准确手势操作, 从而相应地基于对移动终端的手势操作, 完成对智 能终端屏幕中所显示图像的控制。
[0060] 如图 5所示, 本发明的一个实施例中提供了一种智能终端的控制装置, 包括: [0061] 感应模块 510, 感应手势操作, 手势操作是通过用户在移动终端的屏幕上触摸 形成。
[0062] 识别模块 520, 识别手势操作, 从预设在移动终端的数据库中调取手势指令, 其中, 数据库中包含多个手势指令, 手势指令与手势操作相对应。
[0063] 发送模块 530, 将手势指令发送给智能终端, 智能终端的屏幕显示外部设备传 输的视野图像吋, 根据手势指令对应的方向, 控制视野图像按相应方向进行移 动。
[0064] 智能终端的屏幕显示外部设备传输的视野图像且接收到第一手势指令吋, 在视 野图像上绘制多个功能图标; 智能终端根据第二手指指令从多个功能图标中选 择至少一个, 并执行被选功能图标对应的功能。 手势指令包括吋间间隔低于预 设阈值的第一手势指令和第二手势指令。
[0066] 在本实施例中, 所述多个功能图标对应的功能包括: 向与智能终端通信的无人 机发送飞行指令, 可以控制无人机的飞行。
[0067] 在具体的一个实施例中, 第一部分是当飞机还未起飞吋, 用户可以调节智能终 端的参数、 观看视频等。 当用户在手机屏幕上滑动吋, 将用户在智能终端中的 界面与手机触屏映射。 当手指在手机屏幕从上向下滑动, 对应于将智能终端的 屏幕显示界面从上向下运动, 于是智能终端的屏幕显示内的视野区域上移。 当 手指在手机屏幕从下向上滑动, 对应智能终端的屏幕显示界面从下向上运动, 于是智能终端的屏幕显示内的视野区域下移。 当手指在手机屏幕从左向右滑动 , 智能终端的屏幕显示内的视野向左移动。 当手指在手机屏幕从右向左滑动, 视野向右移动。 由于选项层级的划分, 视野区域只可以在同一层级进行切换。 在上下左右移动界面的基础上, 其他的手势提供了更多的功能动作, 包括轻触 屏幕确认调参选项, 在视频中向左向右滑动实现快退快进等。
[0068] 在具体的另外一个实施例中, 第二部分是当系统进入飞行模式后, 用户可以在 不脱掉智能终端的情况下, 用户在所述手机界面触摸形成所述手势操作, 手机 识别所述手势操作后发送指令给飞机, 实现飞机的起飞、 空中拍摄、 位置运动 、 返航、 降落等功能。 进一步的, 通过用户在手机界面触摸操作, 使得智能终 端的屏幕显示内的视野移动到"进入飞行模式 "后, 用户通过长按手机屏幕三秒触 动起飞选项, 具体如图 4A所示, 之后智能终端的屏幕显示内的的视野变为相机 图传画面, 用户在手机界面向上滑动, 手机识别向上滑动的手势操作, 发送一
飞行指令给飞机使得飞机起飞。
[0069] 在本实施例的技术方案中, 通过虚拟现实眼镜控制无人机的飞行, 用户在移动 终端上进行手势操作后, 移动终端将手势操作对应的指令发送到虚拟现实眼镜 , 虚拟现实眼镜中提供多个操作选项, 并根据移动终端的指令从中选择一个选 项, 将该选项对应的飞行指令发送给无人机, 供无人机按飞行指令进行飞行, 具体如图 4B所示。
[0070] 智能终端在视野图像绘制手势操作提示, 以提示用户在移动终端的屏幕上完成 手势操作。 在本实施例中, 在用户在移动终端进行手势操作吋, 由于用户无法 看到移动终端, 所以在智能终端的屏幕上显示用户所进行的手势操作。 根据本 实施例的技术方案, 用户所完成的基于移动终端的控制流程如下:
[0071] 1.用户手拿手机, 从上向下滑动, 视野移动到"参数调整"功能选项;
[0072] 2.用户从右向左滑动, 视野移动到"参数调整 2"选项;
[0073] 3.用户上下滑动手机触屏选择需要的参数选项;
[0074] 4.用户从左向右滑动, 退回"参数调整";
[0075] 5.用户从下向上滑动, 视野移动到"飞行模式";
[0076] 6.用户从右向左滑动, 跳转到 "进入飞行模式";
[0077] 7.用户按提示长按屏幕三秒, 眼镜视野跳转到飞机图传画面;
[0078] 8.图传画面有包含参数, 地图, 提示弹窗;
[0079] 9.用户从右向左滑入屏幕呼入飞行参数调整菜单;
[0080] 10.用户上下滑动屏幕, 以拨轮的方式选择需要的参数;
[0081] 11.用户从左向右滑出屏幕, 收起参数调整菜单;
[0082] 12.用户单击屏幕, 屏幕出现快捷动作菜单。
[0083] 13.用户将手指放在屏幕上, 移动而不抬起手指, 选择需要的快捷动作。
[0084] 以上参照附图说明了本发明的优选实施例, 并非因此局限本发明的权利范围。
本领域技术人员不脱离本发明的范围和实质, 可以有多种变型方案实现本发明 , 比如作为一个实施例的特征可用于另一实施例而得到又一实施例。 凡在运用 本发明的技术构思之内所作的任何修改、 等同替换和改进, 均应在本发明的权 利范围之内。
工业实用性
通过手势动作来实现对智能终端的控制, 用户在佩戴智能终端的情况下, 仍然 可以准确地感知移动终端的位置, 从而可以实现对移动终端的准确手势操作, 从而相应地基于对移动终端的手势操作, 完成对智能终端屏幕中所显示图像的 控制。 因此, 具有工业实用性。
Claims
[权利要求 1] 一种智能终端的控制方法, 应用于移动终端, 该方法包括:
感应手势操作, 所述手势操作是通过用户在所述移动终端的屏幕上触 摸形成;
识别所述手势操作, 从预设在所述移动终端的数据库中调取手势指令
, 其中, 所述数据库中包含多个手势指令, 所述手势指令与所述手势 操作相对应; 将所述手势指令发送给智能终端, 所述智能终端用于佩戴在所述用户 的头部且具有用于显示图像的屏幕, 所述智能终端根据所述手势指令 控制所述智能终端的屏幕中显示的图像进行改变。
[权利要求 2] 根据权利要求 1所述智能终端的控制方法, 其中, 所述智能终端根据 所述手势指令控制所述智能终端的屏幕中显示的图像进行改变, 具体 包括:
所述智能终端的屏幕显示外部设备传输的视野图像吋, 根据所述手势 指令对应的方向, 控制所述视野图像按相应方向进行移动。
[权利要求 3] 根据权利要求 2所述智能终端的控制方法, 其中, 所述手势指令包括 吋间间隔低于预设阈值的第一手势指令和第二手势指令, 所述智能终 端根据所述手势指令控制所述智能终端的屏幕中显示的图像进行改变 , 具体包括:
所述智能终端的屏幕显示外部设备传输的视野图像且接收到所述第一 手势指令吋, 在所述视野图像上绘制多个功能图标;
所述智能终端根据所述第二手指指令从所述多个功能图标中选择至少 一个, 并执行被选功能图标对应的功能。
[权利要求 4] 根据权利要求 3所述智能终端的控制方法, 其中, 所述多个功能图标 对应的功能包括: 向与所述智能终端通信的无人机发送飞行指令, 以 控制所述无人机的飞行。
[权利要求 5] 根据权利要求 2所述智能终端的控制方法, 其中, 还包括:
所述智能终端在所述视野图像绘制手势操作提示, 以提示所述用户在
所述移动终端的屏幕上完成所述手势操作。
一种智能终端的控制装置, 应用于移动终端, 该装置包括: 感应模块, 感应手势操作, 所述手势操作是通过用户在所述移动终端 的屏幕上触摸形成;
识别模块, 识别所述手势操作, 从预设在所述移动终端的数据库中调 取手势指令, 其中, 所述数据库中包含多个手势指令, 所述手势指令 与所述手势操作相对应;
发送模块, 将所述手势指令发送给智能终端, 所述智能终端用于佩戴 在所述用户的头部且具有用于显示图像的屏幕, 所述智能终端根据所 述手势指令控制所述智能终端的屏幕中显示的图像进行改变。
根据权利要求 6所述智能终端的控制装置, 其中,
所述智能终端的屏幕显示外部设备传输的视野图像吋, 根据所述手势 指令对应的方向, 控制所述视野图像按相应方向进行移动。
根据权利要求 7所述智能终端的控制装置, 其中, 所述手势指令包括 吋间间隔低于预设阈值的第一手势指令和第二手势指令; 所述智能终 端的屏幕显示外部设备传输的视野图像且接收到所述第一手势指令吋 , 在所述视野图像上绘制多个功能图标; 所述智能终端根据所述第二 手指指令从所述多个功能图标中选择至少一个, 并执行被选功能图标 对应的功能。
根据权利要求 8所述智能终端的控制装置, 其中, 所述多个功能图标 对应的功能包括: 向与所述智能终端通信的无人机发送飞行指令, 以 控制所述无人机的飞行。
根据权利要求 7所述智能终端的控制装置, 其中,
所述智能终端还在所述视野图像绘制手势操作提示, 以提示所述用户 在所述移动终端的屏幕上完成所述手势操作。
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201710031960.8A CN106708412B (zh) | 2017-01-17 | 2017-01-17 | 智能终端的控制方法和装置 |
| CN201710031960.8 | 2017-01-17 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018133593A1 true WO2018133593A1 (zh) | 2018-07-26 |
Family
ID=58907615
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2017/115961 Ceased WO2018133593A1 (zh) | 2017-01-17 | 2017-12-13 | 智能终端的控制方法和装置 |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN106708412B (zh) |
| WO (1) | WO2018133593A1 (zh) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN110928477A (zh) * | 2019-11-18 | 2020-03-27 | 钟林 | 一种利用滑动手势向智能终端输入操作指令的方法及装置 |
| CN110941385A (zh) * | 2019-11-18 | 2020-03-31 | 钟林 | 一种利用滑按手势操作智能终端变量调节的方法及装置 |
| CN113039550A (zh) * | 2018-10-10 | 2021-06-25 | 深圳市道通智能航空技术股份有限公司 | 手势识别方法、vr视角控制方法以及vr系统 |
| CN114140884A (zh) * | 2021-12-23 | 2022-03-04 | 北京德为智慧科技有限公司 | 一种显示控制装置、方法及终端 |
| CN111367458B (zh) * | 2020-03-10 | 2024-02-06 | 柯旋 | 用于智能触屏移动终端上的无障碍膜和无障碍触屏方法 |
Families Citing this family (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106708412B (zh) * | 2017-01-17 | 2020-06-12 | 亿航智能设备(广州)有限公司 | 智能终端的控制方法和装置 |
| CN107291359A (zh) | 2017-06-06 | 2017-10-24 | 歌尔股份有限公司 | 一种输入方法、装置和系统 |
| CN108257145B (zh) * | 2017-12-13 | 2021-07-02 | 北京华航无线电测量研究所 | 一种基于ar技术的无人机智能侦察处理系统及方法 |
| US10635895B2 (en) | 2018-06-27 | 2020-04-28 | Facebook Technologies, Llc | Gesture-based casting and manipulation of virtual content in artificial-reality environments |
| US10783712B2 (en) * | 2018-06-27 | 2020-09-22 | Facebook Technologies, Llc | Visual flairs for emphasizing gestures in artificial-reality environments |
| CN109120800A (zh) * | 2018-10-18 | 2019-01-01 | 维沃移动通信有限公司 | 一种应用程序图标调整方法及移动终端 |
| CN109753148A (zh) * | 2018-11-15 | 2019-05-14 | 北京奇艺世纪科技有限公司 | 一种vr设备的控制方法、装置及控制终端 |
| CN109410691A (zh) * | 2018-12-17 | 2019-03-01 | 深圳市中智仿真科技有限公司 | 一种手势控制功能的汽车驾培模拟机 |
| CN111586894A (zh) * | 2019-02-15 | 2020-08-25 | 普天信息技术有限公司 | 一种宽带集群终端的模式选择方法 |
| CN112083796A (zh) * | 2019-06-12 | 2020-12-15 | Oppo广东移动通信有限公司 | 控制方法、头戴设备、移动终端和控制系统 |
| CN111340962B (zh) * | 2020-02-24 | 2023-08-15 | 维沃移动通信有限公司 | 控制方法、电子设备及存储介质 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104808800A (zh) * | 2015-05-21 | 2015-07-29 | 上海斐讯数据通信技术有限公司 | 智能眼镜设备、移动终端及移动终端操作方法 |
| CN105425952A (zh) * | 2015-11-04 | 2016-03-23 | 腾讯科技(深圳)有限公司 | 无人机操控界面交互方法和装置 |
| CN105808071A (zh) * | 2016-03-31 | 2016-07-27 | 联想(北京)有限公司 | 一种显示控制方法、装置和电子设备 |
| CN106095309A (zh) * | 2016-06-03 | 2016-11-09 | 广东欧珀移动通信有限公司 | 终端的操作控制方法及装置 |
| CN106708412A (zh) * | 2017-01-17 | 2017-05-24 | 亿航智能设备(广州)有限公司 | 智能终端的控制方法和装置 |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105700541A (zh) * | 2016-03-18 | 2016-06-22 | 普宙飞行器科技(深圳)有限公司 | 无人机的操控方法、无人机系统及无人机 |
-
2017
- 2017-01-17 CN CN201710031960.8A patent/CN106708412B/zh active Active
- 2017-12-13 WO PCT/CN2017/115961 patent/WO2018133593A1/zh not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104808800A (zh) * | 2015-05-21 | 2015-07-29 | 上海斐讯数据通信技术有限公司 | 智能眼镜设备、移动终端及移动终端操作方法 |
| CN105425952A (zh) * | 2015-11-04 | 2016-03-23 | 腾讯科技(深圳)有限公司 | 无人机操控界面交互方法和装置 |
| CN105808071A (zh) * | 2016-03-31 | 2016-07-27 | 联想(北京)有限公司 | 一种显示控制方法、装置和电子设备 |
| CN106095309A (zh) * | 2016-06-03 | 2016-11-09 | 广东欧珀移动通信有限公司 | 终端的操作控制方法及装置 |
| CN106708412A (zh) * | 2017-01-17 | 2017-05-24 | 亿航智能设备(广州)有限公司 | 智能终端的控制方法和装置 |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113039550A (zh) * | 2018-10-10 | 2021-06-25 | 深圳市道通智能航空技术股份有限公司 | 手势识别方法、vr视角控制方法以及vr系统 |
| CN110928477A (zh) * | 2019-11-18 | 2020-03-27 | 钟林 | 一种利用滑动手势向智能终端输入操作指令的方法及装置 |
| CN110941385A (zh) * | 2019-11-18 | 2020-03-31 | 钟林 | 一种利用滑按手势操作智能终端变量调节的方法及装置 |
| CN111367458B (zh) * | 2020-03-10 | 2024-02-06 | 柯旋 | 用于智能触屏移动终端上的无障碍膜和无障碍触屏方法 |
| CN114140884A (zh) * | 2021-12-23 | 2022-03-04 | 北京德为智慧科技有限公司 | 一种显示控制装置、方法及终端 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN106708412B (zh) | 2020-06-12 |
| CN106708412A (zh) | 2017-05-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2018133593A1 (zh) | 智能终端的控制方法和装置 | |
| JP7674441B2 (ja) | 頭部装着型画像ディスプレイデバイスのための入力を提供するための方法および装置 | |
| US11093045B2 (en) | Systems and methods to augment user interaction with the environment outside of a vehicle | |
| CN105814522B (zh) | 基于运动识别来显示虚拟输入设备的用户界面的设备和方法 | |
| JP5900393B2 (ja) | 情報処理装置、操作制御方法及びプログラム | |
| US11782514B2 (en) | Wearable device and control method thereof, gesture recognition method, and control system | |
| CN110058759B (zh) | 显示装置及图像显示方法 | |
| US9900541B2 (en) | Augmented reality remote control | |
| CN110377053B (zh) | 无人机的飞行控制方法和装置 | |
| CN103530061B (zh) | 显示装置及控制方法 | |
| US20060209021A1 (en) | Virtual mouse driving apparatus and method using two-handed gestures | |
| JP6390799B2 (ja) | 入力装置、入力方法、及びプログラム | |
| EP2908215B1 (en) | Method and apparatus for gesture detection and display control | |
| CN111182205A (zh) | 拍摄方法、电子设备及介质 | |
| KR20160039499A (ko) | 디스플레이 장치 및 그의 제어 방법 | |
| KR20130105725A (ko) | 콘텐츠의 양 손 제어에 기반한 컴퓨터 비전 | |
| US20160252966A1 (en) | Method by which eyeglass-type display device recognizes and inputs movement | |
| CN107977083A (zh) | 基于vr系统的操作执行方法及装置 | |
| CN105630157A (zh) | 控制方法及控制装置、终端和控制系统 | |
| CN106796484A (zh) | 显示装置及其控制方法 | |
| KR101233793B1 (ko) | 손 동작 인식을 이용한 가상 마우스 구동 방법 | |
| CN107861683B (zh) | 无人机无按钮操作方法及装置 | |
| CN110519517B (zh) | 临摹引导方法、电子设备及计算机可读存储介质 | |
| US10437415B2 (en) | System, method, and device for controlling a display | |
| CN111580677A (zh) | 一种人机交互方法及人机交互系统 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17893176 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 041119) |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17893176 Country of ref document: EP Kind code of ref document: A1 |