WO2014135023A1 - Procédé et système d'interaction homme-machine pour terminal intelligent - Google Patents
Procédé et système d'interaction homme-machine pour terminal intelligent Download PDFInfo
- Publication number
- WO2014135023A1 WO2014135023A1 PCT/CN2014/072586 CN2014072586W WO2014135023A1 WO 2014135023 A1 WO2014135023 A1 WO 2014135023A1 CN 2014072586 W CN2014072586 W CN 2014072586W WO 2014135023 A1 WO2014135023 A1 WO 2014135023A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- control instruction
- smart terminal
- data
- earphone
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
Definitions
- the present invention relates to the field of human-computer interaction technologies, and in particular, to a human-computer interaction method and system for an intelligent terminal.
- Face recognition technology requires the device to turn on the camera and aim at the face to be accurate. Identification. Therefore, in actual use, when the user is unwilling or unable to perform various operations by hand, the above-mentioned human-computer interaction technology often fails to meet the user's operation requirements, accurately achieves the user's desired operational effect, and gives the user A bad operating experience that is not convenient enough.
- the present invention provides a novel human-computer interaction method and system for a smart terminal, so that the user does not enjoy the audio/video even when making detailed menu selections. You need to touch the phone to operate and communicate the relevant control commands.
- a human-computer interaction method for a smart terminal wherein the smart terminal establishes a wired or wireless connection with an inductive headset, and the digital inductive device and the at least one pressure sensing device are disposed on the inductive headset, the steps of the method comprising:
- the digital gyroscope collects motion data during the movement of the earphone and sends it to the intelligent terminal;
- the processor of the smart terminal analyzes the motion data to obtain and executes a control instruction matching the control data, the control instruction including a movement of the focus mark on the user display interface;
- the pressure sensing device acquires the sensing signal caused by the tensioning action of the ear organ of the user, and outputs the sensing data to the intelligent terminal according to the sensing signal;
- control instruction comprising triggering an item pointed by the focus indicator in step b.
- the focus mark includes, but is not limited to, a highlight effect of a cursor, a mouse pointer, and a text/symbol/graphic
- the item pointed by the focus mark includes but is not limited to a button, a menu option, a text/symbol with a link function/ Graphics.
- Step b The processor of the smart terminal analyzes the motion data to obtain a matching control instruction, which is: the relative motion data of the earphone is obtained by analyzing the smart terminal as a reference object, and the relative motion data is matched with the preset instruction parameter.
- the control instruction is judged according to the matching result; or the motion data output by the digital gyro is directly matched with the preset instruction parameter, and the control instruction is judged according to the matching result.
- control instruction of step b may further include up/down/left/right translation and page turning of the page on the user display interface.
- the sensing data of step c includes the collected pressure value and the corresponding time value.
- Step d analyzing the sensing data to obtain a matching control command includes: comparing whether a maximum pressure difference caused by an expansion or tightening action of a user's ear organ reaches a pressure difference threshold in a preset command parameter; comparing the expansion or Whether the frequency of the tightening action matches the preset frequency reference value.
- the invention also discloses a human-computer interaction system for an intelligent terminal, comprising an inductive earphone and a smart terminal for establishing a wired connection or a wireless connection, the induction earphone comprising:
- connection module for transmitting data to the smart terminal by the earphone
- a gyroscope module for collecting motion data during the movement of the earphone
- a pressure detecting module configured to acquire a sensing signal caused by a tensioning action of a user's ear organ, and output sensing data according to the sensing signal
- the smart terminal includes:
- connection module configured to receive data sent by the earphone
- a motion analysis module for analyzing motion data of the gyro module to obtain a matching control command, the control command including movement of the focus mark on the user display interface
- a pressure analysis module for analyzing the sensing data of the pressure detecting module to obtain a matching control command, the control command comprising triggering an item pointed by the focus mark on the user display interface;
- An execution module for executing the control instruction.
- the focus mark includes a highlight effect of a cursor, a mouse pointer, and a text/symbol/graphic
- the item pointed by the focus mark includes a button, a menu option, and a text/symbol/graphic with a link function.
- the motion analysis module is used to analyze the motion data to obtain a matching control instruction, which is: the relative motion data of the earphone is obtained by analyzing the smart terminal as a reference object, and the relative motion data is matched with the preset instruction parameter, and the matching result is judged according to the matching result.
- Control command or directly match the motion data output by the digital gyroscope with the preset command parameters, and judge the control command according to the matching result.
- the sensing data includes the collected pressure value and the corresponding time value
- the pressure analysis module is configured to analyze the sensing data to obtain a matching control instruction, including: comparing the maximum caused by the expansion or tightening action of the user's ear organ Whether the pressure difference reaches the pressure difference threshold in the preset command parameter; whether the frequency for comparing the expansion or tightening action matches the preset frequency reference value.
- the present invention provides a new human-computer interaction mode of the smart terminal, and detects the user's head motion and the ear-pair earphone through the gyroscope and the pressure sensing device in the earphone connected with the smart terminal signal.
- the tensioning action is implemented to perform a user interface operation command that is adapted to the detection signal.
- Users do not need to directly input instructions to the intelligent terminal to operate, and can also avoid the trouble of being disturbed by foreign objects and unable to transmit control signals, especially when the user is listening to audio/video or even making detailed menu selections, without touching the mobile phone for operation. It can convey relevant control commands, and with the support of existing sensing technology, it can meet the accuracy requirements of operation recognition, bringing a new and convenient operation experience to users.
- FIG. 1 is a schematic flowchart of implementing a human-computer interaction method of a smart phone according to an embodiment of the present invention
- FIG. 2 is a schematic block diagram of a human-machine interaction system of an intelligent terminal according to an embodiment of the present invention.
- a human-computer interaction method for an intelligent terminal the smart terminal establishing a wired or wireless connection with an inductive headset, the wireless connection comprising a Bluetooth connection, wherein the inductive headset is provided with a digital gyroscope and at least one pressure sensing device, the method
- the steps include:
- the digital gyroscope collects motion data during the movement of the earphone and sends it to the intelligent terminal;
- the processor of the smart terminal analyzes the motion data to obtain and executes a control instruction matching the control data, the control instruction including a movement of the focus mark on the user display interface;
- the pressure sensing device acquires the sensing signal caused by the tensioning action of the ear organ of the user, and outputs the sensing data to the intelligent terminal according to the sensing signal;
- control instruction includes the item pointed to by the focus identifier in step b (the coordinate position of the focus identifier in step b is stored in the processor) ).
- the focus mark includes a cursor, a mouse pointer (including a visualized mark with directivity), and a highlighting effect of a text/symbol/graphic
- the item pointed by the focus mark includes a button, a menu option, and a link function. Text/symbol/graphic.
- Step b The processor of the intelligent terminal analyzes the motion data to obtain a matching control instruction, which is: the relative motion data (relative to the smart terminal) of the earphone is analyzed by using the smart terminal as a reference object, and the relative motion data is compared with the pre-motion data.
- the command parameters are matched to determine the control command according to the matching result; or the motion data output by the digital gyro is directly matched with the preset command parameters, and the control command is judged according to the matching result.
- the analysis of the relative motion data requires the smart terminal to simultaneously configure the digital gyroscope.
- the matching of the relative motion data or the motion data with the preset command parameters refers to comparing whether the relative motion data or the motion data is within the range of the preset command parameters.
- the step b further includes: setting a correspondence between the preset instruction parameter and the control instruction, where the preset instruction parameter includes a motion speed and an acceleration, and the motion speed and the acceleration are three-dimensional vectors.
- control instructions of step b may further include up/down/left/right panning and page turning of pages on the user display interface. That is, the user's head motion is detected by the digital gyroscope to control the up/down/left/right panning and page turning of the page.
- the sensing data of step c includes the collected pressure value and the corresponding time value.
- Step d analyzing the sensing data to obtain a matching control command includes: comparing whether a maximum pressure difference caused by an expansion or tightening action of a user's ear organ reaches a pressure difference threshold in a preset command parameter; comparing the expansion or Whether the frequency of the tightening action matches the preset frequency reference value.
- comparing whether the frequency of the expansion or tightening action matches the preset frequency reference value comprises: comparing whether the duration of the one-time expansion or tightening action matches a preset duration range, the one-time expansion or The tightening action means that the pressure value collected by the pressure sensing device fluctuates upward or downward, causing the pressure difference to rise from the pressure difference threshold in the preset command parameter to the value of the pressure difference threshold; whether the number of tensioning actions is compared Matches the preset number of tensions; compares whether the two-two interval time of the tensioning action matches the preset interval time.
- the method further includes: setting a correspondence between the preset instruction parameter and the control instruction, where the preset instruction parameter includes a pressure difference threshold, a duration range, a tensioning number, and an interval time.
- the smart terminal has a voice prompt function
- the voice function is used in combination with the gyro module and the pressure detecting module of the earphone, and the earphone outputs prompt information, such as a prompt menu option, thereby making the inventive concept
- prompt information such as a prompt menu option
- the application of the human-computer interaction method of the present invention includes selection of menu options, control of game operations, and the like. Specifically, taking the selection of menu options as an example, when the functions of the digital gyroscope and the pressure detecting device are turned on, the gyroscope detects the user's head. The forward/back/up/down/left/movement controls the position of the focus mark on the screen of the smart terminal on the mouse. When the focus mark falls on the position desired by the user, the user makes an action of expanding or contracting the ear, and the operation is detected by the pressure detecting means, triggering the menu option pointed by the focus mark, such as the trigger button 'next'. The method is applied to the game, and the movement of the game object can be controlled by the head movement, and the preset operation of the game object, such as 'shooting the shell', is controlled by the expansion or contraction action of the ear.
- the present invention further provides an implementation process of a human-computer interaction method of a smart phone according to an embodiment, and the steps thereof include:
- digital gyroscope collects motion data during the movement of the earphone
- the headset sends motion data to the smart terminal via Bluetooth
- the processor of the intelligent terminal analyzes the motion data
- step 005 determining whether the motion data matches the preset instruction parameter of the control instruction, if step 006 is performed, otherwise returning to step 002;
- the intelligent terminal executes a control instruction, and moves the focus indicator on the user display interface to a position desired by the user;
- the pressure sensing device acquires a sensing signal caused by a tensioning action of a user's ear organ
- the earphone outputs the sensing data to the smart terminal according to the sensing signal
- the processor of the smart terminal analyzes the sensing data.
- step 010 determining whether the sensing data matches the preset instruction parameter of the control instruction, if step 011 is performed, otherwise returning to step 007;
- the intelligent terminal executes a control instruction to trigger an item pointed by the focus identifier.
- the present invention also discloses a human-computer interaction system for an intelligent terminal, which includes an inductive headset and a smart terminal for establishing a wired connection or a wireless connection, and the inductive headset includes:
- connection module for transmitting data to the smart terminal by the earphone
- a gyroscope module for collecting motion data during the movement of the earphone
- a pressure detecting module configured to acquire a sensing signal caused by a tensioning action of a user's ear organ, and output sensing data according to the sensing signal
- the smart terminal includes:
- connection module configured to receive data sent by the earphone
- a motion analysis module for analyzing motion data of the gyro module to obtain a matching control command, the control command including movement of the focus mark on the user display interface
- a pressure analysis module for analyzing the sensing data of the pressure detecting module to obtain a matching control command, the control command comprising triggering an item pointed by the focus mark on the user display interface;
- An execution module for executing the control instruction.
- the focus mark includes a highlight effect of a cursor, a mouse pointer, and a text/symbol/graphic
- the item pointed by the focus mark includes a button, a menu option, and a text/symbol/graphic with a link function.
- the motion analysis module is used to analyze the motion data to obtain a matching control instruction, which is: the relative motion data of the earphone is obtained by analyzing the smart terminal as a reference object, and the relative motion data is matched with the preset instruction parameter, and the matching result is judged according to the matching result.
- Control command or directly match the motion data output by the digital gyroscope with the preset command parameters, and judge the control command according to the matching result.
- the sensing data includes the collected pressure value and the corresponding time value
- the pressure analysis module is configured to analyze the sensing data to obtain a matching control instruction, including: comparing the maximum caused by the expansion or tightening action of the user's ear organ Whether the pressure difference reaches the pressure difference threshold in the preset command parameter; whether the frequency for comparing the expansion or tightening action matches the preset frequency reference value.
- the human-machine interaction system of the present invention further includes an instruction parameter preset module in the smart terminal, configured to set a correspondence between the preset instruction parameter and each control instruction.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
L'invention concerne un procédé et un système d'interaction homme-machine pour terminal intelligent établissant une liaison filaire ou radio avec un écouteur à induction. Ce procédé comporte plusieurs étapes. Un gyroscope numérique équipant l'écouteur acquiert d'abord des données de mouvement générées pendant le déplacement de l'écouteur et les transmet au terminal intelligent. Le terminal intelligent analyse alors les données de mouvement de façon à obtenir une instruction de commande concordante, puis il exécute l'instruction de commande impliquant le déplacement d'un identifiant de foyer sur une interface d'afficheur de l'utilisateur. Un dispositif à induction de pression situé à l'intérieur de l'écouteur acquiert ensuite un signal d'induction induit par le mouvement de tension d'un organe de l'oreille d'un utilisateur, et produit en sortie, conformément au signal d'induction, les données d'induction qu'il destine au terminal intelligent,. Enfin, le terminal intelligent analyse les données d'induction de façon à obtenir une instruction de commande concordante, puis exécute l'instruction de commande qui implique le déclenchement d'un élément désigné par l'identifiant de foyer. L'écouteur à induction détecte le mouvement de la tête de l'utilisateur et le mouvement en tension appliqué à l'écouteur par la partie considérée de l'oreille de l'utilisateur de façon que, lorsqu'un utilisateur apprécie un morceau audiovisuel, ou même qu'il fait une sélection parmi des options spécifiques du menu, l'utilisateur soit capable de communiquer une opération concernée et une instruction de commande sans avoir à toucher le téléphone mobile pour effectuer une opération quelconque.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201310070794.4 | 2013-03-06 | ||
| CN2013100707944A CN103226436A (zh) | 2013-03-06 | 2013-03-06 | 一种智能终端的人机交互方法及系统 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2014135023A1 true WO2014135023A1 (fr) | 2014-09-12 |
Family
ID=48836909
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2014/072586 Ceased WO2014135023A1 (fr) | 2013-03-06 | 2014-02-26 | Procédé et système d'interaction homme-machine pour terminal intelligent |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN103226436A (fr) |
| WO (1) | WO2014135023A1 (fr) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106325481A (zh) * | 2015-06-30 | 2017-01-11 | 展讯通信(天津)有限公司 | 一种非接触式控制系统及方法以及移动终端 |
| CN108269571A (zh) * | 2018-03-07 | 2018-07-10 | 佛山市云米电器科技有限公司 | 一种带有摄像头功能的语音控制终端 |
| CN109426498A (zh) * | 2017-08-24 | 2019-03-05 | 北京迪文科技有限公司 | 一种人机交互系统后台开发方法和装置 |
| CN114554094A (zh) * | 2022-02-25 | 2022-05-27 | 深圳市豪恩声学股份有限公司 | 基于头戴式耳机的摄像控制方法以及头戴式耳机 |
Families Citing this family (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103226436A (zh) * | 2013-03-06 | 2013-07-31 | 广东欧珀移动通信有限公司 | 一种智能终端的人机交互方法及系统 |
| CN103543843A (zh) * | 2013-10-09 | 2014-01-29 | 中国科学院深圳先进技术研究院 | 基于加速度传感器的人机接口设备及人机交互方法 |
| CN103763440A (zh) * | 2014-02-19 | 2014-04-30 | 联想(北京)有限公司 | 一种信息处理方法、电子设备附件及电子设备 |
| CN104935721B (zh) * | 2014-03-20 | 2018-02-13 | 宇龙计算机通信科技(深圳)有限公司 | 一种与智能终端互动的方法及装置 |
| CN104080022A (zh) * | 2014-07-14 | 2014-10-01 | 深迪半导体(上海)有限公司 | 一种带姿态控制的线控耳机 |
| CN105446467B (zh) * | 2014-08-21 | 2018-08-31 | 刘小洋 | 控制智能终端设备使用的方法和装置 |
| CN105572870A (zh) * | 2014-11-05 | 2016-05-11 | 优利科技有限公司 | 头戴式显示设备 |
| CN105867600A (zh) * | 2015-11-06 | 2016-08-17 | 乐视移动智能信息技术(北京)有限公司 | 一种交互方法和设备 |
| CN105472496A (zh) * | 2015-11-21 | 2016-04-06 | 惠州Tcl移动通信有限公司 | 蓝牙耳机及其自动通断方法 |
| CN106908642B (zh) * | 2015-12-23 | 2021-05-28 | 普源精电科技股份有限公司 | 一种探头、示波器、运动识别系统及方法 |
| CN105718777B (zh) * | 2016-01-19 | 2018-12-25 | 宇龙计算机通信科技(深圳)有限公司 | 一种终端解/锁屏方法、耳机、终端以及系统 |
| CN112969116A (zh) * | 2021-02-01 | 2021-06-15 | 深圳市美恩微电子有限公司 | 一种无线耳机与智能终端的交互控制系统 |
| CN112835453B (zh) * | 2021-03-04 | 2023-05-09 | 网易(杭州)网络有限公司 | 模拟人眼聚焦时界面效果的方法、设备和存储介质 |
| CN118870249A (zh) * | 2024-07-29 | 2024-10-29 | 深圳市迈斯高科技有限公司 | 一种应用在蓝牙耳机上的压力感应信号的处理方法 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2293598A2 (fr) * | 2009-07-31 | 2011-03-09 | Carlos De La Fe Dahlin | Système de menu |
| US20110206215A1 (en) * | 2010-02-21 | 2011-08-25 | Sony Ericsson Mobile Communications Ab | Personal listening device having input applied to the housing to provide a desired function and method |
| US20130055103A1 (en) * | 2011-08-29 | 2013-02-28 | Pantech Co., Ltd. | Apparatus and method for controlling three-dimensional graphical user interface (3d gui) |
| CN103226436A (zh) * | 2013-03-06 | 2013-07-31 | 广东欧珀移动通信有限公司 | 一种智能终端的人机交互方法及系统 |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR100656523B1 (ko) * | 2005-07-27 | 2006-12-11 | 삼성전자주식회사 | 인체의 움직임을 이용한 시스템 및 그 방법 |
| CN101232743A (zh) * | 2007-01-24 | 2008-07-30 | 鸿富锦精密工业(深圳)有限公司 | 音频播放装置及其应用的耳机、自动控制方法 |
| US8098838B2 (en) * | 2008-11-24 | 2012-01-17 | Apple Inc. | Detecting the repositioning of an earphone using a microphone and associated action |
| US9030404B2 (en) * | 2009-07-23 | 2015-05-12 | Qualcomm Incorporated | Method and apparatus for distributed user interfaces using wearable devices to control mobile and consumer electronic devices |
-
2013
- 2013-03-06 CN CN2013100707944A patent/CN103226436A/zh active Pending
-
2014
- 2014-02-26 WO PCT/CN2014/072586 patent/WO2014135023A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2293598A2 (fr) * | 2009-07-31 | 2011-03-09 | Carlos De La Fe Dahlin | Système de menu |
| US20110206215A1 (en) * | 2010-02-21 | 2011-08-25 | Sony Ericsson Mobile Communications Ab | Personal listening device having input applied to the housing to provide a desired function and method |
| US20130055103A1 (en) * | 2011-08-29 | 2013-02-28 | Pantech Co., Ltd. | Apparatus and method for controlling three-dimensional graphical user interface (3d gui) |
| CN103226436A (zh) * | 2013-03-06 | 2013-07-31 | 广东欧珀移动通信有限公司 | 一种智能终端的人机交互方法及系统 |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106325481A (zh) * | 2015-06-30 | 2017-01-11 | 展讯通信(天津)有限公司 | 一种非接触式控制系统及方法以及移动终端 |
| CN109426498A (zh) * | 2017-08-24 | 2019-03-05 | 北京迪文科技有限公司 | 一种人机交互系统后台开发方法和装置 |
| CN109426498B (zh) * | 2017-08-24 | 2023-11-17 | 北京迪文科技有限公司 | 一种人机交互系统后台开发方法和装置 |
| CN108269571A (zh) * | 2018-03-07 | 2018-07-10 | 佛山市云米电器科技有限公司 | 一种带有摄像头功能的语音控制终端 |
| CN108269571B (zh) * | 2018-03-07 | 2024-01-09 | 佛山市云米电器科技有限公司 | 一种带有摄像头功能的语音控制终端 |
| CN114554094A (zh) * | 2022-02-25 | 2022-05-27 | 深圳市豪恩声学股份有限公司 | 基于头戴式耳机的摄像控制方法以及头戴式耳机 |
| CN114554094B (zh) * | 2022-02-25 | 2024-04-26 | 深圳市豪恩声学股份有限公司 | 基于头戴式耳机的摄像控制方法以及头戴式耳机 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN103226436A (zh) | 2013-07-31 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2014135023A1 (fr) | Procédé et système d'interaction homme-machine pour terminal intelligent | |
| WO2013048054A1 (fr) | Procédé d'utilisation d'un canal de communication basé sur un geste et système de terminal portable pour supporter celui-ci | |
| CN102937832B (zh) | 一种移动终端的手势捕捉方法及装置 | |
| WO2014107005A1 (fr) | Procédé pour la fourniture d'une fonction de souris et terminal mettant en oeuvre ce procédé | |
| WO2014003365A1 (fr) | Procédé et appareil pour traiter de multiples entrées | |
| WO2014030902A1 (fr) | Procédé d'entrée et appareil de dispositif portable | |
| WO2010110573A2 (fr) | Télépointeur multiple, dispositif d'affichage d'un objet virtuel et procédé de contrôle d'un objet virtuel | |
| WO2017188801A1 (fr) | Procédé de commande optimale basé sur une commande multimode de voix opérationnelle, et dispositif électronique auquel celui-ci est appliqué | |
| WO2015152487A1 (fr) | Procédé, dispositif, système et support d'enregistrement non transitoire lisible par ordinateur pour la fourniture d'interface utilisateur | |
| WO2016052778A1 (fr) | Dispositif portatif et son procédé de commande | |
| WO2015137742A1 (fr) | Appareil d'affichage et son procédé de commande | |
| CN111083684A (zh) | 控制电子设备的方法及电子设备 | |
| CN102981622A (zh) | 一种移动终端的外部控制方法及系统 | |
| WO2018004140A1 (fr) | Dispositif électronique et son procédé de fonctionnement | |
| KR20220047624A (ko) | 객체 위치 조정 방법 및 전자기기 | |
| WO2015174597A1 (fr) | Dispositif d'affichage d'image à commande vocale et procédé de commande vocale pour dispositif d'affichage d'image | |
| WO2015126197A1 (fr) | Appareil et procédé de commande à distance par toucher virtuel mis en œuvre sur un appareil photo | |
| WO2014044063A1 (fr) | Procédé et dispositif de commande de touches tactiles | |
| WO2021129745A1 (fr) | Touche tactile, procédé de commande et dispositif électronique | |
| WO2014178693A1 (fr) | Procédé pour apparier de multiples dispositifs, dispositif pour permettre leur appariement et système serveur | |
| WO2017065450A1 (fr) | Appareil d'affichage et procédé de commande de celui-ci | |
| CN111026482B (zh) | 一种应用程序控制方法及电子设备 | |
| WO2016036197A1 (fr) | Dispositif et procédé de reconnaissance de mouvement de la main | |
| CN111093133B (zh) | 无线设备控制方法、装置及计算机可读存储介质 | |
| WO2019004762A1 (fr) | Procédé et dispositif permettant de fournir une fonction d'interprétation à l'aide d'un écouteur |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14760989 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 14760989 Country of ref document: EP Kind code of ref document: A1 |