[go: up one dir, main page]

WO2016052859A1 - Procédé, dispositif et système destinés à la fourniture d'interface utilisateur et support d'enregistrement non transitoire lisible par un ordinateur - Google Patents

Procédé, dispositif et système destinés à la fourniture d'interface utilisateur et support d'enregistrement non transitoire lisible par un ordinateur Download PDF

Info

Publication number
WO2016052859A1
WO2016052859A1 PCT/KR2015/008747 KR2015008747W WO2016052859A1 WO 2016052859 A1 WO2016052859 A1 WO 2016052859A1 KR 2015008747 W KR2015008747 W KR 2015008747W WO 2016052859 A1 WO2016052859 A1 WO 2016052859A1
Authority
WO
WIPO (PCT)
Prior art keywords
input event
user
posture
movement
content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2015/008747
Other languages
English (en)
Korean (ko)
Inventor
황성재
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FUTUREPLAY Inc
Original Assignee
FUTUREPLAY Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020150051767A external-priority patent/KR101916700B1/ko
Priority claimed from US14/819,151 external-priority patent/US9696815B2/en
Application filed by FUTUREPLAY Inc filed Critical FUTUREPLAY Inc
Publication of WO2016052859A1 publication Critical patent/WO2016052859A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to a method, device, system, and non-transitory computer readable recording medium for providing a user interface.
  • mobile smart devices having various communication and sensing functions and powerful computing functions such as smart phones and smart pads have been widely used.
  • Such mobile smart devices may be relatively small in size and can be carried by the user (eg, smart glasses, smart watches, smart bands, smart devices in the form of rings or broaches, directly on the body or clothing). Smart devices that are attached or embedded).
  • a user might want to perform some task using his or her two or more (heterogeneous) smart devices, or want to perform some task that needs to intervene with his smart device and another user's device. Can be.
  • the (potential) intention of such a user could not be properly supported.
  • the object of the present invention is to solve all the above-mentioned problems.
  • the present invention obtains information about the attitude or movement of the first device and the second device, and inputs an input event that is specified based on the relative relationship between the attitude or movement of the first device and the attitude or movement of the second device. Detect and in response to the input event occur, cause at least some of the content and functions provided on the first device to be provided on the second device or provide at least some of the content and functions provided on the second device on the first device. It is another object of the present invention to provide a user with a more convenient and expanded user interface by using a relative relationship between postures or movements of two or more devices.
  • a method for providing a user interface comprising: (a) obtaining information about a posture or movement of a first device and a second device, wherein the posture or movement of the first device and the second device are obtained; Detecting an input event specified based on a relative relationship between a device's posture or movement, and (b) in response to the input event occurring, at least some of the content and functionality provided on the first device To be provided on a second device or to cause at least some of the content and functionality provided on the second device to be provided on the first device.
  • a device for providing a user interface comprising: obtaining information about the posture or movement of the device and other devices associated with the device, the posture or movement of the device and the posture or of the other device; Technical means for sensing a specified input event based on a relative relationship between movements, and in response to the input event occurring, at least some of the content and functions provided on the device are provided on the other device, or A device is provided that includes a program module to allow at least some of the content and functions provided on another device to be provided on the device.
  • a system for providing a user interface comprising: obtaining information about a posture or movement of a first device and a second device, wherein the posture or movement of the first device and the second device Detect a specified input event based on a relative relationship between posture or movement, and in response to the input event occurring, at least some of the content and functions provided on the first device to be provided on the second device; And a control unit for allowing at least some of the contents and functions provided on the second device to be provided on the first device, and a storage storing information received from at least one of the first device and the second device. This is provided.
  • non-transitory computer readable recording medium for recording another method, device, system, and computer program for executing the method.
  • an effect of providing a more convenient and expanded user interface to a user is achieved by using a relative relationship between postures or movements of two or more devices.
  • a user holding the first device in the same hand while wearing the second device on a body part such as a wrist even if the user flips his or her hand (that is, flips)
  • An effect is achieved that enables the content (or function) provision state at the first device and the second device to be switched between each other.
  • a user having both the first device and the second device looking at different directions may use the first device and the second device irrespective of the posture or the direction of the hand holding the device.
  • the first user and the second user each having the first device and the second device, respectively, by simply performing a simple action corresponding to an input event, the content (or The effect is that the functions) can be shared continuously.
  • FIG. 1 is a view showing a schematic configuration of an entire system for providing a user interface according to an embodiment of the present invention.
  • FIG. 2 is a diagram conceptually illustrating a configuration in which a content (or function) provision state is switched between a first device and a second device according to an embodiment of the present invention.
  • 3 to 5 are diagrams exemplarily illustrating a configuration for providing a user interface according to an embodiment of the present invention.
  • FIG. 6 is a diagram illustrating a configuration in which a format of content provided on a first device and a format of content provided on a second device are different from each other according to an embodiment of the present invention.
  • FIG. 7 is a diagram illustrating a configuration in which content stored in a first device is transmitted to a second device according to an embodiment of the present invention.
  • FIGS. 8 to 15 are diagrams exemplarily illustrating a configuration in which a user interface is provided according to another embodiment of the present invention.
  • 16 to 22 are diagrams exemplarily illustrating how a user interface provided according to various embodiments of the present invention is actually demonstrated.
  • FIG. 1 is a view showing a schematic configuration of an entire system for providing a user interface according to an embodiment of the present invention.
  • the entire system may include a communication network 100, a user interface providing system 200, and a plurality of devices 310 and 320.
  • the communication network 100 may be configured regardless of a communication mode such as wired communication or wireless communication, and includes a local area network (LAN) and a metropolitan area network (MAN). ), And various communication networks such as a wide area network (WAN).
  • LAN local area network
  • MAN metropolitan area network
  • WAN wide area network
  • the communication network 100 as used herein may be a known Internet or World Wide Web (WWW).
  • WWW World Wide Web
  • the communication network 100 may include, at least in part, a known wired / wireless data communication network, a known telephone network, or a known wired / wireless television communication network without being limited thereto.
  • the user interface providing system 200 may be a digital device having a computing capability by mounting a microprocessor and a memory means.
  • the user interface providing system 200 may be a server system.
  • the user interface providing system 200 may be configured to provide one of the devices 310 and 320 to the other through the communication network 100 to provide predetermined information or control commands to the other, or to receive the predetermined information or control commands from the other. To perform the function.
  • the user interface providing system 200 acquires information about the posture or movement of the first device and the second device, as described in detail below, and the posture or movement of the first device and the second device. Detect a specified input event based on a relative relationship between posture or movement, and in response to the input event occurring, at least some of the content and functions provided on the first device are provided on the second device or By allowing at least some of the contents and functions provided on the first device to be provided on the first device, a function of providing a more convenient and expanded user interface to a user may be performed by using a relative relationship between postures or movements of two or more devices. have.
  • Such user interface provision may be performed by a controller (not shown) included by the user interface provision system 200.
  • a controller may exist in the form of a program module in the user interface providing system 200.
  • Such program modules may take the form of operating systems, application modules or other program modules.
  • the program module may be stored in a remote storage device that can communicate with the user interface providing system 200.
  • the program module includes, but is not limited to, routines, subroutines, programs, objects, components, data structures, etc. that perform particular tasks or execute particular abstract data types, which will be described later, according to the present invention.
  • the user interface providing system 200 may store information about a posture or a movement provided from at least one of the plurality of devices 310 and 320, and may be used by at least one of the plurality of devices 310 and 320. To do more. Furthermore, the interface providing system 200 stores information constituting contents or functions provided by at least one of the plurality of devices 310 and 320, and the information is stored in at least one of the plurality of devices 310 and 320. It can further perform the function to be used by. The above-described storage may be performed by a storage (not shown) included by the user interface providing system 200. Such storage is a concept that includes a computer readable recording medium, and may be a broad database including not only a narrow database but also a file system based data record.
  • the user interface providing system 200 has been described as above, but this description is exemplary, and at least some of the functions or components required for the user interface providing system 200 are subject to manipulation as necessary. It will be apparent to those skilled in the art that at least one of the devices 310, 320 may be implemented or included.
  • the plurality of devices 310 and 320 may be matched to the user interface providing system 200 or the plurality of devices 310 and 320 (preferably, the plurality of devices 310). , 320) may be separated from each other or externalized (externalized), and may be a digital device having a function of communicating with each other, and having a memory means and a microprocessor, Any number may be employed as the device 310, 320 according to the present invention.
  • Devices 310 and 320 may be so-called smart devices such as smartphones, smart pads, smart glasses, smart watches, smart bands, smart rings, etc., or may be desktop computers, notebook computers, workstations, PDAs, web pads, mobile phones, buttons, mice. It may be a more traditional device such as, a keyboard, an electronic pen, and the like.
  • the devices 310 and 320 may be Internet of Things (IoT) devices such as a remote controller and a home appliance.
  • IoT Internet of Things
  • the devices 310 and 320 may include at least one technical means capable of receiving an operation from a user.
  • technical means include known components, such as touch panels, pointing tools (e.g., mouse, stylus, electronic pen, etc.), graphical objects that can be manipulated by users, keyboards, toggle switches, biometric information (fingerprints, etc.). Recognition sensors, distance sensors, and the like.
  • the device 310, 320 may include at least one technical means for obtaining physical information about the attitude or movement of the device (310, 320).
  • technical means are known components, such as motion sensors, acceleration sensors, gyroscopes, magnetic sensors, positioning modules (GPS modules, beacon-based positioning (identification) modules, etc.), barometers, distance sensors, cameras Etc. can be mentioned.
  • the device 310, 320 has a posture of the device 310, 320 based on the biometric information obtained from the human body of the user having the device (310, 320)
  • Technical means may be obtained for obtaining physical information about the movement.
  • an EMG signal measuring apparatus etc. are mentioned.
  • the devices 310 and 320 process the above physical information to provide information or control commands to other devices 310 and 320, or receive information or control commands from other devices 310 and 320, or the like.
  • An application program for generating such information or control command may be further included.
  • Such an application may exist in the form of a program module in the corresponding devices 310 and 320.
  • the nature of the program module may be similar to the overall control of the user interface providing system 200 as described above.
  • the application may be replaced with a hardware device or a firmware device, at least a part of which may perform a function substantially the same or equivalent thereto.
  • the first device 310 and the second device 320 may have a predetermined association (for example, an association that belongs to the same user, or an association that functions for the same user).
  • the first device 310 and the second device 320 when it is recognized that they have an association that is located substantially close to each other, or that one of them is reasonable to authenticate or allow the other.
  • a predetermined connection may be formed between the two terminals, which may be performed by the user interface providing system 200 or may be performed by the first device 310 and the second device 320.
  • the user interface providing system 200 provides a user interface in which a plurality of devices 310 and 320 are involved according to various embodiments will be described in detail.
  • the user interface providing system 200 obtains information about the attitude or movement of the first device 310 and the second device 320, and the attitude of the first device 310. Or detecting a specific input event based on a relative relationship between a movement and a posture or movement of the second device 320, and in response to the occurrence of the input event, among the contents and functions provided on the first device 310. At least some may be provided on the second device 320 or at least some of the content and functions provided on the second device 320 may be provided on the first device 310.
  • At least one of the first device 310 and the second device 320 may move in combination with an input means for causing an input event.
  • the first device 310 may be a smart phone that the user can hold by hand
  • the second device 320 may be a smart watch that may be worn on the user's wrist.
  • the input event specified based on a relative relationship between the posture or movement of the first device 310 and the posture or movement of the second device 320 may include at least one coordinate axis.
  • an aspect in which the direction of the first device 310 and an aspect in which the direction of the second device 320 is changed may indicate an event that is opposite or symmetrical to each other.
  • the user is holding the first device 310 and flipping a hand (also known as flip) that is wearing the second device 320 on the wrist. It can point to an event.
  • the input event may include a posture or movement of the first device 310 based on the first surface and a posture or movement of the second device 310 based on the second surface. It can be specified based on the relative relationship between them.
  • the first surface or the second surface may correspond to an upper surface, a lower surface, a left surface, a right surface, a front surface, a rear surface, or the like of the first device 310 or the second device 320.
  • FIG. 2 is a diagram conceptually illustrating a configuration in which a content (or function) provision state is switched between a first device and a second device according to an embodiment of the present invention.
  • the above action is performed.
  • the first device 310 or the second device 320 which has provided the content (or function) to the user with the display screen facing the user before the execution is performed, displays the display after the above action is performed.
  • the screen may not be facing the user and may no longer provide content (or functions) to the user.
  • the system 200 for providing a user interface includes both the first device 310 and the second device 320 that the user faces in different directions.
  • the direction of the first device 310 in the form of a smartphone is changed and the direction of the second device 320 in the form of a smart watch is reversed.
  • Some of the parts 201 and 202 may be provided on the second device 320 (or the first device 310) facing the user after the input event occurs. Therefore, according to the present invention, the content (or function) provided to the user can be provided while maintaining continuity through both the first device 310 and the second device 320. 16 and 17, it can be seen that the above embodiment is actually demonstrated.
  • 3 to 5 are diagrams exemplarily illustrating a configuration for providing a user interface according to an embodiment of the present invention.
  • a user wearing the second device 320 on the left wrist and holding the first device 310 with the same left hand turns over his left hand (ie, flips).
  • the case can be assumed.
  • the second device 320 not looking at the user side may be looking at the user side.
  • the first device 310 not looking at the user is facing the user. Can be seen (see FIG. 4).
  • FIG. 5 a posture or direction of the first device 310 and the second device 320 measured by a gyroscope provided in the first device 310 and the second device 320 is measured.
  • Physical information relating to the present invention may be obtained, and the user interface providing system 200 according to an exemplary embodiment may detect whether an input event occurs based on the physical information as illustrated in FIG. 5.
  • FIGS. 5A and 5B indicate gyroscope measurements of the first device 310 and the second device 320, respectively, and yellow in FIGS. 5A and 5B, respectively.
  • Red and green graphs are azimuth respectively. It indicates the measured value in the pitch and roll direction.
  • At least a part of content provided on the first device 310 facing the user is the second device. At least a portion of the content being provided on the second device 320 facing the user may be transmitted to the first device 310.
  • a process that is being executed on the first device 310 facing toward the user is performed by the second device 320.
  • a process that is being executed on the second device 320 that is facing the user may be executed on the first device 310.
  • the first device 310 in response to an input event generated according to a user's flipping action, provided on the first device 310 (or the second device 320) before the input event occurs.
  • the format of the content that is being used and the format of the content provided on the second device 320 (or the first device 310) after the input event occurs may be different.
  • FIG. 6 is a diagram illustrating a configuration in which a format of content provided on a first device and a format of content provided on a second device are different from each other according to an embodiment of the present invention.
  • content that was provided in a visual form on the first device 310 before the input event occurs may be provided on the second device 320 in an auditory or tactile form after the input event occurs.
  • 320 may be provided in the form of voice or vibration.
  • the input event may include not only a relative relationship between the posture or movement of the first device 310 and the posture or movement of the second device 320, but also the first device 310 or It may be specified further based on a user manipulation input to the second device 320.
  • a user operation input to the first device 310 or the second device 320 may include a touch operation, a keyboard operation, a voice recognition operation, and the like.
  • the specific content associated with the above touch manipulation among the contents provided on the first device 310 facing toward the user may be provided on the second device 320 or vice versa.
  • the specific content associated with the above touch manipulation among the contents provided on the second device 320 may be provided on the first device 310.
  • FIG. 7 is a diagram illustrating a configuration in which content stored in a first device is transmitted to a second device according to an embodiment of the present invention.
  • specific content 701, 702 (eg, a specific sound source file, etc.) selected by the user's touch manipulation 710 among contents provided on the first device 310 before an input event occurs. ) May be provided on the second device 320 after an input event occurs.
  • the user interface providing system 200 allows the user to display the first device 310 with the second device 320 facing the user as the user flips.
  • content provided on the first device 310 facing the user is bookmarked (or A list registered as a bookmark and including the bookmark may be displayed on the second device 320.
  • the user interface providing system 200 allows the user to display the second device 310 with the first device 310 facing the user as the user flips.
  • the present invention is not necessarily limited thereto, the scope of the present invention can be achieved It is noted that the input event can be specified by any number of other actions within (eg, flipping the hand more than once).
  • the present invention is not necessarily limited thereto. It will be appreciated that the first device and the second device may be implemented in other forms such as a smart pad, a smart glass, a smart band, a smart ring, and the like within the scope of the object of the present invention.
  • FIGS. 8 to 15 are diagrams exemplarily illustrating a configuration in which a user interface is provided according to another embodiment of the present invention.
  • a user interface providing system 200 possesses both a first device 310 and a second device 320 that a user looks in different directions.
  • the direction of the first device 310 in the form of a smartphone is changed and the direction of the second device 320 in the form of a smart watch is changed. It can be recognized as input events that are opposite or symmetric to each other.
  • the user interface providing system 200 according to another embodiment of the present invention corresponds to the occurrence of the above input event, and the first device 310 in which the display screen is facing the user before the input event occurs.
  • Mirror At least a portion (810) of the content (or function) that was being provided on the screen, and the display screen after the mirrored content (or function) 810 not only the first device 310 but also an input event occurs It may also be provided on the second device 320 facing the side. As a result, even when the user rotates his or her wrist so that the first device 310 faces the other party, the second device 320 indicates what content (or function) is provided to the other party on the first device 310. You can check through.
  • the system 200 for providing a user interface corresponds to an input event in which a user performs a flip action mentioned with reference to FIG. 8.
  • the content (or function) 910 provided on the first device 310 that the display screen was facing toward the user is divided into public content 920 and private content 930.
  • the public content 920 is displayed on the first device 310 facing the other side after the input event is generated, and the private content (on the second device 320 facing the user after the input event is generated) 930 may be displayed.
  • the public content and the private content can be classified and displayed according to the situation of the user.
  • the user interface providing system 200 may correspond to an input event in which a user performs a flip action mentioned with reference to FIG. 8.
  • the language of the content (or function) 1010 provided on the first device 310 facing the user is translated into another language, and the display screen faces the other side after the input event occurs.
  • the second language 320 may be displayed on the second device 320 facing the user after the input event occurs.
  • the content 1030 in Korean may be displayed.
  • the user interface providing system 200 may display a display screen after the input event is generated in response to the occurrence of an input event for a user to perform the flip action described with reference to FIG. 8.
  • the payment processing device eg, RFID reader, etc.
  • Information (eg, payment amount, etc.) 1110 may be displayed (see FIG. 11).
  • a display screen is processed after the input event occurs.
  • a function 1210 may be provided to support the input of a key (see FIG. 12).
  • the user interface providing system 200 may perform the flipping operation described with reference to FIG. 8 in addition to the user performing a touch operation of selecting a destination on the first device 310.
  • various contents related to the exercise for example, current location, distance or time remaining to the destination, music, etc.
  • the second device 320 that the user can easily manipulate during the exercise. Can be provided.
  • the user interface providing system 200 may respond to the user after the input event occurs in response to the occurrence of an input event that the user performs flip action described with reference to FIG. 8.
  • the user manipulation interface 1410 of the IoT household appliance such as a remote controller, which is the first device 310, may be displayed on the second device 320 that is viewed.
  • the security is secured while the user holds the first device 310 in the form of a smartphone with one hand and wears the second device 320 in the form of a smart watch on the wrist of the other hand.
  • the user interface providing system 200 the first device 310 as the user performs an operation such as turning the wrist of the hand wearing the second device 320, etc.
  • the content (or function) provided on the first device 310 eg, a security key code
  • the user interface providing system 200 as the user performs an operation such as turning the wrist of the hand wearing the second device 320 and the like;
  • an operation such as turning the wrist of the hand wearing the second device 320 and the like;
  • Information about the translation or text dictionary search results for the text may be displayed on the second device. 18 and 19, it can be seen that the above embodiment is actually demonstrated.
  • a first user wearing a smart watch-type first device on the right wrist and a second user wearing a smart watch-type second device on the left wrist shake hands with the right hand and the left hand, respectively.
  • the user interface providing system 200 may provide content (or function) provided on the first device worn by the first user in response to the occurrence of the above input event.
  • At least some of the content (or functionality) being provided on the second device worn by the second user is at least partially worn by the first user. It can be provided on the first device that is doing.
  • the user may receive content from another user's device or provide content to another user's device by simply flipping while shaking hands with the other user with a hand wearing the device.
  • Embodiments according to the present invention described above may be implemented in the form of program instructions that may be executed by various computer components, and may be recorded on a non-transitory computer readable recording medium.
  • the non-transitory computer readable recording medium may include program instructions, data files, data structures, etc. alone or in combination.
  • the program instructions recorded on the non-transitory computer readable recording medium may be those specially designed and configured for the present invention, or may be known and available to those skilled in the computer software arts.
  • non-transitory computer readable recording media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs, DVDs, magnetic-optical media such as floppy disks ( magneto-optical media) and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
  • program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
  • the hardware device may be configured to operate as one or more software modules to perform the process according to the invention, and vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon un aspect, la présente invention concerne un procédé destiné à la fourniture d'une interface utilisateur, le procédé consiste : (a) à acquérir les informations sur des postures ou des mouvements d'un premier dispositif et d'un second dispositif et de détection d'un événement d'entrée spécifié sur la base d'une relation relative entre la posture ou le mouvement du premier dispositif et la posture ou le mouvement du second dispositif ; et (b) à permettre à au moins une partie des contenus et des fonctions fournies sur le premier dispositif d'être fournie sur le second dispositif ou de permettre à au moins une partie du contenu et des fonctions fournies sur le second dispositif d'être fournie sur le premier dispositif, en réponse à l'occurrence de l'événement d'entrée.
PCT/KR2015/008747 2014-10-02 2015-08-21 Procédé, dispositif et système destinés à la fourniture d'interface utilisateur et support d'enregistrement non transitoire lisible par un ordinateur Ceased WO2016052859A1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
KR20140132936 2014-10-02
KR10-2014-0132936 2014-10-02
KR10-2015-0051767 2015-04-13
KR1020150051767A KR101916700B1 (ko) 2014-10-02 2015-04-13 사용자 인터페이스를 제공하기 위한 방법, 디바이스, 시스템 및 비일시성의 컴퓨터 판독 가능한 기록 매체
US14/819,151 2015-08-05
US14/819,151 US9696815B2 (en) 2014-10-02 2015-08-05 Method, device, system and non-transitory computer-readable recording medium for providing user interface

Publications (1)

Publication Number Publication Date
WO2016052859A1 true WO2016052859A1 (fr) 2016-04-07

Family

ID=55630851

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/008747 Ceased WO2016052859A1 (fr) 2014-10-02 2015-08-21 Procédé, dispositif et système destinés à la fourniture d'interface utilisateur et support d'enregistrement non transitoire lisible par un ordinateur

Country Status (1)

Country Link
WO (1) WO2016052859A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140045547A1 (en) * 2012-08-10 2014-02-13 Silverplus, Inc. Wearable Communication Device and User Interface
US20140135631A1 (en) * 2012-06-22 2014-05-15 Fitbit, Inc. Biometric monitoring device with heart rate measurement activated by a single user-gesture
US20140139454A1 (en) * 2012-11-20 2014-05-22 Samsung Electronics Company, Ltd. User Gesture Input to Wearable Electronic Device Involving Movement of Device
WO2014084634A1 (fr) * 2012-11-29 2014-06-05 주식회사 매크론 Souris pour dispositif d'affichage de type lunettes, et procédé de commande correspondant
US20140181954A1 (en) * 2012-12-26 2014-06-26 Charles Cameron Robertson System for conveying an identity and method of doing the same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140135631A1 (en) * 2012-06-22 2014-05-15 Fitbit, Inc. Biometric monitoring device with heart rate measurement activated by a single user-gesture
US20140045547A1 (en) * 2012-08-10 2014-02-13 Silverplus, Inc. Wearable Communication Device and User Interface
US20140139454A1 (en) * 2012-11-20 2014-05-22 Samsung Electronics Company, Ltd. User Gesture Input to Wearable Electronic Device Involving Movement of Device
WO2014084634A1 (fr) * 2012-11-29 2014-06-05 주식회사 매크론 Souris pour dispositif d'affichage de type lunettes, et procédé de commande correspondant
US20140181954A1 (en) * 2012-12-26 2014-06-26 Charles Cameron Robertson System for conveying an identity and method of doing the same

Similar Documents

Publication Publication Date Title
WO2015152487A1 (fr) Procédé, dispositif, système et support d'enregistrement non transitoire lisible par ordinateur pour la fourniture d'interface utilisateur
WO2018151449A1 (fr) Dispositif électronique et procédés permettant de déterminer une orientation du dispositif
WO2018074877A1 (fr) Dispositif électronique et procédé d'acquisition d'informations d'empreintes digitales
WO2014030902A1 (fr) Procédé d'entrée et appareil de dispositif portable
WO2018026202A1 (fr) Dispositif de détection tactile pour déterminer des informations relatives à un stylet, procédé de commande associé et stylet
WO2017095123A1 (fr) Procédé, dispositif et système pour fournir une interface utilisateur, et support d'enregistrement lisible par ordinateur non temporaire
WO2016088981A1 (fr) Procédé, dispositif et système pour établir une interface utilisateur, et support d'enregistrement non transitoire lisible par ordinateur
US9696815B2 (en) Method, device, system and non-transitory computer-readable recording medium for providing user interface
WO2020130356A1 (fr) Système et procédé pour dispositif d'entrée polyvalent pour environnements bidimensionnels et tridimensionnels
WO2014129787A1 (fr) Dispositif électronique à interface utilisateur tactile et son procédé de fonctionnement
WO2020130667A1 (fr) Procédé et dispositif électronique pour commander un dispositif de réalité augmentée
WO2018004140A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2018105955A2 (fr) Procédé d'affichage d'objet et dispositif électronique associé
WO2014178693A1 (fr) Procédé pour apparier de multiples dispositifs, dispositif pour permettre leur appariement et système serveur
WO2016080596A1 (fr) Procédé et système de fourniture d'outil de prototypage, et support d'enregistrement lisible par ordinateur non transitoire
WO2018135903A1 (fr) Dispositif électronique et procédé destinés à l'affichage d'une page web au moyen de ce dispositif
WO2014014240A1 (fr) Souris de doigt de type à contact et son procédé de fonctionnement
WO2012093779A2 (fr) Terminal utilisateur prenant en charge une interface multimodale utilisant l'effleurement et le souffle d'un utilisateur et procédé de commande de ce terminal
WO2019203591A1 (fr) Appareil et procédé d'entrée à haute efficacité pour une réalité virtuelle et une réalité augmentée
CN104461231A (zh) 信息显示控制装置以及信息显示控制方法
JP2023527906A (ja) 制御方法、装置、端末および記憶媒体
US10095309B2 (en) Input device, system and method for finger touch interface
WO2018074824A1 (fr) Dispositif électronique comprenant un capteur d'interférence électromagnétique
JP2022074167A (ja) 入力制御システム
WO2015056886A1 (fr) Procédé de commande d'un écran tactile par détection de la position de la ligne de vision de l'utilisateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15847602

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15847602

Country of ref document: EP

Kind code of ref document: A1