WO2016052859A1 - Method, device, and system for providing user interface and non-transitory computer-readable recording medium - Google Patents
Method, device, and system for providing user interface and non-transitory computer-readable recording medium Download PDFInfo
- Publication number
- WO2016052859A1 WO2016052859A1 PCT/KR2015/008747 KR2015008747W WO2016052859A1 WO 2016052859 A1 WO2016052859 A1 WO 2016052859A1 KR 2015008747 W KR2015008747 W KR 2015008747W WO 2016052859 A1 WO2016052859 A1 WO 2016052859A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- input event
- user
- posture
- movement
- content
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
Definitions
- the present invention relates to a method, device, system, and non-transitory computer readable recording medium for providing a user interface.
- mobile smart devices having various communication and sensing functions and powerful computing functions such as smart phones and smart pads have been widely used.
- Such mobile smart devices may be relatively small in size and can be carried by the user (eg, smart glasses, smart watches, smart bands, smart devices in the form of rings or broaches, directly on the body or clothing). Smart devices that are attached or embedded).
- a user might want to perform some task using his or her two or more (heterogeneous) smart devices, or want to perform some task that needs to intervene with his smart device and another user's device. Can be.
- the (potential) intention of such a user could not be properly supported.
- the object of the present invention is to solve all the above-mentioned problems.
- the present invention obtains information about the attitude or movement of the first device and the second device, and inputs an input event that is specified based on the relative relationship between the attitude or movement of the first device and the attitude or movement of the second device. Detect and in response to the input event occur, cause at least some of the content and functions provided on the first device to be provided on the second device or provide at least some of the content and functions provided on the second device on the first device. It is another object of the present invention to provide a user with a more convenient and expanded user interface by using a relative relationship between postures or movements of two or more devices.
- a method for providing a user interface comprising: (a) obtaining information about a posture or movement of a first device and a second device, wherein the posture or movement of the first device and the second device are obtained; Detecting an input event specified based on a relative relationship between a device's posture or movement, and (b) in response to the input event occurring, at least some of the content and functionality provided on the first device To be provided on a second device or to cause at least some of the content and functionality provided on the second device to be provided on the first device.
- a device for providing a user interface comprising: obtaining information about the posture or movement of the device and other devices associated with the device, the posture or movement of the device and the posture or of the other device; Technical means for sensing a specified input event based on a relative relationship between movements, and in response to the input event occurring, at least some of the content and functions provided on the device are provided on the other device, or A device is provided that includes a program module to allow at least some of the content and functions provided on another device to be provided on the device.
- a system for providing a user interface comprising: obtaining information about a posture or movement of a first device and a second device, wherein the posture or movement of the first device and the second device Detect a specified input event based on a relative relationship between posture or movement, and in response to the input event occurring, at least some of the content and functions provided on the first device to be provided on the second device; And a control unit for allowing at least some of the contents and functions provided on the second device to be provided on the first device, and a storage storing information received from at least one of the first device and the second device. This is provided.
- non-transitory computer readable recording medium for recording another method, device, system, and computer program for executing the method.
- an effect of providing a more convenient and expanded user interface to a user is achieved by using a relative relationship between postures or movements of two or more devices.
- a user holding the first device in the same hand while wearing the second device on a body part such as a wrist even if the user flips his or her hand (that is, flips)
- An effect is achieved that enables the content (or function) provision state at the first device and the second device to be switched between each other.
- a user having both the first device and the second device looking at different directions may use the first device and the second device irrespective of the posture or the direction of the hand holding the device.
- the first user and the second user each having the first device and the second device, respectively, by simply performing a simple action corresponding to an input event, the content (or The effect is that the functions) can be shared continuously.
- FIG. 1 is a view showing a schematic configuration of an entire system for providing a user interface according to an embodiment of the present invention.
- FIG. 2 is a diagram conceptually illustrating a configuration in which a content (or function) provision state is switched between a first device and a second device according to an embodiment of the present invention.
- 3 to 5 are diagrams exemplarily illustrating a configuration for providing a user interface according to an embodiment of the present invention.
- FIG. 6 is a diagram illustrating a configuration in which a format of content provided on a first device and a format of content provided on a second device are different from each other according to an embodiment of the present invention.
- FIG. 7 is a diagram illustrating a configuration in which content stored in a first device is transmitted to a second device according to an embodiment of the present invention.
- FIGS. 8 to 15 are diagrams exemplarily illustrating a configuration in which a user interface is provided according to another embodiment of the present invention.
- 16 to 22 are diagrams exemplarily illustrating how a user interface provided according to various embodiments of the present invention is actually demonstrated.
- FIG. 1 is a view showing a schematic configuration of an entire system for providing a user interface according to an embodiment of the present invention.
- the entire system may include a communication network 100, a user interface providing system 200, and a plurality of devices 310 and 320.
- the communication network 100 may be configured regardless of a communication mode such as wired communication or wireless communication, and includes a local area network (LAN) and a metropolitan area network (MAN). ), And various communication networks such as a wide area network (WAN).
- LAN local area network
- MAN metropolitan area network
- WAN wide area network
- the communication network 100 as used herein may be a known Internet or World Wide Web (WWW).
- WWW World Wide Web
- the communication network 100 may include, at least in part, a known wired / wireless data communication network, a known telephone network, or a known wired / wireless television communication network without being limited thereto.
- the user interface providing system 200 may be a digital device having a computing capability by mounting a microprocessor and a memory means.
- the user interface providing system 200 may be a server system.
- the user interface providing system 200 may be configured to provide one of the devices 310 and 320 to the other through the communication network 100 to provide predetermined information or control commands to the other, or to receive the predetermined information or control commands from the other. To perform the function.
- the user interface providing system 200 acquires information about the posture or movement of the first device and the second device, as described in detail below, and the posture or movement of the first device and the second device. Detect a specified input event based on a relative relationship between posture or movement, and in response to the input event occurring, at least some of the content and functions provided on the first device are provided on the second device or By allowing at least some of the contents and functions provided on the first device to be provided on the first device, a function of providing a more convenient and expanded user interface to a user may be performed by using a relative relationship between postures or movements of two or more devices. have.
- Such user interface provision may be performed by a controller (not shown) included by the user interface provision system 200.
- a controller may exist in the form of a program module in the user interface providing system 200.
- Such program modules may take the form of operating systems, application modules or other program modules.
- the program module may be stored in a remote storage device that can communicate with the user interface providing system 200.
- the program module includes, but is not limited to, routines, subroutines, programs, objects, components, data structures, etc. that perform particular tasks or execute particular abstract data types, which will be described later, according to the present invention.
- the user interface providing system 200 may store information about a posture or a movement provided from at least one of the plurality of devices 310 and 320, and may be used by at least one of the plurality of devices 310 and 320. To do more. Furthermore, the interface providing system 200 stores information constituting contents or functions provided by at least one of the plurality of devices 310 and 320, and the information is stored in at least one of the plurality of devices 310 and 320. It can further perform the function to be used by. The above-described storage may be performed by a storage (not shown) included by the user interface providing system 200. Such storage is a concept that includes a computer readable recording medium, and may be a broad database including not only a narrow database but also a file system based data record.
- the user interface providing system 200 has been described as above, but this description is exemplary, and at least some of the functions or components required for the user interface providing system 200 are subject to manipulation as necessary. It will be apparent to those skilled in the art that at least one of the devices 310, 320 may be implemented or included.
- the plurality of devices 310 and 320 may be matched to the user interface providing system 200 or the plurality of devices 310 and 320 (preferably, the plurality of devices 310). , 320) may be separated from each other or externalized (externalized), and may be a digital device having a function of communicating with each other, and having a memory means and a microprocessor, Any number may be employed as the device 310, 320 according to the present invention.
- Devices 310 and 320 may be so-called smart devices such as smartphones, smart pads, smart glasses, smart watches, smart bands, smart rings, etc., or may be desktop computers, notebook computers, workstations, PDAs, web pads, mobile phones, buttons, mice. It may be a more traditional device such as, a keyboard, an electronic pen, and the like.
- the devices 310 and 320 may be Internet of Things (IoT) devices such as a remote controller and a home appliance.
- IoT Internet of Things
- the devices 310 and 320 may include at least one technical means capable of receiving an operation from a user.
- technical means include known components, such as touch panels, pointing tools (e.g., mouse, stylus, electronic pen, etc.), graphical objects that can be manipulated by users, keyboards, toggle switches, biometric information (fingerprints, etc.). Recognition sensors, distance sensors, and the like.
- the device 310, 320 may include at least one technical means for obtaining physical information about the attitude or movement of the device (310, 320).
- technical means are known components, such as motion sensors, acceleration sensors, gyroscopes, magnetic sensors, positioning modules (GPS modules, beacon-based positioning (identification) modules, etc.), barometers, distance sensors, cameras Etc. can be mentioned.
- the device 310, 320 has a posture of the device 310, 320 based on the biometric information obtained from the human body of the user having the device (310, 320)
- Technical means may be obtained for obtaining physical information about the movement.
- an EMG signal measuring apparatus etc. are mentioned.
- the devices 310 and 320 process the above physical information to provide information or control commands to other devices 310 and 320, or receive information or control commands from other devices 310 and 320, or the like.
- An application program for generating such information or control command may be further included.
- Such an application may exist in the form of a program module in the corresponding devices 310 and 320.
- the nature of the program module may be similar to the overall control of the user interface providing system 200 as described above.
- the application may be replaced with a hardware device or a firmware device, at least a part of which may perform a function substantially the same or equivalent thereto.
- the first device 310 and the second device 320 may have a predetermined association (for example, an association that belongs to the same user, or an association that functions for the same user).
- the first device 310 and the second device 320 when it is recognized that they have an association that is located substantially close to each other, or that one of them is reasonable to authenticate or allow the other.
- a predetermined connection may be formed between the two terminals, which may be performed by the user interface providing system 200 or may be performed by the first device 310 and the second device 320.
- the user interface providing system 200 provides a user interface in which a plurality of devices 310 and 320 are involved according to various embodiments will be described in detail.
- the user interface providing system 200 obtains information about the attitude or movement of the first device 310 and the second device 320, and the attitude of the first device 310. Or detecting a specific input event based on a relative relationship between a movement and a posture or movement of the second device 320, and in response to the occurrence of the input event, among the contents and functions provided on the first device 310. At least some may be provided on the second device 320 or at least some of the content and functions provided on the second device 320 may be provided on the first device 310.
- At least one of the first device 310 and the second device 320 may move in combination with an input means for causing an input event.
- the first device 310 may be a smart phone that the user can hold by hand
- the second device 320 may be a smart watch that may be worn on the user's wrist.
- the input event specified based on a relative relationship between the posture or movement of the first device 310 and the posture or movement of the second device 320 may include at least one coordinate axis.
- an aspect in which the direction of the first device 310 and an aspect in which the direction of the second device 320 is changed may indicate an event that is opposite or symmetrical to each other.
- the user is holding the first device 310 and flipping a hand (also known as flip) that is wearing the second device 320 on the wrist. It can point to an event.
- the input event may include a posture or movement of the first device 310 based on the first surface and a posture or movement of the second device 310 based on the second surface. It can be specified based on the relative relationship between them.
- the first surface or the second surface may correspond to an upper surface, a lower surface, a left surface, a right surface, a front surface, a rear surface, or the like of the first device 310 or the second device 320.
- FIG. 2 is a diagram conceptually illustrating a configuration in which a content (or function) provision state is switched between a first device and a second device according to an embodiment of the present invention.
- the above action is performed.
- the first device 310 or the second device 320 which has provided the content (or function) to the user with the display screen facing the user before the execution is performed, displays the display after the above action is performed.
- the screen may not be facing the user and may no longer provide content (or functions) to the user.
- the system 200 for providing a user interface includes both the first device 310 and the second device 320 that the user faces in different directions.
- the direction of the first device 310 in the form of a smartphone is changed and the direction of the second device 320 in the form of a smart watch is reversed.
- Some of the parts 201 and 202 may be provided on the second device 320 (or the first device 310) facing the user after the input event occurs. Therefore, according to the present invention, the content (or function) provided to the user can be provided while maintaining continuity through both the first device 310 and the second device 320. 16 and 17, it can be seen that the above embodiment is actually demonstrated.
- 3 to 5 are diagrams exemplarily illustrating a configuration for providing a user interface according to an embodiment of the present invention.
- a user wearing the second device 320 on the left wrist and holding the first device 310 with the same left hand turns over his left hand (ie, flips).
- the case can be assumed.
- the second device 320 not looking at the user side may be looking at the user side.
- the first device 310 not looking at the user is facing the user. Can be seen (see FIG. 4).
- FIG. 5 a posture or direction of the first device 310 and the second device 320 measured by a gyroscope provided in the first device 310 and the second device 320 is measured.
- Physical information relating to the present invention may be obtained, and the user interface providing system 200 according to an exemplary embodiment may detect whether an input event occurs based on the physical information as illustrated in FIG. 5.
- FIGS. 5A and 5B indicate gyroscope measurements of the first device 310 and the second device 320, respectively, and yellow in FIGS. 5A and 5B, respectively.
- Red and green graphs are azimuth respectively. It indicates the measured value in the pitch and roll direction.
- At least a part of content provided on the first device 310 facing the user is the second device. At least a portion of the content being provided on the second device 320 facing the user may be transmitted to the first device 310.
- a process that is being executed on the first device 310 facing toward the user is performed by the second device 320.
- a process that is being executed on the second device 320 that is facing the user may be executed on the first device 310.
- the first device 310 in response to an input event generated according to a user's flipping action, provided on the first device 310 (or the second device 320) before the input event occurs.
- the format of the content that is being used and the format of the content provided on the second device 320 (or the first device 310) after the input event occurs may be different.
- FIG. 6 is a diagram illustrating a configuration in which a format of content provided on a first device and a format of content provided on a second device are different from each other according to an embodiment of the present invention.
- content that was provided in a visual form on the first device 310 before the input event occurs may be provided on the second device 320 in an auditory or tactile form after the input event occurs.
- 320 may be provided in the form of voice or vibration.
- the input event may include not only a relative relationship between the posture or movement of the first device 310 and the posture or movement of the second device 320, but also the first device 310 or It may be specified further based on a user manipulation input to the second device 320.
- a user operation input to the first device 310 or the second device 320 may include a touch operation, a keyboard operation, a voice recognition operation, and the like.
- the specific content associated with the above touch manipulation among the contents provided on the first device 310 facing toward the user may be provided on the second device 320 or vice versa.
- the specific content associated with the above touch manipulation among the contents provided on the second device 320 may be provided on the first device 310.
- FIG. 7 is a diagram illustrating a configuration in which content stored in a first device is transmitted to a second device according to an embodiment of the present invention.
- specific content 701, 702 (eg, a specific sound source file, etc.) selected by the user's touch manipulation 710 among contents provided on the first device 310 before an input event occurs. ) May be provided on the second device 320 after an input event occurs.
- the user interface providing system 200 allows the user to display the first device 310 with the second device 320 facing the user as the user flips.
- content provided on the first device 310 facing the user is bookmarked (or A list registered as a bookmark and including the bookmark may be displayed on the second device 320.
- the user interface providing system 200 allows the user to display the second device 310 with the first device 310 facing the user as the user flips.
- the present invention is not necessarily limited thereto, the scope of the present invention can be achieved It is noted that the input event can be specified by any number of other actions within (eg, flipping the hand more than once).
- the present invention is not necessarily limited thereto. It will be appreciated that the first device and the second device may be implemented in other forms such as a smart pad, a smart glass, a smart band, a smart ring, and the like within the scope of the object of the present invention.
- FIGS. 8 to 15 are diagrams exemplarily illustrating a configuration in which a user interface is provided according to another embodiment of the present invention.
- a user interface providing system 200 possesses both a first device 310 and a second device 320 that a user looks in different directions.
- the direction of the first device 310 in the form of a smartphone is changed and the direction of the second device 320 in the form of a smart watch is changed. It can be recognized as input events that are opposite or symmetric to each other.
- the user interface providing system 200 according to another embodiment of the present invention corresponds to the occurrence of the above input event, and the first device 310 in which the display screen is facing the user before the input event occurs.
- Mirror At least a portion (810) of the content (or function) that was being provided on the screen, and the display screen after the mirrored content (or function) 810 not only the first device 310 but also an input event occurs It may also be provided on the second device 320 facing the side. As a result, even when the user rotates his or her wrist so that the first device 310 faces the other party, the second device 320 indicates what content (or function) is provided to the other party on the first device 310. You can check through.
- the system 200 for providing a user interface corresponds to an input event in which a user performs a flip action mentioned with reference to FIG. 8.
- the content (or function) 910 provided on the first device 310 that the display screen was facing toward the user is divided into public content 920 and private content 930.
- the public content 920 is displayed on the first device 310 facing the other side after the input event is generated, and the private content (on the second device 320 facing the user after the input event is generated) 930 may be displayed.
- the public content and the private content can be classified and displayed according to the situation of the user.
- the user interface providing system 200 may correspond to an input event in which a user performs a flip action mentioned with reference to FIG. 8.
- the language of the content (or function) 1010 provided on the first device 310 facing the user is translated into another language, and the display screen faces the other side after the input event occurs.
- the second language 320 may be displayed on the second device 320 facing the user after the input event occurs.
- the content 1030 in Korean may be displayed.
- the user interface providing system 200 may display a display screen after the input event is generated in response to the occurrence of an input event for a user to perform the flip action described with reference to FIG. 8.
- the payment processing device eg, RFID reader, etc.
- Information (eg, payment amount, etc.) 1110 may be displayed (see FIG. 11).
- a display screen is processed after the input event occurs.
- a function 1210 may be provided to support the input of a key (see FIG. 12).
- the user interface providing system 200 may perform the flipping operation described with reference to FIG. 8 in addition to the user performing a touch operation of selecting a destination on the first device 310.
- various contents related to the exercise for example, current location, distance or time remaining to the destination, music, etc.
- the second device 320 that the user can easily manipulate during the exercise. Can be provided.
- the user interface providing system 200 may respond to the user after the input event occurs in response to the occurrence of an input event that the user performs flip action described with reference to FIG. 8.
- the user manipulation interface 1410 of the IoT household appliance such as a remote controller, which is the first device 310, may be displayed on the second device 320 that is viewed.
- the security is secured while the user holds the first device 310 in the form of a smartphone with one hand and wears the second device 320 in the form of a smart watch on the wrist of the other hand.
- the user interface providing system 200 the first device 310 as the user performs an operation such as turning the wrist of the hand wearing the second device 320, etc.
- the content (or function) provided on the first device 310 eg, a security key code
- the user interface providing system 200 as the user performs an operation such as turning the wrist of the hand wearing the second device 320 and the like;
- an operation such as turning the wrist of the hand wearing the second device 320 and the like;
- Information about the translation or text dictionary search results for the text may be displayed on the second device. 18 and 19, it can be seen that the above embodiment is actually demonstrated.
- a first user wearing a smart watch-type first device on the right wrist and a second user wearing a smart watch-type second device on the left wrist shake hands with the right hand and the left hand, respectively.
- the user interface providing system 200 may provide content (or function) provided on the first device worn by the first user in response to the occurrence of the above input event.
- At least some of the content (or functionality) being provided on the second device worn by the second user is at least partially worn by the first user. It can be provided on the first device that is doing.
- the user may receive content from another user's device or provide content to another user's device by simply flipping while shaking hands with the other user with a hand wearing the device.
- Embodiments according to the present invention described above may be implemented in the form of program instructions that may be executed by various computer components, and may be recorded on a non-transitory computer readable recording medium.
- the non-transitory computer readable recording medium may include program instructions, data files, data structures, etc. alone or in combination.
- the program instructions recorded on the non-transitory computer readable recording medium may be those specially designed and configured for the present invention, or may be known and available to those skilled in the computer software arts.
- non-transitory computer readable recording media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs, DVDs, magnetic-optical media such as floppy disks ( magneto-optical media) and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like.
- program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like.
- the hardware device may be configured to operate as one or more software modules to perform the process according to the invention, and vice versa.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
λ³Έ λ°λͺ μ μ¬μ©μ μΈν°νμ΄μ€λ₯Ό μ 곡νκΈ° μν λ°©λ², λλ°μ΄μ€, μμ€ν λ° λΉμΌμμ±μ μ»΄ν¨ν° νλ κ°λ₯ν κΈ°λ‘ λ§€μ²΄μ κ΄ν κ²μ΄λ€.The present invention relates to a method, device, system, and non-transitory computer readable recording medium for providing a user interface.
κ·Όλμ λ€μ΄, μ€λ§νΈν°, μ€λ§νΈ ν¨λ λ±κ³Ό κ°μ΄ λ€μν ν΅μ λ° μΌμ± κΈ°λ₯κ³Ό κ°λ ₯ν μ°μ° κΈ°λ₯μ κ°μΆ μ΄λμ μ€λ§νΈ λλ°μ΄μ€κ° λ리 μ¬μ©λκ³ μλ€. μ΄λ¬ν μ΄λμ μ€λ§νΈ λλ°μ΄μ€ μ€μλ μ¬μ©μκ° μ 체μ μ°©μ©νμ¬ ν΄λν μ μλ λΉκ΅μ μμ ν¬κΈ°μ κ²(μλ₯Ό λ€λ©΄, μ€λ§νΈ κΈλμ€, μ€λ§νΈ μμΉ, μ€λ§νΈ λ°΄λ, λ§μ΄λ λΈλ‘μΉμ κ°μ ννμ μ€λ§νΈ λλ°μ΄μ€, μ 체λ μλ₯μ μ§μ μ μΌλ‘ λΆμ°©λκ±°λ 맀립λλ μ€λ§νΈ λλ°μ΄μ€ λ±)λ μλ€.In recent years, mobile smart devices having various communication and sensing functions and powerful computing functions such as smart phones and smart pads have been widely used. Such mobile smart devices may be relatively small in size and can be carried by the user (eg, smart glasses, smart watches, smart bands, smart devices in the form of rings or broaches, directly on the body or clothing). Smart devices that are attached or embedded).
μ΄λ¬ν μν©μμ μ¬μ©μλ λ³ΈμΈμ λ κ° μ΄μμ (μ΄μ’ μ) μ€λ§νΈ λλ°μ΄μ€λ₯Ό μ¬μ©νμ¬ μ΄λ€ μμ μ μννλ €κ³ νκ±°λ, λ³ΈμΈμ μ€λ§νΈ λλ°μ΄μ€μ λ€λ₯Έ μ¬μ©μμ λλ°μ΄μ€κ° ν¨κ» κ°μ λ νμκ° μλ μ΄λ€ μμ μ΄ μνλκ²λ νλ €κ³ ν μ μλ€. κ·Έλ¬λ, μ’ λμ κΈ°μ μ λ°λ₯΄λ©΄, μ΄λ¬ν μ¬μ©μμ (μ μ¬μ μΈ) μλκ° μ μ νκ² μν¬ν λ μ μμλ€.In this situation, a user might want to perform some task using his or her two or more (heterogeneous) smart devices, or want to perform some task that needs to intervene with his smart device and another user's device. Can be. However, according to the prior art, the (potential) intention of such a user could not be properly supported.
λ³Έ λ°λͺ μ μμ ν λ¬Έμ μ μ λͺ¨λ ν΄κ²°νλ κ²μ κ·Έ λͺ©μ μΌλ‘ νλ€.The object of the present invention is to solve all the above-mentioned problems.
λν, λ³Έ λ°λͺ μ μ 1 λλ°μ΄μ€ λ° μ 2 λλ°μ΄μ€μ μμΈ λλ μμ§μμ κ΄ν μ 보λ₯Ό νλνκ³ , μ 1 λλ°μ΄μ€μ μμΈ λλ μμ§μκ³Ό μ 2 λλ°μ΄μ€μ μμΈ λλ μμ§μ μ¬μ΄μ μλμ μΈ κ΄κ³μ κΈ°μ΄νμ¬ νΉμ λλ μ λ ₯ μ΄λ²€νΈλ₯Ό κ°μ§νκ³ , μ λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ 1 λλ°μ΄μ€ μμμ μ 곡λλ 컨ν μΈ λ° κΈ°λ₯ μ€ μ μ΄λ μΌλΆκ° μ 2 λλ°μ΄μ€ μμμ μ 곡λλλ‘ νκ±°λ μ 2 λλ°μ΄μ€ μμμ μ 곡λλ 컨ν μΈ λ° κΈ°λ₯ μ€ μ μ΄λ μΌλΆκ° μ 1 λλ°μ΄μ€ μμμ μ 곡λλλ‘ ν¨μΌλ‘μ¨, λ κ° μ΄μμ λλ°μ΄μ€μ μμΈ λλ μμ§μ μ¬μ΄μ μλμ μΈ κ΄κ³λ₯Ό μ΄μ©νμ¬ μ¬μ©μμκ² λ³΄λ€ νΈλ¦¬νκ³ νμ₯λ μ¬μ©μ μΈν°νμ΄μ€λ₯Ό μ 곡ν μ μλλ‘ νλ κ²μ λ€λ₯Έ λͺ©μ μΌλ‘ νλ€.In addition, the present invention obtains information about the attitude or movement of the first device and the second device, and inputs an input event that is specified based on the relative relationship between the attitude or movement of the first device and the attitude or movement of the second device. Detect and in response to the input event occur, cause at least some of the content and functions provided on the first device to be provided on the second device or provide at least some of the content and functions provided on the second device on the first device It is another object of the present invention to provide a user with a more convenient and expanded user interface by using a relative relationship between postures or movements of two or more devices.
μκΈ° λͺ©μ μ λ¬μ±νκΈ° μν λ³Έ λ°λͺ μ λνμ μΈ κ΅¬μ±μ λ€μκ³Ό κ°λ€.Representative configuration of the present invention for achieving the above object is as follows.
λ³Έ λ°λͺ μ μΌ νμμ λ°λ₯΄λ©΄, μ¬μ©μ μΈν°νμ΄μ€λ₯Ό μ 곡νκΈ° μν λ°©λ²μΌλ‘μ, (a) μ 1 λλ°μ΄μ€ λ° μ 2 λλ°μ΄μ€μ μμΈ λλ μμ§μμ κ΄ν μ 보λ₯Ό νλνκ³ , μκΈ° μ 1 λλ°μ΄μ€μ μμΈ λλ μμ§μκ³Ό μκΈ° μ 2 λλ°μ΄μ€μ μμΈ λλ μμ§μ μ¬μ΄μ μλμ μΈ κ΄κ³μ κΈ°μ΄νμ¬ νΉμ λλ μ λ ₯ μ΄λ²€νΈλ₯Ό κ°μ§νλ λ¨κ³, λ° (b) μκΈ° μ λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μκΈ° μ 1 λλ°μ΄μ€ μμμ μ 곡λλ 컨ν μΈ λ° κΈ°λ₯ μ€ μ μ΄λ μΌλΆκ° μκΈ° μ 2 λλ°μ΄μ€ μμμ μ 곡λλλ‘ νκ±°λ, μκΈ° μ 2 λλ°μ΄μ€ μμμ μ 곡λλ 컨ν μΈ λ° κΈ°λ₯ μ€ μ μ΄λ μΌλΆκ° μκΈ° μ 1 λλ°μ΄μ€ μμμ μ 곡λλλ‘ νλ λ¨κ³λ₯Ό ν¬ν¨νλ λ°©λ²μ΄ μ 곡λλ€.According to an aspect of the present invention, there is provided a method for providing a user interface, comprising: (a) obtaining information about a posture or movement of a first device and a second device, wherein the posture or movement of the first device and the second device are obtained; Detecting an input event specified based on a relative relationship between a device's posture or movement, and (b) in response to the input event occurring, at least some of the content and functionality provided on the first device To be provided on a second device or to cause at least some of the content and functionality provided on the second device to be provided on the first device.
λ³Έ λ°λͺ μ λ€λ₯Έ νμμ λ°λ₯΄λ©΄, μ¬μ©μ μΈν°νμ΄μ€λ₯Ό μ 곡νκΈ° μν λλ°μ΄μ€λ‘μ, μκΈ° λλ°μ΄μ€ λ° μκΈ° λλ°μ΄μ€μ μ°κ΄λ λ€λ₯Έ λλ°μ΄μ€μ μμΈ λλ μμ§μμ κ΄ν μ 보λ₯Ό νλνκ³ , μκΈ° λλ°μ΄μ€μ μμΈ λλ μμ§μκ³Ό μκΈ° λ€λ₯Έ λλ°μ΄μ€μ μμΈ λλ μμ§μ μ¬μ΄μ μλμ μΈ κ΄κ³μ κΈ°μ΄νμ¬ νΉμ λλ μ λ ₯ μ΄λ²€νΈλ₯Ό κ°μ§νλ κΈ°μ μλ¨, λ° μκΈ° μ λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μκΈ° λλ°μ΄μ€ μμμ μ 곡λλ 컨ν μΈ λ° κΈ°λ₯ μ€ μ μ΄λ μΌλΆκ° μκΈ° λ€λ₯Έ λλ°μ΄μ€ μμμ μ 곡λλλ‘ νκ±°λ, μκΈ° λ€λ₯Έ λλ°μ΄μ€ μμμ μ 곡λλ 컨ν μΈ λ° κΈ°λ₯ μ€ μ μ΄λ μΌλΆκ° μκΈ° λλ°μ΄μ€ μμμ μ 곡λλλ‘ νλ νλ‘κ·Έλ¨ λͺ¨λμ ν¬ν¨νλ λλ°μ΄μ€κ° μ 곡λλ€.According to another aspect of the present invention, a device for providing a user interface, comprising: obtaining information about the posture or movement of the device and other devices associated with the device, the posture or movement of the device and the posture or of the other device; Technical means for sensing a specified input event based on a relative relationship between movements, and in response to the input event occurring, at least some of the content and functions provided on the device are provided on the other device, or A device is provided that includes a program module to allow at least some of the content and functions provided on another device to be provided on the device.
λ³Έ λ°λͺ μ λ λ€λ₯Έ νμμ λ°λ₯΄λ©΄, μ¬μ©μ μΈν°νμ΄μ€λ₯Ό μ 곡νκΈ° μν μμ€ν μΌλ‘μ, μ 1 λλ°μ΄μ€ λ° μ 2 λλ°μ΄μ€μ μμΈ λλ μμ§μμ κ΄ν μ 보λ₯Ό νλνκ³ , μκΈ° μ 1 λλ°μ΄μ€μ μμΈ λλ μμ§μκ³Ό μκΈ° μ 2 λλ°μ΄μ€μ μμΈ λλ μμ§μ μ¬μ΄μ μλμ μΈ κ΄κ³μ κΈ°μ΄νμ¬ νΉμ λλ μ λ ₯ μ΄λ²€νΈλ₯Ό κ°μ§νκ³ , μκΈ° μ λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μκΈ° μ 1 λλ°μ΄μ€ μμμ μ 곡λλ 컨ν μΈ λ° κΈ°λ₯ μ€ μ μ΄λ μΌλΆκ° μκΈ° μ 2 λλ°μ΄μ€ μμμ μ 곡λλλ‘ νκ±°λ, μκΈ° μ 2 λλ°μ΄μ€ μμμ μ 곡λλ 컨ν μΈ λ° κΈ°λ₯ μ€ μ μ΄λ μΌλΆκ° μκΈ° μ 1 λλ°μ΄μ€ μμμ μ 곡λλλ‘ νλ μ μ΄λΆ, λ° μκΈ° μ 1 λλ°μ΄μ€ λ° μκΈ° μ 2 λλ°μ΄μ€ μ€ μ μ΄λ νλλ‘λΆν° μ 곡 λ°μ μ 보λ₯Ό μ μ₯νλ μ μ₯μλ₯Ό ν¬ν¨νλ μμ€ν μ΄ μ 곡λλ€.According to yet another aspect of the present invention, a system for providing a user interface, comprising: obtaining information about a posture or movement of a first device and a second device, wherein the posture or movement of the first device and the second device Detect a specified input event based on a relative relationship between posture or movement, and in response to the input event occurring, at least some of the content and functions provided on the first device to be provided on the second device; And a control unit for allowing at least some of the contents and functions provided on the second device to be provided on the first device, and a storage storing information received from at least one of the first device and the second device. This is provided.
μ΄ μΈμλ, λ³Έ λ°λͺ μ ꡬννκΈ° μν λ€λ₯Έ λ°©λ², λλ°μ΄μ€, μμ€ν λ° μκΈ° λ°©λ²μ μ€ννκΈ° μν μ»΄ν¨ν° νλ‘κ·Έλ¨μ κΈ°λ‘νκΈ° μν λΉμΌμμ±μ μ»΄ν¨ν° νλ κ°λ₯ν κΈ°λ‘ λ§€μ²΄κ° λ μ 곡λλ€.In addition, there is further provided a non-transitory computer readable recording medium for recording another method, device, system, and computer program for executing the method.
λ³Έ λ°λͺ μ μνλ©΄, λ κ° μ΄μμ λλ°μ΄μ€μ μμΈ λλ μμ§μ μ¬μ΄μ μλμ μΈ κ΄κ³λ₯Ό μ΄μ©νμ¬ μ¬μ©μμκ² λ³΄λ€ νΈλ¦¬νκ³ νμ₯λ μ¬μ©μ μΈν°νμ΄μ€λ₯Ό μ 곡ν μ μκ² λλ ν¨κ³Όκ° λ¬μ±λλ€.According to the present invention, an effect of providing a more convenient and expanded user interface to a user is achieved by using a relative relationship between postures or movements of two or more devices.
λν, λ³Έ λ°λͺ μ μνλ©΄, μλͺ© λ±μ μ 체 λΆμμ μ 2 λλ°μ΄μ€λ₯Ό μ°©μ©νκ³ μμΌλ©΄μ κ°μ μμ μ 1 λλ°μ΄μ€λ₯Ό λ€κ³ μλ μ¬μ©μκ° μμ μ μμ λ€μ§λ νμ(μ¦, ν립(Flip))λ₯Ό ννλ κ²λ§μΌλ‘λ μ 1 λλ°μ΄μ€μ μ 2 λλ°μ΄μ€μμμ 컨ν μΈ (λλ κΈ°λ₯) μ 곡 μνκ° μνΈ κ°μ μ νλλλ‘ ν μ μκ² λλ ν¨κ³Όκ° λ¬μ±λλ€.In addition, according to the present invention, a user holding the first device in the same hand while wearing the second device on a body part such as a wrist, even if the user flips his or her hand (that is, flips), An effect is achieved that enables the content (or function) provision state at the first device and the second device to be switched between each other.
λν, λ³Έ λ°λͺ μ μνλ©΄, μλ‘ λ€λ₯Έ λ°©ν₯μ λ°λΌλ³΄κ³ μλ μ 1 λλ°μ΄μ€ λ° μ 2 λλ°μ΄μ€λ₯Ό λͺ¨λ μμ§νκ³ μλ μ¬μ©μκ° μ΄λ€ λλ°μ΄μ€λ₯Ό μμ§νκ³ μλ μμ μμΈ λλ λ°©ν₯κ³Ό κ΄κ³ μμ΄ μ 1 λλ°μ΄μ€μ μ 2 λλ°μ΄μ€λ₯Ό ν΅ν΄ 컨ν μΈ λλ κΈ°λ₯μ κ·Έ μ°μμ±(Continuity)μ΄ μ μ§λλ μνλ‘ μ 곡 λ°μ μ μκ² λλ ν¨κ³Όκ° λ¬μ±λλ€.In addition, according to the present invention, a user having both the first device and the second device looking at different directions may use the first device and the second device irrespective of the posture or the direction of the hand holding the device. Through this, the effect that the content or function can be provided while maintaining its continuity is achieved.
λν, λ³Έ λ°λͺ μ μνλ©΄, κ°κ° μ 1 λλ°μ΄μ€ λ° μ 2 λλ°μ΄μ€λ₯Ό μμ§νκ³ μλ μ 1 μ¬μ©μ λ° μ 2 μ¬μ©μκ° μ λ ₯ μ΄λ²€νΈμ ν΄λΉνλ κ°λ¨ν νμλ₯Ό ννλ κ²λ§μΌλ‘ μ 1 λλ°μ΄μ€μ μ 2 λλ°μ΄μ€ μ¬μ΄μμ 컨ν μΈ (λλ κΈ°λ₯)κ° μ°μμ± μκ² κ³΅μ λ μ μκ² λλ ν¨κ³Όκ° λ¬μ±λλ€.In addition, according to the present invention, the first user and the second user each having the first device and the second device, respectively, by simply performing a simple action corresponding to an input event, the content (or The effect is that the functions) can be shared continuously.
λ 1μ λ³Έ λ°λͺ μ μΌ μ€μμμ λ°λΌ μ¬μ©μ μΈν°νμ΄μ€λ₯Ό μ 곡νκΈ° μν μ 체 μμ€ν μ κ°λ΅μ μΈ κ΅¬μ±μ λνλ΄λ λλ©΄μ΄λ€.1 is a view showing a schematic configuration of an entire system for providing a user interface according to an embodiment of the present invention.
λ 2λ λ³Έ λ°λͺ μ μΌ μ€μμμ λ°λΌ μ 1 λλ°μ΄μ€μ μ 2 λλ°μ΄μ€ μνΈ κ°μ 컨ν μΈ (λλ κΈ°λ₯) μ 곡 μνκ° μ νλλ ꡬμ±μ κ°λ μ μΌλ‘ λνλ΄λ λλ©΄μ΄λ€.2 is a diagram conceptually illustrating a configuration in which a content (or function) provision state is switched between a first device and a second device according to an embodiment of the present invention.
λ 3 λ΄μ§ λ 5λ λ³Έ λ°λͺ μ μΌ μ€μμμ λ°λΌ μ¬μ©μ μΈν°νμ΄μ€λ₯Ό μ 곡νλ ꡬμ±μ μμμ μΌλ‘ λνλ΄λ λλ©΄μ΄λ€.3 to 5 are diagrams exemplarily illustrating a configuration for providing a user interface according to an embodiment of the present invention.
λ 6μ λ³Έ λ°λͺ μ μΌ μ€μμμ λ°λΌ μ 1 λλ°μ΄μ€ μμμ μ 곡λλ 컨ν μΈ μ νμκ³Ό μ 2 λλ°μ΄μ€ μμμ μ 곡λλ 컨ν μΈ μ νμμ΄ μλ‘ λ¬λΌμ§λ ꡬμ±μ μμμ μΌλ‘ λνλ΄λ λλ©΄μ΄λ€.6 is a diagram illustrating a configuration in which a format of content provided on a first device and a format of content provided on a second device are different from each other according to an embodiment of the present invention.
λ 7μ λ³Έ λ°λͺ μ μΌ μ€μμμ λ°λΌ μ 1 λλ°μ΄μ€μ μ μ₯λμ΄ μλ 컨ν μΈ κ° μ 2 λλ°μ΄μ€λ‘ μ μ‘λλ ꡬμ±μ μμμ μΌλ‘ λνλ΄λ λλ©΄μ΄λ€.FIG. 7 is a diagram illustrating a configuration in which content stored in a first device is transmitted to a second device according to an embodiment of the present invention.
λ 8 λ΄μ§ λ 15λ λ³Έ λ°λͺ μ λ€λ₯Έ μ€μμμ λ°λΌ μ¬μ©μ μΈν°νμ΄μ€κ° μ 곡λλ ꡬμ±μ μμμ μΌλ‘ λνλ΄λ λλ©΄μ΄λ€.8 to 15 are diagrams exemplarily illustrating a configuration in which a user interface is provided according to another embodiment of the present invention.
λ 16 λ΄μ§ λ 22λ λ³Έ λ°λͺ μ λ€μν μ€μμμ λ°λΌ μ 곡λλ μ¬μ©μ μΈν°νμ΄μ€κ° μ€μ λ‘ μμ°λκ³ μλ λͺ¨μ΅μ μμμ μΌλ‘ λνλ΄λ λλ©΄μ΄λ€.16 to 22 are diagrams exemplarily illustrating how a user interface provided according to various embodiments of the present invention is actually demonstrated.
μ 체 μμ€ν μ ꡬμ±Configuration of the entire system
λ 1μ λ³Έ λ°λͺ μ μΌ μ€μμμ λ°λΌ μ¬μ©μ μΈν°νμ΄μ€λ₯Ό μ 곡νκΈ° μν μ 체 μμ€ν μ κ°λ΅μ μΈ κ΅¬μ±μ λνλ΄λ λλ©΄μ΄λ€.1 is a view showing a schematic configuration of an entire system for providing a user interface according to an embodiment of the present invention.
λ 1μ λμλ λ°μ κ°μ΄, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯Έ μ 체 μμ€ν
μ ν΅μ λ§(100), μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200), λ° λ³΅μμ λλ°μ΄μ€(310, 320)λ₯Ό ν¬ν¨νμ¬ κ΅¬μ±λ μ μλ€.As shown in FIG. 1, the entire system according to an embodiment of the present invention may include a
λ¨Όμ , λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯Έ ν΅μ λ§(100)μ μ μ ν΅μ μ΄λ 무μ ν΅μ κ³Ό κ°μ ν΅μ μνλ₯Ό κ°λ¦¬μ§ μκ³ κ΅¬μ±λ μ μμΌλ©°, 근거리 ν΅μ λ§(LAN; Local Area Network), λμκΆ ν΅μ λ§(MAN; Metropolitan Area Network), κ΄μ ν΅μ λ§(WAN; Wide Area Network) λ± λ€μν ν΅μ λ§μΌλ‘ ꡬμ±λ μ μλ€. λ°λμ§νκ²λ, λ³Έ λͺ
μΈμμμ λ§νλ ν΅μ λ§(100)μ 곡μ§μ μΈν°λ· λλ μλμμ΄λμΉ(WWW; World Wide Web)μΌ μ μλ€. κ·Έλ¬λ, ν΅μ λ§(100)μ, κ΅³μ΄ μ΄μ κ΅νλ νμ μμ΄, 곡μ§μ μ 무μ λ°μ΄ν° ν΅μ λ§, 곡μ§μ μ νλ§ λλ 곡μ§μ μ 무μ ν
λ λΉμ ν΅μ λ§μ κ·Έ μ μ΄λ μΌλΆμ μμ΄μ ν¬ν¨ν μλ μλ€.First, the
λ€μμΌλ‘, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯Έ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ λ©λͺ¨λ¦¬ μλ¨μ ꡬλΉνκ³ λ§μ΄ν¬λ‘ νλ‘μΈμλ₯Ό νμ¬νμ¬ μ°μ° λ₯λ ₯μ κ°μΆ λμ§νΈ κΈ°κΈ°μΌ μ μλ€. μ΄λ¬ν μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ μλ² μμ€ν
μΌ μ μλ€. μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ ν΅μ λ§(100)μ ν΅νμ¬ λλ°μ΄μ€(310, 320) μ€ νλκ° λ€λ₯Έ νλλ‘ μμ μ μ 보λ μ μ΄ λͺ
λ Ήμ μ 곡νκ±°λ, νλκ° λ€λ₯Έ νλλ‘λΆν° μμ μ μ 보λ μ μ΄ λͺ
λ Ήμ μ 곡 λ°λλ‘ λ§€κ°νλ κΈ°λ₯μ μνν μ μλ€.Next, the user
μ΄λ₯Ό μνμ¬, μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ, μλμμ μμΈνκ² μ€λͺ
λλ λ°μ κ°μ΄, μ 1 λλ°μ΄μ€ λ° μ 2 λλ°μ΄μ€μ μμΈ λλ μμ§μμ κ΄ν μ 보λ₯Ό νλνκ³ , μ 1 λλ°μ΄μ€μ μμΈ λλ μμ§μκ³Ό μ 2 λλ°μ΄μ€μ μμΈ λλ μμ§μ μ¬μ΄μ μλμ μΈ κ΄κ³μ κΈ°μ΄νμ¬ νΉμ λλ μ
λ ₯ μ΄λ²€νΈλ₯Ό κ°μ§νκ³ , μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ 1 λλ°μ΄μ€ μμμ μ 곡λλ 컨ν
μΈ λ° κΈ°λ₯ μ€ μ μ΄λ μΌλΆκ° μ 2 λλ°μ΄μ€ μμμ μ 곡λλλ‘ νκ±°λ μ 2 λλ°μ΄μ€ μμμ μ 곡λλ 컨ν
μΈ λ° κΈ°λ₯ μ€ μ μ΄λ μΌλΆκ° μ 1 λλ°μ΄μ€ μμμ μ 곡λλλ‘ ν¨μΌλ‘μ¨, λ κ° μ΄μμ λλ°μ΄μ€μ μμΈ λλ μμ§μ μ¬μ΄μ μλμ μΈ κ΄κ³λ₯Ό μ΄μ©νμ¬ μ¬μ©μμκ² λ³΄λ€ νΈλ¦¬νκ³ νμ₯λ μ¬μ©μ μΈν°νμ΄μ€λ₯Ό μ 곡νλ κΈ°λ₯μ μνν μ μλ€.To this end, the user
μ΄μ κ°μ μ¬μ©μ μΈν°νμ΄μ€ μ 곡μ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ μνμ¬ ν¬ν¨λλ μ μ΄λΆ(λ―Έλμλ¨)μ μνμ¬ μνλ μ μλ€. μ΄λ¬ν μ μ΄λΆλ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200) λ΄μμ νλ‘κ·Έλ¨ λͺ¨λμ ννλ‘ μ‘΄μ¬ν μ μλ€. μ΄λ¬ν νλ‘κ·Έλ¨ λͺ¨λμ μ΄μ μμ€ν
, μμ© νλ‘κ·Έλ¨ λͺ¨λ λλ κΈ°ν νλ‘κ·Έλ¨ λͺ¨λμ ννλ₯Ό κ°μ§ μ μλ€. λν, νλ‘κ·Έλ¨ λͺ¨λμ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)κ³Ό ν΅μ κ°λ₯ν μ격 κΈ°μ΅ μ₯μΉμ μ μ₯λ μλ μλ€. ννΈ, νλ‘κ·Έλ¨ λͺ¨λμ λ³Έ λ°λͺ
μ λ°λΌ νμ ν νΉμ μ
무λ₯Ό μννκ±°λ νΉμ μΆμ λ°μ΄ν° μ νμ μ€ννλ 루ν΄, μλΈλ£¨ν΄, νλ‘κ·Έλ¨, μ€λΈμ νΈ, μ»΄ν¬λνΈ, λ°μ΄ν° ꡬ쑰 λ±μ ν¬κ΄νμ§λ§, μ΄μ μ νλμ§λ μλλ€.Such user interface provision may be performed by a controller (not shown) included by the user
λν, μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ 볡μμ λλ°μ΄μ€(310, 320) μ€ μ μ΄λ νλλ‘λΆν° μ 곡 λ°μ μμΈ λλ μμ§μμ κ΄ν μ 보λ₯Ό μ μ₯νκ³ , μ΄κ²μ΄ 볡μμ λλ°μ΄μ€(310, 320) μ€ μ μ΄λ νλμ μνμ¬ μ΄μ©λλλ‘ νλ κΈ°λ₯μ λ μνν μ μλ€. λμκ°, μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ 볡μμ λλ°μ΄μ€(310, 320) μ€ μ μ΄λ νλμμ μ 곡λκ³ μλ 컨ν
μΈ λλ κΈ°λ₯μ ꡬμ±νλ μ 보λ₯Ό μ μ₯νκ³ , μ΄λ¬ν μ λ³΄κ° λ³΅μμ λλ°μ΄μ€(310, 320) μ€ μ μ΄λ νλμ μνμ¬ μ΄μ©λλλ‘ νλ κΈ°λ₯μ λ μνν μ μλ€. μμ ν μ μ₯μ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ μνμ¬ ν¬ν¨λλ μ μ₯μ(λ―Έλμλ¨)μ μνμ¬ μνλ μ μλ€. μ΄λ¬ν μ μ₯μλ μ»΄ν¨ν° νλ
κ°λ₯ν κΈ°λ‘ λ§€μ²΄λ₯Ό ν¬ν¨νλ κ°λ
μΌλ‘μ, νμμ λ°μ΄ν°λ² μ΄μ€λΏλ§ μλλΌ νμΌ μμ€ν
μ κΈ°λ°μ λ λ°μ΄ν° κΈ°λ‘ λ±μ ν¬ν¨νλ κ΄μμ λ°μ΄ν°λ² μ΄μ€μΌ μλ μλ€.In addition, the user
μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ κΈ°λ₯μ κ΄νμ¬λ μλμμ λ μμΈνκ² μμλ³΄κΈ°λ‘ νλ€. ννΈ, μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ κ΄νμ¬ μμ κ°μ΄ μ€λͺ
λμμΌλ, μ΄λ¬ν μ€λͺ
μ μμμ μΈ κ²μ΄κ³ , μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ μꡬλλ κΈ°λ₯μ΄λ ꡬμ±μμμ μ μ΄λ μΌλΆκ° νμμ λ°λΌ μ‘°μμ λμμ΄ λλ 볡μμ λλ°μ΄μ€(310, 320) μ€ μ μ΄λ νλμμ μ€νλκ±°λ ν¬ν¨λ μλ μμμ λΉμ
μμκ² μλͺ
νλ€.The function of the user
λ§μ§λ§μΌλ‘, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯Έ 볡μμ λλ°μ΄μ€(310, 320)λ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200) λλ 볡μμ λλ°μ΄μ€(310, 320) μ€ μλλλ κ²(λ°λμ§νκ²λ, 볡μμ λλ°μ΄μ€(310, 320)λ μλ‘ λΆλ¦¬λμ΄ μκ±°λ μΈλΆν(externalization) λμ΄ μμ μ μμ)μ μ μν ν ν΅μ ν μ μλ κΈ°λ₯μ ν¬ν¨νλ λμ§νΈ κΈ°κΈ°λ‘μ, λ©λͺ¨λ¦¬ μλ¨μ ꡬλΉνκ³ λ§μ΄ν¬λ‘ νλ‘μΈμλ₯Ό νμ¬νμ¬ μ°μ° λ₯λ ₯μ κ°μΆ λμ§νΈ κΈ°κΈ°λΌλ©΄ μΌλ§λ μ§ λ³Έ λ°λͺ
μ λ°λ₯Έ λλ°μ΄μ€(310, 320)λ‘μ μ±νλ μ μλ€. λλ°μ΄μ€(310, 320)λ μ€λ§νΈν°, μ€λ§νΈ ν¨λ, μ€λ§νΈ κΈλμ€, μ€λ§νΈ μμΉ, μ€λ§νΈ λ°΄λ, μ€λ§νΈ λ§ λ±κ³Ό κ°μ μμ μ€λ§νΈ λλ°μ΄μ€μ΄κ±°λ λ°μ€ν¬ν μ»΄ν¨ν°, λ
ΈνΈλΆ μ»΄ν¨ν°, μν¬μ€ν
μ΄μ
, PDA, μΉ ν¨λ, μ΄λ μ νκΈ°, λ²νΌ, λ§μ°μ€, ν€λ³΄λ, μ μ ν λ±κ³Ό κ°μ λ€μ μ ν΅μ μΈ λλ°μ΄μ€μΌ μ μλ€. λν, λλ°μ΄μ€(310, 320)λ 리λͺ¨μ»¨, κ°μ κΈ°κΈ° λ±μ IoT(Internet of Things) λλ°μ΄μ€μΌ μ μλ€.Lastly, the plurality of
νΉν, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯΄λ©΄, λλ°μ΄μ€(310, 320)μλ μ¬μ©μλ‘λΆν°μ μ‘°μμ μ
λ ₯ λ°μ μ μλ κΈ°μ μλ¨μ΄ μ μ΄λ νλ ν¬ν¨λ μ μλ€. μ΄λ¬ν κΈ°μ μλ¨μ μλ‘μ, 곡μ§μ ꡬμ±μμμΈ, ν°μΉ ν¨λ, ν¬μΈν
λꡬ(μλ₯Ό λ€λ©΄, λ§μ°μ€, μ€νμΌλ¬μ€, μ μ ν λ±), μ¬μ©μ μ‘°μμ΄ κ°λ₯ν κ·Έλν½ κ°μ²΄, ν€λ³΄λ, ν κΈ μ€μμΉ, μμ± μ 보(μ§λ¬Έ λ±) μΈμ μΌμ, 거리 μΌμ λ±μ λ€ μ μλ€.In particular, according to an embodiment of the present invention, the
λν, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯΄λ©΄, λλ°μ΄μ€(310, 320)μλ κ·Έ λλ°μ΄μ€(310, 320)μ μμΈλ μμ§μμ κ΄ν 물리μ μΈ μ 보λ₯Ό νλν μ μλ κΈ°μ μλ¨μ΄ μ μ΄λ νλ ν¬ν¨λ μ μλ€. μ΄λ¬ν κΈ°μ μλ¨μ μλ‘μ, 곡μ§μ ꡬμ±μμμΈ, μμ§μ μΌμ, κ°μλ μΌμ, μμ΄λ‘μ€μ½ν, μκΈ° μΌμ, μμΉ κ²°μ λͺ¨λ(GPS λͺ¨λ, λΉμ½ κΈ°λ°μ μμΉ κ²°μ (νμΈ) λͺ¨λ λ±), κΈ°μκ³, 거리 μΌμ, μΉ΄λ©λΌ λ±μ λ€ μ μλ€.In addition, according to an embodiment of the present invention, the
λν, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯΄λ©΄, λλ°μ΄μ€(310, 320)μλ ν΄λΉ λλ°μ΄μ€(310, 320)λ₯Ό μμ§νκ³ μλ μ¬μ©μμ μΈμ²΄λ‘λΆν° νλλλ μ체 μ 보μ κΈ°μ΄νμ¬ ν΄λΉ λλ°μ΄μ€(310, 320)μ μμΈλ μμ§μμ κ΄ν 물리μ μΈ μ 보λ₯Ό νλν μ μλ κΈ°μ μλ¨μ΄ ν¬ν¨λ μ μλ€. μ΄λ¬ν κΈ°μ μλ¨μ μλ‘μ, κ·Όμ λ μ νΈ μΈ‘μ μ₯μΉ λ±μ λ€ μ μλ€.In addition, according to an embodiment of the present invention, the
λν, λλ°μ΄μ€(310, 320)μλ μμ κ°μ 물리μ μΈ μ 보λ₯Ό μ²λ¦¬νμ¬ λ€λ₯Έ λλ°μ΄μ€(310, 320 λ±)μ λνμ¬ μ 보λ μ μ΄ λͺ
λ Ήμ μ 곡νκ±°λ λ€λ₯Έ λλ°μ΄μ€(310, 320 λ±)λ‘λΆν° μ 보λ μ μ΄ λͺ
λ Ήμ μ 곡 λ°κ±°λ μ΄λ¬ν μ 보λ μ μ΄ λͺ
λ Ήμ μμ±νκΈ° μν μ ν리μΌμ΄μ
νλ‘κ·Έλ¨μ΄ λ ν¬ν¨λμ΄ μμ μ μλ€. μ΄λ¬ν μ ν리μΌμ΄μ
μ ν΄λΉ λλ°μ΄μ€(310, 320) λ΄μμ νλ‘κ·Έλ¨ λͺ¨λμ ννλ‘ μ‘΄μ¬ν μ μλ€. μ΄λ¬ν νλ‘κ·Έλ¨ λͺ¨λμ μ±κ²©μ μ μ ν λ°μ κ°μ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ μ μ΄λΆμ μ λ°μ μΌλ‘ μ μ¬ν μ μλ€. μ¬κΈ°μ, μ ν리μΌμ΄μ
μ κ·Έ μ μ΄λ μΌλΆκ° νμμ λ°λΌ κ·Έκ²κ³Ό μ€μ§μ μΌλ‘ λμΌνκ±°λ κ· λ±ν κΈ°λ₯μ μνν μ μλ νλμ¨μ΄ μ₯μΉλ νμ¨μ΄ μ₯μΉλ‘ μΉνλ μλ μλ€.In addition, the
ννΈ, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯΄λ©΄, μ 1 λλ°μ΄μ€(310)μ μ 2 λλ°μ΄μ€(320)κ° μμ μ μ°κ΄μ±(μλ₯Ό λ€λ©΄, λμΌν μ¬μ©μμκ² μνλ κ²μ΄λΌλ μ°κ΄μ±μ΄λ, λμΌν μ¬μ©μλ₯Ό μνμ¬ κΈ°λ₯νλ κ²μ΄λΌλ μ°κ΄μ±μ΄λ, μλ‘ μ€μ§μ μΌλ‘ κ°κΉμ΄ κ³³μ λ°°μΉλμ΄ μλ κ²μ΄λΌλ μ°κ΄μ±μ΄λ, μ΄λ€ μ€ νλκ° λ€λ₯Έ νλλ₯Ό μΈμ¦νκ±°λ νμ©νλ κ²μ΄ ν©λΉνλ€λ μ°κ΄μ±)μ κ°λ κ²μ΄λΌλ μ μ΄ μΈμλλ κ²½μ°μ, μ 1 λλ°μ΄μ€(310)μ μ 2 λλ°μ΄μ€(320) κ°μ μμ μ μ°κ²°μ΄ νμ±λ μ μλλ°, μ΄λ¬ν μΈμ λλ μ°κ²°μ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ μν΄ μνλκ±°λ μ 1 λλ°μ΄μ€(310) λ° μ 2 λλ°μ΄μ€(320)μ μν΄ μνλ μ μλ€.Meanwhile, according to an embodiment of the present invention, the
μ€μμλ€Examples
μλμμλ, λ³Έ λ°λͺ
μ λ°λ₯Έ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ΄ μ¬λ¬ μ€μμμ λ°λΌ 볡μμ λλ°μ΄μ€(310, 320)κ° κ°μ
λλ μ¬μ©μ μΈν°νμ΄μ€λ₯Ό μ 곡νλ ꡬ체μ μΈ μμ κ΄νμ¬ μμΈνκ² μ΄ν΄λ³΄κΈ°λ‘ νλ€.Hereinafter, a detailed example in which the user
λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯΄λ©΄, μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ, μ 1 λλ°μ΄μ€(310) λ° μ 2 λλ°μ΄μ€(320)μ μμΈ λλ μμ§μμ κ΄ν μ 보λ₯Ό νλνκ³ , μ 1 λλ°μ΄μ€(310)μ μμΈ λλ μμ§μκ³Ό μ 2 λλ°μ΄μ€(320)μ μμΈ λλ μμ§μ μ¬μ΄μ μλμ μΈ κ΄κ³μ κΈ°μ΄νμ¬ νΉμ λλ μ
λ ₯ μ΄λ²€νΈλ₯Ό κ°μ§νκ³ , μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ 1 λλ°μ΄μ€(310) μμμ μ 곡λλ 컨ν
μΈ λ° κΈ°λ₯ μ€ μ μ΄λ μΌλΆκ° μ 2 λλ°μ΄μ€(320) μμμ μ 곡λλλ‘ νκ±°λ μ 2 λλ°μ΄μ€(320) μμμ μ 곡λλ 컨ν
μΈ λ° κΈ°λ₯ μ€ μ μ΄λ μΌλΆκ° μ 1 λλ°μ΄μ€(310) μμμ μ 곡λλλ‘ ν μ μλ€.According to an embodiment of the present invention, the user
ꡬ체μ μΌλ‘, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯΄λ©΄, μ 1 λλ°μ΄μ€(310) λ° μ 2 λλ°μ΄μ€(320) μ€ μ μ΄λ νλλ μ
λ ₯ μ΄λ²€νΈλ₯Ό μ λ°μν€λ μ
λ ₯ μλ¨κ³Ό μΌμ²΄λ‘ κ²°ν©λμ΄ μμ§μΌ μ μλ€. μλ₯Ό λ€λ©΄, μ 1 λλ°μ΄μ€(310)λ μ¬μ©μκ° μμΌλ‘ νμ§ν μ μλ μ€λ§νΈν°μΌ μ μκ³ , μ 2 λλ°μ΄μ€(320)λ μ¬μ©μμ μλͺ©μ μ°©μ©λ μ μλ μ€λ§νΈ μμΉμΌ μ μλ€.In detail, according to one embodiment of the present invention, at least one of the
λν, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯΄λ©΄, μ 1 λλ°μ΄μ€(310)μ μμΈ λλ μμ§μκ³Ό μ 2 λλ°μ΄μ€(320)μ μμΈ λλ μμ§μ μ¬μ΄μ μλμ μΈ κ΄κ³μ κΈ°μ΄νμ¬ νΉμ λλ μ
λ ₯ μ΄λ²€νΈλ, μ μ΄λ νλμ μ’ν μΆμ κΈ°μ€μΌλ‘ ν λ μ 1 λλ°μ΄μ€(310)μ λ°©ν₯μ΄ λ³ννλ μμκ³Ό μ 2 λλ°μ΄μ€(320)μ λ°©ν₯μ΄ λ³ννλ μμμ΄ μλ‘ λ°λμ΄κ±°λ λμΉμΈ μ΄λ²€νΈλ₯Ό κ°λ¦¬ν¬ μ μλ€. μλ₯Ό λ€λ©΄, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯΄λ©΄, μ¬μ©μκ° μ 1 λλ°μ΄μ€(310)λ₯Ό νμ§νκ³ μκ³ μλͺ©μ μ 2 λλ°μ΄μ€(320)λ₯Ό μ°©μ©νκ³ μλ μμ λ€μ§λ νμ(μΌλͺ
, ν립(Flip))λ₯Ό ννλ μ΄λ²€νΈλ₯Ό κ°λ¦¬ν¬ μ μλ€.In addition, according to an embodiment of the present invention, the input event specified based on a relative relationship between the posture or movement of the
ννΈ, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯΄λ©΄, μ
λ ₯ μ΄λ²€νΈλ, μ 1 λ©΄μ κΈ°μ€μΌλ‘ νλ μ 1 λλ°μ΄μ€(310)μ μμΈ λλ μμ§μκ³Ό μ 2 λ©΄μ κΈ°μ€μΌλ‘ νλ μ 2 λλ°μ΄μ€(310)μ μμΈ λλ μμ§μ μ¬μ΄μ μλμ μΈ κ΄κ³μ κΈ°μ΄νμ¬ νΉμ λ μ μλ€. μ¬κΈ°μ, μ 1 λ©΄ λλ μ 2 λ©΄μ, μ 1 λλ°μ΄μ€(310) λλ μ 2 λλ°μ΄μ€(320)μ μλ©΄, νλ©΄, μ’λ©΄, μ°λ©΄, μ λ©΄, νλ©΄ λ±μ ν΄λΉν μ μλ€.Meanwhile, according to an embodiment of the present invention, the input event may include a posture or movement of the
λ 2λ λ³Έ λ°λͺ μ μΌ μ€μμμ λ°λΌ μ 1 λλ°μ΄μ€μ μ 2 λλ°μ΄μ€ μνΈ κ°μ 컨ν μΈ (λλ κΈ°λ₯) μ 곡 μνκ° μ νλλ ꡬμ±μ κ°λ μ μΌλ‘ λνλ΄λ λλ©΄μ΄λ€.2 is a diagram conceptually illustrating a configuration in which a content (or function) provision state is switched between a first device and a second device according to an embodiment of the present invention.
μΌλ°μ μΌλ‘, μμΌλ‘ μ 1 λλ°μ΄μ€(310)λ₯Ό νμ§νκ³ μκ³ κ°μ μμ μλͺ©μ μ 2 λλ°μ΄μ€(320)λ₯Ό μ°©μ©νκ³ μλ μ¬μ©μκ° ν΄λΉ μμ λ€μ§λ νμ(μ¦, ν립)λ₯Ό ννλ κ²½μ°μ, μμ νμκ° νν΄μ§κΈ° μ μ νμ νλ©΄μ΄ μ¬μ©μ μͺ½μ λ°λΌλ³΄κ³ μλ μνλ‘ μ¬μ©μμκ² μ»¨ν
μΈ (λλ κΈ°λ₯)μ μ 곡νκ³ μλ μ 1 λλ°μ΄μ€(310) λλ μ 2 λλ°μ΄μ€(320)λ μμ νμκ° νν΄μ§κ³ λ νμλ κ·Έ νμ νλ©΄μ΄ μ¬μ©μ μͺ½μ λ°λΌλ³΄μ§ λͺ»νκ² λμ΄ λ μ΄μ μ¬μ©μμκ² μ»¨ν
μΈ (λλ κΈ°λ₯)μ μ 곡νκΈ° μ΄λ ΅κ² λ μ μλ€.In general, when the user holding the
λ 2λ₯Ό μ°Έμ‘°νλ©΄, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯Έ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ, μ¬μ©μκ° μλ‘ λ€λ₯Έ λ°©ν₯μ λ°λΌλ³΄κ³ μλ μ 1 λλ°μ΄μ€(310) λ° μ 2 λλ°μ΄μ€(320)λ₯Ό λͺ¨λ μμ§νκ³ μλ μμ λ€μ§λ ν립 νμλ₯Ό ννλ κ²½μ°μ, μμ νμλ₯Ό μ€λ§νΈν° ννμ μ 1 λλ°μ΄μ€(310)μ λ°©ν₯μ΄ λ³ννλ μμκ³Ό μ€λ§νΈ μμΉ ννμ μ 2 λλ°μ΄μ€(320)μ λ°©ν₯μ΄ λ³ννλ μμμ΄ μλ‘ λ°λμ΄κ±°λ λμΉμΈ μ
λ ₯ μ΄λ²€νΈλ‘μ μΈμνκ³ , μ
λ ₯ μ΄λ²€νΈκ° λ°μνκΈ° μ μ νμ νλ©΄μ΄ μ¬μ©μ μͺ½μ λ°λΌλ³΄κ³ μλ μ 1 λλ°μ΄μ€(310)(λλ μ 2 λλ°μ΄μ€(320)) μμμ μ 곡λκ³ μλ 컨ν
μΈ (λλ κΈ°λ₯) μ€ μ μ΄λ μΌλΆ(201, 202)κ° μ
λ ₯ μ΄λ²€νΈκ° λ°μν μ΄νμ νμ νλ©΄μ΄ μ¬μ©μ μͺ½μ λ°λΌλ³΄κ² λ μ 2 λλ°μ΄μ€(320)(λλ μ 1 λλ°μ΄μ€(310)) μμμ μ 곡λλλ‘ ν μ μλ€. λ°λΌμ, λ³Έ λ°λͺ
μ λ°λ₯΄λ©΄, μ¬μ©μμκ² μ 곡λλ 컨ν
μΈ (λλ κΈ°λ₯)κ° μ 1 λλ°μ΄μ€(310)μ μ 2 λλ°μ΄μ€(320)λ₯Ό λͺ¨λ ν΅ν΄μ μ°μμ±(continuity)μ μ μ§ν μ±λ‘ μ 곡λ μ μκ² λλ€. λ 16 λ° λ 17μ μ°Έμ‘°νλ©΄, μμ μ€μμκ° μ€μ λ‘ μμ°λλ λͺ¨μ΅μ νμΈν μ μλ€.Referring to FIG. 2, the
λ 3 λ΄μ§ λ 5λ λ³Έ λ°λͺ μ μΌ μ€μμμ λ°λΌ μ¬μ©μ μΈν°νμ΄μ€λ₯Ό μ 곡νλ ꡬμ±μ μμμ μΌλ‘ λνλ΄λ λλ©΄μ΄λ€.3 to 5 are diagrams exemplarily illustrating a configuration for providing a user interface according to an embodiment of the present invention.
λ 3 λ° λ 4λ₯Ό μ°Έμ‘°νλ©΄, μΌμͺ½ μλͺ©μ μ 2 λλ°μ΄μ€(320)λ₯Ό μ°©μ©νκ³ μκ³ κ°μ μΌμͺ½ μμΌλ‘ μ 1 λλ°μ΄μ€(310)λ₯Ό νμ§νκ³ μλ μ¬μ©μκ° μμ μ μΌμμ λ€μ§λ νμ(μ¦, ν립)λ₯Ό ννλ κ²½μ°λ₯Ό κ°μ ν μ μλ€. μ΄λ¬ν κ²½μ°μ, μ¬μ©μ μͺ½μ λ°λΌλ³΄κ³ μλ μ 1 λλ°μ΄μ€(310)κ° λ μ΄μ μ¬μ©μ μͺ½μ λ°λΌλ³΄μ§ λͺ»νκ² λλ λμ μ μ¬μ©μ μͺ½μ λ°λΌλ³΄μ§ μκ³ μλ μ 2 λλ°μ΄μ€(320)κ° μ¬μ©μ μͺ½μ λ°λΌλ³΄κ² λ μ μκ³ (λ 3 μ°Έμ‘°), λ°λλ‘, μ¬μ©μ μͺ½μ λ°λΌλ³΄κ³ μλ μ 2 λλ°μ΄μ€(320)κ° λ μ΄μ μ¬μ©μ μͺ½μ λ°λΌλ³΄μ§ λͺ»νκ² λλ λμ μ μ¬μ©μ μͺ½μ λ°λΌλ³΄μ§ μκ³ μλ μ 1 λλ°μ΄μ€(310)κ° μ¬μ©μ μͺ½μ λ°λΌλ³΄κ² λ μ μλ€(λ 4 μ°Έμ‘°).3 and 4, a user wearing the
λ 5λ₯Ό μ°Έμ‘°νλ©΄, μ 1 λλ°μ΄μ€(310) λ° μ 2 λλ°μ΄μ€(320)μ ꡬλΉλμ΄ μλ μμ΄λ‘μ€μ½ν(Gyroscope)μ μνμ¬ μΈ‘μ λλ μ 1 λλ°μ΄μ€(310) λ° μ 2 λλ°μ΄μ€(320)μ μμΈ λλ λ°©ν₯μ κ΄ν 물리μ μΈ μ λ³΄κ° νλλ μ μμΌλ©°, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯Έ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ, λ 5μ λμλ λ°μ κ°μ 물리μ μΈ μ 보μ κΈ°μ΄νμ¬ μ
λ ₯ μ΄λ²€νΈκ° λ°μνλμ§ μ¬λΆλ₯Ό κ°μ§ν μ μλ€. ꡬ체μ μΌλ‘, λ 5μ (a) λ° (b)λ κ°κ° μ 1 λλ°μ΄μ€(310) λ° μ 2 λλ°μ΄μ€(320)μ λν μμ΄λ‘μ€μ½ν μΈ‘μ κ°μ κ°λ¦¬ν€λ©°, λ 5μ (a) λ° (b) κ°κ°μμ λ
Έλμ, λΆμμ λ° μ΄λ‘μ κ·Έλνλ κ°κ° λ°©μκ°(azimuth). νΌμΉ(pitch) λ° λ‘€(roll) λ°©ν₯μ μΈ‘μ κ°μ κ°λ¦¬ν¨λ€.Referring to FIG. 5, a posture or direction of the
ννΈ, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯΄λ©΄, μ¬μ©μμ ν립 νμμ λ°λ₯Έ μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ¬μ©μ μͺ½μ λ°λΌλ³΄κ³ μλ μ 1 λλ°μ΄μ€(310) μμμ μ 곡λκ³ μλ 컨ν
μΈ μ€ μ μ΄λ μΌλΆκ° μ 2 λλ°μ΄μ€(320)μκ² μ μ‘λλλ‘ νκ±°λ, λ°λλ‘, μ¬μ©μ μͺ½μ λ°λΌλ³΄κ³ μλ μ 2 λλ°μ΄μ€(320) μμμ μ 곡λκ³ μλ 컨ν
μΈ μ€ μ μ΄λ μΌλΆκ° μ 1 λλ°μ΄μ€(310)μκ² μ μ‘λλλ‘ ν μ μλ€.Meanwhile, according to an embodiment of the present disclosure, in response to an input event caused by a user's flipping action, at least a part of content provided on the
λν, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯΄λ©΄, μ¬μ©μμ ν립 νμμ λ°λ₯Έ μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ¬μ©μ μͺ½μ λ°λΌλ³΄κ³ μλ μ 1 λλ°μ΄μ€(310) μμμ μ€νλκ³ μλ νλ‘μΈμ€κ° μ 2 λλ°μ΄μ€(320) μμμ μ€νλλλ‘ νκ±°λ, λ°λλ‘, μ¬μ©μ μͺ½μ λ°λΌλ³΄κ³ μλ μ 2 λλ°μ΄μ€(320) μμμ μ€νλκ³ μλ νλ‘μΈμ€κ° μ 1 λλ°μ΄μ€(310) μμμ μ€νλλλ‘ ν μ μλ€.In addition, according to an embodiment of the present invention, in response to an input event caused by a user's flipping action, a process that is being executed on the
λν, λ³Έ λ°λͺ μ μΌ μ€μμμ λ°λ₯΄λ©΄, μ¬μ©μμ ν립 νμμ λ°λ₯Έ μ λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ λ ₯ μ΄λ²€νΈκ° λ°μνκΈ° μ μ μ 1 λλ°μ΄μ€ μ(310)(λλ μ 2 λλ°μ΄μ€(320))μμ μ 곡λκ³ μλ 컨ν μΈ μ νμκ³Ό μ λ ₯ μ΄λ²€νΈκ° λ°μν μ΄νμ μ 2 λλ°μ΄μ€(320)(λλ μ 1 λλ°μ΄μ€(310)) μμμ μ 곡λλ 컨ν μΈ μ νμμ΄ μλ‘ λ€λ₯Ό μ μλ€.In addition, according to an embodiment of the present invention, in response to an input event generated according to a user's flipping action, provided on the first device 310 (or the second device 320) before the input event occurs. The format of the content that is being used and the format of the content provided on the second device 320 (or the first device 310) after the input event occurs may be different.
λ 6μ λ³Έ λ°λͺ μ μΌ μ€μμμ λ°λΌ μ 1 λλ°μ΄μ€ μμμ μ 곡λλ 컨ν μΈ μ νμκ³Ό μ 2 λλ°μ΄μ€ μμμ μ 곡λλ 컨ν μΈ μ νμμ΄ μλ‘ λ¬λΌμ§λ ꡬμ±μ μμμ μΌλ‘ λνλ΄λ λλ©΄μ΄λ€.6 is a diagram illustrating a configuration in which a format of content provided on a first device and a format of content provided on a second device are different from each other according to an embodiment of the present invention.
λ 6μ μ°Έμ‘°νλ©΄, μ
λ ₯ μ΄λ²€νΈκ° λ°μνκΈ° μ μ μ 1 λλ°μ΄μ€(310) μμμ μκ°μ μΈ ννλ‘ μ 곡λκ³ μλ 컨ν
μΈ κ° μ
λ ₯ μ΄λ²€νΈκ° λ°μν μ΄νμ μ 2 λλ°μ΄μ€(320) μμμ μ²κ° λλ μ΄κ°μ μΈ ννλ‘ μ 곡λ μ μλ€. μλ₯Ό λ€λ©΄, νμ νλ©΄μ ν¬κΈ°κ° μλμ μΌλ‘ ν° μ 1 λλ°μ΄μ€(310) μμμ ν
μ€νΈμ ννλ‘ μ 곡λκ³ μλ λ
μ 컨ν
μΈ κ°, νμ νλ©΄μ ν¬κΈ°κ° μλμ μΌλ‘ μκ³ μ¬μ©μμ μλͺ©μ λ°μ°©νμ¬ μ°©μ©λλ μ 2 λλ°μ΄μ€(320) μμμλ μμ± λλ μ§λμ ννλ‘ μ 곡λ μ μλ€.Referring to FIG. 6, content that was provided in a visual form on the
ννΈ, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯΄λ©΄, μ
λ ₯ μ΄λ²€νΈλ, μ 1 λλ°μ΄μ€(310)μ μμΈ λλ μμ§μκ³Ό μ 2 λλ°μ΄μ€(320)μ μμΈ λλ μμ§μ μ¬μ΄μ μλμ μΈ κ΄κ³λΏλ§ μλλΌ, μ 1 λλ°μ΄μ€(310) λλ μ 2 λλ°μ΄μ€(320)μ λνμ¬ μ
λ ₯λλ μ¬μ©μ μ‘°μμ λ κΈ°μ΄νμ¬ νΉμ λ μ μλ€. μλ₯Ό λ€λ©΄, μ 1 λλ°μ΄μ€(310) λλ μ 2 λλ°μ΄μ€(320)μ λνμ¬ μ
λ ₯λλ μ¬μ©μ μ‘°μμλ, ν°μΉ μ‘°μ, ν€λ³΄λ μ‘°μ, μμ± μΈμ μ‘°μ λ±μ΄ ν¬ν¨λ μ μλ€.Meanwhile, according to an embodiment of the present invention, the input event may include not only a relative relationship between the posture or movement of the
ꡬ체μ μΌλ‘, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯΄λ©΄, μ¬μ©μκ° ν립 νμλ₯Ό ννλ κ²κ³Ό ν¨κ» μ 1 λλ°μ΄μ€(310) λλ μ 2 λλ°μ΄μ€(320)μ νμ νλ©΄ μμ νμλ νΉμ 컨ν
μΈ λ₯Ό ν°μΉνλ μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ¬μ©μ μͺ½μ λ°λΌλ³΄κ³ μλ μ 1 λλ°μ΄μ€(310) μμμ μ 곡λκ³ μλ 컨ν
μΈ μ€ μμ ν°μΉ μ‘°μκ³Ό μ°κ΄λ νΉμ 컨ν
μΈ κ° μ 2 λλ°μ΄μ€(320) μμμ μ 곡λλλ‘ νκ±°λ, λ°λλ‘, μ¬μ©μ μͺ½μ λ°λΌλ³΄κ³ μλ μ 2 λλ°μ΄μ€(320) μμμ μ 곡λκ³ μλ 컨ν
μΈ μ€ μμ ν°μΉ μ‘°μκ³Ό μ°κ΄λ νΉμ 컨ν
μΈ κ° μ 1 λλ°μ΄μ€(310) μμμ μ 곡λλλ‘ ν μ μλ€.Specifically, according to an embodiment of the present invention, when a user performs a flip action and an input event for touching specific content displayed on the display screen of the
λ 7μ λ³Έ λ°λͺ μ μΌ μ€μμμ λ°λΌ μ 1 λλ°μ΄μ€μ μ μ₯λμ΄ μλ 컨ν μΈ κ° μ 2 λλ°μ΄μ€λ‘ μ μ‘λλ ꡬμ±μ μμμ μΌλ‘ λνλ΄λ λλ©΄μ΄λ€.FIG. 7 is a diagram illustrating a configuration in which content stored in a first device is transmitted to a second device according to an embodiment of the present invention.
λ 7μ μ°Έμ‘°νλ©΄, μ
λ ₯ μ΄λ²€νΈκ° λ°μνκΈ° μ μ μ 1 λλ°μ΄μ€(310) μμμ μ 곡λκ³ μλ 컨ν
μΈ μ€ μ¬μ©μμ ν°μΉ μ‘°μ(710)μ μν΄ μ νλ νΉμ 컨ν
μΈ (701, 702)(μλ₯Ό λ€λ©΄, νΉμ μμ νμΌ λ±)λ§μ΄ μ
λ ₯ μ΄λ²€νΈκ° λ°μν μ΄νμ μ 2 λλ°μ΄μ€(320) μμμ μ 곡λλλ‘ ν μ μλ€.Referring to FIG. 7,
λν, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯΄λ©΄, μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ μ¬μ©μκ° ν립 νμλ₯Ό νν¨μ λ°λΌ μ 2 λλ°μ΄μ€(320)κ° μ¬μ©μ μͺ½μ λ°λΌλ³΄κ² λλ κ²κ³Ό ν¨κ» μ¬μ©μκ° μ 1 λλ°μ΄μ€(310)μ νμ νλ©΄ μμ νμλ νΉμ 컨ν
μΈ (μλ₯Ό λ€λ©΄, μΉ νμ΄μ§ λ±)λ₯Ό ν°μΉνλ μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ¬μ©μ μͺ½μ λ°λΌλ³΄κ³ μλ μ 1 λλ°μ΄μ€(310) μμμ μ 곡λκ³ μλ 컨ν
μΈ κ° λΆλ§ν¬(λλ μ¦κ²¨μ°ΎκΈ°)λ‘μ λ±λ‘λκ³ ν΄λΉ λΆλ§ν¬λ₯Ό ν¬ν¨νλ λͺ©λ‘μ΄ μ 2 λλ°μ΄μ€(320) μμ νμλλλ‘ ν μ μλ€. λν, λ³Έ λ°λͺ
μ μΌ μ€μμμ λ°λ₯΄λ©΄, μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ μ¬μ©μκ° ν립 νμλ₯Ό νν¨μ λ°λΌ μ 1 λλ°μ΄μ€(310)κ° μ¬μ©μ μͺ½μ λ°λΌλ³΄κ² λλ κ²κ³Ό ν¨κ» μ¬μ©μκ° μ 2 λλ°μ΄μ€(310)μ νμ νλ©΄ μμ νμλ λΆλ§ν¬ λͺ©λ‘(μλ₯Ό λ€λ©΄, λΆλ§ν¬λ μΉ νμ΄μ§ λͺ©λ‘ λ±) μ€ νΉμ λΆλ§ν¬λ₯Ό μ ν(μ¦, ν°μΉ)νλ μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ¬μ©μ μͺ½μ λ°λΌλ³΄κ³ μλ μ 2 λλ°μ΄μ€(310) μμμ μ νλ νΉμ λΆλ§ν¬μ λμνλ 컨ν
μΈ κ° μ 1 λλ°μ΄μ€(310) μμ νμλλλ‘ ν μ μλ€. λ 20 λ΄μ§ λ 22λ₯Ό μ°Έμ‘°νλ©΄, μμ μ€μμκ° μ€μ λ‘ μμ°λλ λͺ¨μ΅μ νμΈν μ μλ€.In addition, according to an embodiment of the present invention, the user
ννΈ, μ΄μμμλ, μ¬μ©μκ° μμ μ μμ ν λ² λ€μ§λ νμμ μνμ¬ μ λ ₯ μ΄λ²€νΈκ° νΉμ λλ μ€μμμ λνμ¬ μ£Όλ‘ μ€λͺ λμμ§λ§, λ³Έ λ°λͺ μ΄ λ°λμ μ΄μ νμ λλ κ²μ μλλ©°, λ³Έ λ°λͺ μ λͺ©μ μ λ¬μ±ν μ μλ λ²μ λ΄μμ μΌλ§λ μ§ λ€λ₯Έ νμ(μλ₯Ό λ€λ©΄, μμ λ λ² μ΄μ λ€μ§λ νμ λ±)μ μνμ¬λ μ λ ₯ μ΄λ²€νΈκ° νΉμ λ μ μμμ λ°ν λλ€.On the other hand, in the above, mainly described with respect to the embodiment in which the input event is specified by the user flipping his hand once, the present invention is not necessarily limited thereto, the scope of the present invention can be achieved It is noted that the input event can be specified by any number of other actions within (eg, flipping the hand more than once).
λν, μ΄μμμλ, μ 1 λλ°μ΄μ€κ° μ¬μ©μμ μμ νμ§λλ μ€λ§νΈν°μ΄κ³ μ 2 λλ°μ΄μ€κ° μ¬μ©μμ μλͺ©μ μ°©μ©λλ μ€λ§νΈ μμΉμΈ μ€μμμ λνμ¬ μ£Όλ‘ μ€λͺ λμμ§λ§, λ³Έ λ°λͺ μ΄ λ°λμ μ΄μ νμ λλ κ²μ μλλ©°, λ³Έ λ°λͺ μ λͺ©μ μ λ¬μ±ν μ μλ λ²μ λ΄μμ μ 1 λλ°μ΄μ€μ μ 2 λλ°μ΄μ€λ μ€λ§νΈ ν¨λ, μ€λ§νΈ κΈλμ€, μ€λ§νΈ λ°΄λ, μ€λ§νΈ λ§ λ±μ λ€λ₯Έ ννλ‘λ ꡬνλ μ μμμ λ°ν λλ€.In addition, while the above is mainly described with respect to the embodiment in which the first device is a smart phone held in the user's hand and the second device is a smart watch worn on the user's wrist, the present invention is not necessarily limited thereto. It will be appreciated that the first device and the second device may be implemented in other forms such as a smart pad, a smart glass, a smart band, a smart ring, and the like within the scope of the object of the present invention.
λ 8 λ΄μ§ λ 15λ λ³Έ λ°λͺ μ λ€λ₯Έ μ€μμμ λ°λΌ μ¬μ©μ μΈν°νμ΄μ€κ° μ 곡λλ ꡬμ±μ μμμ μΌλ‘ λνλ΄λ λλ©΄μ΄λ€.8 to 15 are diagrams exemplarily illustrating a configuration in which a user interface is provided according to another embodiment of the present invention.
λ¨Όμ , λ 8μ μ°Έμ‘°νλ©΄, λ³Έ λ°λͺ
μ λ€λ₯Έ μ€μμμ λ°λ₯Έ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ, μ¬μ©μκ° μλ‘ λ€λ₯Έ λ°©ν₯μ λ°λΌλ³΄κ³ μλ μ 1 λλ°μ΄μ€(310) λ° μ 2 λλ°μ΄μ€(320)λ₯Ό λͺ¨λ μμ§νκ³ μλ μμ λ€μ§λ ν립 νμλ₯Ό ννλ κ²½μ°μ, μμ νμλ₯Ό μ€λ§νΈν° ννμ μ 1 λλ°μ΄μ€(310)μ λ°©ν₯μ΄ λ³ννλ μμκ³Ό μ€λ§νΈ μμΉ ννμ μ 2 λλ°μ΄μ€(320)μ λ°©ν₯μ΄ λ³ννλ μμμ΄ μλ‘ λ°λμ΄κ±°λ λμΉμΈ μ
λ ₯ μ΄λ²€νΈλ‘μ μΈμν μ μλ€. κ³μνμ¬, λ³Έ λ°λͺ
μ λ€λ₯Έ μ€μμμ λ°λ₯Έ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ, μμ κ°μ μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ
λ ₯ μ΄λ²€νΈ λ°μ μ μ νμ νλ©΄μ΄ μ¬μ©μ μͺ½μ λ°λΌλ³΄κ³ μλ μ 1 λλ°μ΄μ€(310) μμμ μ 곡λκ³ μλ 컨ν
μΈ (λλ κΈ°λ₯) μ€ μ μ΄λ μΌλΆ(810)λ₯Ό λ―Έλ¬λ§(Mirroring)νκ³ , μ΄λ κ² λ―Έλ¬λ§λ 컨ν
μΈ (λλ κΈ°λ₯)(810)κ° μ 1 λλ°μ΄μ€(310)λΏλ§ μλλΌ μ
λ ₯ μ΄λ²€νΈ λ°μ νμ νμ νλ©΄μ΄ μ¬μ©μ μͺ½μ λ°λΌλ³΄κ² λ μ 2 λλ°μ΄μ€(320) μμλ μ 곡λλλ‘ ν μ μλ€. μ΄λ‘μ¨, μ¬μ©μκ° μ 1 λλ°μ΄μ€(310)κ° μλλ°© μͺ½μ λ°λΌλ³΄λλ‘ μμ μ μλͺ©μ νμ μν¨ μνμμλ μ 1 λλ°μ΄μ€(310) μμμ μλλ°©μκ² μ΄λ€ 컨ν
μΈ (λλ κΈ°λ₯)κ° μ 곡λκ³ μλμ§λ₯Ό μ 2 λλ°μ΄μ€(320)λ₯Ό ν΅ν΄ νμΈν μ μκ² λλ€.First, referring to FIG. 8, a user
λ€μμΌλ‘, λ 9λ₯Ό μ°Έμ‘°νλ©΄, λ³Έ λ°λͺ
μ λ€λ₯Έ μ€μμμ λ°λ₯Έ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ, μ¬μ©μκ° λ 8κ³Ό κ΄λ ¨νμ¬ μΈκΈν ν립 νμλ₯Ό νλ μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ
λ ₯ μ΄λ²€νΈ λ°μ μ μ νμ νλ©΄μ΄ μ¬μ©μ μͺ½μ λ°λΌλ³΄κ³ μλ μ 1 λλ°μ΄μ€(310) μμμ μ 곡λκ³ μλ 컨ν
μΈ (λλ κΈ°λ₯)(910)λ₯Ό 곡μ μΈ(Public) 컨ν
μΈ (920)μ μ¬μ μΈ(Private) 컨ν
μΈ (930)λ‘ λλκ³ , μ
λ ₯ μ΄λ²€νΈ λ°μ νμ νμ νλ©΄μ΄ μλλ°© μͺ½μ λ°λΌλ³΄κ² λ μ 1 λλ°μ΄μ€(310) μμλ 곡μ μΈ μ»¨ν
μΈ (920)κ° νμλλλ‘ νκ³ , μ
λ ₯ μ΄λ²€νΈ λ°μ νμ μ¬μ©μ μͺ½μ λ°λΌλ³΄κ² λ μ 2 λλ°μ΄μ€(320) μμλ μ¬μ μΈ μ»¨ν
μΈ (930)κ° νμλλλ‘ ν μ μλ€. μ΄λ‘μ¨, μ¬μ©μκ° μ²ν μν©μ λ°λΌ 곡μ μΈ μ»¨ν
μΈ μ μ¬μ μΈ μ»¨ν
μΈ λ₯Ό ꡬλΆνμ¬ νμν μ μκ² λλ€.Next, referring to FIG. 9, the
λ€μμΌλ‘, λ 10μ μ°Έμ‘°νλ©΄, λ³Έ λ°λͺ
μ λ€λ₯Έ μ€μμμ λ°λ₯Έ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ, μ¬μ©μκ° λ 8κ³Ό κ΄λ ¨νμ¬ μΈκΈν ν립 νμλ₯Ό νλ μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ
λ ₯ μ΄λ²€νΈ λ°μ μ μ νμ νλ©΄μ΄ μ¬μ©μ μͺ½μ λ°λΌλ³΄κ³ μλ μ 1 λλ°μ΄μ€(310) μμμ μ 곡λκ³ μλ 컨ν
μΈ (λλ κΈ°λ₯)(1010)μ μΈμ΄λ₯Ό λ€λ₯Έ μΈμ΄λ‘ λ²μνκ³ , μ
λ ₯ μ΄λ²€νΈ λ°μ νμ νμ νλ©΄μ΄ μλλ°© μͺ½μ λ°λΌλ³΄κ² λ μ 1 λλ°μ΄μ€(310) μμλ μ 1 μΈμ΄(μλ₯Ό λ€λ©΄, μμ΄)λ‘ λ 컨ν
μΈ (1020)κ° νμλλλ‘ νκ³ , μ
λ ₯ μ΄λ²€νΈ λ°μ νμ μ¬μ©μ μͺ½μ λ°λΌλ³΄κ² λ μ 2 λλ°μ΄μ€(320) μμλ μ 2 μΈμ΄(μλ₯Ό λ€λ©΄, νκ΅μ΄)λ‘ λ 컨ν
μΈ (1030)κ° νμλλλ‘ ν μ μλ€.Next, referring to FIG. 10, the user
λ€μμΌλ‘, λ 11 λ° 12λ₯Ό μ°Έμ‘°νλ©΄, μ¬μ©μκ° μμ μ μμΌλ‘ νμ§νκ³ μλ μ€λ§νΈν° ννμ μ 1 λλ°μ΄μ€(310)λ₯Ό μ΄μ©νμ¬ κ·Όκ±°λ¦¬ ν΅μ μ κΈ°λ°ν κ²°μ λ₯Ό μννλ κ²½μ°λ₯Ό κ°μ ν μ μλ€. μ΄λ¬ν κ²½μ°μ, λ³Έ λ°λͺ
μ λ€λ₯Έ μ€μμμ λ°λ₯Έ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ, μ¬μ©μκ° λ 8κ³Ό κ΄λ ¨νμ¬ μΈκΈν ν립 νμλ₯Ό νλ μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ
λ ₯ μ΄λ²€νΈ λ°μ νμ νμ νλ©΄μ΄ κ²°μ μ²λ¦¬ κΈ°κΈ°(μλ₯Ό λ€λ©΄, RFID 리λκΈ° λ±)(330) μͺ½μ λ°λΌλ³΄κ² λ μ 1 λλ°μ΄μ€(310)λ₯Ό λμ ν΄ μ
λ ₯ μ΄λ²€νΈ λ°μ νμ μ¬μ©μ μͺ½μ λ°λΌλ³΄κ² λ μ 2 λλ°μ΄μ€(320) μμ κ²°μ μ κ΄ν λ€μν μ 보(μλ₯Ό λ€λ©΄, κ²°μ κΈμ‘ λ±)(1110)κ° νμλλλ‘ ν μ μλ€(λ 11 μ°Έμ‘°). λν, λ³Έ λ°λͺ
μ λ€λ₯Έ μ€μμμ λ°λ₯Έ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ, μ¬μ©μκ° λ 8κ³Ό κ΄λ ¨νμ¬ μΈκΈν ν립 νμλ₯Ό νλ μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ
λ ₯ μ΄λ²€νΈ λ°μ νμ νμ νλ©΄μ΄ κ²°μ μ²λ¦¬ κΈ°κΈ°(μλ₯Ό λ€λ©΄, RFID 리λκΈ° λ±)(330) μͺ½μ λ°λΌλ³΄κ² λ μ 1 λλ°μ΄μ€(310)λ₯Ό λμ ν΄ μ
λ ₯ μ΄λ²€νΈ λ°μ νμ μ¬μ©μ μͺ½μ λ°λΌλ³΄κ² λ μ 2 λλ°μ΄μ€(320) μμ ν°μΉ μ‘°μμ κΈ°λ°νμ¬ μλͺ
μ μ
λ ₯ν μ μλλ‘ μ§μνλ κΈ°λ₯(1210)μ΄ μ 곡λλλ‘ ν μ μλ€(λ 12 μ°Έμ‘°).Next, referring to FIGS. 11 and 12, it may be assumed that a user performs payment based on short-range communication using the
λ€μμΌλ‘, λ 13μ μ°Έμ‘°νλ©΄, μ¬μ©μκ° μ 2 λλ°μ΄μ€(320)λ₯Ό μλͺ©μ μ°©μ©ν μνλ‘ λ¬λ¦¬κΈ°μ κ°μ μ΄λμ νλ κ²½μ°λ₯Ό κ°μ ν μ μλ€. μ΄λ¬ν κ²½μ°μ, λ³Έ λ°λͺ
μ λ€λ₯Έ μ€μμμ λ°λ₯Έ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ, μ¬μ©μκ° μ 1 λλ°μ΄μ€(310) μμμ λͺ©μ μ§λ₯Ό μ ννλ ν°μΉ μ‘°μμ ννλ κ²κ³Ό ν¨κ» λ 8κ³Ό κ΄λ ¨νμ¬ μΈκΈν ν립 νμλ₯Ό νλ μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ¬μ©μκ° μ΄λ μ€μμ μμ½κ² μ‘°μν μ μλ μ 2 λλ°μ΄μ€(320) μμ μ΄λμ κ΄ν λ€μν 컨ν
μΈ (μλ₯Ό λ€λ©΄, νμ¬ μμΉ, λͺ©μ μ§κΉμ§ λ¨μ 거리 λλ μκ°, μμ
λ±)κ° μ 곡λλλ‘ ν μ μλ€.Next, referring to FIG. 13, it may be assumed that a user performs an exercise such as running while wearing the
λ€μμΌλ‘, λ 14λ₯Ό μ°Έμ‘°νλ©΄, μ¬μ©μκ° μ 2 λλ°μ΄μ€(320)λ₯Ό μλͺ©μ μ°©μ©ν μνλ‘ λ¦¬λͺ¨μ»¨κ³Ό κ°μ IoT κ°μ κΈ°κΈ°(μ 1 λλ°μ΄μ€(310)μ ν΄λΉν¨)λ₯Ό κ°μ μμΌλ‘ νμ§νλ κ²½μ°λ₯Ό κ°μ ν μ μλ€. μ΄λ¬ν κ²½μ°μ, λ³Έ λ°λͺ
μ λ€λ₯Έ μ€μμμ λ°λ₯Έ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ, μ¬μ©μκ° λ 8κ³Ό κ΄λ ¨νμ¬ μΈκΈν ν립 νμλ₯Ό νλ μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ
λ ₯ μ΄λ²€νΈ λ°μ νμ μ¬μ©μ μͺ½μ λ°λΌλ³΄κ² λ μ 2 λλ°μ΄μ€(320) μμ μ 1 λλ°μ΄μ€(310)μΈ λ¦¬λͺ¨μ»¨ λ±μ IoT κ°μ κΈ°κΈ°μ μ¬μ©μ μ‘°μ μΈν°νμ΄μ€(1410)κ° νμλλλ‘ ν μ μλ€.Next, referring to FIG. 14, it will be assumed that a user grips an IoT home appliance (corresponding to the first device 310), such as a remote controller, with the same hand while wearing the
λ€μμΌλ‘, λ 15λ₯Ό μ°Έμ‘°νλ©΄, μ¬μ©μκ° ν μμΌλ‘ μ€λ§νΈν° ννμ μ 1 λλ°μ΄μ€(310)λ₯Ό νμ§νκ³ λ€λ₯Έ μμ μλͺ©μ μ€λ§νΈ μμΉ ννμ μ 2 λλ°μ΄μ€(320)λ₯Ό μ°©μ©νκ³ μλ μνμμ 보μμ΄ μꡬλλ μμ
μ μννλ κ²½μ°λ₯Ό κ°μ ν μ μλ€. μ΄λ¬ν κ²½μ°μ, λ³Έ λ°λͺ
μ λ€λ₯Έ μ€μμμ λ°λ₯Έ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ, μ¬μ©μκ° μ 2 λλ°μ΄μ€(320)λ₯Ό μ°©μ©νκ³ μλ μμ μλͺ©μ λ리λ λ±μ λμμ νν¨μ λ°λΌ μ 1 λλ°μ΄μ€(310)μ μ 2 λλ°μ΄μ€(320) λͺ¨λκ° μ¬μ©μ μͺ½μ λ°λΌλ³΄κ² λλ μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ 1 λλ°μ΄μ€(310) μμ μ 곡λλ 컨ν
μΈ (λλ κΈ°λ₯)(μλ₯Ό λ€λ©΄, μ¬μ©μκ° λ³΄μ ν€ μ½λλ₯Ό μ
λ ₯ν μ μλλ‘ μ§μνλ κΈ°λ₯ λ±)(1510)κ³Ό μ°κ΄λλ λΆκ° 컨ν
μΈ (λλ κΈ°λ₯)(μλ₯Ό λ€λ©΄, μ
λ ₯λμ΄μΌ ν 보μ ν€ μ½λ λ±)(1520)κ° μ 2 λλ°μ΄μ€(320) μμ μ 곡λλλ‘ ν μ μλ€. μ΄λ‘μ¨, μ¬μ©μκ° λΉλ°λ²νΈ λ±μ 보μ μ 보λ₯Ό μ
λ ₯νλ μμ
μ μνν¨μ μμ΄μ νΈμμ±κ³Ό 보μμ±μ λμΌ μ μκ² λλ€.Next, referring to FIG. 15, the security is secured while the user holds the
ννΈ, λ³Έ λ°λͺ
μ λ€λ₯Έ μ€μμμ λ°λ₯Έ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ, μ¬μ©μκ° μ 2 λλ°μ΄μ€(320)λ₯Ό μ°©μ©νκ³ μλ μμ μλͺ©μ λ리λ λ±μ λμμ νν¨μ λ°λΌ μ 1 λλ°μ΄μ€(310)μ μ 2 λλ°μ΄μ€(320) λͺ¨λκ° μ¬μ©μ μͺ½μ λ°λΌλ³΄κ² λλ κ²κ³Ό ν¨κ» μ¬μ©μκ° μ 1 λλ°μ΄μ€(310) μμ νμλκ³ μλ 컨ν
μΈ μ€ νΉμ ν
μ€νΈλ₯Ό μ ννλ ν°μΉ μ‘°μμ ννλ μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, νΉμ ν
μ€νΈμ λν λ²μ λλ μ΄νμ¬μ κ²μ κ²°κ³Όμ κ΄ν μ λ³΄κ° μ 2 λλ°μ΄μ€ μμ νμλλλ‘ ν μ μλ€. λ 18 λ° λ 19λ₯Ό μ°Έμ‘°νλ©΄, μμ μ€μμκ° μ€μ λ‘ μμ°λλ λͺ¨μ΅μ νμΈν μ μλ€.On the other hand, the user
ννΈ, μ€λ₯Έμͺ½ μλͺ©μ μ€λ§νΈ μμΉ ννμ μ 1 λλ°μ΄μ€λ₯Ό μ°©μ©νκ³ μλ μ 1 μ¬μ©μμ μΌμͺ½ μλͺ©μ μ€λ§νΈ μμΉ ννμ μ 2 λλ°μ΄μ€λ₯Ό μ°©μ©νκ³ μλ μ 2 μ¬μ©μκ° κ°κ° μ€λ₯Έμκ³Ό μΌμμΌλ‘ μ
μλ₯Ό λλλ μν©μμ μ 1 μ¬μ©μμ μ 2 μ¬μ©μκ° μλ‘ λ§μ‘μ μμ λ리λ ν립 νμλ₯Ό ννλ μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²½μ°λ₯Ό κ°μ ν μ μλ€. μ΄λ¬ν κ²½μ°μ, λ³Έ λ°λͺ
μ λ€λ₯Έ μ€μμμ λ°λ₯Έ μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν
(200)μ, μμ κ°μ μ
λ ₯ μ΄λ²€νΈκ° λ°μνλ κ²μ λμνμ¬, μ 1 μ¬μ©μκ° μ°©μ©νκ³ μλ μ 1 λλ°μ΄μ€ μμμ μ 곡λκ³ μλ 컨ν
μΈ (λλ κΈ°λ₯) μ€ μ μ΄λ μΌλΆκ° μ 2 μ¬μ©μκ° μ°©μ©νκ³ μλ μ 2 λλ°μ΄μ€ μμμ μ 곡λλλ‘ νκ±°λ, λ°λλ‘, μ 2 μ¬μ©μκ° μ°©μ©νκ³ μλ μ 2 λλ°μ΄μ€ μμμ μ 곡λκ³ μλ 컨ν
μΈ (λλ κΈ°λ₯) μ€ μ μ΄λ μΌλΆκ° μ 1 μ¬μ©μκ° μ°©μ©νκ³ μλ μ 1 λλ°μ΄μ€ μμμ μ 곡λλλ‘ ν μ μλ€. μ΄λ‘μ¨, μ¬μ©μλ λλ°μ΄μ€λ₯Ό μ°©μ©νκ³ μλ μμΌλ‘ λ€λ₯Έ μ¬μ©μμ μ
μλ₯Ό λλλ©΄μ ν립 νμλ₯Ό ννλ κ²λ§μΌλ‘ λ€λ₯Έ μ¬μ©μμ λλ°μ΄μ€λ‘λΆν° 컨ν
μΈ λ₯Ό μ 곡 λ°κ±°λ λ€λ₯Έ μ¬μ©μμ λλ°μ΄μ€μκ² μ»¨ν
μΈ λ₯Ό μ 곡ν μ μκ² λλ€.On the other hand, a first user wearing a smart watch-type first device on the right wrist and a second user wearing a smart watch-type second device on the left wrist shake hands with the right hand and the left hand, respectively. It may be assumed that an input event occurs in which the user and the second user perform a flip action in which the user turns hands held together. In this case, the user
μ΄μ μ€λͺ λ λ³Έ λ°λͺ μ λ°λ₯Έ μ€μμλ€μ λ€μν μ»΄ν¨ν° ꡬμ±μμλ₯Ό ν΅νμ¬ μνλ μ μλ νλ‘κ·Έλ¨ λͺ λ Ήμ΄μ ννλ‘ κ΅¬νλμ΄ λΉμΌμμ±μ μ»΄ν¨ν° νλ κ°λ₯ν κΈ°λ‘ λ§€μ²΄μ κΈ°λ‘λ μ μλ€. μκΈ° λΉμΌμμ±μ μ»΄ν¨ν° νλ κ°λ₯ν κΈ°λ‘ λ§€μ²΄λ νλ‘κ·Έλ¨ λͺ λ Ήμ΄, λ°μ΄ν° νμΌ, λ°μ΄ν° ꡬ쑰 λ±μ λ¨λ μΌλ‘ λλ μ‘°ν©νμ¬ ν¬ν¨ν μ μλ€. μκΈ° λΉμΌμμ±μ μ»΄ν¨ν° νλ κ°λ₯ν κΈ°λ‘ λ§€μ²΄μ κΈ°λ‘λλ νλ‘κ·Έλ¨ λͺ λ Ήμ΄λ λ³Έ λ°λͺ μ μνμ¬ νΉλ³ν μ€κ³λκ³ κ΅¬μ±λ κ²λ€μ΄κ±°λ μ»΄ν¨ν° μννΈμ¨μ΄ λΆμΌμ λΉμ μμκ² κ³΅μ§λμ΄ μ¬μ© κ°λ₯ν κ²μΌ μλ μλ€. λΉμΌμμ±μ μ»΄ν¨ν° νλ κ°λ₯ν κΈ°λ‘ λ§€μ²΄μ μμλ, νλ λμ€ν¬, νλ‘νΌ λμ€ν¬ λ° μκΈ° ν μ΄νμ κ°μ μκΈ° 맀체, CD-ROM, DVDμ κ°μ κ΄κΈ°λ‘ 맀체, νλ‘ν°μ»¬ λμ€ν¬(floptical disk)μ κ°μ μκΈ°-κ΄ λ§€μ²΄(magneto-optical media), λ° ROM, RAM, νλμ λ©λͺ¨λ¦¬ λ±κ³Ό κ°μ νλ‘κ·Έλ¨ λͺ λ Ήμ΄λ₯Ό μ μ₯νκ³ μννλλ‘ νΉλ³ν ꡬμ±λ νλμ¨μ΄ μ₯μΉκ° ν¬ν¨λλ€. νλ‘κ·Έλ¨ λͺ λ Ήμ΄μ μμλ, μ»΄νμΌλ¬μ μν΄ λ§λ€μ΄μ§λ κ²κ³Ό κ°μ κΈ°κ³μ΄ μ½λλΏλ§ μλλΌ μΈν°νλ¦¬ν° λ±μ μ¬μ©ν΄μ μ»΄ν¨ν°μ μν΄μ μ€νλ μ μλ κ³ κΈ μΈμ΄ μ½λλ ν¬ν¨λλ€. μκΈ° νλμ¨μ΄ μ₯μΉλ λ³Έ λ°λͺ μ λ°λ₯Έ μ²λ¦¬λ₯Ό μννκΈ° μν΄ νλ μ΄μμ μννΈμ¨μ΄ λͺ¨λλ‘μ μλνλλ‘ κ΅¬μ±λ μ μμΌλ©°, κ·Έ μλ λ§μ°¬κ°μ§μ΄λ€.Embodiments according to the present invention described above may be implemented in the form of program instructions that may be executed by various computer components, and may be recorded on a non-transitory computer readable recording medium. The non-transitory computer readable recording medium may include program instructions, data files, data structures, etc. alone or in combination. The program instructions recorded on the non-transitory computer readable recording medium may be those specially designed and configured for the present invention, or may be known and available to those skilled in the computer software arts. Examples of non-transitory computer readable recording media include magnetic media such as hard disks, floppy disks and magnetic tape, optical recording media such as CD-ROMs, DVDs, magnetic-optical media such as floppy disks ( magneto-optical media) and hardware devices specifically configured to store and execute program instructions, such as ROM, RAM, flash memory, and the like. Examples of program instructions include not only machine code generated by a compiler, but also high-level language code that can be executed by a computer using an interpreter or the like. The hardware device may be configured to operate as one or more software modules to perform the process according to the invention, and vice versa.
μ΄μμμ λ³Έ λ°λͺ μ΄ κ΅¬μ²΄μ μΈ κ΅¬μ±μμ λ±κ³Ό κ°μ νΉμ μ¬νλ€κ³Ό νμ λ μ€μμ λ° λλ©΄μ μν΄ μ€λͺ λμμΌλ, μ΄λ λ³Έ λ°λͺ μ λ³΄λ€ μ λ°μ μΈ μ΄ν΄λ₯Ό λκΈ° μν΄μ μ 곡λ κ²μΌ λΏ, λ³Έ λ°λͺ μ΄ μκΈ° μ€μμλ€μ νμ λλ κ²μ μλλ©°, λ³Έ λ°λͺ μ΄ μνλ κΈ°μ λΆμΌμμ ν΅μμ μΈ μ§μμ κ°μ§ μλΌλ©΄ μ΄λ¬ν κΈ°μ¬λ‘λΆν° λ€μν μμ λ° λ³νμ κΎν μ μλ€.Although the present invention has been described by specific embodiments such as specific components and the like, but the embodiments and the drawings are provided to assist in a more general understanding of the present invention, the present invention is not limited to the above embodiments. For those skilled in the art, various modifications and variations can be made from these descriptions.
λ°λΌμ, λ³Έ λ°λͺ μ μ¬μμ μκΈ° μ€λͺ λ μ€μμμ κ΅νλμ΄ μ ν΄μ Έμλ μλ λλ©°, νμ νλ νΉνμ²κ΅¬λ²μλΏλ§ μλλΌ μ΄ νΉνμ²κ΅¬λ²μμ κ· λ±νκ² λλ λ±κ°μ μΌλ‘ λ³νλ λͺ¨λ κ²λ€μ λ³Έ λ°λͺ μ μ¬μμ λ²μ£Όμ μνλ€κ³ ν κ²μ΄λ€.Accordingly, the spirit of the present invention should not be limited to the above-described embodiments, and all of the equivalents or equivalents of the claims, as well as the appended claims, fall within the scope of the spirit of the present invention. I will say.
-λΆνΈμ μ€λͺ -Explanation of sign
100: ν΅μ λ§100: network
200: μ¬μ©μ μΈν°νμ΄μ€ μ 곡 μμ€ν 200: user interface providing system
310: μ 1 λλ°μ΄μ€310: first device
320: μ 2 λλ°μ΄μ€320: second device
Claims (16)
Applications Claiming Priority (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR10-2014-0132936 | 2014-10-02 | ||
| KR20140132936 | 2014-10-02 | ||
| KR10-2015-0051767 | 2015-04-13 | ||
| KR1020150051767A KR101916700B1 (en) | 2014-10-02 | 2015-04-13 | Method, device, system and non-transitory computer-readable recording medium for providing user interface |
| US14/819,151 | 2015-08-05 | ||
| US14/819,151 US9696815B2 (en) | 2014-10-02 | 2015-08-05 | Method, device, system and non-transitory computer-readable recording medium for providing user interface |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016052859A1 true WO2016052859A1 (en) | 2016-04-07 |
Family
ID=55630851
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2015/008747 Ceased WO2016052859A1 (en) | 2014-10-02 | 2015-08-21 | Method, device, and system for providing user interface and non-transitory computer-readable recording medium |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2016052859A1 (en) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140045547A1 (en) * | 2012-08-10 | 2014-02-13 | Silverplus, Inc. | Wearable Communication Device and User Interface |
| US20140135631A1 (en) * | 2012-06-22 | 2014-05-15 | Fitbit, Inc. | Biometric monitoring device with heart rate measurement activated by a single user-gesture |
| US20140139454A1 (en) * | 2012-11-20 | 2014-05-22 | Samsung Electronics Company, Ltd. | User Gesture Input to Wearable Electronic Device Involving Movement of Device |
| WO2014084634A1 (en) * | 2012-11-29 | 2014-06-05 | μ£Όμνμ¬ λ§€ν¬λ‘ | Mouse apparatus for eye-glass type display device, and method for driving same |
| US20140181954A1 (en) * | 2012-12-26 | 2014-06-26 | Charles Cameron Robertson | System for conveying an identity and method of doing the same |
-
2015
- 2015-08-21 WO PCT/KR2015/008747 patent/WO2016052859A1/en not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140135631A1 (en) * | 2012-06-22 | 2014-05-15 | Fitbit, Inc. | Biometric monitoring device with heart rate measurement activated by a single user-gesture |
| US20140045547A1 (en) * | 2012-08-10 | 2014-02-13 | Silverplus, Inc. | Wearable Communication Device and User Interface |
| US20140139454A1 (en) * | 2012-11-20 | 2014-05-22 | Samsung Electronics Company, Ltd. | User Gesture Input to Wearable Electronic Device Involving Movement of Device |
| WO2014084634A1 (en) * | 2012-11-29 | 2014-06-05 | μ£Όμνμ¬ λ§€ν¬λ‘ | Mouse apparatus for eye-glass type display device, and method for driving same |
| US20140181954A1 (en) * | 2012-12-26 | 2014-06-26 | Charles Cameron Robertson | System for conveying an identity and method of doing the same |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2015152487A1 (en) | Method, device, system and non-transitory computer-readable recording medium for providing user interface | |
| WO2018151449A1 (en) | Electronic device and methods for determining orientation of the device | |
| WO2018074877A1 (en) | Electronic device and method for acquiring fingerprint information | |
| WO2018026202A1 (en) | Touch sensing device for determining information related to pen, control method therefor, and pen | |
| WO2017095123A1 (en) | Method, device, and system for providing user interface, and non-temporary computer-readable recording medium | |
| WO2016088981A1 (en) | Method, device, and system for providing user interface, and non-transitory computer-readable recording medium | |
| US9696815B2 (en) | Method, device, system and non-transitory computer-readable recording medium for providing user interface | |
| WO2020130356A1 (en) | System and method for multipurpose input device for two-dimensional and three-dimensional environments | |
| WO2014129787A1 (en) | Electronic device having touch-sensitive user interface and related operating method | |
| WO2020130667A1 (en) | Method and electronic device for controlling augmented reality device | |
| WO2018004140A1 (en) | Electronic device and operating method therefor | |
| WO2018105955A2 (en) | Method for displaying object and electronic device thereof | |
| WO2014178693A1 (en) | Method for matching multiple devices, device for enabling matching thereof and server system | |
| WO2016080596A1 (en) | Method and system for providing prototyping tool, and non-transitory computer-readable recording medium | |
| WO2018135903A1 (en) | Electronic device and method for displaying screen by the same | |
| WO2014014240A1 (en) | Contact type finger mouse and operating method thereof | |
| WO2014185753A1 (en) | Method for matching multiple devices, and device and server system for enabling matching | |
| WO2012093779A2 (en) | User terminal supporting multimodal interface using user touch and breath and method for controlling same | |
| WO2019203591A1 (en) | High efficiency input apparatus and method for virtual reality and augmented reality | |
| CN104461231A (en) | Information display control device and information display control method | |
| JP2023527906A (en) | Control method, device, terminal and storage medium | |
| US10095309B2 (en) | Input device, system and method for finger touch interface | |
| WO2018074824A1 (en) | Electronic device comprising electromagnetic interference sensor | |
| JP2022074167A (en) | Input control system | |
| WO2015056886A1 (en) | Method for controlling touch screen by detecting position of line of sight of user |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15847602 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15847602 Country of ref document: EP Kind code of ref document: A1 |