[go: up one dir, main page]

WO2019032014A1 - A touch-based virtual-reality interaction system - Google Patents

A touch-based virtual-reality interaction system Download PDF

Info

Publication number
WO2019032014A1
WO2019032014A1 PCT/SE2018/050781 SE2018050781W WO2019032014A1 WO 2019032014 A1 WO2019032014 A1 WO 2019032014A1 SE 2018050781 W SE2018050781 W SE 2018050781W WO 2019032014 A1 WO2019032014 A1 WO 2019032014A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
user
sensitive apparatus
touch sensitive
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/SE2018/050781
Other languages
French (fr)
Inventor
Kristofer JAKOBSON
Tomas Christiansson
Mattias KRUS
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FlatFrog Laboratories AB
Original Assignee
FlatFrog Laboratories AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FlatFrog Laboratories AB filed Critical FlatFrog Laboratories AB
Publication of WO2019032014A1 publication Critical patent/WO2019032014A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates generally to the field of virtual-reality (VR) interaction systems. More particularly, the present invention relates to a touch- based VR interaction system and a related method.
  • VR virtual-reality
  • touch-sensitive panels are being used for providing input data to computers, gaming devices, presentation- and
  • Virtual-reality presents the user with an environment partially if not fully disconnected from the actual physical environment of the user.
  • Various ways of interacting with this environment have been tried. These include IR tracked gloves, IR tracked wands or other gesturing tools, gyroscope-/accelerometer tracked objects.
  • the IR tracked objects are typically tracked using one or more IR sensors configured to view and triangulate IR light sources on the IR tracked objects.
  • Such interaction systems provide high latency, low accuracy user input to the virtual
  • One objective is to provide a VR interaction system with a high-precision interface.
  • Another objective is to provide a touch-based VR interaction system in which a VR user interact with a high precision touch sensitive apparatus in the physical reality whilst viewing the interaction in the virtual reality.
  • a touch-based virtual-reality (VR) interaction system comprising a touch sensitive apparatus configured to receive touch input from a user, a VR output device configured to display a position of the user and a virtual representation of the touch input in a VR environment coordinate system within a virtual space, a positioning unit configured to provide spatial position information of the position of the touch sensitive apparatus relative to the user, and a processing unit configured to map the spatial position information of the touch sensitive apparatus to the VR environment coordinate system.
  • the processing unit is configured to communicate a set of VR environment coordinates of the touch sensitive apparatus to the VR output device so that the touch sensitive apparatus is displayed within the virtual space together with the virtual representation of the touch input.
  • a method in a touch-based VR interaction system having a touch sensitive apparatus configured to receive touch input from a user, and a VR output device configured to display a position of the user and a virtual representation of the touch input in a VR environment coordinate system within a virtual space.
  • the method comprises providing spatial information of the position of the touch sensitive apparatus relative to the user, mapping the spatial position information of the touch sensitive apparatus to the VR environment coordinate system, and
  • a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method according to the second aspect.
  • Some examples of the disclosure provide for a touch-based VR interaction system in which a VR user interact with a high precision touch sensitive apparatus in the physical reality whilst viewing the interaction in the virtual reality.
  • Some examples of the disclosure provide for an enhanced VR experience via interaction with a touch panel.
  • Some examples of the disclosure provide for capturing input from a user's interaction with a VR environment with a high accuracy.
  • Fig. 1 shows a touch-based virtual-reality (VR) interaction system according to examples of the disclosure
  • Fig. 2 shows a touch-based VR interaction system according to examples of the disclosure
  • Fig. 3 shows a touch-based VR interaction system according to examples of the disclosure
  • Fig. 4 shows a touch-based VR interaction system according to examples of the disclosure
  • Fig. 5 shows a touch-based VR interaction system according to examples of the disclosure
  • Fig. 6 shows a touch-based VR interaction system according to examples of the disclosure
  • Fig. 7 shows a touch-based VR interaction system according to examples of the disclosure
  • Fig. 8 shows a touch-based VR interaction system according to examples of the disclosure
  • Fig. 9 shows a touch-based VR interaction system according to examples of the disclosure.
  • Fig. 10 shows a VR environment in which a plurality of virtual
  • Fig. 1 1 is a flowchart of a method in a touch-based VR interaction system according to examples of the disclosure.
  • Fig. 1 is a schematic illustration of a touch-based virtual-reality (VR) interaction system 100 comprising a touch sensitive apparatus 101 configured to receive touch input from a user, and a VR output device 102 configured to display a position of the user and a virtual representation of the touch input in a
  • VR virtual-reality
  • the touch sensitive apparatus 101 may be configured to receive input using e.g. one or more fingers, a pointer or stylus etc on a touch panel 101 ' of the touch sensitive apparatus 101 .
  • the VR output device 102 may be configured to be wearable by the user, and may thus comprise a VR headset.
  • the VR output device 102 presents a virtual space to the user, as well as a virtual representation of the touch input, when the user provides touch input to the touch sensitive apparatus 101 .
  • a virtual representation of the user, such as one or more fingers, and/or a pointer or stylus may be presented in the VR output device 102 to facilitate orientation in the virtual space.
  • the touch-based VR interaction system 100 comprises a positioning unit 103 configured to provide spatial position information of the position of the touch sensitive apparatus 101 relative to the user, and a processing unit 104 configured to map the spatial position information of the touch sensitive apparatus 101 to the VR environment coordinate system.
  • the processing unit 104 is configured to communicate a set of VR environment coordinates of the touch sensitive apparatus 101 to the VR output device 102 so that the touch sensitive apparatus 101 is displayed within the virtual space together with the virtual representation of the touch input. The VR user may thus reliably interact with a high precision touch sensitive apparatus 101 in the physical reality whilst viewing the interaction in VR.
  • Various input from the user's interaction with a VR environment may thus be captured with an increased accuracy.
  • touch input of fine details of a component for a machine presented in the VR space may be captured with the increased accuracy and low latency of the touch sensitive apparatus 101 , that otherwise would not be resolved by typical spatial sensors in previous VR systems.
  • Mapping the position of the touch sensitive apparatus 101 to the VR environment provides further for an enhanced VR experience combining the freedom of customizing different VR environments to the user's tasks with the tactile interaction provided by the touch sensitive apparatus 101 .
  • the simultaneous interaction with the touch sensitive apparatus 101 allows for a more viable handling of user input from a VR environment, such as the communication of a user's input to various related systems and applications. A realistic and more practical utilization of VR may thus be provided, across a range of applications and technical fields.
  • touch-sensitive panels There are numerous known techniques for providing touch sensitivity to the touch panel 101 ', e.g. by using cameras to capture light scattered off the point(s) of touch on the panel, by using cameras to directly observe the objects interacting with the panel, by incorporating resistive wire grids, capacitive sensors, strain gauges, etc. into the panel.
  • a plurality of optical emitters and optical receivers are arranged around the periphery of the touch surface of the panel 101 ' to create a grid of intersecting light paths (otherwise known as detection lines) above the touch surface. Each light path extends between a respective emitter/receiver pair. An object that touches the touch surface will block or attenuate some of the light paths. Based on the identity of the receivers detecting a blocked light path, a processor can determine the location of the intercept between the blocked light paths.
  • the touch-based VR interaction system 100 may comprise at least one spatial marker 105, 105', arranged on the touch sensitive apparatus 101 , as schematically illustrated in Fig. 2.
  • the positioning unit 103 may be configured to track the at least one spatial marker 105, 105', to determine an associated position of the touch sensitive apparatus 101 relative to the user.
  • the at least one spatial marker 105, 105' may comprise IR markers such as IR light sources, or any other marker configured for allowing tracking by the positioning unit 102, such as markers of different shapes and configurations being physically provided on parts of the touch sensitive apparatus 101 and/or displayed on the touch panel 101 ' thereof. Accurate mapping of the obtained spatial position information to the VR environment coordinate system may then be provided.
  • Fig. 2 illustrates first and second spatial markers 105, 105', but it is conceivable that the number of spatial markers may be varied to provide for an optimized position detection.
  • the touch-based VR interaction system 100 may comprise an image sensor device 106 configured to be wearable by the user, as schematically illustrated in Figs. 3 - 8.
  • the image sensor device 106 may be configured to capture image data 107, 107', 107", 107"', associated with the position of the touch sensitive apparatus 101 and communicate the image data to the positioning unit 103, which is configured to determine the position of the touch sensitive apparatus 101 relative to the user based on the captured image data, such as by a triangulation process of the obtained image data. Since the image sensor device 106 may be arranged at the position of the user, i.e. by being wearable, the relative position between the user and the touch sensitive apparatus 101 may be accurately determined. This provides for accurately determining the VR environment coordinates of the touch sensitive apparatus
  • the touch- based VR interaction system 100 thus enables high-resolution input and for more complex tasks to be carried out by the user in the VR space.
  • the image sensor device 106 may be configured to capture image data 107 of the at least one spatial marker 105, 105', and communicate the image data to the positioning unit 103, which is configured to determine the position of the touch sensitive apparatus 101 relative to the user based on the captured image data.
  • Fig. 3 illustrates an example where the image sensor device 106 locates the position of the touch sensitive apparatus 101 based on spatial markers 105, 105'.
  • the processing unit 104 may then accurately map the retrieved spatial position information to the VR environment coordinate system.
  • the image sensor device 106 may be configured to capture image data 107' displayed by the touch sensitive apparatus 101 and communicate the image data to the positioning unit 103, as schematically illustrated in Fig. 4.
  • the image data 107' displayed by the touch sensitive apparatus 101 may comprise objects of varying shapes and configurations that allow for a calibration of the position of the touch sensitive apparatus 101 in the VR environment coordinate system. A flexible and highly optimizable calibration may thus be provided, since the displayed image data may be varied for different conditions and applications.
  • the touch sensitive apparatus 101 may be configured to display image data comprising at least one orientation tag 107", as schematically illustrated in Fig. 5.
  • the positioning unit 103 may be configured to track the position of the at least one orientation tag 103 to determine an associated position of the touch sensitive apparatus 101 relative to the user.
  • the number of tags 103 displayed and the configurations thereof may vary to provide for a precise positioning procedure and a VR environment which is accurately anchored to the physical reality, i.e. the touch sensitive apparatus 101 .
  • the touch sensitive apparatus 101 may be configured to display a calibration image 108 at (or at a defined distance to) the position of a user input device 109 on the touch sensitive apparatus 101 when the touch sensitive apparatus receives touch input from the user input device 109, as schematically illustrated in Fig. 6.
  • the image sensor device 106 may be configured to capture image data comprising the calibration image 108 and the user input device 109 and/or the user 1 1 1 .
  • the positioning unit 103 may be configured to determine an orientation of the user input device 109 and/or the user 1 1 1 (such as one or more fingers, hand or lower arm of the user) relative the touch sensitive apparatus 101 based on a projected image 110 of the user input device 109 and/or the user 1 1 1 on the calibration image 108.
  • the positioning device 103 may determine the orientation, position, or dynamics of the movement, such as the speed or acceleration, of the user input device 109 and/or the user 1 1 1.
  • Such spatial position information is then mapped to the VR environment coordinate system as described, which provides for a facilitated interaction with the touch sensitive apparatus 101 , e.g. by displaying a virtual representation of the user input device 109 and/or the user 1 1 1 in the VR space.
  • This also allows for providing sufficient information to allow effective palm rejection, e.g. by identifying a stylus tip from the projected image and ignoring all other touches around that stylus tip position.
  • the calibration image 108 is advantageously displayed around the user input device 109 and/or the hand of the user 1 11 .
  • the touch sensitive apparatus 101 may be configured to display the calibration image 108 tracking the position of the user input device 109, and/or the user 11 1 , on the touch sensitive apparatus 101 .
  • the calibration image 108 may thus follow the position of the user input device 109, and/or the user 1 1 1 , on the touch sensitive apparatus 101 , which may improve the detection of the above mentioned spatial position information.
  • the touch-based VR interaction system 100 may comprise a light emitter 1 16 arranged at a determined spatial position relative to the touch sensitive apparatus 101 , as schematically illustrated in Fig. 8.
  • the image sensor device 106 may be configured to capture image data 107"' of light emitted by the light emitter and communicate the image data to the positioning unit 103, which is configured to determine the position of the touch sensitive apparatus 101 relative to the user based on the captured image data.
  • the light may be IR light or light of any other wavelength suitable for detection by the image sensor device 106.
  • the image sensor device 106 may be arranged at the VR output device 102, as schematically illustrated in Figs. 3 - 8. It is conceivable however that the image sensor device 106 may be displaced from the VR output device 102 but at a predetermined distance from the touch sensitive apparatus 101 and communicating with the positioning unit 103, so that the image data 107 - 107"' may be received by the positioning unit 103.
  • the touch-based VR interaction system 100 may comprise a second image sensor device 113, 1 13', arranged on the touch sensitive apparatus 101 , as schematically illustrated in Fig. 7.
  • the second image sensor device 1 13, 1 13' may be configured to capture image data of the user 1 1 1 and/or a user input device 109 and communicate the image data to the positioning unit 103, which is configured to determine an orientation of the user 1 1 1 and/or a user input device 109 relative to the touch sensitive apparatus 101 based on the captured image data.
  • the second image sensor device 1 13, 113' may comprise depth cameras for accurately determining the spatial positioning information. Inertia sensors may also track the movement of the user input device 109 for defined periods of time, such as the time between letters when writing a word.
  • the positioning device 103 may determine the orientation, position, or dynamics of the movement, such as the speed or acceleration, of the user input device 109 and/or the user 1 1 1 from the image data.
  • the processing unit 104 may subsequently map such spatial position information to the VR environment coordinate system as described above for providing a precise representation of the user input device 109 and/or the user 1 1 1 in the VR space.
  • the accuracy of the virtual representation of the touch input in the VR environment coordinate system may thus be improved so that user may experience a more direct connection between physical movements of e.g. input device 109 and the resulting virtual presentation, which is critical for fine touch input gestures e.g. in high-resolution tasks.
  • Such improved VR representation and tracking of the user input device 109 and/or the user 1 11 is also
  • the processing unit 104 may thus be configured to map spatial position information associated with the determined orientation of the user 1 1 1 and/or a user input device 109 to the VR environment coordinate system, and the VR output device 102 may be configured to display the orientation of the user 1 1 1 and/or a user input device 109 in the virtual space.
  • the positioning unit 103 may be configured to determine a calibration position of a user input device 109 in the VR environment coordinate system when touching at least one physical coordinate 112 on the touch sensitive apparatus 101 (i.e. on the touch panel 101 ' thereof).
  • the processing unit 104 may be configured to map the position of the at least one physical coordinate to the VR environment coordinate system by registering the at least one physical coordinate to the calibration position when detecting the touch of the at least one physical coordinate 1 12.
  • the user may calibrate the position of the touch sensitive apparatus 101 in the VR space with a few touches on the touch panel 101 '.
  • Each touch with the user input device 109 connects the respective physical coordinate at the touch site of the touch panel 101 ' with the coordinate of the user input device 109 in the VR environment coordinate system, when at the same point in time.
  • the VR output device 102 may be configured to display the touch sensitive apparatus as a plurality of virtual representations 1 14 thereof in the virtual space, as schematically illustrated in Fig. 10.
  • the processing unit 104 may be configured to associate at least a second 1 15 virtual representation of the plurality of virtual representations 1 14 with a second set of VR environment coordinates in response to a user input so that the VR output device 102 displays the second virtual representation 1 15 as being separated within the virtual space from a virtual representation 1 15' of the touch sensitive apparatus receiving touch input.
  • the VR output device 102 displays a presentation session in the VR space in one application, in which a plurality of virtual representations 1 14 of a touch sensitive apparatus is displayed to a user 11 1 or a plurality of users.
  • a user 1 1 1 may interact with a first virtual representation 1 15' of the touch sensitive apparatus.
  • the user may subsequently provide a dedicated touch input, such as a swipe gesture, to shift the first virtual representation 115' to a different location in the VR space (e.g. as denoted by reference 1 15 in Fig. 10) and continue interaction with another virtual representation of the touch sensitive apparatus in the VR space, but with the same physical touch sensitive apparatus 101 .
  • a plurality of virtual representations 114 may be arranged in the VR space for viewing and further interaction by the participating VR users.
  • a user may then 'activate' any of the virtual representations 1 14 for touch input, by again anchoring a virtual representation 115 to the VR coordinates represented by the touch sensitive apparatus 101 .
  • the virtual representation 1 15' aligned with the physical touch sensitive apparatus 101 may be highlighted e.g. with a different color in the VR space to facilitate the user orientation.
  • the touch-based VR interaction system 100 thus provides for a highly dynamic interaction with the freedom to utilize the VR space while ensuring that all of the user's input is structured and retained, with high resolution and accuracy. It is conceivable that several touch sensitive apparatuses 101 are connected over a communication network, where the touch-based VR interaction system 100 incorporates the touch sensitive apparatuses 101 so that simultaneous input to the plurality of touch panels 101 ' can be provided and mapped to the VR space for simultaneous interaction and viewing by a plurality of user's in a network.
  • Fig. 1 1 illustrates a flow chart of a method 200 in a touch-based VR interaction system. The order in which the steps of the method 200 are described and illustrated should not be construed as limiting and it is
  • the touch-based VR interaction system 100 has a touch sensitive apparatus 101 configured to receive touch input from a user, and a VR output device 102 configured to display a position of the user and a virtual representation of the touch input in a VR environment coordinate system within a virtual space.
  • the method 200 comprises providing 201 spatial information of the position of the touch sensitive apparatus 101 relative to the user, and mapping 202 the spatial position information of the touch sensitive apparatus 101 to the VR environment coordinate system.
  • the method 200 comprises communicating 203 a set of VR environment coordinates of the touch sensitive apparatus to the VR output device 102 so that the touch sensitive apparatus 101 is displayed 204 within the virtual space together with the virtual representation of the touch input.
  • the method 200 thus provides for the advantageous benefits as described above in relation to the system 100 and Figs. 1 - 10.
  • a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method 200.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un système d'interaction tactile en réalité virtuelle (VR). Le système d'interaction VR comprend un appareil tactile configuré pour recevoir une entrée tactile d'un utilisateur, un dispositif de sortie VR configuré pour afficher une position de l'utilisateur et une représentation virtuelle de l'entrée tactile dans un système de coordonnées d'environnement VR à l'intérieur d'un espace virtuel, une unité de positionnement configurée pour fournir des informations de position spatiale de la position de l'appareil tactile par rapport à l'utilisateur, et une unité de traitement configurée pour mapper les informations de position spatiale de l'appareil tactile sur le système de coordonnées d'environnement VR. L'unité de traitement est configurée pour communiquer un ensemble de coordonnées d'environnement VR de l'appareil tactile au dispositif de sortie VR de telle sorte que l'appareil tactile s'affiche dans l'espace virtuel conjointement avec la représentation virtuelle de l'entrée tactile. L'invention concerne également un procédé associé.The invention relates to a touch interaction system in virtual reality (VR). The VR interaction system comprises a touch device configured to receive a user's touch input, a VR output device configured to display a user's position, and a virtual representation of the touch input in a coordinate system of a user. VR environment within a virtual space, a positioning unit configured to provide spatial position information of the position of the touch device relative to the user, and a processing unit configured to map the information spatial position of the touch device on the VR environment coordinate system. The processing unit is configured to communicate a set of environment coordinates VR from the touch device to the output device VR such that the touch device is displayed in the virtual space together with the virtual representation of the device. touch input. The invention also relates to an associated method.

Description

A touch-based virtual-reality interaction system
Technical Field
The present invention relates generally to the field of virtual-reality (VR) interaction systems. More particularly, the present invention relates to a touch- based VR interaction system and a related method.
Background
To an increasing extent, touch-sensitive panels are being used for providing input data to computers, gaming devices, presentation- and
conference systems etc. Alongside this development is the growing field of virtual-reality systems and applications. Virtual-reality presents the user with an environment partially if not fully disconnected from the actual physical environment of the user. Various ways of interacting with this environment have been tried. These include IR tracked gloves, IR tracked wands or other gesturing tools, gyroscope-/accelerometer tracked objects. The IR tracked objects are typically tracked using one or more IR sensors configured to view and triangulate IR light sources on the IR tracked objects. Such interaction systems provide high latency, low accuracy user input to the virtual
environment. It would thus be advantageous to provide a VR interaction system with a high-precision interface.
Summary It is an objective of the invention to at least partly overcome one or more of the above-identified limitations of the prior art.
One objective is to provide a VR interaction system with a high-precision interface.
Another objective is to provide a touch-based VR interaction system in which a VR user interact with a high precision touch sensitive apparatus in the physical reality whilst viewing the interaction in the virtual reality. One or more of these objectives, and other objectives that may appear from the description below, are at least partly achieved by means of a touch- based (VR) interaction system and a related method according to the independent claims, embodiments thereof being defined by the dependent claims.
According to a first aspect a touch-based virtual-reality (VR) interaction system is provided. The VR interaction system comprises a touch sensitive apparatus configured to receive touch input from a user, a VR output device configured to display a position of the user and a virtual representation of the touch input in a VR environment coordinate system within a virtual space, a positioning unit configured to provide spatial position information of the position of the touch sensitive apparatus relative to the user, and a processing unit configured to map the spatial position information of the touch sensitive apparatus to the VR environment coordinate system. The processing unit is configured to communicate a set of VR environment coordinates of the touch sensitive apparatus to the VR output device so that the touch sensitive apparatus is displayed within the virtual space together with the virtual representation of the touch input.
According to a second aspect a method in a touch-based VR interaction system is provided. The system having a touch sensitive apparatus configured to receive touch input from a user, and a VR output device configured to display a position of the user and a virtual representation of the touch input in a VR environment coordinate system within a virtual space. The method comprises providing spatial information of the position of the touch sensitive apparatus relative to the user, mapping the spatial position information of the touch sensitive apparatus to the VR environment coordinate system, and
communicating a set of VR environment coordinates of the touch sensitive apparatus to the VR output device so that the touch sensitive apparatus is displayed within the virtual space together with the virtual representation of the touch input.
According to a third aspect a computer program product is provided comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method according to the second aspect.
Further examples of the invention are defined in the dependent claims, wherein features for the second and third aspects of the disclosure are as for the first aspect mutatis mutandis. Some examples of the disclosure provide for a VR interaction system with a high-precision interface.
Some examples of the disclosure provide for a touch-based VR interaction system in which a VR user interact with a high precision touch sensitive apparatus in the physical reality whilst viewing the interaction in the virtual reality.
Some examples of the disclosure provide for an enhanced VR experience via interaction with a touch panel.
Some examples of the disclosure provide for capturing input from a user's interaction with a VR environment with a high accuracy.
It should be emphasized that the term "comprises/comprising" when used in this specification is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.
Brief Description of the Drawings
These and other aspects, features and advantages of which examples of the invention are capable of will be apparent and elucidated from the following description of examples of the present invention, reference being made to the accompanying schematic drawings, in which;
Fig. 1 shows a touch-based virtual-reality (VR) interaction system according to examples of the disclosure;
Fig. 2 shows a touch-based VR interaction system according to examples of the disclosure;
Fig. 3 shows a touch-based VR interaction system according to examples of the disclosure;
Fig. 4 shows a touch-based VR interaction system according to examples of the disclosure;
Fig. 5 shows a touch-based VR interaction system according to examples of the disclosure;
Fig. 6 shows a touch-based VR interaction system according to examples of the disclosure;
Fig. 7 shows a touch-based VR interaction system according to examples of the disclosure; Fig. 8 shows a touch-based VR interaction system according to examples of the disclosure;
Fig. 9 shows a touch-based VR interaction system according to examples of the disclosure;
Fig. 10 shows a VR environment in which a plurality of virtual
representations of a touch sensitive apparatus is shown, according to examples of the disclosure; and
Fig. 1 1 is a flowchart of a method in a touch-based VR interaction system according to examples of the disclosure.
Detailed Description
Specific examples of the invention will now be described with reference to the accompanying drawings. This invention may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these examples are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. The terminology used in the detailed description of the examples illustrated in the accompanying drawings is not intended to be limiting of the invention. In the drawings, like numbers refer to like elements.
Fig. 1 is a schematic illustration of a touch-based virtual-reality (VR) interaction system 100 comprising a touch sensitive apparatus 101 configured to receive touch input from a user, and a VR output device 102 configured to display a position of the user and a virtual representation of the touch input in a
VR environment coordinate system within a virtual space. The touch sensitive apparatus 101 may be configured to receive input using e.g. one or more fingers, a pointer or stylus etc on a touch panel 101 ' of the touch sensitive apparatus 101 . The VR output device 102 may be configured to be wearable by the user, and may thus comprise a VR headset. The VR output device 102 presents a virtual space to the user, as well as a virtual representation of the touch input, when the user provides touch input to the touch sensitive apparatus 101 . A virtual representation of the user, such as one or more fingers, and/or a pointer or stylus may be presented in the VR output device 102 to facilitate orientation in the virtual space. The objects presented in the virtual space, such as the user, the virtual representation of the touch input, or any other
(interactable) VR objects have thus determined coordinates in the VR environment coordinate system, for visualization via the VR output device 102. The VR coordinates may be determined by sensor devices configured to detect the location and movements of these objects. Further, the touch-based VR interaction system 100 comprises a positioning unit 103 configured to provide spatial position information of the position of the touch sensitive apparatus 101 relative to the user, and a processing unit 104 configured to map the spatial position information of the touch sensitive apparatus 101 to the VR environment coordinate system. The processing unit 104 is configured to communicate a set of VR environment coordinates of the touch sensitive apparatus 101 to the VR output device 102 so that the touch sensitive apparatus 101 is displayed within the virtual space together with the virtual representation of the touch input. The VR user may thus reliably interact with a high precision touch sensitive apparatus 101 in the physical reality whilst viewing the interaction in VR.
Various input from the user's interaction with a VR environment may thus be captured with an increased accuracy. For example, touch input of fine details of a component for a machine presented in the VR space may be captured with the increased accuracy and low latency of the touch sensitive apparatus 101 , that otherwise would not be resolved by typical spatial sensors in previous VR systems. Mapping the position of the touch sensitive apparatus 101 to the VR environment provides further for an enhanced VR experience combining the freedom of customizing different VR environments to the user's tasks with the tactile interaction provided by the touch sensitive apparatus 101 . Moreover, the simultaneous interaction with the touch sensitive apparatus 101 allows for a more viable handling of user input from a VR environment, such as the communication of a user's input to various related systems and applications. A realistic and more practical utilization of VR may thus be provided, across a range of applications and technical fields.
There are numerous known techniques for providing touch sensitivity to the touch panel 101 ', e.g. by using cameras to capture light scattered off the point(s) of touch on the panel, by using cameras to directly observe the objects interacting with the panel, by incorporating resistive wire grids, capacitive sensors, strain gauges, etc. into the panel. In one category of touch-sensitive panels known as 'above surface optical touch systems', a plurality of optical emitters and optical receivers are arranged around the periphery of the touch surface of the panel 101 ' to create a grid of intersecting light paths (otherwise known as detection lines) above the touch surface. Each light path extends between a respective emitter/receiver pair. An object that touches the touch surface will block or attenuate some of the light paths. Based on the identity of the receivers detecting a blocked light path, a processor can determine the location of the intercept between the blocked light paths.
The touch-based VR interaction system 100 may comprise at least one spatial marker 105, 105', arranged on the touch sensitive apparatus 101 , as schematically illustrated in Fig. 2. The positioning unit 103 may be configured to track the at least one spatial marker 105, 105', to determine an associated position of the touch sensitive apparatus 101 relative to the user. The at least one spatial marker 105, 105', may comprise IR markers such as IR light sources, or any other marker configured for allowing tracking by the positioning unit 102, such as markers of different shapes and configurations being physically provided on parts of the touch sensitive apparatus 101 and/or displayed on the touch panel 101 ' thereof. Accurate mapping of the obtained spatial position information to the VR environment coordinate system may then be provided. Fig. 2 illustrates first and second spatial markers 105, 105', but it is conceivable that the number of spatial markers may be varied to provide for an optimized position detection.
The touch-based VR interaction system 100 may comprise an image sensor device 106 configured to be wearable by the user, as schematically illustrated in Figs. 3 - 8. The image sensor device 106 may be configured to capture image data 107, 107', 107", 107"', associated with the position of the touch sensitive apparatus 101 and communicate the image data to the positioning unit 103, which is configured to determine the position of the touch sensitive apparatus 101 relative to the user based on the captured image data, such as by a triangulation process of the obtained image data. Since the image sensor device 106 may be arranged at the position of the user, i.e. by being wearable, the relative position between the user and the touch sensitive apparatus 101 may be accurately determined. This provides for accurately determining the VR environment coordinates of the touch sensitive apparatus
101 and a precise positioning the touch sensitive apparatus in the virtual space. Such precise positioning in the virtual space facilitates the interaction with the touch sensitive apparatus 101 when the user is immersed in the VR experience, since the virtual representation of the touch sensitive apparatus 101 may be precisely aligned with the physical touch sensitive apparatus 101. The touch- based VR interaction system 100 thus enables high-resolution input and for more complex tasks to be carried out by the user in the VR space. The image sensor device 106 may be configured to capture image data 107 of the at least one spatial marker 105, 105', and communicate the image data to the positioning unit 103, which is configured to determine the position of the touch sensitive apparatus 101 relative to the user based on the captured image data. Fig. 3 illustrates an example where the image sensor device 106 locates the position of the touch sensitive apparatus 101 based on spatial markers 105, 105'. The processing unit 104 may then accurately map the retrieved spatial position information to the VR environment coordinate system.
The image sensor device 106 may be configured to capture image data 107' displayed by the touch sensitive apparatus 101 and communicate the image data to the positioning unit 103, as schematically illustrated in Fig. 4. The image data 107' displayed by the touch sensitive apparatus 101 may comprise objects of varying shapes and configurations that allow for a calibration of the position of the touch sensitive apparatus 101 in the VR environment coordinate system. A flexible and highly optimizable calibration may thus be provided, since the displayed image data may be varied for different conditions and applications.
The touch sensitive apparatus 101 may be configured to display image data comprising at least one orientation tag 107", as schematically illustrated in Fig. 5. The positioning unit 103 may be configured to track the position of the at least one orientation tag 103 to determine an associated position of the touch sensitive apparatus 101 relative to the user. The number of tags 103 displayed and the configurations thereof may vary to provide for a precise positioning procedure and a VR environment which is accurately anchored to the physical reality, i.e. the touch sensitive apparatus 101 .
The touch sensitive apparatus 101 may be configured to display a calibration image 108 at (or at a defined distance to) the position of a user input device 109 on the touch sensitive apparatus 101 when the touch sensitive apparatus receives touch input from the user input device 109, as schematically illustrated in Fig. 6. The image sensor device 106 may be configured to capture image data comprising the calibration image 108 and the user input device 109 and/or the user 1 1 1 . The positioning unit 103 may be configured to determine an orientation of the user input device 109 and/or the user 1 1 1 (such as one or more fingers, hand or lower arm of the user) relative the touch sensitive apparatus 101 based on a projected image 110 of the user input device 109 and/or the user 1 1 1 on the calibration image 108. Thus, by observing which parts of the calibration image 108 being obscured by the user input device 109 and/or the user 1 1 1 , the positioning device 103 may determine the orientation, position, or dynamics of the movement, such as the speed or acceleration, of the user input device 109 and/or the user 1 1 1. Such spatial position information is then mapped to the VR environment coordinate system as described, which provides for a facilitated interaction with the touch sensitive apparatus 101 , e.g. by displaying a virtual representation of the user input device 109 and/or the user 1 1 1 in the VR space. This also allows for providing sufficient information to allow effective palm rejection, e.g. by identifying a stylus tip from the projected image and ignoring all other touches around that stylus tip position. As the user is usually looking at their hand when interacting with the touch panel, the calibration image 108 is advantageously displayed around the user input device 109 and/or the hand of the user 1 11 .
The touch sensitive apparatus 101 may be configured to display the calibration image 108 tracking the position of the user input device 109, and/or the user 11 1 , on the touch sensitive apparatus 101 . The calibration image 108 may thus follow the position of the user input device 109, and/or the user 1 1 1 , on the touch sensitive apparatus 101 , which may improve the detection of the above mentioned spatial position information.
The touch-based VR interaction system 100 may comprise a light emitter 1 16 arranged at a determined spatial position relative to the touch sensitive apparatus 101 , as schematically illustrated in Fig. 8. The image sensor device 106 may be configured to capture image data 107"' of light emitted by the light emitter and communicate the image data to the positioning unit 103, which is configured to determine the position of the touch sensitive apparatus 101 relative to the user based on the captured image data. The light may be IR light or light of any other wavelength suitable for detection by the image sensor device 106.
The image sensor device 106 may be arranged at the VR output device 102, as schematically illustrated in Figs. 3 - 8. It is conceivable however that the image sensor device 106 may be displaced from the VR output device 102 but at a predetermined distance from the touch sensitive apparatus 101 and communicating with the positioning unit 103, so that the image data 107 - 107"' may be received by the positioning unit 103.
The touch-based VR interaction system 100 may comprise a second image sensor device 113, 1 13', arranged on the touch sensitive apparatus 101 , as schematically illustrated in Fig. 7. The second image sensor device 1 13, 1 13', may be configured to capture image data of the user 1 1 1 and/or a user input device 109 and communicate the image data to the positioning unit 103, which is configured to determine an orientation of the user 1 1 1 and/or a user input device 109 relative to the touch sensitive apparatus 101 based on the captured image data. The second image sensor device 1 13, 113', may comprise depth cameras for accurately determining the spatial positioning information. Inertia sensors may also track the movement of the user input device 109 for defined periods of time, such as the time between letters when writing a word. The positioning device 103 may determine the orientation, position, or dynamics of the movement, such as the speed or acceleration, of the user input device 109 and/or the user 1 1 1 from the image data. The processing unit 104 may subsequently map such spatial position information to the VR environment coordinate system as described above for providing a precise representation of the user input device 109 and/or the user 1 1 1 in the VR space. The accuracy of the virtual representation of the touch input in the VR environment coordinate system may thus be improved so that user may experience a more direct connection between physical movements of e.g. input device 109 and the resulting virtual presentation, which is critical for fine touch input gestures e.g. in high-resolution tasks. Such improved VR representation and tracking of the user input device 109 and/or the user 1 11 is also
advantageous for avoiding disorientation of the user.
The processing unit 104 may thus be configured to map spatial position information associated with the determined orientation of the user 1 1 1 and/or a user input device 109 to the VR environment coordinate system, and the VR output device 102 may be configured to display the orientation of the user 1 1 1 and/or a user input device 109 in the virtual space.
The positioning unit 103 may be configured to determine a calibration position of a user input device 109 in the VR environment coordinate system when touching at least one physical coordinate 112 on the touch sensitive apparatus 101 (i.e. on the touch panel 101 ' thereof). The processing unit 104 may be configured to map the position of the at least one physical coordinate to the VR environment coordinate system by registering the at least one physical coordinate to the calibration position when detecting the touch of the at least one physical coordinate 1 12. Thus, if the user has a tracked user input device 109, such as VR gloves or the like, the user may calibrate the position of the touch sensitive apparatus 101 in the VR space with a few touches on the touch panel 101 '. Each touch with the user input device 109 connects the respective physical coordinate at the touch site of the touch panel 101 ' with the coordinate of the user input device 109 in the VR environment coordinate system, when at the same point in time.
The VR output device 102 may be configured to display the touch sensitive apparatus as a plurality of virtual representations 1 14 thereof in the virtual space, as schematically illustrated in Fig. 10. The processing unit 104 may be configured to associate at least a second 1 15 virtual representation of the plurality of virtual representations 1 14 with a second set of VR environment coordinates in response to a user input so that the VR output device 102 displays the second virtual representation 1 15 as being separated within the virtual space from a virtual representation 1 15' of the touch sensitive apparatus receiving touch input. For example, it is conceivable that the VR output device 102 displays a presentation session in the VR space in one application, in which a plurality of virtual representations 1 14 of a touch sensitive apparatus is displayed to a user 11 1 or a plurality of users. A user 1 1 1 may interact with a first virtual representation 1 15' of the touch sensitive apparatus. The user may subsequently provide a dedicated touch input, such as a swipe gesture, to shift the first virtual representation 115' to a different location in the VR space (e.g. as denoted by reference 1 15 in Fig. 10) and continue interaction with another virtual representation of the touch sensitive apparatus in the VR space, but with the same physical touch sensitive apparatus 101 . Hence, a plurality of virtual representations 114 may be arranged in the VR space for viewing and further interaction by the participating VR users. A user may then 'activate' any of the virtual representations 1 14 for touch input, by again anchoring a virtual representation 115 to the VR coordinates represented by the touch sensitive apparatus 101 . The virtual representation 1 15' aligned with the physical touch sensitive apparatus 101 may be highlighted e.g. with a different color in the VR space to facilitate the user orientation. The touch-based VR interaction system 100 thus provides for a highly dynamic interaction with the freedom to utilize the VR space while ensuring that all of the user's input is structured and retained, with high resolution and accuracy. It is conceivable that several touch sensitive apparatuses 101 are connected over a communication network, where the touch-based VR interaction system 100 incorporates the touch sensitive apparatuses 101 so that simultaneous input to the plurality of touch panels 101 ' can be provided and mapped to the VR space for simultaneous interaction and viewing by a plurality of user's in a network.
Fig. 1 1 illustrates a flow chart of a method 200 in a touch-based VR interaction system. The order in which the steps of the method 200 are described and illustrated should not be construed as limiting and it is
conceivable that the steps can be performed in varying order. As mentioned, the touch-based VR interaction system 100 has a touch sensitive apparatus 101 configured to receive touch input from a user, and a VR output device 102 configured to display a position of the user and a virtual representation of the touch input in a VR environment coordinate system within a virtual space. The method 200 comprises providing 201 spatial information of the position of the touch sensitive apparatus 101 relative to the user, and mapping 202 the spatial position information of the touch sensitive apparatus 101 to the VR environment coordinate system. The method 200 comprises communicating 203 a set of VR environment coordinates of the touch sensitive apparatus to the VR output device 102 so that the touch sensitive apparatus 101 is displayed 204 within the virtual space together with the virtual representation of the touch input. The method 200 thus provides for the advantageous benefits as described above in relation to the system 100 and Figs. 1 - 10.
A computer program product is provided comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method 200.
The present invention has been described above with reference to specific examples. However, other examples than the above described are equally possible within the scope of the invention. The different features and steps of the invention may be combined in other combinations than those described. The scope of the invention is only limited by the appended patent claims.
More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings of the present invention is/are used.

Claims

Claims
1 . A touch-based virtual-reality (VR) interaction system (100) comprising a touch sensitive apparatus (101 ) configured to receive touch input from a user,
a VR output device (102) configured to display a position of the user and a virtual representation of the touch input in a VR environment coordinate system within a virtual space,
a positioning unit (103) configured to provide spatial position information of the position of the touch sensitive apparatus relative to the user,
a processing unit (104) configured to map the spatial position information of the touch sensitive apparatus to the VR environment coordinate system, whereby the processing unit is configured to communicate a set of VR environment coordinates of the touch sensitive apparatus to the VR output device so that the touch sensitive apparatus is displayed within the virtual space together with the virtual representation of the touch input.
2. Touch-based VR interaction system according to claim 1 , comprising at least one spatial marker (105, 105') arranged on the touch sensitive apparatus, wherein the positioning unit is configured to track the at least one spatial marker to determine an associated position of the touch sensitive apparatus relative to the user.
3. Touch-based VR interaction system according to claim 1 or 2, comprising
an image sensor device (106) configured to be wearable by the user, and wherein the image sensor device is configured to capture image data (107, 107', 107", 107"') associated with the position of the touch sensitive apparatus and communicate the image data to the positioning unit, wherein the positioning unit is configured to determine the position of the touch sensitive apparatus relative to the user based on the captured image data.
4. Touch-based VR interaction system according to claim 2 and 3, wherein the image sensor device is configured to capture image data (107) of the at least one spatial marker and communicate the image data to the positioning unit, wherein the positioning unit is configured to determine the position of the touch sensitive apparatus relative to the user based on the captured image data.
5. Touch-based VR interaction system according to claim 3 or 4, wherein the image sensor device is configured to capture image data (107') displayed by the touch sensitive apparatus and communicate the image data to the positioning unit, wherein the positioning unit is configured to determine the position of the touch sensitive apparatus relative to the user based on the captured image data.
6. Touch-based VR interaction system according to claim 5, wherein the touch sensitive apparatus is configured to display image data comprising at least one orientation tag (107"), and wherein the positioning unit is configured to track the position of the at least one orientation tag to determine an associated position of the touch sensitive apparatus relative to the user.
7. Touch-based VR interaction system according to any of claims 3 - 6, wherein the touch sensitive apparatus is configured to display a calibration image (108) at the position of a user input device (109) on the touch sensitive apparatus when the touch sensitive apparatus receive touch input from the user input device, whereby the image sensor device is configured to capture image data comprising the calibration image and the user input device and/or the user (1 11 ), wherein the positioning unit is configured to determine an orientation of the user input device and/or the user relative the touch sensitive apparatus based on a projected image (110) of the user input device and/or the user on the calibration image.
8. Touch-based VR interaction system according to claim 7, wherein the touch sensitive apparatus is configured to display the calibration image tracking the position of the user input device on the touch sensitive apparatus.
9. Touch-based VR interaction system according to any of claims 3 - 8, comprising a light emitter (1 16) arranged at a determined spatial position relative to the touch sensitive apparatus, and wherein the image sensor device is configured to capture image data (107"') of light emitted by the light emitter and communicate the image data to the positioning unit, wherein the positioning unit is configured to determine the position of the touch sensitive apparatus relative to the user based on the captured image data.
10. Touch-based VR interaction system according to any of claims 3 - 9, wherein the image sensor device is arranged at the VR output device.
1 1 . Touch-based VR interaction system according to any of claims 1 - 10, comprising a second image sensor device (113, 1 13') arranged on the touch sensitive apparatus, wherein the second image sensor device is configured to capture image data of the user (1 11 ) and/or a user input device (109) and communicate the image data to the positioning unit, wherein the positioning unit is configured to determine an orientation of the user and/or a user input device relative to the touch sensitive apparatus based on the captured image data.
12. Touch-based VR interaction system according to claim 7 or 1 1 , wherein the processing unit is configured to map spatial position information associated with the determined orientation of the user and/or a user input device to the VR environment coordinate system, and wherein the VR output device is configured to display the orientation of the user and/or a user input device in the virtual space.
13. Touch-based VR interaction system according to any of claims 1 - 12, wherein the positioning unit is configured to determine a calibration position of a user input device (109) in the VR environment coordinate system when touching at least one physical coordinate (1 12) on the touch sensitive
apparatus, whereby the processing unit is configured to map the position of the at least one physical coordinate to the VR environment coordinate system by registering the at least one physical coordinate to the calibration position when detecting the touch of the at least one physical coordinate.
14. Touch-based VR interaction system according to any of claims 1 - 13, wherein the VR output device is configured to display the touch sensitive apparatus as a plurality of virtual representations (1 14) thereof in the virtual space, wherein the processing unit is configured to associate at least a second (1 15) virtual representation of the plurality of virtual representations with a second set of VR environment coordinates in response to a user input so that the VR output device displays the second virtual representation as being separated within the virtual space from a virtual representation (1 15') of the touch sensitive apparatus receiving touch input.
15. A method (200) in a touch-based virtual-reality (VR) interaction system (100) having a touch sensitive apparatus (101 ) configured to receive touch input from a user, and a VR output device (102) configured to display a position of the user and a virtual representation of the touch input in a VR environment coordinate system within a virtual space, the method comprising
providing (201 ) spatial information of the position of the touch sensitive apparatus relative to the user,
mapping (202) the spatial position information of the touch sensitive apparatus to the VR environment coordinate system, and
communicating (203) a set of VR environment coordinates of the touch sensitive apparatus to the VR output device so that the touch sensitive apparatus is displayed (204) within the virtual space together with the virtual representation of the touch input.
16. A computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method according to claim 15.
PCT/SE2018/050781 2017-08-07 2018-07-31 A touch-based virtual-reality interaction system Ceased WO2019032014A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1730206-8 2017-08-07
SE1730206 2017-08-07

Publications (1)

Publication Number Publication Date
WO2019032014A1 true WO2019032014A1 (en) 2019-02-14

Family

ID=65270988

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2018/050781 Ceased WO2019032014A1 (en) 2017-08-07 2018-07-31 A touch-based virtual-reality interaction system

Country Status (1)

Country Link
WO (1) WO2019032014A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023234822A1 (en) * 2022-05-31 2023-12-07 Flatfrog Laboratories Ab An extended-reality interaction system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030227470A1 (en) * 2002-06-06 2003-12-11 Yakup Genc System and method for measuring the registration accuracy of an augmented reality system
US20120206323A1 (en) * 2010-02-28 2012-08-16 Osterhout Group, Inc. Ar glasses with event and sensor triggered ar eyepiece interface to external devices
US20130265393A1 (en) * 2006-08-10 2013-10-10 Canon Kabushiki Kaisha Image capture environment calibration method and information processing apparatus
EP2981079A1 (en) * 2013-03-28 2016-02-03 Sony Corporation Image processing device and method, and program
US20160093105A1 (en) * 2014-09-30 2016-03-31 Sony Computer Entertainment Inc. Display of text information on a head-mounted display
US20170061700A1 (en) * 2015-02-13 2017-03-02 Julian Michael Urbach Intercommunication between a head mounted display and a real world object
US20170199580A1 (en) * 2012-10-17 2017-07-13 Microsoft Technology Licensing, Llc Grasping virtual objects in augmented reality

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030227470A1 (en) * 2002-06-06 2003-12-11 Yakup Genc System and method for measuring the registration accuracy of an augmented reality system
US20130265393A1 (en) * 2006-08-10 2013-10-10 Canon Kabushiki Kaisha Image capture environment calibration method and information processing apparatus
US20120206323A1 (en) * 2010-02-28 2012-08-16 Osterhout Group, Inc. Ar glasses with event and sensor triggered ar eyepiece interface to external devices
US20170199580A1 (en) * 2012-10-17 2017-07-13 Microsoft Technology Licensing, Llc Grasping virtual objects in augmented reality
EP2981079A1 (en) * 2013-03-28 2016-02-03 Sony Corporation Image processing device and method, and program
US20160093105A1 (en) * 2014-09-30 2016-03-31 Sony Computer Entertainment Inc. Display of text information on a head-mounted display
US20170061700A1 (en) * 2015-02-13 2017-03-02 Julian Michael Urbach Intercommunication between a head mounted display and a real world object

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023234822A1 (en) * 2022-05-31 2023-12-07 Flatfrog Laboratories Ab An extended-reality interaction system

Similar Documents

Publication Publication Date Title
US8180114B2 (en) Gesture recognition interface system with vertical display
JP6348211B2 (en) Remote control of computer equipment
CN108431729B (en) Three-dimensional object tracking to increase display area
JP6539816B2 (en) Multi-modal gesture based interactive system and method using one single sensing system
JP2022540315A (en) Virtual User Interface Using Peripheral Devices in Artificial Reality Environment
US10185433B2 (en) Method and apparatus for touch responding of wearable device as well as wearable device
US8614669B2 (en) Touchless tablet method and system thereof
US8743089B2 (en) Information processing apparatus and control method thereof
CN102317892B (en) Method of controlling information input device, information input device, program and information storage medium
US9310851B2 (en) Three-dimensional (3D) human-computer interaction system using computer mouse as a 3D pointing device and an operation method thereof
US20140317576A1 (en) Method and system for responding to user's selection gesture of object displayed in three dimensions
US20030132913A1 (en) Touchless computer input device to control display cursor mark position by using stereovision input from two video cameras
US20120319945A1 (en) System and method for reporting data in a computer vision system
EP4083757A1 (en) Touchless input interface for an electronic display using multiple sensors
US11640198B2 (en) System and method for human interaction with virtual objects
US20220253148A1 (en) Devices, Systems, and Methods for Contactless Interfacing
US9703410B2 (en) Remote sensing touchscreen
JP2015056064A (en) Coordinate input device and image processing device
WO2023234822A1 (en) An extended-reality interaction system
WO2019032014A1 (en) A touch-based virtual-reality interaction system
EP2390761A1 (en) A method and system for selecting an item in a three dimensional space
Caruso et al. AR-Mote: A wireless device for Augmented Reality environment
KR20150135823A (en) An input device using head movement
McDonald Using Image-Based Tracking for Smartphone-Based Interaction in Virtual Reality
Soleimani et al. Converting every surface to touchscreen

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18843741

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18843741

Country of ref document: EP

Kind code of ref document: A1