WO2016041088A1 - Système et procédé de localisation de périphériques portables dans des applications de réalité augmentée et de réalité virtuelle - Google Patents
Système et procédé de localisation de périphériques portables dans des applications de réalité augmentée et de réalité virtuelle Download PDFInfo
- Publication number
- WO2016041088A1 WO2016041088A1 PCT/CA2015/050918 CA2015050918W WO2016041088A1 WO 2016041088 A1 WO2016041088 A1 WO 2016041088A1 CA 2015050918 W CA2015050918 W CA 2015050918W WO 2016041088 A1 WO2016041088 A1 WO 2016041088A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- hmd
- location
- orientation
- motion
- physical environment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/66—Sonar tracking systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/66—Tracking systems using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/66—Radar-tracking systems; Analogous systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
Definitions
- the following relates generally to wearable technologies, and more specifically to tracking of a head mounted device, user gestures and peripheral devices in augmented reality and virtual reality environments.
- AR augmented reality
- VR virtual reality
- a system for generating a rendered image stream for display to a user of a head mounted device (HMD) for augmented reality and virtual reality applications comprising: a location, motion and orientation (LMO) system disposed upon each of a plurality of tracked dynamic objects, including the HMD, within the physical environment, the LMO system being configured to provide location, motion and orientation information for each tracked dynamic object; a memory having stored thereon data comprising a correlation between the location, motion and orientation information for the HMD and a field of view of the HMD; and a processor configured to: map the physical environment to a virtual map; obtain from each LMO system, the location, motion and orientation information for each tracked dynamic object; repeatedly and substantially continuously map to the virtual map the substantially real-time location and orientation of each tracked dynamic object within the physical environment; for the HMD, obtain the data from the memory, determine the location and orientation of the field of view based on the correlation and the location and orientation of the HMD, and repeatedly and substantially continuously map to the virtual map the
- a method for generating and displaying a rendered image stream to a user of a head mounted device (HMD) for augmented reality and virtual reality applications comprising: obtaining from a memory, data comprising a correlation between the location, motion and orientation information for the HMD and a field of view of the HMD; mapping the physical environment to a virtual map; obtaining from a plurality of tracked dynamic objects, including the HMD, within the physical environment, location, motion and orientation information; repeatedly and substantially continuously mapping to the virtual map the substantially real-time location and orientation of each tracked dynamic object within the physical environment; by one or more processors, for the HMD, determining the location and orientation of the field of view based on the correlation and the location and orientation of the HMD, and repeatedly and substantially continuously mapping to the virtual map the field of view; rendering computer generated images related to at least one of the tracked dynamic objects; transforming the coordinates of the computer generated images from a coordinate system of the virtual map to a coordinate system of a display of the HMD;
- a system for tracking dynamic objects in a physical environment to render an image stream for a head mounted device (HMD) for augmented reality and virtual reality applications comprising: a location, motion and orientation (LMO) system disposed upon each tracked dynamic object, configured to provide location, motion and orientation information for each respective tracked dynamic object; and a processor communicatively coupled to the location, motion and orientation systems, the processor configured to: map the physical environment to a virtual map; obtain the location, motion and orientation information for each tracked dynamic object; and repeatedly and substantially continuously map to the virtual map the location and orientation of each tracked dynamic object within the physical environment.
- LMO location, motion and orientation
- a method for tracking dynamic objects in a physical environment to render an image stream for a head mounted device (HMD) for augmented reality and virtual reality applications comprising: mapping the physical environment to a virtual map; obtaining from a plurality of tracked dynamic objects, within the physical environment, location, motion and orientation information; and by one or more processors, repeatedly and substantially continuously mapping to the virtual map the location and orientation of each tracked dynamic object within the physical environment.
- HMD head mounted device
- a head mounted device for augmented reality and virtual reality application
- the HMD comprising: a location, motion and orientation system disposed upon the HMD for providing location, motion and orientation information
- the location, motion and orientation system comprising: a positioning unit to provide a determined location and orientation of the HMD with reference to a physical environment surrounding the HMD; an inertial measurement unit to provide motion information, comprising a gyroscope and an accelerometer; and a processor communicatively coupled to the location, motion and orientation system, configured to calculate the substantially real-time location and orientation of the HMD based on adding the motion information from the inertial measurement unit to a most recent determined location and orientation from the information provided by the positioning unit.
- a method to determine a substantially real-time location and orientation of a head mounted device (HMD) for augmented reality and virtual reality applications comprising: obtaining an absolute location and orientation of the HMD with reference to a physical environment surrounding the HMD; subsequently obtaining motion and orientation information from a gyroscope disposed upon the HMD and further motion and orientation information from an accelerometer disposed upon the HMD to correct a rotational bias of the gyroscope; and calculating, by one or more processors, the substantially real-time location and orientation of the HMD based on correcting the rotational bias of the gyroscope and adding the corrected motion and orientation information from the gyroscope to a most recent absolute location and orientation from the information provided by the positioning unit.
- HMD head mounted device
- FIG. 1 illustrates in schematic form a system for tracking dynamic objects in a physical environment occupied by a plurality of users equipped with head mounted devices;
- Fig. 2 illustrates an exemplary configuration of location, motion and orientation systems upon a user equipped with an HMD
- FIG. 3 illustrates an exemplary configuration of a head mounted device
- Fig. 4 illustrates a method for mapping to a virtual map the respective locations and orientations of dynamic objects in a physical environment
- Fig. 5 illustrates a field of view for a head mounted device in a physical environment
- Fig. 6 illustrates a method for generating a rendered image stream for a head mounted device.
- Any module, unit, component, server, computer, terminal or device exemplified herein that executes instructions may include or otherwise have access to computer readable media such as storage media, computer storage media, or data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical discs, or tape.
- Computer storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
- Examples of computer storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile discs (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by an application, module, or both. Any such computer storage media may be part of the device or accessible or connectable thereto.
- any processor or controller set out herein may be implemented as a singular processor or as a plurality of processors. The plurality of processors may be arrayed or distributed, and any processing function referred to herein may be carried out by one or by a plurality of processors, even though a single processor may be exemplified. Any method, application or module herein described may be implemented using computer readable/executable instructions that may be stored or otherwise held by such computer readable media and executed by the one or more processors.
- FIG. 1 and Fig. 3 an exemplary scenario is shown in which multiple users occupy a physical environment.
- the users are equipped with HMDs 12 and peripherals 13.
- Each dynamic object may be equipped with a location, motion and orientation (LMO) system to provide LMO information.
- LMO location, motion and orientation
- a processor 130 obtains the LMO information from the dynamic objects and maps their respective locations and orientations in a virtual map, as hereinafter described in greater detail.
- the processor 130 may use the mapped orientations and locations for the dynamic objects to generate dynamic object-related computer generated imagery (CGI).
- CGI computer generated imagery
- the processor 130 may generate a rendered image stream comprising the CGI for display to the user on the display of the user's HMD 12.
- processor may be distributed amongst the components occupying the physical environment, within the physical environment or in a server 300 in network communication with a network 17 accessible from the physical environment.
- the processor 130 may be distributed between the HMDs 12 and a console 11 , or over the I nternet via the network 17.
- Each user's HMD 12 may communicate with the user's peripheral 13, or the HMDs and peripherals 13 may communicate directly with the console 11 or the server 300 located over a network 17 accessible from the physical environment, as shown.
- FIG. 2 an exemplary configuration of LMO systems 21 disposed upon a user 1 is shown.
- the user 1 may be equipped with an HMD 12, as well as LMO systems 21 disposed upon her hands and feet, and one which is disposed upon the HMD 12.
- Each LMO system 21 may provide LMO information for the limb or body part upon which it is disposed.
- FIG. 3 an exemplary HMD 12 embodied as a helmet is shown; however, other embodiments are contemplated.
- the HMD 12 may comprise a processor 130 for generating a rendered image stream comprising CGI.
- the processor may be located apart from the HMD 12.
- the processor may be communicatively coupled to the following components of the HMD 12: (i) a scanning system 132 for scanning the physical environment surrounding the HMD 12; (ii) an LMO system 141 comprising a local positioning unit for determining the HMD's 12 location within the physical environment, a motion sensing module for detecting the HMD's 12 motions within the physical environment, and/or an orientation detection module for detecting the orientation of the HMD 12; (iii) an imaging system 123, such as, for example, a camera system comprising one or more cameras, to capture a physical image stream of the physical environment; (iv) a display system 121 for displaying to a user of the HMD 12 the physical image stream and/or the rendered image stream; (v) a power management system 113 for distributing power to the components; (vi) a sensory feedback system 120 comprising, for example, haptic feedback devices, for providing sensory feedback
- the HMD 12 may further comprise a wireless communication system 126 having, for example, antennae, to communicate with other components in an AR and/or VR system, such as, for example, other HMDs, peripherals, actors, a gaming console, or a router.
- a wireless communication system 126 having, for example, antennae, to communicate with other components in an AR and/or VR system, such as, for example, other HMDs, peripherals, actors, a gaming console, or a router.
- An exemplary peripheral in the form of a controller 13 is shown.
- the controller 13 comprises a toggle-type actuator 131 for receiving user input.
- Each LMO system may comprise a communication module configured to transmit LMO information for the system to the processor, or communication of LMO information may be routed through HMDs, as shown in Fig. 1 .
- communication between an LMO system worn by a user or disposed upon a peripheral held by a user may be effected via a matched transmitter-receiver signal pair between an HMD and the LMO system.
- an LMO system for a user's peripheral, or which is worn by a user may comprise a transmitter of a receiver-transmitter pair; the receiver of the receiver-transmitter pair may be disposed upon an HMD worn by the user.
- the transmitter and receiver may be configured to cooperatively provide LMO information to the processor via the HMD.
- the receiver may be disposed upon the peripheral while the transmitter may be disposed upon the HMD, and the LMO information for the HMD and peripheral may be provided to the processor via the peripheral.
- the transmitter may transmit a signal comprising the LMO information for the transmitter relative to the receiver.
- the transmitter obtains LMO information from its corresponding LMO system and encodes the LMO information into a signal.
- the transmitter transmits the signal to the paired receiver.
- the transmitter and receiver repeatedly and continuously (or substantially continuously, at a suitable frequency) transmit and receive, respectively, signals to exchange substantially real-time LMO information for the LMO system.
- the receiver provides the transmitted LMO information from the transmitter to the processor.
- the transmitter and receiver pair may communicate using ZigBee, Bluetooth, WiFi, radio- frequency (RF) or other suitable wireless communication protocol.
- the transmitter may be a magnetic source and the receiver be a magnetic receiver/antenna or vice versa.
- the pair can be configured so that the receiver is able to detect the field of the source, so that the transmitter-emitter pair may serve as an LMO system and communication system.
- the LMO information of the peripheral may be provided to the processor as LMO information relative to the LMO information of the HMD.
- the processor may then map the substantially real-time location and orientation of the peripheral according to a transformation to transform the LMO information of the peripheral, which is defined relative to the LMO information of the HMD, to LMO information relative to the physical environment.
- a transformation to transform the LMO information of the peripheral, which is defined relative to the LMO information of the HMD, to LMO information relative to the physical environment.
- the LMO system of the HMD may return the substantially instantaneous location and orientation of the HMD relative to the coordinates of the physical environment, while the LMO system of the peripheral may cooperate with the LMO system of the HMD to provide LMO information relative to the HMD.
- the transformation may therefore be implemented in order to redefine the LMO information from the peripheral into the coordinate system defining the physical environment.
- a user may be equipped with a peripheral comprising a paired transmitter- receiver LMO system, while a display in the physical environment which the user is occupying may comprise the opposite half of the paired transmitter-receiver LMO system.
- the display may be, for example, an LCD, LED, OLED or plasma television or monitor.
- a processor within the physical environment or within the display, and in communication with the display and its LMO system, may be configured to obtain the LMO information of the peripheral relative to the location and orientation of the display so that the user's gestures may be tracked by the processor for engaging with the display. A user may thereby interact with the display using gestures.
- An LMO system may comprise one or more of the following: (i) a positioning sensor configured to track a location, the positioning sensor comprising, for example, one or more of a magnetic, ultrasound, radio-frequency (RF), LiDAR or radar type of sensor, whether alone or in combination; and (ii) a motion sensor configured to track motions, the motion sensor comprising, for example, one or more of the following: an accelerometer, a gyroscope, a magnetometer, whether alone or in combination.
- the sensors in the LMO system may be, for example, MEMS-type sensors. Certain sensor configurations may provide all LMO information as a single suite.
- the LMO system may comprise a 3D magnetic positioning unit, which alone may provide real-time location, positioning and orientation information.
- the magnetic positioning unit provides the location and orientation in absolute terms.
- absolute refers to locations and orientations which are determined relative to other features within the surrounding physical environment at a given point in time.
- dead reckoning or relative positioning relies on determining positions and orientations based on measuring and adding motion information.
- a magnetic positioning unit alone may be sufficiently fast and accurate to provide location and orientation information.
- typical refresh rates of a magnetic positioning unit may be unsuitable.
- certain magnetic positioning units are susceptible to magnetic "noise" in some environments which may render their readings ambiguous or unreliable. In such environments, it may be preferable, therefore, to implement other or further sensing systems.
- the LMO system and the processor cooperate to implement dead- reckoning techniques, i.e., techniques in which relative changes to the motion and orientation of the LMO system are added to a known initial location and orientation to return the location of the LMO system at a given point in time.
- Dead reckoning location calculation may suffer from cumulative errors and, further, the initial location and orientation of the sensor must be known or determined in certain applications. Therefore, in an embodiment of an LMO system using dead- reckoning techniques the LMO system comprises an inertial measurement unit (IMU) having a combination of an accelerometer and a gyroscope. Gyroscopes frequently experience latent rotational bias, while accelerometers may provide relatively less accurate dead-reckoning measurements.
- IMU inertial measurement unit
- the accelerometer preferably serves as a reference for the readings of the gyroscope from which the processor may determine the degree of rotational bias exhibited by the gyroscope and adjust the readings accordingly.
- the processor may determine the degree of rotational bias exhibited by the gyroscope and adjust the readings accordingly.
- the LMO system may comprise a positioning sensor, or the initial orientation may be determined from a motion sensor depending on the type of motion sensor used.
- the processor may be configured to assume aspects of orientation based on the parameters returned by the motion sensor at rest.
- the resting location of an accelerometer may provide the orientation of the LMO system with reference to the earth's gravity: the motion vector of an IMU at rest may be assumed by the processor to be pointing towards the earth.
- the processor may use this information to assign an initial orientation, subsequent deviation from which will be provided by the IMU as motion information.
- the processor may determine the orientation of the LMO system relative to the physical environment at a given point in time.
- the LMO system may comprise a local positioning unit from which the initial orientation and location may be determined, as previously described.
- the processor may determine the location and orientation of the HMD relative to an initial scan (or another prior scan) of the physical environment.
- the processor may identify features common to both the initial scan (or another prior scan) and a later scan to determine the change in location and orientation of HMD giving rise to any deviation between the scans apparent in the common feature.
- the use of the scanning system to determine the location and orientation of the HMD may be preferable to use of a magnetic positioning unit.
- the processor obtains readings from an IMU and a magnetic positioning unit in order to periodically verify the accuracy of, and correct, any dead reckoning calculations. Since the magnetic positioning unit provides absolute instead of relative orientation and location information, the processor may incorporate that information to adjust the calculated location and orientation of the LMO system. If the rate at which the IMU provides LMO information exceeds the same rate for the magnetic positioning unit, the IMU may be used by the processor for interim analysis between readings from the magnetic positioning unit.
- a user equipped with an HMD 12 may be equipped with additional LMO systems 21 configured to provide LMO information for various body parts to which the LMO systems 21 are coupled. For example, a user may wear a wrist band comprising an LMO system 21 , as shown in Fig. 2. As the user moves her hand and/or arm, the LMO system 21 may provide real-time LMO information which the processor may use to map the user's motions and gestures to the virtual map.
- peripherals or other wearable devices may comprise myography sensors, such as, for example, electromyography (EMG), mechanomyography (MMG) or phonomyography sensors, to measure the user's gestures and body movements.
- the myography sensors may provide motion information based on measurements to the processor so that the user's gestures may be mapped to the virtual map.
- the processor may access a library of "gestures" against which to compare the output of the myography sensors in order to apply the gestures to a given application. For example, a user equipped with a myography sensor configured to measure finger movements may depress a virtual trigger of a virtual gun by contracting her index finger.
- the processor may map the motion to the virtual map for rendering, while determining that the motion corresponds to a trigger motion in the gesture library. If the processor determines that the motion does correspond to the trigger motion, the processor will record the event to actuate an outcome according to predefined rules, such as, for example game parameters. Alternatively, the user may actuate the trigger event by, for example, depressing a trigger or other button on a peripheral device with her index finger.
- the processor may obtain an approximate finger motion from the gesture library and map the user's finger motion to the virtual map based on an approximate finger motion expected to actuate the trigger motion.
- an LMO system may be added to an existing peripheral so that the processor may map the motion of the peripheral to the virtual map.
- the processor may use the LMO information to derive LMO information for virtual objects and actions represented in the rendered image streams. For example, the processor may correlate a given LMO sensor to a virtual device, such as a virtual gun, so that any virtual bullet fired therefrom is represented in the rendered image stream according to the location, motion and orientation of the virtual gun at a time of firing. The trajectory of the virtual bullet may be mapped to the virtual map so that CGI representing the virtual bullet may be rendered according to the mapped trajectory.
- the LMO systems of dynamic objects coupled to a particular HMD may provide LMO information as values or vectors relative to the location and orientation of the HMD at a point in time.
- the HMD and a peripheral may each comprise a 3D magnetic positioning sensor configured to provide location and orientation information for the dynamic object relative to the HMD.
- the processor may map the location and orientation of the peripheral with reference to the HMD's location and orientation.
- the processor may render a rendered image stream comprising CGI for all mapped dynamic objects in the physical environment according to the method illustrated in Fig. 4.
- the processor generates or acquires a map of the physical environment, at block 401.
- the processor may generate the map based on information provided by the scanning system.
- the processor may obtain the coordinates from the scanning system of one or more HMDs for all scanned physical features of the physical environment and generate a virtual map as, for example, a 3D point cloud in which each point is defined according to its real- world (i.e., physical environment) dimensions.
- the processor may acquire the map from a map library in which an already generated map of static features within the physical environment is stored.
- the processor obtains the location and orientation information from the various LMO systems in the physical environment and maps their locations and orientations to the virtual map.
- the processor may repeatedly and continuously refresh the map by updating the locations and orientations of the dynamic objects according to the substantially real-time LMO information provided by their respective LMO systems.
- the processor may generate a rendered image stream comprising dynamic object- related CGI.
- a user's display of the HMD with which the user is equipped may display a physical image stream as well as the rendered image stream.
- the physical image stream may be captured by an imaging system, such as one or more cameras, and relayed to the display.
- the processor may therefore generate a rendered image stream whose CGI will be displayed by the display over corresponding features of the physical environment within the physical image stream. For example, in a game in which a first user and a second user occupy a physical environment, the processor may generate a rendered image stream for the first user in which the second user is represented as a zombie when the second user appears within the field of view of the first user's physical imaging system.
- the first user may perceive the second user as a zombie.
- the HMD 12 comprises a camera 123 to capture a physical image stream of the physical environment 431 , and an LMO system 21.
- the field of view of the camera 123 is shown by the dashed lines emanating outwardly therefrom.
- the processor may ascertain the field of view for the camera 123 from memory or directly from the camera.
- the LMO system 21 of the HMD provides substantially real-time location and orientation information for the HMD 12.
- the processor may ascertain the relative location and orientation (shown as coordinates Xc, Yc, Zc, ⁇ , (3c, yc) of the camera 123 with respect to the HMD 12 from memory, so that the processor may determine the field of view of the camera 123 at a given time based on the LMO information of the HMD 12 at that time.
- the processor may thereby map the field of view to the virtual map as a notional field of view.
- the processor may map a notional field of view which has the same parameters as the field of view of the camera 123; however, the processor may map a notional field of view with different parameters. For example, if the rendered image stream is displayed alone (i.e., without the physical image stream), as in a pure VR application, then the processor may define any parameters for the notional field of view.
- FIG. 6 A method for calibrating the rendered image stream is illustrated in Fig. 6. Assuming the processor has initiated a virtual map of the physical environment, as described above with respect to Fig. 4, at block 601 , the processor determines the location and orientation of the HMD based on the LMO information provided by the LMO system of the HMD. At block 603, the processor obtains, from a memory or from a camera system on the HMD, data comprising a correlation between the LMO information for the HMD and the field of view of the HMD. The correlation may be between the LMO information for the HMD and the actual parameters of field of view of the camera system, or the correlation may be between the LMO information for the HMD and a predetermined field of view having any defined parameters stored in the memory.
- the processor maps the notional field of view to the virtual map based on the correlation and the LMO information.
- the processor repeatedly and continuously updates the notional field of view in the virtual map based on substantially real-time LMO information provided by the LMO system of the HMD.
- the processor maps and repeatedly and continuously updates in the virtual map the locations and orientations of the dynamic objects within the physical environment, as described with respect to blocks 403 and 405 of Fig. 4.
- the processor either renders CGI for the entire physical environment, or renders CGI only for those features which are within the notional fields of view of any users occupying the physical environment.
- a rendered image stream may comprise only that CGI which is within the notional field of view.
- the processor applies a transformation, which may be provided by the imaging system, the display system or from the memory, to the CGI to transform the coordinates of the CGI from the coordinate system of the virtual map to the coordinate system of the display.
- the transformed CGI is then provided to the display as a rendered image stream.
- a physical image stream generated by the imaging system may be provided to the display substantially simultaneously to the rendered image stream, depending on whether AR or VR is to be displayed.
- the processor may generate multiple rendered image streams, one for each LMO system-equipped HMD occupying the physical environment.
- Each rendered image stream would account for the substantially real-time LMO information for the HMD to which it is provided, thereby enabling multiple users to experience individualised AR or VR according to their respective locations and orientations within the physical environment.
- the processor may adjust the rendered image stream to change the field of view so that the user experiences a field of view that substantially correlates to the field of view he would experience if he were moving throughout the environment without wearing an HMD. For example, a first user whose natural field of view includes a second user would expect the second user's relative location within the field of view to change as the user turns his head; if the first user turns his head leftwards, the second user should displace rightwards within the first user's field of view.
- the processor may simulate this effect by invoking the method described above with respect to Fig. 6.
- the notional field of view preferably has the same real-world parameters (such as dimensions, focal length and orientation) as the physical world field of view of the user's HMD.
- the user's display may then simultaneously display both the physical and rendered image streams to present an AR.
- the foregoing systems and methods may enable a user equipped with a wearable LMO system to perform exercises, the motions of which may be mapped and tracked for analysis by, for example, the user's trainer.
- the foregoing systems and methods may enable a user to use gesture controls to interact with a processor, for example to initiate AR or VR gameplay in a physical environment.
- the foregoing systems and methods may enable the processor to generate CGI for dynamic objects in a physical environment, even while those dynamic objects lie outside the field of view or capture region of a user's HMD.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Acoustics & Sound (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
L'invention concerne la localisation d'objets dynamiques dans un environnement physique pour les présenter à un utilisateur équipé d'un dispositif monté sur la tête. L'invention concerne un système comprenant le dispositif monté sur la tête, un processeur en communication avec un écran du dispositif monté sur la tête et en communication avec des systèmes de localisation, de mouvement et d'orientation disposés sur le dispositif monté sur la tête, ainsi qu'avec d'autres objets dynamiques dans l'environnement physique. Le processeur met en correspondance la localisation sensiblement en temps réel ainsi que l'orientation des objets dynamiques avec une carte virtuelle de l'environnement physique, en fonction d'informations d'emplacement, de mouvement et d'orientation fournies par chacun des systèmes. Le processeur met à jour de façon répétée et continue la carte virtuelle, en fonction de changements d'emplacement et d'orientation des objets dynamiques. Le processeur peut également simuler un champ de vision changeant ressenti par l'utilisateur du dispositif monté sur la tête.
Applications Claiming Priority (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201462052863P | 2014-09-19 | 2014-09-19 | |
| US62/052,863 | 2014-09-19 | ||
| US201462097331P | 2014-12-29 | 2014-12-29 | |
| US62/097,331 | 2014-12-29 | ||
| US201562099418P | 2015-01-02 | 2015-01-02 | |
| US62/099,418 | 2015-01-02 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016041088A1 true WO2016041088A1 (fr) | 2016-03-24 |
Family
ID=55532396
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CA2015/050918 Ceased WO2016041088A1 (fr) | 2014-09-19 | 2015-09-18 | Système et procédé de localisation de périphériques portables dans des applications de réalité augmentée et de réalité virtuelle |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2016041088A1 (fr) |
Cited By (63)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105785373A (zh) * | 2016-04-26 | 2016-07-20 | 上海与德通讯技术有限公司 | 一种虚拟现实位置识别系统及方法 |
| WO2017172661A1 (fr) * | 2016-03-31 | 2017-10-05 | Microsoft Technology Licensing, Llc | Suivi électromagnétique d'objets pour réalité mixte |
| US20170351094A1 (en) * | 2016-06-06 | 2017-12-07 | Adam G. Poulos | Optically augmenting electromagnetic tracking in mixed reality |
| WO2018022657A1 (fr) * | 2016-07-25 | 2018-02-01 | Ctrl-Labs Corporation | Système et procédé de mesure des mouvements de corps rigides articulés |
| WO2018075270A1 (fr) * | 2016-10-17 | 2018-04-26 | Microsoft Technology Licensing, Llc | Génération et affichage d'une image générée par ordinateur sur une pose future d'un objet du monde réel |
| WO2018118728A3 (fr) * | 2016-12-22 | 2018-07-26 | Microsoft Technology Licensing, Llc | Détection et correction d'interférence magnétique |
| US10151606B1 (en) | 2016-02-24 | 2018-12-11 | Ommo Technologies, Inc. | Tracking position and movement using a magnetic field |
| US10276289B1 (en) | 2018-06-01 | 2019-04-30 | Ommo Technologies, Inc. | Rotating a permanent magnet in a position detection system |
| CN110168475A (zh) * | 2016-11-14 | 2019-08-23 | 罗技欧洲公司 | 将用户接口装置导入虚拟现实/增强现实的系统 |
| US10409371B2 (en) | 2016-07-25 | 2019-09-10 | Ctrl-Labs Corporation | Methods and apparatus for inferring user intent based on neuromuscular signals |
| US10460455B2 (en) | 2018-01-25 | 2019-10-29 | Ctrl-Labs Corporation | Real-time processing of handstate representation model estimates |
| US10489986B2 (en) | 2018-01-25 | 2019-11-26 | Ctrl-Labs Corporation | User-controlled tuning of handstate representation model parameters |
| CN110521203A (zh) * | 2017-04-25 | 2019-11-29 | Ati科技无限责任公司 | 多头戴式显示器虚拟现实配置中的显示调步 |
| US10496168B2 (en) | 2018-01-25 | 2019-12-03 | Ctrl-Labs Corporation | Calibration techniques for handstate representation modeling using neuromuscular signals |
| US10504286B2 (en) | 2018-01-25 | 2019-12-10 | Ctrl-Labs Corporation | Techniques for anonymizing neuromuscular signal data |
| CN110825219A (zh) * | 2018-08-14 | 2020-02-21 | 三星电子株式会社 | 电子设备、电子设备的控制方法和电子系统 |
| US10592001B2 (en) | 2018-05-08 | 2020-03-17 | Facebook Technologies, Llc | Systems and methods for improved speech recognition using neuromuscular information |
| US10684692B2 (en) | 2014-06-19 | 2020-06-16 | Facebook Technologies, Llc | Systems, devices, and methods for gesture identification |
| US10687759B2 (en) | 2018-05-29 | 2020-06-23 | Facebook Technologies, Llc | Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods |
| GB2580915A (en) * | 2019-01-29 | 2020-08-05 | Sony Interactive Entertainment Inc | Peripheral tracking system and method |
| US10772519B2 (en) | 2018-05-25 | 2020-09-15 | Facebook Technologies, Llc | Methods and apparatus for providing sub-muscular control |
| US10817795B2 (en) | 2018-01-25 | 2020-10-27 | Facebook Technologies, Llc | Handstate reconstruction based on multiple inputs |
| US10825241B2 (en) | 2018-03-16 | 2020-11-03 | Microsoft Technology Licensing, Llc | Using a one-dimensional ray sensor to map an environment |
| US10842407B2 (en) | 2018-08-31 | 2020-11-24 | Facebook Technologies, Llc | Camera-guided interpretation of neuromuscular signals |
| US10905383B2 (en) | 2019-02-28 | 2021-02-02 | Facebook Technologies, Llc | Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces |
| US10921764B2 (en) | 2018-09-26 | 2021-02-16 | Facebook Technologies, Llc | Neuromuscular control of physical objects in an environment |
| US10928888B2 (en) | 2016-11-14 | 2021-02-23 | Logitech Europe S.A. | Systems and methods for configuring a hub-centric virtual/augmented reality environment |
| US10937414B2 (en) | 2018-05-08 | 2021-03-02 | Facebook Technologies, Llc | Systems and methods for text input using neuromuscular information |
| CN112601975A (zh) * | 2018-05-31 | 2021-04-02 | 奇跃公司 | 雷达头部姿势定位 |
| US10970936B2 (en) | 2018-10-05 | 2021-04-06 | Facebook Technologies, Llc | Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment |
| US10970374B2 (en) | 2018-06-14 | 2021-04-06 | Facebook Technologies, Llc | User identification and authentication with neuromuscular signatures |
| US10990174B2 (en) | 2016-07-25 | 2021-04-27 | Facebook Technologies, Llc | Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors |
| US11000211B2 (en) | 2016-07-25 | 2021-05-11 | Facebook Technologies, Llc | Adaptive system for deriving control signals from measurements of neuromuscular activity |
| US11045137B2 (en) | 2018-07-19 | 2021-06-29 | Facebook Technologies, Llc | Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device |
| US11069148B2 (en) | 2018-01-25 | 2021-07-20 | Facebook Technologies, Llc | Visualization of reconstructed handstate information |
| US11079846B2 (en) | 2013-11-12 | 2021-08-03 | Facebook Technologies, Llc | Systems, articles, and methods for capacitive electromyography sensors |
| US11112856B2 (en) | 2016-03-13 | 2021-09-07 | Logitech Europe S.A. | Transition between virtual and augmented reality |
| US11179066B2 (en) | 2018-08-13 | 2021-11-23 | Facebook Technologies, Llc | Real-time spike detection and identification |
| US11216069B2 (en) | 2018-05-08 | 2022-01-04 | Facebook Technologies, Llc | Systems and methods for improved speech recognition using neuromuscular information |
| CN114174959A (zh) * | 2019-09-11 | 2022-03-11 | 脸谱科技有限责任公司 | 由物理对象触发的人工现实 |
| US11331045B1 (en) | 2018-01-25 | 2022-05-17 | Facebook Technologies, Llc | Systems and methods for mitigating neuromuscular signal artifacts |
| US11481031B1 (en) | 2019-04-30 | 2022-10-25 | Meta Platforms Technologies, Llc | Devices, systems, and methods for controlling computing devices via neuromuscular signals of users |
| US11481030B2 (en) | 2019-03-29 | 2022-10-25 | Meta Platforms Technologies, Llc | Methods and apparatus for gesture detection and classification |
| US11493993B2 (en) | 2019-09-04 | 2022-11-08 | Meta Platforms Technologies, Llc | Systems, methods, and interfaces for performing inputs based on neuromuscular control |
| US11567573B2 (en) | 2018-09-20 | 2023-01-31 | Meta Platforms Technologies, Llc | Neuromuscular text entry, writing and drawing in augmented reality systems |
| US11635736B2 (en) | 2017-10-19 | 2023-04-25 | Meta Platforms Technologies, Llc | Systems and methods for identifying biological structures associated with neuromuscular source signals |
| US11666264B1 (en) | 2013-11-27 | 2023-06-06 | Meta Platforms Technologies, Llc | Systems, articles, and methods for electromyography sensors |
| US11797087B2 (en) | 2018-11-27 | 2023-10-24 | Meta Platforms Technologies, Llc | Methods and apparatus for autocalibration of a wearable electrode sensor system |
| US11868531B1 (en) | 2021-04-08 | 2024-01-09 | Meta Platforms Technologies, Llc | Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof |
| US11907423B2 (en) | 2019-11-25 | 2024-02-20 | Meta Platforms Technologies, Llc | Systems and methods for contextualized interactions with an environment |
| US11921471B2 (en) | 2013-08-16 | 2024-03-05 | Meta Platforms Technologies, Llc | Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source |
| US11961494B1 (en) | 2019-03-29 | 2024-04-16 | Meta Platforms Technologies, Llc | Electromagnetic interference reduction in extended reality environments |
| US12026527B2 (en) | 2022-05-10 | 2024-07-02 | Meta Platforms Technologies, Llc | World-controlled and application-controlled augments in an artificial-reality environment |
| US12056268B2 (en) | 2021-08-17 | 2024-08-06 | Meta Platforms Technologies, Llc | Platformization of mixed reality objects in virtual reality environments |
| US12086932B2 (en) | 2021-10-27 | 2024-09-10 | Meta Platforms Technologies, Llc | Virtual object structures and interrelationships |
| US12093447B2 (en) | 2022-01-13 | 2024-09-17 | Meta Platforms Technologies, Llc | Ephemeral artificial reality experiences |
| US12089953B1 (en) | 2019-12-04 | 2024-09-17 | Meta Platforms Technologies, Llc | Systems and methods for utilizing intrinsic current noise to measure interface impedances |
| US12106440B2 (en) | 2021-07-01 | 2024-10-01 | Meta Platforms Technologies, Llc | Environment model with surfaces and per-surface volumes |
| US12118745B2 (en) | 2019-05-15 | 2024-10-15 | Trumpf Tracking Technologies Gmbh | Method for coupling co-ordinate systems, and computer-assisted system |
| WO2025018552A1 (fr) * | 2023-07-20 | 2025-01-23 | 삼성전자주식회사 | Dispositif à porter sur soi pour communiquer avec un dispositif électronique externe, et procédé associé |
| US12254581B2 (en) | 2020-08-31 | 2025-03-18 | Meta Platforms Technologies, Llc | Artificial reality augments and surfaces |
| US12272012B2 (en) | 2021-06-02 | 2025-04-08 | Meta Platforms Technologies, Llc | Dynamic mixed reality content in virtual reality |
| US12504816B2 (en) | 2022-08-30 | 2025-12-23 | Meta Platforms Technologies, Llc | Wearable devices and associated band structures for sensing neuromuscular signals using sensor pairs in respective pods with communicative pathways to a common processor |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5844482A (en) * | 1997-05-20 | 1998-12-01 | Guthrie; Warren E. | Tagging system using motion detector |
| US6054951A (en) * | 1995-08-28 | 2000-04-25 | Sypniewski; Jozef | Multi-dimensional tracking sensor |
| US20050116823A1 (en) * | 2003-12-03 | 2005-06-02 | Torsten Paulsen | System for tracking object locations using self-tracking tags |
| US20120092328A1 (en) * | 2010-10-15 | 2012-04-19 | Jason Flaks | Fusing virtual content into real content |
| US20120195460A1 (en) * | 2011-01-31 | 2012-08-02 | Qualcomm Incorporated | Context aware augmentation interactions |
| US20120320169A1 (en) * | 2011-06-17 | 2012-12-20 | Microsoft Corporation | Volumetric video presentation |
-
2015
- 2015-09-18 WO PCT/CA2015/050918 patent/WO2016041088A1/fr not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6054951A (en) * | 1995-08-28 | 2000-04-25 | Sypniewski; Jozef | Multi-dimensional tracking sensor |
| US5844482A (en) * | 1997-05-20 | 1998-12-01 | Guthrie; Warren E. | Tagging system using motion detector |
| US20050116823A1 (en) * | 2003-12-03 | 2005-06-02 | Torsten Paulsen | System for tracking object locations using self-tracking tags |
| US20120092328A1 (en) * | 2010-10-15 | 2012-04-19 | Jason Flaks | Fusing virtual content into real content |
| US20120195460A1 (en) * | 2011-01-31 | 2012-08-02 | Qualcomm Incorporated | Context aware augmentation interactions |
| US20120320169A1 (en) * | 2011-06-17 | 2012-12-20 | Microsoft Corporation | Volumetric video presentation |
Cited By (86)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11921471B2 (en) | 2013-08-16 | 2024-03-05 | Meta Platforms Technologies, Llc | Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source |
| US11079846B2 (en) | 2013-11-12 | 2021-08-03 | Facebook Technologies, Llc | Systems, articles, and methods for capacitive electromyography sensors |
| US11666264B1 (en) | 2013-11-27 | 2023-06-06 | Meta Platforms Technologies, Llc | Systems, articles, and methods for electromyography sensors |
| US10684692B2 (en) | 2014-06-19 | 2020-06-16 | Facebook Technologies, Llc | Systems, devices, and methods for gesture identification |
| US10704929B1 (en) | 2016-02-24 | 2020-07-07 | Ommo Technologies, Inc. | Tracking position and movement using a magnetic field |
| US10151606B1 (en) | 2016-02-24 | 2018-12-11 | Ommo Technologies, Inc. | Tracking position and movement using a magnetic field |
| US11112856B2 (en) | 2016-03-13 | 2021-09-07 | Logitech Europe S.A. | Transition between virtual and augmented reality |
| WO2017172661A1 (fr) * | 2016-03-31 | 2017-10-05 | Microsoft Technology Licensing, Llc | Suivi électromagnétique d'objets pour réalité mixte |
| CN105785373A (zh) * | 2016-04-26 | 2016-07-20 | 上海与德通讯技术有限公司 | 一种虚拟现实位置识别系统及方法 |
| US10254546B2 (en) | 2016-06-06 | 2019-04-09 | Microsoft Technology Licensing, Llc | Optically augmenting electromagnetic tracking in mixed reality |
| US20170351094A1 (en) * | 2016-06-06 | 2017-12-07 | Adam G. Poulos | Optically augmenting electromagnetic tracking in mixed reality |
| WO2018022657A1 (fr) * | 2016-07-25 | 2018-02-01 | Ctrl-Labs Corporation | Système et procédé de mesure des mouvements de corps rigides articulés |
| US10409371B2 (en) | 2016-07-25 | 2019-09-10 | Ctrl-Labs Corporation | Methods and apparatus for inferring user intent based on neuromuscular signals |
| US10990174B2 (en) | 2016-07-25 | 2021-04-27 | Facebook Technologies, Llc | Methods and apparatus for predicting musculo-skeletal position information using wearable autonomous sensors |
| US11000211B2 (en) | 2016-07-25 | 2021-05-11 | Facebook Technologies, Llc | Adaptive system for deriving control signals from measurements of neuromuscular activity |
| US11337652B2 (en) | 2016-07-25 | 2022-05-24 | Facebook Technologies, Llc | System and method for measuring the movements of articulated rigid bodies |
| US10656711B2 (en) | 2016-07-25 | 2020-05-19 | Facebook Technologies, Llc | Methods and apparatus for inferring user intent based on neuromuscular signals |
| WO2018075270A1 (fr) * | 2016-10-17 | 2018-04-26 | Microsoft Technology Licensing, Llc | Génération et affichage d'une image générée par ordinateur sur une pose future d'un objet du monde réel |
| US10134192B2 (en) | 2016-10-17 | 2018-11-20 | Microsoft Technology Licensing, Llc | Generating and displaying a computer generated image on a future pose of a real world object |
| US10928888B2 (en) | 2016-11-14 | 2021-02-23 | Logitech Europe S.A. | Systems and methods for configuring a hub-centric virtual/augmented reality environment |
| EP3539087A4 (fr) * | 2016-11-14 | 2020-09-30 | Logitech Europe S.A. | Système d'importation de dispositifs d'interface utilisateur dans une réalité virtuelle/augmentée |
| CN110168475A (zh) * | 2016-11-14 | 2019-08-23 | 罗技欧洲公司 | 将用户接口装置导入虚拟现实/增强现实的系统 |
| CN110088711A (zh) * | 2016-12-22 | 2019-08-02 | 微软技术许可有限责任公司 | 磁干扰检测与校正 |
| CN110088711B (zh) * | 2016-12-22 | 2022-04-15 | 微软技术许可有限责任公司 | 磁干扰检测与校正 |
| WO2018118728A3 (fr) * | 2016-12-22 | 2018-07-26 | Microsoft Technology Licensing, Llc | Détection et correction d'interférence magnétique |
| US10746815B2 (en) | 2016-12-22 | 2020-08-18 | Microsoft Technology Licensing, Llc | Magnetic interference detection and correction |
| CN110521203A (zh) * | 2017-04-25 | 2019-11-29 | Ati科技无限责任公司 | 多头戴式显示器虚拟现实配置中的显示调步 |
| CN110521203B (zh) * | 2017-04-25 | 2022-03-11 | Ati科技无限责任公司 | 多头戴式显示器虚拟现实配置中的显示调步 |
| US11635736B2 (en) | 2017-10-19 | 2023-04-25 | Meta Platforms Technologies, Llc | Systems and methods for identifying biological structures associated with neuromuscular source signals |
| US10504286B2 (en) | 2018-01-25 | 2019-12-10 | Ctrl-Labs Corporation | Techniques for anonymizing neuromuscular signal data |
| US11331045B1 (en) | 2018-01-25 | 2022-05-17 | Facebook Technologies, Llc | Systems and methods for mitigating neuromuscular signal artifacts |
| US10817795B2 (en) | 2018-01-25 | 2020-10-27 | Facebook Technologies, Llc | Handstate reconstruction based on multiple inputs |
| US11163361B2 (en) | 2018-01-25 | 2021-11-02 | Facebook Technologies, Llc | Calibration techniques for handstate representation modeling using neuromuscular signals |
| US10496168B2 (en) | 2018-01-25 | 2019-12-03 | Ctrl-Labs Corporation | Calibration techniques for handstate representation modeling using neuromuscular signals |
| US11127143B2 (en) | 2018-01-25 | 2021-09-21 | Facebook Technologies, Llc | Real-time processing of handstate representation model estimates |
| US10950047B2 (en) | 2018-01-25 | 2021-03-16 | Facebook Technologies, Llc | Techniques for anonymizing neuromuscular signal data |
| US11587242B1 (en) | 2018-01-25 | 2023-02-21 | Meta Platforms Technologies, Llc | Real-time processing of handstate representation model estimates |
| US10489986B2 (en) | 2018-01-25 | 2019-11-26 | Ctrl-Labs Corporation | User-controlled tuning of handstate representation model parameters |
| US11069148B2 (en) | 2018-01-25 | 2021-07-20 | Facebook Technologies, Llc | Visualization of reconstructed handstate information |
| US10460455B2 (en) | 2018-01-25 | 2019-10-29 | Ctrl-Labs Corporation | Real-time processing of handstate representation model estimates |
| US11361522B2 (en) | 2018-01-25 | 2022-06-14 | Facebook Technologies, Llc | User-controlled tuning of handstate representation model parameters |
| US10825241B2 (en) | 2018-03-16 | 2020-11-03 | Microsoft Technology Licensing, Llc | Using a one-dimensional ray sensor to map an environment |
| US11036302B1 (en) | 2018-05-08 | 2021-06-15 | Facebook Technologies, Llc | Wearable devices and methods for improved speech recognition |
| US10937414B2 (en) | 2018-05-08 | 2021-03-02 | Facebook Technologies, Llc | Systems and methods for text input using neuromuscular information |
| US10592001B2 (en) | 2018-05-08 | 2020-03-17 | Facebook Technologies, Llc | Systems and methods for improved speech recognition using neuromuscular information |
| US11216069B2 (en) | 2018-05-08 | 2022-01-04 | Facebook Technologies, Llc | Systems and methods for improved speech recognition using neuromuscular information |
| US10772519B2 (en) | 2018-05-25 | 2020-09-15 | Facebook Technologies, Llc | Methods and apparatus for providing sub-muscular control |
| US10687759B2 (en) | 2018-05-29 | 2020-06-23 | Facebook Technologies, Llc | Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods |
| US11129569B1 (en) | 2018-05-29 | 2021-09-28 | Facebook Technologies, Llc | Shielding techniques for noise reduction in surface electromyography signal measurement and related systems and methods |
| CN112601975A (zh) * | 2018-05-31 | 2021-04-02 | 奇跃公司 | 雷达头部姿势定位 |
| US10276289B1 (en) | 2018-06-01 | 2019-04-30 | Ommo Technologies, Inc. | Rotating a permanent magnet in a position detection system |
| US10970374B2 (en) | 2018-06-14 | 2021-04-06 | Facebook Technologies, Llc | User identification and authentication with neuromuscular signatures |
| US11045137B2 (en) | 2018-07-19 | 2021-06-29 | Facebook Technologies, Llc | Methods and apparatus for improved signal robustness for a wearable neuromuscular recording device |
| US11179066B2 (en) | 2018-08-13 | 2021-11-23 | Facebook Technologies, Llc | Real-time spike detection and identification |
| CN110825219B (zh) * | 2018-08-14 | 2022-04-22 | 三星电子株式会社 | 电子设备、电子设备的控制方法和电子系统 |
| CN110825219A (zh) * | 2018-08-14 | 2020-02-21 | 三星电子株式会社 | 电子设备、电子设备的控制方法和电子系统 |
| US10905350B2 (en) | 2018-08-31 | 2021-02-02 | Facebook Technologies, Llc | Camera-guided interpretation of neuromuscular signals |
| US10842407B2 (en) | 2018-08-31 | 2020-11-24 | Facebook Technologies, Llc | Camera-guided interpretation of neuromuscular signals |
| US11567573B2 (en) | 2018-09-20 | 2023-01-31 | Meta Platforms Technologies, Llc | Neuromuscular text entry, writing and drawing in augmented reality systems |
| US10921764B2 (en) | 2018-09-26 | 2021-02-16 | Facebook Technologies, Llc | Neuromuscular control of physical objects in an environment |
| US10970936B2 (en) | 2018-10-05 | 2021-04-06 | Facebook Technologies, Llc | Use of neuromuscular signals to provide enhanced interactions with physical objects in an augmented reality environment |
| US11941176B1 (en) | 2018-11-27 | 2024-03-26 | Meta Platforms Technologies, Llc | Methods and apparatus for autocalibration of a wearable electrode sensor system |
| US11797087B2 (en) | 2018-11-27 | 2023-10-24 | Meta Platforms Technologies, Llc | Methods and apparatus for autocalibration of a wearable electrode sensor system |
| GB2580915A (en) * | 2019-01-29 | 2020-08-05 | Sony Interactive Entertainment Inc | Peripheral tracking system and method |
| GB2580915B (en) * | 2019-01-29 | 2021-06-09 | Sony Interactive Entertainment Inc | Peripheral tracking system and method |
| US11602684B2 (en) | 2019-01-29 | 2023-03-14 | Sony Interactive Entertainment Inc. | Peripheral tracking system and method |
| US10905383B2 (en) | 2019-02-28 | 2021-02-02 | Facebook Technologies, Llc | Methods and apparatus for unsupervised one-shot machine learning for classification of human gestures and estimation of applied forces |
| US11481030B2 (en) | 2019-03-29 | 2022-10-25 | Meta Platforms Technologies, Llc | Methods and apparatus for gesture detection and classification |
| US11961494B1 (en) | 2019-03-29 | 2024-04-16 | Meta Platforms Technologies, Llc | Electromagnetic interference reduction in extended reality environments |
| US11481031B1 (en) | 2019-04-30 | 2022-10-25 | Meta Platforms Technologies, Llc | Devices, systems, and methods for controlling computing devices via neuromuscular signals of users |
| US12118745B2 (en) | 2019-05-15 | 2024-10-15 | Trumpf Tracking Technologies Gmbh | Method for coupling co-ordinate systems, and computer-assisted system |
| US11493993B2 (en) | 2019-09-04 | 2022-11-08 | Meta Platforms Technologies, Llc | Systems, methods, and interfaces for performing inputs based on neuromuscular control |
| CN114174959A (zh) * | 2019-09-11 | 2022-03-11 | 脸谱科技有限责任公司 | 由物理对象触发的人工现实 |
| US12197634B2 (en) | 2019-09-11 | 2025-01-14 | Meta Platforms Technologies, Llc | Artificial reality triggered by physical object |
| US11907423B2 (en) | 2019-11-25 | 2024-02-20 | Meta Platforms Technologies, Llc | Systems and methods for contextualized interactions with an environment |
| US12089953B1 (en) | 2019-12-04 | 2024-09-17 | Meta Platforms Technologies, Llc | Systems and methods for utilizing intrinsic current noise to measure interface impedances |
| US12254581B2 (en) | 2020-08-31 | 2025-03-18 | Meta Platforms Technologies, Llc | Artificial reality augments and surfaces |
| US11868531B1 (en) | 2021-04-08 | 2024-01-09 | Meta Platforms Technologies, Llc | Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof |
| US12272012B2 (en) | 2021-06-02 | 2025-04-08 | Meta Platforms Technologies, Llc | Dynamic mixed reality content in virtual reality |
| US12106440B2 (en) | 2021-07-01 | 2024-10-01 | Meta Platforms Technologies, Llc | Environment model with surfaces and per-surface volumes |
| US12056268B2 (en) | 2021-08-17 | 2024-08-06 | Meta Platforms Technologies, Llc | Platformization of mixed reality objects in virtual reality environments |
| US12086932B2 (en) | 2021-10-27 | 2024-09-10 | Meta Platforms Technologies, Llc | Virtual object structures and interrelationships |
| US12093447B2 (en) | 2022-01-13 | 2024-09-17 | Meta Platforms Technologies, Llc | Ephemeral artificial reality experiences |
| US12026527B2 (en) | 2022-05-10 | 2024-07-02 | Meta Platforms Technologies, Llc | World-controlled and application-controlled augments in an artificial-reality environment |
| US12504816B2 (en) | 2022-08-30 | 2025-12-23 | Meta Platforms Technologies, Llc | Wearable devices and associated band structures for sensing neuromuscular signals using sensor pairs in respective pods with communicative pathways to a common processor |
| WO2025018552A1 (fr) * | 2023-07-20 | 2025-01-23 | 삼성전자주식회사 | Dispositif à porter sur soi pour communiquer avec un dispositif électronique externe, et procédé associé |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2016041088A1 (fr) | Système et procédé de localisation de périphériques portables dans des applications de réalité augmentée et de réalité virtuelle | |
| US10905950B2 (en) | Head-mounted display tracking | |
| US11928838B2 (en) | Calibration system and method to align a 3D virtual scene and a 3D real world for a stereoscopic head-mounted display | |
| US10852847B2 (en) | Controller tracking for multiple degrees of freedom | |
| EP3343320B1 (fr) | Appareil de traitement d'informations, système de traitement d'informations et procédé de traitement d'informations | |
| EP3000011B1 (fr) | Mise en place d'objets de réalité augmentée avec asservissement au corps | |
| US20170336220A1 (en) | Multi-Sensor Position and Orientation Determination System and Device | |
| US20140192164A1 (en) | System and method for determining depth information in augmented reality scene | |
| CN106774844A (zh) | 一种用于虚拟定位的方法及设备 | |
| KR20100047563A (ko) | 모의 훈련을 위한 증강 현실 장치 및 가상 이미지 합성 방법 | |
| US12198283B2 (en) | Smooth object correction for augmented reality devices | |
| JP6859447B2 (ja) | 情報処理システムおよび対象物情報取得方法 | |
| US20230062045A1 (en) | Display control device, display control method, and recording medium | |
| CN108734721B (zh) | 追踪系统以及追踪方法 | |
| WO2021177132A1 (fr) | Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations, et programme | |
| US11845001B2 (en) | Calibration system and method for handheld controller | |
| CN113029190B (zh) | 运动跟踪系统和方法 | |
| KR102391539B1 (ko) | 신호지연을 보정한 공간 공유 혼합 현실 교육 시스템 | |
| WO2024259520A1 (fr) | Procédé, système et appareil d'alignement spatial de dispositifs mobiles | |
| WO2024177912A1 (fr) | Correction de dérive et fusion de données de capteur optique-inertiel pour réalité virtuelle | |
| CN119533860A (zh) | 动作捕捉设备的校准方法、装置、设备和介质 | |
| WO2025075521A1 (fr) | Procédé d'étalonnage d'un ensemble de caméras pour suivi optique | |
| KR20210064692A (ko) | 신호지연을 보정한 공간 공유 혼합 현실 시스템 | |
| JP2017215873A (ja) | 表示装置、情報表示システム、及びプログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15841840 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15841840 Country of ref document: EP Kind code of ref document: A1 |