WO2023064762A1 - Computer-implemented methods for virtual reality systems - Google Patents
Computer-implemented methods for virtual reality systems Download PDFInfo
- Publication number
- WO2023064762A1 WO2023064762A1 PCT/US2022/077897 US2022077897W WO2023064762A1 WO 2023064762 A1 WO2023064762 A1 WO 2023064762A1 US 2022077897 W US2022077897 W US 2022077897W WO 2023064762 A1 WO2023064762 A1 WO 2023064762A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- collider
- physics
- virtual
- dynamic
- virtual representation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/57—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
- A63F13/577—Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
Definitions
- VR systems can generate VR environments, and can interact with one or more users.
- VR systems typically include a VR engine, input/output (I/O), application software, and a database.
- the VR system can receive user input (via the I/O device(s)), and can react or modify the VR environment based on the user input.
- One aspect of the invention provides a method for virtual reality (VR) tracking.
- the method includes: generating, in a VR system, a first virtual representation of a first object; coupling, in the VR system, a first dynamic collider to a surface of the first virtual representation; generating, in the VR system, a second virtual representation of a second object; coupling, in the VR system, a second dynamic collider to a surface of the second virtual representation; and normalizing, in the VR system, a position of the first dynamic collider to a position of the second dynamic collider for physics computations between the first virtual representation and the second virtual representation.
- VR virtual reality
- the first dynamic collider and the second dynamic collider can be two-dimensional.
- the method can further include: determining a position of the first dynamic collider in relation to the second dynamic collider; and modifying a position of the first dynamic collider on the surface of the first virtual representation based on the determination.
- the method can further include: determining that the first dynamic collider contacts the second dynamic collider; and calculating physics vectors for the second virtual representation based on the contact.
- the method can further include: calculating physics vectors for the first virtual representation during a time step when the contact occurs, wherein the calculating the physics vectors for the second virtual representation is based on the calculated physics vectors for the first virtual representation during the time step.
- the method can further include: calculating physics vectors for the second virtual representation during a time step when the contact occurs, wherein the calculating the physics vectors for the second virtual representation is based on the calculated physics vectors for the second virtual representation during the time step.
- the physics vectors can include one or more selected from the group consisting of: speed vectors, rotation vectors, or a combination thereof.
- the first object, the second object, or both, can include sports equipment.
- the first object can include a hockey stick.
- the second object can include a hockey puck.
- the method can further include: generating, in the VR system, a third virtual representation of a third object; and coupling a compound collider to a surface of the third virtual representation, wherein the compound collider remains unassociated with both the first dynamic collider and the second dynamic collider.
- Another aspect of the invention provides a method for computations in a virtual reality (VR) system.
- the method includes: identifying a distance between a first virtual object and a second virtual object within a rendered VR environment; and adjusting a VR computational parameter of the rendered VR environment based on the distance.
- VR virtual reality
- the first virtual object can be a hockey stick
- the second virtual object can be a hockey puck.
- the VR computational parameter can be a physical frame length, a physical parameter of virtual objects, or a combination thereof.
- the physical parameter of the virtual objects can include a drag parameter, a friction parameter, a gravity parameter, or a combination thereof.
- the adjustment can be a linear adjustment proportional to a change in the distance between the first virtual object and the second virtual object.
- the computer-implemented method includes: determining a physics simulation quality threshold for a VR environment; determining a number of physics frames to be included in a rendering frame for the VR environment; identifying a compensating factor based on the number of physics frames and processing limitations of the VR system; adjusting one or more physics parameters of a virtual object in the VR environment according to the compensating factor; and generating the VR environment according to the number of physics frames and the adjusted one or more physics parameters.
- VR Virtual Reality
- FIG. 1 depicts a virtual reality (VR) system according to embodiments of the present disclosure.
- VR virtual reality
- FIGS. 2 and 3 depict images of dynamic colliders in a VR environment according to embodiments of the present disclosure.
- FIG. 4 depicts a process flow for implementing dynamic colliders in a VR environment according to an embodiment of the present disclosure.
- FIGS. 5 and 6 depicts timing diagrams of a VR engine for rendering a VR environment according to embodiments of the present disclosure.
- the term “about” is understood as within a range of normal tolerance in the art, for example within 2 standard deviations of the mean. “About” can be understood as within 10%, 9%, 8%, 7%, 6%, 5%, 4%, 3%, 2%, 1%, 0.5%, 0.1%, 0.05%, or 0.01% of the stated value. Unless otherwise clear from context, all numerical values provided herein are modified by the term about.
- FIG. 1 depicts a VR system 100 according to embodiments of the present disclosure.
- the VR system 100 can include a VR engine 105, input/output (I/O) devices 110, application software 115, and a database 120. While embodiments of the systems described herein will refer to VR systems, the system can alternatively be an Augmented Reality (AR) system, an Extended Reality (XR) system, a Mixed Reality (MR) system, and the line
- AR Augmented Reality
- XR Extended Reality
- MR Mixed Reality
- the I/O devices 110 can include input devices, output devices, or both.
- the input devices can transmit signals to the VR system 100 regarding commands or actions performed by a system user.
- An input device can include a tracking device, a point-input device, a biocontroller, a voice device, a video-game controller, or a combination thereof.
- the input devices can include sports equipment worn or used by a user.
- input devices can include a hockey stick, a puck, a baseball bat, a glove, a baseball, a football, a soccer ball, a shoe, a glove, a golf club, a tennis racquet, protective equipment such as helmets, shoulder pads, shin guards, and the like, or a combination thereof.
- Input devices can be coupled with conventional sports equipment.
- an input device can be mounted on the shaft and/or the blade of an ice-hockey stick, on an ice-hockey goalie glove, and the like.
- an input device can include motion sensors (e.g., electromagnetic, ultrasonic, optical, mechanical, gyroscopic, etc.) such as a three-axis accelerometer, location sensors, pressure sensors, audio sensors, and the like.
- the sensors can be in communication with the VR engine 105, the application software 115, the database 120, other I/O devices 110, or a combination thereof.
- Output devices can receive feedback from the VR engine 105 and relay this feedback back to a user of the system 100.
- the output devices can be, for example, graphic devices, audio devices, haptic devices, and the like.
- the output devices can be a graphics device, such as a VR headset display or a projection screen, and audio devices that relay audio to the user (e.g., that can be integrated into the VR headset).
- the VR engine 105 can facilitate the execution of the VR software.
- the VR engine 105 can generate graphical models, object rendering, lighting, mapping, texturing, simulation, and display for the VR system 100.
- the VR engine 105 can also interface with I/O devices 110 and/or user of the VR system 100.
- the VR engine 105 hardware can include one or more central processing units, one or more graphical processing units, one or more memory systems, and the like.
- the application software 115 can facilitate the designing, developing, and maintenance of the virtual environment of the VR system 100.
- the application software 115 can include development software and modeling software.
- development software can include virtual-world authoring software, VR software development kits (SDK), application program interfaces (API), and the like.
- Colliders can disposed and/or generated on virtual representations of an I/O device for the calculations of physical movements and reactions of the corresponding I/O device.
- the I/O device can be a hockey stick, which can include one or more motion sensors coupled or integrated into the hockey stick.
- the virtual representation of the hockey stick can include one or more colliders representing surfaces of the hockey stick that can come in contact with a virtual representation of a hockey puck.
- the VR engine can determine physics parameters of the hockey puck and collider during the collision to determine resulting physics parameters of the hockey puck after the collision occurs.
- the VR engine can identify time of collision, velocity vectors of the hockey puck at the time of collision, and velocity vectors of the collider of the hockey stick at the time of collision. From these values, the VR engine can calculate resulting velocity vectors for the virtual hockey puck immediately after the collision.
- VR systems typically implement compound colliders.
- Compound colliders when aggregated with other compound colliders, approximate the shape of the VO device surface (e.g., the hockey stick blade and puck).
- Compound colliders are limited in that each compound collider is a simplified shape, such as a 2-dimensional square, a box, a sphere, a capsule, and the like, which can include sharp edges. These sharp edges, along with the static nature of the compound collider, can compromise the accuracy of collision calculations. For example, if the virtual hockey puck comes in contact with the collider edge, as opposed to the center of the collider, the VR engine can calculate incorrect physics vectors for the hockey puck.
- dynamic colliders may be implemented with VR system.
- a first dynamic collider can be disposed or generated on a surface of a virtual object.
- the virtual object may be a virtual representation of an input device, such as I/O device 110 of FIG. 1.
- a second dynamic collider may be disposed and/or generated on a surface of another virtual object.
- the first and second dynamic colliders can correspond to one another, such that the first dynamic collider and the second dynamic collider interact with each other when the surfaces of the virtual objects contact one another.
- a dynamic collider can be repositioned along the surface of its corresponding virtual object based on the position of the other corresponding dynamic collider.
- the first dynamic collider may be disposed on a virtual representation of a hockey stick blade.
- the second dynamic collider may be disposed on a virtual representation of a hockey puck.
- the dynamic collider of the hockey puck can be repositioned along the hockey puck based on the location of the dynamic collider on the hockey stick blade, and vice versa.
- the dynamic colliders can reposition along the respective surfaces such that, when the hockey puck contacts the hockey stick blade, the dynamic colliders are flush with one another (as opposed to only the dynamic collider edges contacting one another). This can increase the chances that physics vectors calculated from the collision (between the hockey puck and the hockey stick blade) are not impacted by the shapes and limitations of the underlying colliders. This can reduce glitches, simulate smooth edges of the virtual objects, and the like.
- the repositioning of the dynamic colliders with respect to one another may occur after crossing a proximity threshold.
- a proximity threshold e.g. 1 or 2 meters.
- the VR system can identify this crossing of the proximity threshold and may initiate the repositioning of the dynamic colliders. This may reserve VR system resources (e.g., processing resource, etc.).
- the dynamic repositioning can be performed such that the center of a face of at least one of the pair of dynamic colliders contacts or has a minimum distance from a contact point of the other of the pair of dynamic colliders. For example, if a hockey puck is approaching a hockey stick, the hockey puck will be traveling along a vector (that may change due to collisions with the ice, boards, players, and the like) towards the stick.
- the dynamic collider on the stick can shift to be positioned as close as possible to intersect with that vector, e.g., as the player moves the stick.
- Multiple dynamic collider combinations may exist in a VR environment. For example, while there may be a one-to-one pairing between dynamic colliders (e.g., the hockey stick blade and the hockey puck), multiple pairings may exist (e.g., between another hockey stick blade and another collider on the single hockey puck, and the like). Further, in some cases compound colliders can also be implemented in the VR environment in conjunction with the dynamic colliders. For example, compound colliders may be disposed and/or generated along the hockey rink boards (e.g., for collisions between the hockey puck and the boards).
- FIGS. 2 and 3 depicts images of dynamic colliders in a VR environment according to embodiments of the present disclosure.
- a VR system such as the VR system 100 discussed with reference to FIG. 1, may generate a VR environment corresponding to a hockey simulation.
- a hockey puck may be generated, along with other features typical of a hockey game, such as an ice surface, ice rink layouts (hockey rink, stadium, lights, etc.), hockey boundary lines, and the like.
- the VR system can also include a hockey stick as an VO device, for which the VR system can generate a virtual representation and receive user input.
- a dynamic collider 225 may be positioned on the annular surface of the virtual hockey puck.
- a corresponding dynamic collider 230 can be positioned on a surface of the virtual hockey stick blade.
- the colliders 225, 230 are computer-generated constructs that need not (and typically will not) be rendered in the graphical user interface.
- the dynamic collider 225, 230 are depicted as having three-dimensional depth, the dynamic colliders 225, 230 can be represented as a two-dimensional planar shape or the leading planar shape of a three- dimensional dynamic collider can be utilized for physics calculations.
- the dynamic colliders 225, 230 can be rendered proud of, tangential with, or recessed from a depicted boundary of an object such as an ice-hockey stick or puck.
- the dynamic collider 225 may be configured to reposition along the annular surface based on the location of the dynamic collider 230.
- the VR system can determine a position of the dynamic collider 230 in a particular physics frame (Step 405 of FIG. 4).
- the VR system can also determine a vector normal to the dynamic collider 225, such as a vector normal to the surface 235 of the dynamic collider 225 (Step 410).
- the VR system can determine a position for the normal vector for normalizing the normal vector to the position of the dynamic collider 230 (Step 415).
- the VR system can reposition the dynamic collider 225 (e.g., in the next physics frame) such that the vector normal to the dynamic collider is also normal to the position of the dynamic collider 230 (Step 420).
- the VR system can determine an angle between the current position of the normal vector and the new position, and can adjust the position of the dynamic collider 225 according to the angle.
- the VR system may monitor VR geocoordinates, and the VR system can determine a geocoordinate for repositioning the dynamic collider 225.
- one dynamic collider (e.g., dynamic collider 225) may be repositionable, while the corresponding collider may be statically positioned.
- both dynamic colliders may be repositionable based on the location of the other collider. In these cases, both dynamic colliders may implement the steps discussed above.
- a common problem with VR systems is the “rubber band” effect, and/or physics glitches and other unwanted behavior.
- the movement of virtual objects in the VR environment can lag behind the movement of the corresponding I/O devices of the VR system.
- the VR engine of the system can behave discreetly and simplifies the VR physics calculations compared to the real physics of the corresponding VO devices (e.g., in order to calculate the physics vectors more quickly).
- this can result in this “rubber band” effect, which can manifest as speeding up, or slowing down, of the VR graphics displayed to the user.
- glitches or other unwanted behavior may occur due to these computing limitations.
- Insufficient computing power of the VR system can result in a freeze or inaccurate depictions (e.g., a glitch) of the rendering frame.
- the freeze or inaccuracies can cause problems for users of the VR system.
- the physics engine has not completed the number of physics calculations required for the rendering frame (e.g., 13 ms), which can be a predefined number of physics frames (e.g. approximately 7 physics frames of 2 ms lengths).
- the impact can require an increasing number of physics calculations performed by the VR engine.
- the engine can compensate for this increase by slowing down the time within the environment.
- the physics calculations of the rendering frame can speed up the time within the environment, resulting in a user observing a slow-motion of the hockey puck leaving the hockey stick, and then a quick burst of the hockey puck moving at faster-than-expected speeds if a computer freeze occurred, or an increase in speed that catches up to the expected speed of the hockey puck..
- a VR system can be configured to linearly adjust certain parameters within the VR environment for processing constraints of the VR engine.
- the VR engine can monitor a VR parameter of the VR environment, for example, a distance between a first virtual object and a second virtual object.
- the VR engine can monitor a distance between a virtual object and a position of a user within the VR environment.
- the VR engine can monitor the distance between a virtual hockey puck and a virtual hockey stick of a user.
- the VR engine can adjust a physics frame size in relation to the rendering frame based on the value of the monitored VR parameter. For example, the VR engine can monitor the distance between a virtual hockey puck and a virtual hockey stick. In this example, if the VR engine has completed a rendering frame, and if there is time to render another graphics frame (e.g., which can begin .01 seconds after a preceding frame), but the physics engine has completed only a subset of the physics frames required for a rendering frame (e.g., 4 out of 7 physics frames), the VR system can render the state of game the physics frames that the physics engine has completed. This technique can still render graphic frames at the same pace as a fully calculated rendering frame, so the rendering frames can have the same timing in the real world, but in some cases may represent shorter time in the VR simulation.
- the physics frame can be longer, so there may be less frames for the same amount of time and the system may include sufficient computational power.
- the physics frame length can be linearly shortened. Thus, the same amount of real time can be divided among more physics frames. The computations performed may be more precise, but more computational power may be required for the rendering frame.
- the physics engine may experience the “slo-mo” scenario described above based on the lack of computing power. However, the VR system may mitigate the “slo-mo” effect by adjusting other parameters of the VR environment.
- the VR engine can adjust other parameters of the VR environment based on the parameter monitoring. For example, the VR engine can identify that the virtual object is closer to the user compared to a previous monitoring interval. The VR engine, based on this change in distance, may adjust another parameter of the VR environment, for example, a physical parameter of a virtual object, such as a drag parameter, a friction parameter, a gravity parameter, and the like.
- a physical parameter of a virtual object such as a drag parameter, a friction parameter, a gravity parameter, and the like.
- FIG. 5 depicts a timing diagram of a VR system with necessary processing/computing resources for generating the VR environment.
- the tracking data is received by the VR engine, performs physics calculations (e.g., timing, velocities, collider data, and the like), and generates a GPU rendering frame from the tracking data and calculations.
- the VR environment timing aligns precisely (or near precisely) with the real-world timing (e.g., the VR engine initiates the GPU rendering frame as the tracking data is received, and VR environment time passes on a one-to-one ratio with the real world timing).
- FIG. 6 depicts a timing diagram for a VR system according to an embodiment of the disclosure provided herein.
- the VR engine may adjust the timing parameter of the VR environment by increasing the number of physics frames included in a rendering frame.
- the VR engine can experience a “slow motion” in a GPU rendering frame due to the increase in physics calculations performed by the VR system.
- the VR engine can mitigate for the slow-motion effect by adjusting other parameters of the VR system.
- the VR system can adjust the number of physics frames in a rendering frame in the VR environment when a ball or puck becomes closer to a user or I/O device.
- the VR system can determine a position of a virtual hockey puck in relation to a virtual hockey stick corresponding to an VO hockey stick of a user.
- the VR system can determine that the distance between the virtual hockey puck and the virtual hockey stick falls below a predefined threshold.
- the VR system can then increase the number of physics frames within a rendering frame, which can increase the number of physics calculated for the rendering frame of the VR environment.
- the VR system may implement this “slow motion” scenario in order to generate a high quality physics simulation.
- the VR system can increase the number of physics frames within a given rendering frame. This increase can provide an increase in physics calculations per rendering frame.
- the VR system may also compensate for this increase in physics frames by adjusting other VR parameters (e.g., to compensate for an increase in processing resources in a process-resource-limiting system), such as physics parameters of one or more virtual objects of the VR environment.
- the increase in physics frames may be extended based on the desired quality of the physics simulations.
- the parameter adjustment within the VR environment may be linear in nature.
- the VR engine may adjust the timing parameter of the environment proportionally to the distance away from the user the tracked virtual object is.
- the adjustments made in the VR environment may be imperceptible to a user, and may also reduce the possibility of the “rubber band” effect, or may reduce the frequency and severity of glitch occurrences or other unwanted behavior.
- the VR engine may also be configured to provide fast paced-physics calculations. Physics calculations in a VR engine is discreet. The calculations are repeated within time steps of the same length, which generally are between 20-50ms. This time step may be insufficient for fast and precise movements that can occur in simulations (e.g., for sports simulations), and may lead to glitches and inaccuracy.
- the VR engine described herein may be configured to reduce a time step to every 2 ms. This may reduce glitches and increase accuracy of a rendered VR environment, particularly for sports-related simulations.
- Joints in a VR environment can connect to virtual objects together to simulate a physics movement.
- joints can connect objects representing the position of a VR tracker (e.g., attached to a hockey stick) with the tracker’s virtual representation with colliders.
- ToAngleAxis ( out angle , out axis ) ;
- angularTarget angle * axis ; f ollowingOb j ect .
- angularVelocity angularVelocityCoef f * angularTarget ;
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Processing Or Creating Images (AREA)
Abstract
One aspect of the invention provides a method for virtual reality (VR) tracking. The method includes: generating, in a VR system, a first virtual representation of a first object; coupling, in the VR system, a first dynamic collider to a surface of the first virtual representation; generating, in the VR system, a second virtual representation of a second object; coupling, in the VR system, a second dynamic collider to a surface of the second virtual representation; and normalizing, in the VR system, a position of the first dynamic collider to a position of the second dynamic collider for physics computations between the first virtual representation and the second virtual representation.
Description
COMPUTER-IMPLEMENTED METHODS FOR VIRTUAL REALITY SYSTEMS
CROSS-REFERENCE TO RELATED APPLICATION
This application claims the benefit priority to U.S. Provisional Patent Application Serial No. 63/262,533, filed October 14, 2021. The entire content of this application is hereby incorporated by reference herein.
BACKGROUND OF THE INVENTION
Virtual reality (VR) systems can generate VR environments, and can interact with one or more users. VR systems typically include a VR engine, input/output (I/O), application software, and a database. The VR system can receive user input (via the I/O device(s)), and can react or modify the VR environment based on the user input.
SUMMARY
One aspect of the invention provides a method for virtual reality (VR) tracking. The method includes: generating, in a VR system, a first virtual representation of a first object; coupling, in the VR system, a first dynamic collider to a surface of the first virtual representation; generating, in the VR system, a second virtual representation of a second object; coupling, in the VR system, a second dynamic collider to a surface of the second virtual representation; and normalizing, in the VR system, a position of the first dynamic collider to a position of the second dynamic collider for physics computations between the first virtual representation and the second virtual representation.
This aspect of the invention can have a variety of embodiments. The first dynamic collider and the second dynamic collider can be two-dimensional.
The method can further include: determining a position of the first dynamic collider in relation to the second dynamic collider; and modifying a position of the first dynamic collider on the surface of the first virtual representation based on the determination.
The method can further include: determining that the first dynamic collider contacts the second dynamic collider; and calculating physics vectors for the second virtual representation based on the contact. The method can further include: calculating physics vectors for the first virtual representation during a time step when the contact occurs, wherein the calculating the physics vectors for the second virtual representation is based on the calculated physics vectors
for the first virtual representation during the time step. The method can further include: calculating physics vectors for the second virtual representation during a time step when the contact occurs, wherein the calculating the physics vectors for the second virtual representation is based on the calculated physics vectors for the second virtual representation during the time step. The physics vectors can include one or more selected from the group consisting of: speed vectors, rotation vectors, or a combination thereof.
The first object, the second object, or both, can include sports equipment. The first object can include a hockey stick. The second object can include a hockey puck.
The method can further include: generating, in the VR system, a third virtual representation of a third object; and coupling a compound collider to a surface of the third virtual representation, wherein the compound collider remains unassociated with both the first dynamic collider and the second dynamic collider.
Another aspect of the invention provides a method for computations in a virtual reality (VR) system. The method includes: identifying a distance between a first virtual object and a second virtual object within a rendered VR environment; and adjusting a VR computational parameter of the rendered VR environment based on the distance.
This aspect of the invention can have a variety of embodiments. The first virtual object can be a hockey stick, and the second virtual object can be a hockey puck. The VR computational parameter can be a physical frame length, a physical parameter of virtual objects, or a combination thereof. The physical parameter of the virtual objects can include a drag parameter, a friction parameter, a gravity parameter, or a combination thereof.
The adjustment can be a linear adjustment proportional to a change in the distance between the first virtual object and the second virtual object.
Another aspect of the invention provides a computer-implemented method via a Virtual Reality (VR) system. The computer-implemented method includes: determining a physics simulation quality threshold for a VR environment; determining a number of physics frames to be included in a rendering frame for the VR environment; identifying a compensating factor based on the number of physics frames and processing limitations of the VR system; adjusting one or more physics parameters of a virtual object in the VR environment according to the compensating factor; and generating the VR environment according to the number of physics frames and the adjusted one or more physics parameters.
BRIEF DESCRIPTION OF THE DRAWINGS
For a fuller understanding of the nature and desired objects of the present invention, reference is made to the following detailed description taken in conjunction with the accompanying drawing figures wherein like reference characters denote corresponding parts throughout the several views.
FIG. 1 depicts a virtual reality (VR) system according to embodiments of the present disclosure.
FIGS. 2 and 3 depict images of dynamic colliders in a VR environment according to embodiments of the present disclosure.
FIG. 4 depicts a process flow for implementing dynamic colliders in a VR environment according to an embodiment of the present disclosure.
FIGS. 5 and 6 depicts timing diagrams of a VR engine for rendering a VR environment according to embodiments of the present disclosure.
DEFINITIONS
The instant invention is most clearly understood with reference to the following definitions.
As used herein, the singular form “a,” “an,” and “the” include plural references unless the context clearly dictates otherwise.
Unless specifically stated or obvious from context, as used herein, the term “about” is understood as within a range of normal tolerance in the art, for example within 2 standard deviations of the mean. “About” can be understood as within 10%, 9%, 8%, 7%, 6%, 5%, 4%, 3%, 2%, 1%, 0.5%, 0.1%, 0.05%, or 0.01% of the stated value. Unless otherwise clear from context, all numerical values provided herein are modified by the term about.
As used in the specification and claims, the terms “comprises,” “comprising,” “containing,” “having,” and the like can have the meaning ascribed to them in U.S. patent law and can mean “includes,” “including,” and the like.
Unless specifically stated or obvious from context, the term “or,” as used herein, is understood to be inclusive.
DETAILED DESCRIPTION OF THE INVENTION
VR System
FIG. 1 depicts a VR system 100 according to embodiments of the present disclosure. The VR system 100 can include a VR engine 105, input/output (I/O) devices 110, application software 115, and a database 120. While embodiments of the systems described herein will refer to VR systems, the system can alternatively be an Augmented Reality (AR) system, an Extended Reality (XR) system, a Mixed Reality (MR) system, and the line
The I/O devices 110 can include input devices, output devices, or both. The input devices can transmit signals to the VR system 100 regarding commands or actions performed by a system user. An input device can include a tracking device, a point-input device, a biocontroller, a voice device, a video-game controller, or a combination thereof. In the context of a sports VR system, the input devices can include sports equipment worn or used by a user. For example, input devices can include a hockey stick, a puck, a baseball bat, a glove, a baseball, a football, a soccer ball, a shoe, a glove, a golf club, a tennis racquet, protective equipment such as helmets, shoulder pads, shin guards, and the like, or a combination thereof. Input devices can be coupled with conventional sports equipment. For example, an input device can be mounted on the shaft and/or the blade of an ice-hockey stick, on an ice-hockey goalie glove, and the like.
Various sensors can be coupled or integrated into the input devices, which can be in communication with other components of the VR system 100. For example, an input device can include motion sensors (e.g., electromagnetic, ultrasonic, optical, mechanical, gyroscopic, etc.) such as a three-axis accelerometer, location sensors, pressure sensors, audio sensors, and the like. The sensors can be in communication with the VR engine 105, the application software 115, the database 120, other I/O devices 110, or a combination thereof.
Output devices can receive feedback from the VR engine 105 and relay this feedback back to a user of the system 100. The output devices can be, for example, graphic devices, audio devices, haptic devices, and the like. For example, in the context of a sports VR system, the output devices can be a graphics device, such as a VR headset display or a projection screen, and audio devices that relay audio to the user (e.g., that can be integrated into the VR headset).
The VR engine 105 can facilitate the execution of the VR software. For example, the VR engine 105 can generate graphical models, object rendering, lighting, mapping, texturing, simulation, and display for the VR system 100. The VR engine 105 can also interface with I/O
devices 110 and/or user of the VR system 100. The VR engine 105 hardware can include one or more central processing units, one or more graphical processing units, one or more memory systems, and the like.
The application software 115 can facilitate the designing, developing, and maintenance of the virtual environment of the VR system 100. The application software 115 can include development software and modeling software. For example, development software can include virtual-world authoring software, VR software development kits (SDK), application program interfaces (API), and the like.
Dynamic Colliders
Colliders can disposed and/or generated on virtual representations of an I/O device for the calculations of physical movements and reactions of the corresponding I/O device. For example, for a hockey VR environment, the I/O device can be a hockey stick, which can include one or more motion sensors coupled or integrated into the hockey stick. The virtual representation of the hockey stick can include one or more colliders representing surfaces of the hockey stick that can come in contact with a virtual representation of a hockey puck. When a collider contacts the virtual hockey puck, the VR engine can determine physics parameters of the hockey puck and collider during the collision to determine resulting physics parameters of the hockey puck after the collision occurs. For example, when a collision occurs between the virtual hockey stick and virtual hockey puck, the VR engine can identify time of collision, velocity vectors of the hockey puck at the time of collision, and velocity vectors of the collider of the hockey stick at the time of collision. From these values, the VR engine can calculate resulting velocity vectors for the virtual hockey puck immediately after the collision.
However, due to the complicated surfaces of I/O devices (e.g., a curved blade of a hockey stick and curved sides of a hockey puck), VR systems typically implement compound colliders. Compound colliders, when aggregated with other compound colliders, approximate the shape of the VO device surface (e.g., the hockey stick blade and puck). Compound colliders are limited in that each compound collider is a simplified shape, such as a 2-dimensional square, a box, a sphere, a capsule, and the like, which can include sharp edges. These sharp edges, along with the static nature of the compound collider, can compromise the accuracy of collision calculations. For example, if the virtual hockey puck comes in contact with the collider edge, as opposed to
the center of the collider, the VR engine can calculate incorrect physics vectors for the hockey puck.
According to the present disclosure, dynamic colliders may be implemented with VR system. A first dynamic collider can be disposed or generated on a surface of a virtual object. In some cases, the virtual object may be a virtual representation of an input device, such as I/O device 110 of FIG. 1. A second dynamic collider may be disposed and/or generated on a surface of another virtual object. The first and second dynamic colliders can correspond to one another, such that the first dynamic collider and the second dynamic collider interact with each other when the surfaces of the virtual objects contact one another.
For example, a dynamic collider can be repositioned along the surface of its corresponding virtual object based on the position of the other corresponding dynamic collider. For example and as depicted in FIGS. 2 and 3, in some cases the first dynamic collider may be disposed on a virtual representation of a hockey stick blade. The second dynamic collider may be disposed on a virtual representation of a hockey puck.
As the hockey puck travels, the dynamic collider of the hockey puck can be repositioned along the hockey puck based on the location of the dynamic collider on the hockey stick blade, and vice versa. For example, when the hockey puck approaches the hockey stick blade, the dynamic colliders can reposition along the respective surfaces such that, when the hockey puck contacts the hockey stick blade, the dynamic colliders are flush with one another (as opposed to only the dynamic collider edges contacting one another). This can increase the chances that physics vectors calculated from the collision (between the hockey puck and the hockey stick blade) are not impacted by the shapes and limitations of the underlying colliders. This can reduce glitches, simulate smooth edges of the virtual objects, and the like.
In some cases, the repositioning of the dynamic colliders with respect to one another may occur after crossing a proximity threshold. For example, the hockey puck of the above example may move towards the hockey stick blade, and may cross within a predefined distance threshold (e.g., 1 or 2 meters). The VR system can identify this crossing of the proximity threshold and may initiate the repositioning of the dynamic colliders. This may reserve VR system resources (e.g., processing resource, etc.).
The dynamic repositioning can be performed such that the center of a face of at least one of the pair of dynamic colliders contacts or has a minimum distance from a contact point of the
other of the pair of dynamic colliders. For example, if a hockey puck is approaching a hockey stick, the hockey puck will be traveling along a vector (that may change due to collisions with the ice, boards, players, and the like) towards the stick. The dynamic collider on the stick can shift to be positioned as close as possible to intersect with that vector, e.g., as the player moves the stick.
Multiple dynamic collider combinations may exist in a VR environment. For example, while there may be a one-to-one pairing between dynamic colliders (e.g., the hockey stick blade and the hockey puck), multiple pairings may exist (e.g., between another hockey stick blade and another collider on the single hockey puck, and the like). Further, in some cases compound colliders can also be implemented in the VR environment in conjunction with the dynamic colliders. For example, compound colliders may be disposed and/or generated along the hockey rink boards (e.g., for collisions between the hockey puck and the boards).
FIGS. 2 and 3 depicts images of dynamic colliders in a VR environment according to embodiments of the present disclosure. A VR system, such as the VR system 100 discussed with reference to FIG. 1, may generate a VR environment corresponding to a hockey simulation. A hockey puck may be generated, along with other features typical of a hockey game, such as an ice surface, ice rink layouts (hockey rink, stadium, lights, etc.), hockey boundary lines, and the like. The VR system can also include a hockey stick as an VO device, for which the VR system can generate a virtual representation and receive user input.
A dynamic collider 225 may be positioned on the annular surface of the virtual hockey puck. A corresponding dynamic collider 230 can be positioned on a surface of the virtual hockey stick blade. The colliders 225, 230 are computer-generated constructs that need not (and typically will not) be rendered in the graphical user interface. Similarly, although the dynamic collider 225, 230 are depicted as having three-dimensional depth, the dynamic colliders 225, 230 can be represented as a two-dimensional planar shape or the leading planar shape of a three- dimensional dynamic collider can be utilized for physics calculations. As depicted in FIGS. 2 and 3, the dynamic colliders 225, 230 can be rendered proud of, tangential with, or recessed from a depicted boundary of an object such as an ice-hockey stick or puck.
The dynamic collider 225 may be configured to reposition along the annular surface based on the location of the dynamic collider 230. For example, the VR system can determine a position of the dynamic collider 230 in a particular physics frame (Step 405 of FIG. 4). The VR
system can also determine a vector normal to the dynamic collider 225, such as a vector normal to the surface 235 of the dynamic collider 225 (Step 410). The VR system can determine a position for the normal vector for normalizing the normal vector to the position of the dynamic collider 230 (Step 415). The VR system can reposition the dynamic collider 225 (e.g., in the next physics frame) such that the vector normal to the dynamic collider is also normal to the position of the dynamic collider 230 (Step 420). For example, the VR system can determine an angle between the current position of the normal vector and the new position, and can adjust the position of the dynamic collider 225 according to the angle. In some cases, the VR system may monitor VR geocoordinates, and the VR system can determine a geocoordinate for repositioning the dynamic collider 225.
In some cases, one dynamic collider (e.g., dynamic collider 225) may be repositionable, while the corresponding collider may be statically positioned. In other cases, both dynamic colliders may be repositionable based on the location of the other collider. In these cases, both dynamic colliders may implement the steps discussed above.
Mitigation of Effects Related to Insufficient Computing Power for Physics Calculations
A common problem with VR systems is the “rubber band” effect, and/or physics glitches and other unwanted behavior. The movement of virtual objects in the VR environment can lag behind the movement of the corresponding I/O devices of the VR system. The VR engine of the system can behave discreetly and simplifies the VR physics calculations compared to the real physics of the corresponding VO devices (e.g., in order to calculate the physics vectors more quickly). However, this can result in this “rubber band” effect, which can manifest as speeding up, or slowing down, of the VR graphics displayed to the user. In some cases, glitches or other unwanted behavior may occur due to these computing limitations.
Insufficient computing power of the VR system can result in a freeze or inaccurate depictions (e.g., a glitch) of the rendering frame. The freeze or inaccuracies can cause problems for users of the VR system. If there is computing power missing in the physics calculations, the physics engine has not completed the number of physics calculations required for the rendering frame (e.g., 13 ms), which can be a predefined number of physics frames (e.g. approximately 7 physics frames of 2 ms lengths).
For example, in the example of the hockey VR simulation, as the hockey puck contacts a hockey stick blade in the VR environment, the impact can require an increasing number of
physics calculations performed by the VR engine. The engine can compensate for this increase by slowing down the time within the environment. However, as the engine “catches up” to the time of the physical world and physics calculations for the virtual objects, the physics calculations of the rendering frame can speed up the time within the environment, resulting in a user observing a slow-motion of the hockey puck leaving the hockey stick, and then a quick burst of the hockey puck moving at faster-than-expected speeds if a computer freeze occurred, or an increase in speed that catches up to the expected speed of the hockey puck..
According to the present disclosure, a VR system can be configured to linearly adjust certain parameters within the VR environment for processing constraints of the VR engine. The VR engine can monitor a VR parameter of the VR environment, for example, a distance between a first virtual object and a second virtual object. In some cases, the VR engine can monitor a distance between a virtual object and a position of a user within the VR environment. For example, in the context of a hockey VR simulation, the VR engine can monitor the distance between a virtual hockey puck and a virtual hockey stick of a user.
The VR engine can adjust a physics frame size in relation to the rendering frame based on the value of the monitored VR parameter. For example, the VR engine can monitor the distance between a virtual hockey puck and a virtual hockey stick. In this example, if the VR engine has completed a rendering frame, and if there is time to render another graphics frame (e.g., which can begin .01 seconds after a preceding frame), but the physics engine has completed only a subset of the physics frames required for a rendering frame (e.g., 4 out of 7 physics frames), the VR system can render the state of game the physics frames that the physics engine has completed. This technique can still render graphic frames at the same pace as a fully calculated rendering frame, so the rendering frames can have the same timing in the real world, but in some cases may represent shorter time in the VR simulation.
When the virtual hockey puck is far away from the virtual hockey stick, the physics frame can be longer, so there may be less frames for the same amount of time and the system may include sufficient computational power. When the virtual hockey puck approaches the virtual hockey stick, the physics frame length can be linearly shortened. Thus, the same amount of real time can be divided among more physics frames. The computations performed may be more precise, but more computational power may be required for the rendering frame. The physics engine may experience the “slo-mo” scenario described above based on the lack of
computing power. However, the VR system may mitigate the “slo-mo” effect by adjusting other parameters of the VR environment.
The VR engine can adjust other parameters of the VR environment based on the parameter monitoring. For example, the VR engine can identify that the virtual object is closer to the user compared to a previous monitoring interval. The VR engine, based on this change in distance, may adjust another parameter of the VR environment, for example, a physical parameter of a virtual object, such as a drag parameter, a friction parameter, a gravity parameter, and the like.
FIG. 5 depicts a timing diagram of a VR system with necessary processing/computing resources for generating the VR environment. As shown, the tracking data is received by the VR engine, performs physics calculations (e.g., timing, velocities, collider data, and the like), and generates a GPU rendering frame from the tracking data and calculations. In an ideal VR engine (e.g., one with sufficient computing capabilities regardless of the computation burden), the VR environment timing aligns precisely (or near precisely) with the real-world timing (e.g., the VR engine initiates the GPU rendering frame as the tracking data is received, and VR environment time passes on a one-to-one ratio with the real world timing).
FIG. 6 depicts a timing diagram for a VR system according to an embodiment of the disclosure provided herein. As depicted, the VR engine may adjust the timing parameter of the VR environment by increasing the number of physics frames included in a rendering frame. For example, the VR engine can experience a “slow motion” in a GPU rendering frame due to the increase in physics calculations performed by the VR system. The VR engine can mitigate for the slow-motion effect by adjusting other parameters of the VR system. In some cases, such as in the context of a sport VR simulation, the VR system can adjust the number of physics frames in a rendering frame in the VR environment when a ball or puck becomes closer to a user or I/O device. For example, the VR system can determine a position of a virtual hockey puck in relation to a virtual hockey stick corresponding to an VO hockey stick of a user. The VR system can determine that the distance between the virtual hockey puck and the virtual hockey stick falls below a predefined threshold. The VR system can then increase the number of physics frames within a rendering frame, which can increase the number of physics calculated for the rendering frame of the VR environment.
In some cases, the VR system may implement this “slow motion” scenario in order to generate a high quality physics simulation. For example, the VR system can increase the number of physics frames within a given rendering frame. This increase can provide an increase in physics calculations per rendering frame. However, the VR system may also compensate for this increase in physics frames by adjusting other VR parameters (e.g., to compensate for an increase in processing resources in a process-resource-limiting system), such as physics parameters of one or more virtual objects of the VR environment. Further, in some cases, the increase in physics frames may be extended based on the desired quality of the physics simulations.
In some cases, the parameter adjustment within the VR environment may be linear in nature. For example, the VR engine may adjust the timing parameter of the environment proportionally to the distance away from the user the tracked virtual object is. Thus, the adjustments made in the VR environment may be imperceptible to a user, and may also reduce the possibility of the “rubber band” effect, or may reduce the frequency and severity of glitch occurrences or other unwanted behavior.
Fast-Paced Physics Calculations
The VR engine may also be configured to provide fast paced-physics calculations. Physics calculations in a VR engine is discreet. The calculations are repeated within time steps of the same length, which generally are between 20-50ms. This time step may be insufficient for fast and precise movements that can occur in simulations (e.g., for sports simulations), and may lead to glitches and inaccuracy. The VR engine described herein may be configured to reduce a time step to every 2 ms. This may reduce glitches and increase accuracy of a rendered VR environment, particularly for sports-related simulations.
VR Joints
Joints in a VR environment can connect to virtual objects together to simulate a physics movement. In some cases, joints can connect objects representing the position of a VR tracker (e.g., attached to a hockey stick) with the tracker’s virtual representation with colliders.
However, multiple inbuilt joints may be insufficient for fast-paced physics simulations and may cause a “rubber band effect.” The VR system described herein may incorporate specific joints that can add kinetic energy and torque to the mapped VR equipment object, which can follow the real world positioning of the equipment quickly and precisely. Sample pseudocode for the joint of the VR system is provided below:
Vectors positionDelta = targetPosition - f ollowingOb j ect Posit ion;
Quaternion rotationDelta = targetRotation *
Quaternion . Inverse ( f ollowingOb j ectRotation) ; f ollowingOb j ect . velocity = positionDelta * velocityCoef f ; float angle ;
Vectors axis ;
/ /gets the axis of rotation and an angle from the axis representing final rotation rotationDelta . ToAngleAxis ( out angle , out axis ) ;
Vectors angularTarget = angle * axis ; f ollowingOb j ect . angularVelocity = angularVelocityCoef f * angularTarget ;
EQUIVALENTS
Although preferred embodiments of the invention have been described using specific terms, such description is for illustrative purposes only, and it is to be understood that changes and variations may be made without departing from the spirit or scope of the following claims.
INCORPORATION BY REFERENCE
The entire contents of all patents, published patent applications, and other references cited herein are hereby expressly incorporated herein in their entireties by reference.
Claims
1. A method for virtual reality (VR) tracking comprising: generating, in a VR system, a first virtual representation of a first object; coupling, in the VR system, a first dynamic collider to a surface of the first virtual representation; generating, in the VR system, a second virtual representation of a second object; coupling, in the VR system, a second dynamic collider to a surface of the second virtual representation; and normalizing, in the VR system, a position of the first dynamic collider to a position of the second dynamic collider for physics computations between the first virtual representation and the second virtual representation.
2. The method of claim 1, wherein the first dynamic collider and the second dynamic collider are two-dimensional.
3. The method of claim 1, further comprising: determining a position of the first dynamic collider in relation to the second dynamic collider; and modifying a position of the first dynamic collider on the surface of the first virtual representation based on the determination.
4. The method of claim 1, further comprising: determining that the first dynamic collider contacts the second dynamic collider; and calculating physics vectors for the second virtual representation based on the contact.
5. The method of claim 4, further comprising: calculating physics vectors for the first virtual representation during a time step when the contact occurs, wherein the calculating the physics vectors for the second virtual representation is based on the calculated physics vectors for the first virtual representation during the time step.
6. The method of claim 4, further comprising: calculating physics vectors for the second virtual representation during a time step when the contact occurs, wherein the calculating the physics vectors for the second virtual representation is based on the calculated physics vectors for the second virtual representation during the time step.
7. The method of claim 4, wherein the physics vectors comprise one or more selected from the group consisting of: speed vectors, rotation vectors, or a combination thereof.
8. The method of claim 1, wherein the first object, the second object, or both, comprise sports equipment.
9. The method of claim 8, where the first object comprises a hockey stick.
10. The method of claim 8, wherein the second object comprises a hockey puck.
11. The method of claim 1, further comprising: generating, in the VR system, a third virtual representation of a third object; and coupling a compound collider to a surface of the third virtual representation, wherein the compound collider remains unassociated with both the first dynamic collider and the second dynamic collider.
12. A method for computations in a virtual reality (VR) system, the method comprising: identifying a distance between a first virtual object and a second virtual object within a rendered VR environment; and adjusting a VR computational parameter of the rendered VR environment based on the distance.
13. The method of claim 12, wherein the first virtual object is a hockey stick, and the second virtual object is a hockey puck.
14. The method of claim 12, wherein the VR computational parameter is a physical frame length, a physical parameter of virtual objects, or a combination thereof.
15. The method of claim 14, wherein the physical parameter of the virtual objects comprises a drag parameter, a friction parameter, a gravity parameter, or a combination thereof.
16. The method of claim 12, wherein the adjustment is a linear adjustment proportional to a change in the distance between the first virtual object and the second virtual object.
17. A computer-implemented method via a Virtual Reality (VR) system, the computer- implemented method comprising: determining a physics simulation quality threshold for a VR environment; determining a number of physics frames to be included in a rendering frame for the VR environment; identifying a compensating factor based on the number of physics frames and processing limitations of the VR system; adjusting one or more physics parameters of a virtual object in the VR environment according to the compensating factor; and generating the VR environment according to the number of physics frames and the adjusted one or more physics parameters.
-15-
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202163262533P | 2021-10-14 | 2021-10-14 | |
| US63/262,533 | 2021-10-14 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2023064762A1 true WO2023064762A1 (en) | 2023-04-20 |
Family
ID=85988044
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2022/077897 Ceased WO2023064762A1 (en) | 2021-10-14 | 2022-10-11 | Computer-implemented methods for virtual reality systems |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2023064762A1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN116650940A (en) * | 2023-05-10 | 2023-08-29 | 哈尔滨工业大学 | A method for realizing curling virtual reality competition |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8576222B2 (en) * | 1998-07-17 | 2013-11-05 | 3D Systems, Inc. | Systems and methods for interfacing with a virtual object in a haptic virtual environment |
| KR20180095588A (en) * | 2015-12-17 | 2018-08-27 | 뷰리시티 게엠베하 | Method and apparatus for motion analysis of sports apparatus |
| US20190318522A1 (en) * | 2018-04-11 | 2019-10-17 | Electronic Arts Inc. | Collision Detection and Resolution in Virtual Environments |
| US10661149B2 (en) * | 2017-03-07 | 2020-05-26 | vSports, LLC | Mixed-reality sports tracking and simulation |
-
2022
- 2022-10-11 WO PCT/US2022/077897 patent/WO2023064762A1/en not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8576222B2 (en) * | 1998-07-17 | 2013-11-05 | 3D Systems, Inc. | Systems and methods for interfacing with a virtual object in a haptic virtual environment |
| KR20180095588A (en) * | 2015-12-17 | 2018-08-27 | 뷰리시티 게엠베하 | Method and apparatus for motion analysis of sports apparatus |
| US10661149B2 (en) * | 2017-03-07 | 2020-05-26 | vSports, LLC | Mixed-reality sports tracking and simulation |
| US20190318522A1 (en) * | 2018-04-11 | 2019-10-17 | Electronic Arts Inc. | Collision Detection and Resolution in Virtual Environments |
Non-Patent Citations (1)
| Title |
|---|
| WANG BIN, ZHANG RUIQI, XI CHONG, SUN JING, YANG XIAOCHUN: "Virtual and Real-Time Synchronous Interaction for Playing Table Tennis with Holograms in Mixed Reality", SENSORS, vol. 20, no. 17, pages 4857, XP093057055, DOI: 10.3390/s20174857 * |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN116650940A (en) * | 2023-05-10 | 2023-08-29 | 哈尔滨工业大学 | A method for realizing curling virtual reality competition |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11508116B2 (en) | Method and system for automated camera collision and composition preservation | |
| KR101660089B1 (en) | Augmented reality device, method for simulating interactions using the same and computer-readable storage medium | |
| US8451278B2 (en) | Determine intended motions | |
| US8597142B2 (en) | Dynamic camera based practice mode | |
| Fujimura et al. | Geometric analysis and quantitative evaluation of sport teamwork | |
| KR20160080064A (en) | Virtual sensor in a virtual environment | |
| US20130102387A1 (en) | Calculating metabolic equivalence with a computing device | |
| Bandow et al. | Comparison of a video and a virtual based environment using the temporal and spatial occlusion technique for studying anticipation in karate | |
| Dhawan et al. | Development of a novel immersive interactive virtual reality cricket simulator for cricket batting | |
| WO2023064762A1 (en) | Computer-implemented methods for virtual reality systems | |
| CN104258555A (en) | RGBD vision sensing type double-fist ball hitting fitness interaction system | |
| US8538736B1 (en) | System and method for simulating object weight in animations | |
| He et al. | Mathematical modeling and simulation of table tennis trajectory based on digital video image processing | |
| CN102024140B (en) | Drumbeating action identification method based on computer | |
| US20200380714A1 (en) | Virtual reality simulations using surface tracking | |
| US20240135618A1 (en) | Generating artificial agents for realistic motion simulation using broadcast videos | |
| JP5420518B2 (en) | Image processing apparatus, image processing program, and image processing method | |
| JP2017029577A (en) | GAME SYSTEM, CONTROL METHOD, AND PROGRAM | |
| Zhu et al. | Simulation and modeling of free kicks in football games and analysis on assisted training | |
| US20250082996A1 (en) | Golf simulator | |
| Mitsuhashi et al. | Motion curved surface analysis and composite for skill succession using RGBD camera | |
| Chung | Metaverse XR Components | |
| Kim et al. | On visual artifacts of physics simulation in augmented reality environment | |
| US20250285318A1 (en) | Three dimensional trajectory model and system | |
| Bogdanovych et al. | A novel approach to sports oriented video games with real-time motion streaming |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22881952 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 22881952 Country of ref document: EP Kind code of ref document: A1 |