WO2020123841A1 - Systèmes et procédés de rendu d'environnements généraux de réalité virtuelle pour la thérapie de la vision - Google Patents
Systèmes et procédés de rendu d'environnements généraux de réalité virtuelle pour la thérapie de la vision Download PDFInfo
- Publication number
- WO2020123841A1 WO2020123841A1 PCT/US2019/066038 US2019066038W WO2020123841A1 WO 2020123841 A1 WO2020123841 A1 WO 2020123841A1 US 2019066038 W US2019066038 W US 2019066038W WO 2020123841 A1 WO2020123841 A1 WO 2020123841A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- virtual reality
- processing device
- frame
- value
- reality display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/02—Subjective types, i.e. testing apparatus requiring the active assistance of the patient
- A61B3/08—Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing binocular or stereoscopic vision, e.g. strabismus
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/11—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
- A61B3/111—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring interpupillary distance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
Definitions
- Binocular dysfunctions are present in between 4% to 8% of the general population and between 40% and 50% of the brain injury population. Some examples of binocular dysfunctions include, but are not limited to nystagmus, strabismus, convergence insufficiency (Cl), convergence excess, divergence insufficiency and divergence excess.
- the visual symptoms of binocular dysfunctions can be exacerbated by, for example, extensive use of hand held electronic devices (e.g., smart phones, tablets, etc.) as well as by any near visual tasks (e.g., reading, computer work, etc.) - adversely impacting occupational and recreational activities of those suffering from binocular dysfunctions.
- asthenopic symptoms associated with binocular dysfunctions can include, but are not limited to double/blurred vision, eyestrain, visual fatigue, and headaches, which all negatively impact activities of daily living.
- Vision therapy is one therapeutic intervention that is commonly used to treat binocular dysfunctions.
- Exemplary embodiments of the present disclosure relate to systems, methods, apparatus, and computer-readable media for rending virtual reality environments are disclosed.
- the systems, methods, apparatus, and computer-readable media disclosed herein can be used to render the virtual reality environments for vision therapy and/or for other purposes.
- Third party virtual reality media e.g., games, videos, images, etc.
- exemplary embodiments of the present disclosure can dynamically modify and/or adjust the manner in which the virtual reality media is rendered on a virtual reality display to affect the visual system vergence- accommodation conflict of a user.
- Embodiments of the present disclosure negate the need for a full-time game development team to develop customized virtual reality media specifically for vision therapy, and allow users and practitioners to leverage the full content library available on various virtual reality platforms developed for the general public.
- embodiments of the present disclosure change the game that the user is currently playing from a virtual reality media developed for the general public into a therapeutic experience.
- a virtual reality game is designed so that the distance between the two cameras in the virtual world typically matches the distance between the user’s eyes, or interpupillary distance (IPD), in the real world. This setting minimizes the effects of the visual system’s vergence-accommodation conflict to improve visual comfort of the gamer.
- Embodiments of the present disclosure interface with the underlying software and memory space in the computing device that handles communication between the virtual reality display and the game software. Embodiments of the present disclosure can intervene with the configurations and/or communications between the computing system processing the virtual reality game software and the display to render the virtual game software with increased the disparities (differences).
- a virtual reality game designed for the general population can be modified on a frame-by- frame basis to render the virtual reality game with increased disparities between specific objects within the rendered virtual reality game that are seen by the left and right eyes in a controlled manner within the virtual environment.
- This increase in disparity causes an increase in the convergence demand required for the user to maintain a single and clear image of the objects being rendered on the display, prompting the user to rotate his/her eyes in a controlled manner.
- This stimulation of the extraocular muscles and engagement of the neural system when done repeatedly with increased difficulty, can lead to an improvement in that person’ s convergence system.
- a system, method, and non-transitory computer-readable medium are disclosed.
- Embodiments of the systems, methods, and non-transitory computer-readable media are provided for modifying the manner in which virtual reality media developed for the general public is rendered on a virtual reality display to provide vision therapy to a user, the method comprising.
- Virtual reality media to be rendered in a virtual reality display is processed and one or more parameters are received from a virtual reality display.
- the one or more parameters specify the manner in which the virtual reality media is to be rendered on the virtual reality display.
- At least one of the one or more parameters is modified to alter the manner in which the virtual reality media is render on the virtual reality display after the one or more parameters are received and prior to transmission of the virtual reality media to the virtual reality display.
- the virtual reality media on the virtual reality display based on the at least one of the one or more parameters that have been modified to increases an effect of the vergence-accommodation conflict of the user viewing the virtual reality display.
- The includes receiving, by a processing device, one or more parameters from the virtual reality display, storing the one or more parameters in memory by the processing device, and modifying, in response to execution of intermediate software by the processing device, at least one of the one or more parameters in memory to have a value that is different than that which was received from the virtual reality display. The value is modified based on a setting in the intermediate software.
- the method also includes retrieving, by the processing device, the one or more parameters including the value of the at least one of the one or more parameters and a frame of the virtual reality media to be rendered on the virtual reality display; processing, by the processing device, the frame of virtual reality media to be rendered by the virtual reality display.
- a manner in which the frame of virtual reality media is render is determined at least in part by the value of the at least one of the one or more parameters.
- the method also includes rendering the frame of the virtual reality media on the virtual reality display as modified by the value of the at least one of the one or more parameters.
- a non-transitory computer-readable medium comprising instructions. Execution of the instructions by a processing device causes the processing device to receive one or more parameters from the virtual reality display, store the one or more parameters in memory by the processing device; and modify, in response to execution of intermediate software by the processing device, at least one of the one or more parameters in memory to have a value that is different than that which was received from the virtual reality display. The value is modified based on a setting in the intermediate software.
- Execution of the instructions by the processing also cause the processing device to retrieve the one or more parameters including the value of the at least one of the one or more parameters and a frame of the virtual reality media to be rendered on the virtual reality display, and process the frame of virtual reality media to be rendered by a virtual reality display.
- a manner in which the frame of virtual reality media is render is determined at least in part by the value of the at least one of the one or more parameters.
- Execution of the instructions by the processing also cause the processing device to transmit the frame of the virtual reality media to be rendered on the virtual reality display as modified by the at least one of the one or more parameters to increase an effect of the vergence- accommodation conflict of the user viewing the virtual reality display.
- a system for modifying the manner in which virtual reality media is rendered on a virtual reality display includes a virtual reality display, a memory, and a processing device.
- the processing device is operatively coupled to the memory and the virtual reality display, and is configured to receive one or more parameters from the virtual reality display, store the one or more parameters in memory, and modify, in response to execution of intermediate software, at least one of the one or more parameters in memory to have a value that is different than that which was received from the virtual reality display.
- the value is modified based on a setting in the intermediate software.
- the processing device is also configured to retrieve the one or more parameters including the value of the at least one of the one or more parameters and a frame of the virtual reality media to be rendered on the virtual reality display, and process the frame of virtual reality media to be rendered by the virtual reality display.
- a manner in which the frame of virtual reality media is render is determined at least in part by the value of the at least one of the one or more parameters.
- the processing device is also configured to transmit the frame of the virtual reality media to be rendered on the virtual reality display as modified by the at least one of the one or more parameters to increase an effect of the vergence-accommodation conflict of the user viewing the virtual reality display.
- the at least one of the one or more parameters received by the processing device from the virtual reality display is an interpupillary distance and the value is modified to be different than an actual interpupillary distance between eyes of the user.
- the frame of virtual reality media can be processed by the processing device by setting an inter-camera distance between left image and right image of the frame of virtual reality media to be less than or greater than the actual interpupillary distance.
- the virtual reality media can be rendered on the virtual reality display on a frame- by-frame basis, and the processing device can receive the interpupillary distance on the frame-by-frame basis from the virtual reality display.
- the processing device can execute the intermediate software to modify the value of the interpupillary distance on the frame-by- frame basis after the interpupillary distance is stored in memory and before each frame is processed by the processing device to render each frame on the virtual reality display.
- the processing device can executes the intermediate software to modify the value of the interpupillary distance to be identical for each frame, to be different for each frame, and/or to be based on a pattern or profile or combination thereof defined by the intermediate software, a step profile, a ramp profile, and/or a sweep profile.
- FIG. 1 shows an exemplary virtual reality system in accordance with exemplary embodiments of the present disclosure.
- FIG. 2 is a block diagram of an exemplary embodiment of the virtual reality display shown in FIG. 1.
- FIG. 3 is a block diagram of an exemplary embodiment of the computing system shown in FIG. 1.
- FIG. 4 illustrates a user’s eyes in an unmodified virtual reality environment in which virtual reality media developed for the general population.
- FIG. 5 illustrates the user’s eyes in a modified virtual reality environment in which virtual reality media developed for the general population in accordance with embodiments of the present disclosure.
- FIG. 6 is a flowchart illustrating an information flow in a virtual reality environment for a single frame according to embodiments of the present disclosure.
- FIG. 7 is an exemplary graphical user interface for the intermediate software in accordance with embodiments of the present disclosure.
- FIG. 8 is a flowchart illustrating an exemplary process for modifying the manner in which VR media developed for the general public is rendered on a virtual reality display in accordance with embodiments of the present disclosure.
- Exemplary embodiments of the present disclosure can interface with the underlying controller software executing on a processing device that handles communication between the processing device and the virtual reality (VR) display to render the VR media on the VR display.
- VR virtual reality
- the manner in which the VR media is rendered by exemplary embodiments is modified to increase the disparities between the images presented to the user within the virtual environment. This increase in disparity causes an increase in the convergence demand required for the user to maintain a single and clear image of the scene, prompting the user to rotate his/her eyes in a controlled manner.
- FIG. 1 shows an exemplary virtual reality system 100 in accordance with exemplary embodiments of the present disclosure.
- the virtual reality system 100 can include a computing system 110 and a virtual reality display 150.
- the virtual reality display 150 and the computing system 110 can be communicatively coupled to each other via wireless or wired communications such that the virtual reality display 150 and the computing system 110 can interact with each other to implement a virtual reality environment.
- the computing system 110 and the virtual reality display can be integrally formed or housed within the same form factor (e.g., tablet, smart phone, etc.).
- the virtual reality display 150 can communicate with the computing system 110 via a communications network such that the computing system 110 can remotely located relative to the virtual reality display 150.
- the virtual reality system 100 can be configured to provide a three-dimensional virtual reality gaming environment.
- the computing system 110 can receive and/or store virtual reality media 105, virtual media software 112, controller software 114, and an intermediate software application 116.
- the virtual reality media 105 can be a virtual reality game, movie, scenes, images, virtual worlds, and/or any other content to be rendered using the virtual reality display 150.
- the virtual reality media 112 can be developed for the general population (as opposed to for vision therapy) such that, under normal circumstances and conditions, the virtual reality media software 112 renders data of the virtual reality media 105 in a manner that reduces vergence-accommodation conflict of a user to provide a comfortable viewing environment for a user.
- the controller software 114 can be executed by the computing system 110 to provide an interface between the virtual media software 112 and the virtual reality display and/or to control communication between the virtual reality media software 112 and the virtual reality display 150 to render images from the virtual reality media 105.
- the computing system 110 can reserve and utilize a memory space 118 in memory 120 of the computing system 110 processing the virtual reality media 105 and interfacing with the virtual reality display 150.
- the computing system 110 can reserve and utilize the memory space 118 to store data 122 from the virtual reality media 105 (e.g., to rendered on the virtual reality display), parameters 124 from the virtual reality media 105, and/or parameters 126 received from the virtual reality display 150.
- the computing system 110 can generate and/or utilize the memory space 118 in the memory 120 or other storage device to store an in-memory version of the virtual reality data 122, or portions thereof, to be transmitted to, and rendered by, the virtual reality display 150.
- the intermediate software application 116 can be executed by the computing system 110 to adjust/modify the data 122 from the virtual reality media software 112, parameters 124, and/or the parameters 126 subsequent to processing by the controller software 114 being executed by the computing system 110 and prior to transmission of the data 122 to the virtual reality display 150.
- the computing system 110 can access the memory space 118 and can modify the data 122 from the virtual reality media 105 and/or virtual reality media software 112 (e.g., to be rendered on the virtual reality display), parameters 124 from the virtual reality media 105 and/or virtual media software 112, and/or the parameters 126 received from the virtual reality display 150 to change the manner in which the data 122 is rendered by the virtual reality display 150 as described herein. At least some of the parameters 124 and/or 126 can be update on a frame- by- frame basis.
- the virtual reality display 150 can communicate with the computing system 110 on a frame-by-frame basis to change or update the parameters 126 associated with the virtual reality display 150 and/or the user interacting with the virtual reality display 150 in the memory 120.
- the virtual reality display 150 can be configured to render the data 122 related to the virtual reality media 105 using the stereoscopic effect with separate images for the right and left eyes to create a visual perception of depth.
- the virtual reality display 150 can include circuitry for interfacing with the computing system 110 to communicate/transmit information associated with the virtual reality display 150 and/or with the user interacting with the virtual reality display.
- the virtual reality display 150 can transmit the parameters 126 including information related to a type of display, a resolution of the display, a refresh or frame rate of the display, and/or any other parameters that may be used by the virtual reality media software executing on the computing system 110 when processing the virtual reality media 105.
- the virtual reality display 150 can transmit the parameters 126 including information related to an inter-pupil distance (IPD) of the user as well as other information related to the user of the virtual reality display.
- the virtual reality media software 112 executing on the computing system 110 can process the virtual reality media 105 and transmit the data 122 related to the virtual reality media 105 to the virtual reality display 150, which can render virtual reality scenes or images to be viewed by one or more users of the display.
- the scenes or images can be rendered by the virtual reality display 110 on a frame-by-frame basis.
- the intermediate software 116 can be executed by the computing system 110 in parallel to the controller software 114 to modify the manner in which the scenes or images of the virtual reality media 105 are rendered on the virtual reality display 150 prior to transmission of the data 122 from the computing system 110 to the virtual reality display 150.
- the intermediate software 116 can modify/change the IPD received from the virtual reality device 150 in a frame-by- frame basis to increase the effects of the vergence- accommodation conflict of the user viewing the virtual reality display 150.
- the intermediate software 116 can be programmed to modify the IPD with a static fixed value that is the same for each frame, modify the IPD to change on a frame-by-frame basis, modify the IPD according to one or more patterns or specified profiles on an intra-frame basis or an inter- frame basis to change the inter-camera distance based on a setting in the intermediate software 116.
- the patterns or profiles that can be used by the intermediate software 116 can include a step pattern where the received/reported IPD is changed in steps of a specified increment, a ramp pattern where the received/reported IPD is changed gradually in an increasing or decreasing manner, a sweep pattern where the received/reported IPD is changed across a spectrum of values including received/reported IPD values that are smaller than and larger than the received/reported IPD of a user, a step-ramp pattern that combines the step and ramp patterns, and/or a ramp-step pattern that combines the ramp and step patterns.
- FIG. 2 shows an embodiment of the virtual reality display 150 in accordance with exemplary embodiments of the present disclosure.
- the virtual reality display 150 can be a head mounted display that can be communicatively coupled to the computing system 110 via wireless or wired communications.
- the virtual reality display 150 can include circuitry disposed within a housing 250.
- the circuitry can include a right eye display 222, a left eye display 224, one or more right eye image capturing devices 226, one or more left eye image capturing devices 228, one or more right eye light emitting diodes 230, one or more left eye light emitting diodes 232, a right eye controller 234, a left eye controller 236, one or more display controllers 238, and one or more hardware interfaces 240.
- the right and left eye displays 222 and 224 can be disposed within the housing 250 such that the right eye display 222 is positioned in front of the right eye of the user when the housing 250 is mounted on the user’s head and the left eye display 224 is positioned in front of the left eye of the user when the housing 250 is mounted on the user’s head.
- the right eye display 222 and the left eye display 224 can be controlled by the one or more display controllers 238 to render images on the right and left eye displays 222 and 224 to induce a stereoscopic effect, which can be used to generate three-dimensional images, where objects in the images can be perceived by the user’s vision system as being at different depths while maintaining constant focal length between the user’ s right eye and the right eye display 222 and between the user’s left eye and the left eye display 224.
- the right eye display 222 and/or the left eye display 224 can be implemented as a light emitting diode display, an organic light emitting diode (OLED) display (e.g., passive-matrix (PMOLED) display, active-matrix (AMOLED) display), and/or any suitable display.
- OLED organic light emitting diode
- PMOLED passive-matrix
- AMOLED active-matrix
- the one or more right eye image capturing devices 226 can be disposed in the housing 250 relative to the right eye display 222 so that the one or more right eye image capturing devices 226 can be positioned and oriented to capture images of the user’ s right eye as the user views the right eye display 222.
- the one or more left eye image capturing devices 228 can be disposed in the housing 250 relative to the left eye display 224 so that the one or more left eye image capturing devices 228 can be positioned and oriented to capture images of the user’s left eye as the user views the left eye display 224.
- the one or more right and left eye image capturing devices 222 and 224 can be infrared (IR) cameras configured to have a particular sensitive to IR light (e.g., to capture images of IR radiation).
- the one or more right eye light emitting diodes 230 can be disposed in the housing 250 relative to the right eye display 222 and the one or more right eye light emitting diodes so that the one or more light emitting diodes 230 can be positioned and oriented to emit light towards the user’s right eye as the user views the right eye display 222.
- the one or more left eye light emitting diodes 232 can be disposed in the housing 250 relative to the left eye display 224 so that the one or more left eye light emitting diodes 232 can be positioned and oriented to emit light towards the user’s left eye as the user views the left eye display 224.
- the one or more right and left eye light emitting diodes 230 and 232 can be infrared (IR) light emitting diodes configured to emit IR light.
- the light emitting diodes project infrared light into the eye at about ten percent (10%) of the safety limit.
- the right eye controller 234 can be operatively coupled to the one or more right eye image capturing devices 226 to control an operation of the one or more right eye image capturing devices 226 and/or to process the images of the right eye captured by the one or more right eye image capturing devices 226.
- the left eye controller 236 can be operatively coupled to the one or more left eye image capturing devices 228 to control an operation of the one or more left eye image capturing devices 228 and/or to process the images of the left eye captured by the one or more left eye image capturing devices 228.
- the right and left eye controllers 234 and 236 can be configured to control a shutter, aperture, refresh rate, discharge rate, and the like of the one or more right and left eye image capturing devices 222 and 224, respectively.
- the right and left eye controllers 234 and 236 can monitor and/or track the movement of the user’s right and right eyes as the user views the right and left eye displays 226, respectively, which can be utilized by exemplary embodiments to effect vision therapy of the user for binocular dysfunctions.
- exemplary embodiments of the present disclosure can be implemented with a single integrated controller to control and interface with the right and left eye image capturing devices 222 and 224.
- the one or more display controllers 238 can be operatively coupled to right and left eye displays 222 and 224 to control an operation of the right and left eye displays 222 and 224 in response to input (e.g., the data 122) received from the computing system 110.
- the one or more display controllers 238 can be configured to render images on the right and left eye displays of the same scene and/or objects, where images of the scene and/or objects are render at slightly different angles or points-of-view to facilitate the stereoscopic effect.
- the one or more display controllers 238 can include graphical processing units.
- the one or more hardware interfaces 240 can facilitate communication between the head mounted virtual reality display 150 and the computing system 110.
- the virtual reality display 150 can be configured to transmit the parameters 126 to the computing system 110 and to receive data 122 from the computing system 110 via the one or more hardware interfaces 240.
- the one or more hardware interfaces 240 can be configured to receive the data 122 from the computing system 110 corresponding to images and can be configured to transmit the data 122 to the one or more display controllers 238, which can render the images on the right and left eye displays 222 and 224 to provide a virtual reality environment in three-dimensions (e.g., as a result of the stereoscopic effect).
- the housing 250 can include a mounting structure 252 and a display structure 254.
- the mounting structure 252 allows a user to wear the head mounted virtual reality display 150 on his/her head and to position the display structure over his/her eyes to facilitate viewing of the right and left eye displays 222 and 224 by the right and left eyes of the user, respectively.
- the mounting structure can be configured to generally mount the head mounted virtual reality display 150 on a user’ s head in a secure and stable manner. As such, the head mounted virtual reality display 150 generally remains generally fixed with respect to the user’s head such that when the user moves his/her head left, right, up, and down, the head mounted virtual reality display 150 generally moves with the user’s head.
- the display structure 254 can be contoured to fit snug against a user’s face to cover the user’ s eyes and to generally prevent light from the environment surrounding the user from reaching the user’s eyes.
- the display structure 254 can include a right eye portal 256 and a left eye portal 258 formed therein.
- a right eye lens 260a can be disposed over the right eye portal and a left eye lens 260b can be disposed over the left eye portal.
- the right eye display 222, the one or more right eye image capturing devices 226, and the one or more right eye light emitting diodes 230 can be disposed within the display structure 254 behind the lens 260 covering the right eye portal 256 such that the lens 256 is disposed between the user’s right eye and each of the right eye display 222, the one or more right eye image capturing devices 226, and the one or more right eye light emitting diodes 230.
- the left eye display 224, the one or more left eye image capturing devices 228, and the one or more left eye light emitting diodes 232 can be disposed within the display structure 254 behind the lens 260 covering the left eye portal 258 such that the lens 260 is disposed between the user’s left eye and each of the left eye display 224, the one or more left eye image capturing devices 228, and the one or more left eye light emitting diodes 232.
- the one or more right eye image capturing devices 226 and the one or more right eye light emitting diodes 230 are described as being disposed behind the lens 260 covering the right eye portal as an example embodiment, in exemplary embodiments of the present disclosure the one or more right eye image capturing devices 226 and/or the one or more right eye light emitting diodes 230 can be disposed in front of and/or around the lens 260 covering the right eye portal such that lens 260 is not positioned between the user’ s right eye and the one or more right eye image capturing devices 226 and/or the one or more right eye light emitting diodes 230.
- the one or more left eye image capturing devices 228 and the one or more left eye light emitting diodes 232 are described as being disposed behind the lens 260 covering the left eye portal as an example embodiment, in exemplary embodiments of the present disclosure the one or more left eye image capturing devices 228 and/or the one or more left eye light emitting diodes 232 can be disposed in front of and/or around the lens 260 covering the left eye portal such that lens 260 is not positioned between the user’ s left eye and the one or more right eye image capturing devices 226 and/or the one or more right eye light emitting diodes 230.
- the computing system 110 transmits the data 122 to the head mounted display 150 and information for rendering the data 122.
- the data 122 can include right and left images to be rendered by the right and left eye displays 222 and 224 and the information can include camera positions from which the images are rendered based on the parameters 124 and/or 126 (e.g. the parameter for the inter-pupil distance).
- the user’s visual system can attempt to perceive the right and left images as a single image in three-dimensional space (e.g., using the stereoscopic effect).
- exemplary embodiments of the present disclosure can be configured such that the head mounted display includes the computing system 110 and/or is configured to perform the functions and operations of the computing system 110 such that the head mount virtual display 150 is a self-contained, stand-alone device or system.
- the virtual reality display 150 is shown as having two displays, exemplary embodiments of the virtual reality display can be formed be a single display that is divided into a right eye portion and a left eye portion.
- a mobile device such as a smart phone, can be the virtual reality display.
- FIG. 3 is a block diagram of an exemplary embodiment of the computing system 110.
- the computing system 110 can be a gaming console configured to execute virtual reality games to be rendered through embodiments of the head mounted display 150 and/or can be any system configured to and/or programmed to process virtual reality media to be rendered on a display.
- the computing system 110 includes one or more non-transitory computer-readable media for storing one or more computer-executable instructions or software for implementing exemplary embodiments, such as the virtual reality media software 112, the controller software 114, and/or the intermediate software 116.
- the non-transitory computer-readable media may include, but are not limited to, one or more types of hardware memory, non-transitory tangible media (for example, one or more magnetic storage disks, one or more optical disks, one or more flash drives), and the like.
- memory 120 included in the computing system 110 may store computer-readable and computer-executable instructions or software for implementing exemplary embodiments.
- the computing system 110 also includes processor 302 and associated core 304, and optionally, one or more additional processor(s) 302’ and associated core(s) 304’ (for example, in the case of computer systems having multiple processors/cores), for executing instances of computer-readable and computer-executable instructions or software stored in the memory 120 and other programs for controlling system hardware.
- the processor(s) 302, 302’ can execute the virtual reality media software, the controller software 114, and/ the intermediate software to process the virtual reality media 105 and render data from the virtual reality media on the virtual reality display 150.
- Processor 302 and processor(s) 302’ may each be a single core processor or multiple core (304 and 304’) processor and may be central processing units, graphical processing units, and the like.
- Virtualization may be employed in the computing system 110 so that infrastructure and resources in the computing device may be shared dynamically.
- a virtual machine 314 may be provided to handle a process running on multiple processors so that the process appears to be using only one computing resource rather than multiple computing resources. Multiple virtual machines may also be used with one processor.
- Memory 120 may include a computer system memory or random access memory, such as DRAM, SRAM, EDO RAM, and the like. Memory 120 may include other types of memory as well, or combinations thereof.
- a user may interact with the computing system 110 through an embodiment of the virtual reality display 150, which can display one or more images of the virtual reality media 112 in accordance with exemplary embodiments.
- the computing system 110 may include other I/O devices for receiving input from a user, for example, a keyboard or any suitable multi-point touch interface 308, a pointing device 310 (e.g., a mouse or joystick).
- the computing system 110 may include other suitable conventional I/O peripherals.
- the computing system 110 may also include one or more storage devices 324, such as a hard-drive, CD-ROM, or other computer readable media, for storing data and computer- readable instructions and/or software that process the virtual reality media 112 including, for example, the controller software 114 and the intermediate application 116.
- Exemplary storage device 324 may also store one or more databases for storing any suitable information required to implement exemplary embodiments.
- exemplary storage device 324 can store one or more databases 328 for storing information, the data 122, the parameters 124, and/or the parameters 126, and the like.
- the databases may be updated at any suitable time to add, delete, and/or update one or more items in the databases.
- the computing system 110 can include a network interface 312 configured to interface via one or more network devices 322 with one or more networks, for example, Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (for example, 802.11, Tl, T3, 56kb, X.25), broadband connections (for example, ISDN, Frame Relay, ATM), wireless connections, controller area network (CAN), or some combination of any or all of the above.
- LAN Local Area Network
- WAN Wide Area Network
- the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (for example, 802.11, Tl, T3, 56kb, X.25), broadband connections (for example, ISDN, Frame Relay, ATM), wireless connections, controller area network (CAN), or some combination of any or all of the above.
- the network interface 312 may include a built-in network adapter, network interface card, PCMCIA network card, card bus network adapter, wireless network adapter, USB network adapter, modem or any other device suitable for interfacing the computing system 110 to any type of network capable of communication and performing the operations described herein.
- the computing system 110 may be any computer system, such as a workstation, desktop computer, server, laptop, handheld computer, tablet computer (e.g., the iPadTM tablet computer), mobile computing or communication device (e.g., the iPhoneTM communication device), or other form of computing or telecommunications device that is capable of communication and that has sufficient processor power and memory capacity to perform the operations described herein.
- the computing system 110 may run any operating system 316, such as any of the versions of the Microsoft® Windows® operating systems, the different releases of the Unix and Linux operating systems, any version of the MacOS® for Macintosh computers, any embedded operating system, Microsoft® Xbox operating systems for Xbox gaming systems, Playstation operating systems for PlayStation gaming systems, Wii operating systems for Nintendo® Wii gaming systems, Switch operating system for Nintendo® Switch gaming systems, any real-time operating system, any open source operating system, any proprietary operating system, or any other operating system capable of running on the computing device and performing the operations described herein.
- the operating system 316 may be run in native mode or emulated mode.
- the operating system 316 may be run on one or more cloud machine instances.
- FIG. 4 illustrates a user’s eyes 402, 404 in a conventional virtual reality environment 400 in which virtual reality media developed for the general population (e.g., the virtual reality media 112).
- the virtual reality media processed by the controller software is designed to be rendered so that the distance between the two cameras 406, 408 in the virtual world, or inter-camera distance (ICD) 410, should match the distance between the user’s eyes 402, 404, or interpupillary distance (IPD) 412, in the real world.
- ICD inter-camera distance
- IPD interpupillary distance
- FIG. 5 illustrates the user’s eyes 402, 404 in a virtual reality environment 500 in accordance with embodiments of the present disclosure in which the virtual reality media from FIG. 4 (e.g., the virtual reality media 112) is rendered on the display is modified by the intermediate application.
- the virtual reality media processed by the controller software is modified prior to transmission to the virtual reality display so that the distance between the two cameras 406, 408 in the virtual world, or inter-camera distance (ICD) 510, is different than (e.g., greater than or less than) the distance between the user’s eyes 402, 404, or interpupillary distance (IPD) 512, in the real world.
- ICD inter-camera distance
- IPD interpupillary distance
- FIG. 6 is a flowchart 600 illustrating an information flow in a virtual reality environment for a single frame according to embodiments of the present disclosure.
- the virtual reality media software 112 requests recent pose and IPD information from the controller software 114.
- the controller software 114 looks for this information from the virtual reality display 150 in the memory space.
- the controller software 114 informs the virtual reality media software 112 of any changes to pose or IPD.
- the virtual reality media software 112 sends the final frame render to the virtual reality display.
- the intermediate software 116 disrupts the flow of information from the controller software to the virtual reality media, changing the reported IPD values from the virtual reality display to modify the inter-camera distance (ICD) to be less than or greater than the reported IPD and to modify the manner in which the virtual reality data is rendered; thereby providing a therapeutic effect from virtual reality media.
- ICD inter-camera distance
- FIG. 7 is an exemplary graphical user interface (GUI) 700 for the intermediate software in accordance with embodiments of the present disclosure.
- the GUI 700 can include input fields for controlling the operation of the intermediate software 116 to modify the manner in which the data from the virtual reality media is rendered on the virtual reality display.
- the GUI 700 can allow a user to select various specified patterns or profiles (e.g., setting) by which the intermediate software controls the IPD value received from the virtual reality display and utilized by the computing system to render the data in a manner that increases and/or decreases the effects of the vergence-accommodation conflict of the user viewing the virtual reality display according to the patterns or profiles (e.g., by adjusting the inter-camera distance).
- the inputs can allow a user to create patterns or profiles using inputs to specify, for example, step, ramp, sweep, step-ramp, and/or ramp-step patterns with different values, which once created, can be added to the available specified patterns or profiles for subsequent use.
- an input can be provided that allows users to remove a specified pattern or profile.
- Outputs can be provided to provide the user with information regarding the selected pattern or profile being implemented by the intermediate software as well as when the pattern or profile was started and a time remaining for the pattern or profile to ran.
- the GUI 700 can allow a user to start, stop, and/or pause the implementation of the pattern or profile.
- FIG. 8 is a flowchart illustrating an exemplary process 800 for modifying the manner in which VR media developed for the general public is rendered on a virtual reality display to provide vision therapy to a user.
- the process 800 can be performed for each frame of virtual reality data to be rendered on a virtual reality display, can be performed once for all of the frames of virtual reality data to be rendered on the virtual reality display, and/or can be performed on some frames, but not others, of the virtual reality data to be rendered on the virtual reality display.
- a computing system 110 executes virtual reality media software 112 to process virtual reality media 105 to be rendered on a virtual reality display 150.
- intermediate software 116 and controller software 114 are executed on the computing system.
- the controller software 114 interfaces and communicates with the virtual reality display to configure the virtual reality environment.
- the controller software 114 can receive parameters 126 from the virtual reality display (e.g., such as the IPD) that can be used by the virtual reality media software to generate (left and right images of) frames to be rendered on the virtual reality display.
- the intermediate software application can modify the parameters (e.g., the reported and stored IPD) based on one or more specified programs or profiles (e.g., setting in the intermediate software).
- the virtual reality data to be rendered on the virtual reality display is altered according to the modified parameters.
- the intermediate software can modify the reported IPD received from the virtual reality display to change an inter-camera distance associated with the rendering of the data of the virtual reality media (e.g., set the inter-camera distance to be less than or greater than the reported IPD). This modification can increases the effects of the vergence-accommodation conflict of the user viewing the virtual reality display.
- Exemplary flowcharts are provided herein for illustrative purposes and are non limiting examples of methods.
- One of ordinary skill in the art will recognize that exemplary methods may include more or fewer steps than those illustrated in the exemplary flowcharts, and that the steps in the exemplary flowcharts may be performed in a different order than the order shown in the illustrative flowcharts.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Ophthalmology & Optometry (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Biophysics (AREA)
- Processing Or Creating Images (AREA)
Abstract
La présente invention concerne des systèmes, des procédés, et des supports non temporaires lisibles par ordinateur destinés à modifier la manière selon laquelle un support de réalité virtuelle développé pour le grand public est rendu sur un afficheur de réalité virtuelle pour fournir la thérapie de la vision à un(e) utilisateur/utilisatrice, le procédé comprenant les étapes suivantes : un milieu de réalité virtuelle à rendre dans un afficheur de réalité virtuelle est traité et un ou plusieurs paramètres sont reçus depuis un afficheur de réalité virtuelle. Lesdits paramètres spécifient la manière selon laquelle le support de réalité virtuelle doit être rendu sur afficheur de réalité virtuelle. Au moins l'un desdits paramètres est modifié pour altérer la manière selon laquelle le support de réalité virtuelle est rendu sur l'afficheur de réalité virtuelle après que lesdits paramètres sont reçus et avant la transmission du support de réalité virtuelle vers l'afficheur de réalité virtuelle. Le support de réalité virtuelle sur l'afficheur de réalité virtuelle étant basé sur au moins l'un desdits paramètres qui a été modifié pour accroître un effet de conflit de vergence-conflit d'accommodation de l'utilisateur/utilisatrice visualisant l'afficheur de réalité virtuelle.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201862778766P | 2018-12-12 | 2018-12-12 | |
| US62/778,766 | 2018-12-12 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020123841A1 true WO2020123841A1 (fr) | 2020-06-18 |
Family
ID=71077090
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2019/066038 Ceased WO2020123841A1 (fr) | 2018-12-12 | 2019-12-12 | Systèmes et procédés de rendu d'environnements généraux de réalité virtuelle pour la thérapie de la vision |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2020123841A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115002438A (zh) * | 2022-05-27 | 2022-09-02 | 厦门雅基软件有限公司 | Xr应用的开发预览方法、装置、电子设备及可读存储介质 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101352385A (zh) * | 2008-08-26 | 2009-01-28 | 北京航空航天大学 | 弱视治疗仪的瞳距调整机构 |
| US20160007849A1 (en) * | 2014-07-08 | 2016-01-14 | Krueger Wesley W O | Systems and methods for the measurement of vestibulo-ocular reflex to improve human performance in an occupational environment |
| CN106233328A (zh) * | 2014-02-19 | 2016-12-14 | 埃弗加泽公司 | 用于改进、提高或增强视觉的设备和方法 |
| CN106461939A (zh) * | 2015-05-29 | 2017-02-22 | 深圳市柔宇科技有限公司 | 自适配显示调节的方法及头戴式显示设备 |
-
2019
- 2019-12-12 WO PCT/US2019/066038 patent/WO2020123841A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101352385A (zh) * | 2008-08-26 | 2009-01-28 | 北京航空航天大学 | 弱视治疗仪的瞳距调整机构 |
| CN106233328A (zh) * | 2014-02-19 | 2016-12-14 | 埃弗加泽公司 | 用于改进、提高或增强视觉的设备和方法 |
| US20160007849A1 (en) * | 2014-07-08 | 2016-01-14 | Krueger Wesley W O | Systems and methods for the measurement of vestibulo-ocular reflex to improve human performance in an occupational environment |
| CN106461939A (zh) * | 2015-05-29 | 2017-02-22 | 深圳市柔宇科技有限公司 | 自适配显示调节的方法及头戴式显示设备 |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115002438A (zh) * | 2022-05-27 | 2022-09-02 | 厦门雅基软件有限公司 | Xr应用的开发预览方法、装置、电子设备及可读存储介质 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CA3023488C (fr) | Systeme et procede de generation d'une representation progressive associee a des donnees d'image de realites virtuelle et physique en correspondance surjective | |
| CA2825563C (fr) | Systeme d'affichage de realite virtuelle | |
| JP2023037626A (ja) | Hmd環境での高速中心窩レンダリングのための予測及びgpuに対する最新の更新を伴う視線追跡 | |
| US10438418B2 (en) | Information processing method for displaying a virtual screen and system for executing the information processing method | |
| US10029176B2 (en) | Data processing apparatus and method of controlling display | |
| CN106484116B (zh) | 媒体文件的处理方法和装置 | |
| US20180136723A1 (en) | Immersive displays | |
| US11237413B1 (en) | Multi-focal display based on polarization switches and geometric phase lenses | |
| KR20200056658A (ko) | 클라우드 기반의 가상현실 서비스를 위한 버퍼 관리 방법 및 장치 | |
| CN117480469A (zh) | 眼睛模型注册 | |
| WO2020123841A1 (fr) | Systèmes et procédés de rendu d'environnements généraux de réalité virtuelle pour la thérapie de la vision | |
| US20210286701A1 (en) | View-Based Breakpoints For A Display System | |
| JP6996450B2 (ja) | 画像処理装置、画像処理方法、およびプログラム | |
| JP6416338B1 (ja) | 情報処理方法、情報処理プログラム、情報処理システムおよび情報処理装置 | |
| US10659755B2 (en) | Information processing device, information processing method, and program | |
| CN113101158A (zh) | 一种基于vr的双眼视像融合训练方法和装置 | |
| WO2017022303A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations, et programme | |
| JP7300569B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
| JP6934374B2 (ja) | プロセッサを備えるコンピュータにより実行される方法 | |
| JP2024040528A (ja) | 情報処理装置、情報処理方法、及びプログラム | |
| KR20210133058A (ko) | 6dof 온라인 멀티 플레이어 vr 관광체험 시스템 | |
| JP7467748B1 (ja) | 表示制御装置、表示システム及びプログラム | |
| US20250244827A1 (en) | Information display system | |
| KR102286517B1 (ko) | 컨트롤러 입력에 따른 회전 구동 제어 방법 및 이를 이용한 헤드 마운티드 디스플레이 시스템 | |
| CN106484114B (zh) | 基于虚拟现实的交互控制方法及装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19895046 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19895046 Country of ref document: EP Kind code of ref document: A1 |