[go: up one dir, main page]

WO2020123841A1 - Systems and methods for rendering general virtual reality environments for vision therapy - Google Patents

Systems and methods for rendering general virtual reality environments for vision therapy Download PDF

Info

Publication number
WO2020123841A1
WO2020123841A1 PCT/US2019/066038 US2019066038W WO2020123841A1 WO 2020123841 A1 WO2020123841 A1 WO 2020123841A1 US 2019066038 W US2019066038 W US 2019066038W WO 2020123841 A1 WO2020123841 A1 WO 2020123841A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual reality
processing device
frame
value
reality display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2019/066038
Other languages
French (fr)
Inventor
John Vito D'ANTONIO-BERTAGNOLLI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
New Jersey Institute of Technology
Original Assignee
New Jersey Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by New Jersey Institute of Technology filed Critical New Jersey Institute of Technology
Publication of WO2020123841A1 publication Critical patent/WO2020123841A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/08Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing binocular or stereoscopic vision, e.g. strabismus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • A61B3/111Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring interpupillary distance
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/003Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects

Definitions

  • Binocular dysfunctions are present in between 4% to 8% of the general population and between 40% and 50% of the brain injury population. Some examples of binocular dysfunctions include, but are not limited to nystagmus, strabismus, convergence insufficiency (Cl), convergence excess, divergence insufficiency and divergence excess.
  • the visual symptoms of binocular dysfunctions can be exacerbated by, for example, extensive use of hand held electronic devices (e.g., smart phones, tablets, etc.) as well as by any near visual tasks (e.g., reading, computer work, etc.) - adversely impacting occupational and recreational activities of those suffering from binocular dysfunctions.
  • asthenopic symptoms associated with binocular dysfunctions can include, but are not limited to double/blurred vision, eyestrain, visual fatigue, and headaches, which all negatively impact activities of daily living.
  • Vision therapy is one therapeutic intervention that is commonly used to treat binocular dysfunctions.
  • Exemplary embodiments of the present disclosure relate to systems, methods, apparatus, and computer-readable media for rending virtual reality environments are disclosed.
  • the systems, methods, apparatus, and computer-readable media disclosed herein can be used to render the virtual reality environments for vision therapy and/or for other purposes.
  • Third party virtual reality media e.g., games, videos, images, etc.
  • exemplary embodiments of the present disclosure can dynamically modify and/or adjust the manner in which the virtual reality media is rendered on a virtual reality display to affect the visual system vergence- accommodation conflict of a user.
  • Embodiments of the present disclosure negate the need for a full-time game development team to develop customized virtual reality media specifically for vision therapy, and allow users and practitioners to leverage the full content library available on various virtual reality platforms developed for the general public.
  • embodiments of the present disclosure change the game that the user is currently playing from a virtual reality media developed for the general public into a therapeutic experience.
  • a virtual reality game is designed so that the distance between the two cameras in the virtual world typically matches the distance between the user’s eyes, or interpupillary distance (IPD), in the real world. This setting minimizes the effects of the visual system’s vergence-accommodation conflict to improve visual comfort of the gamer.
  • Embodiments of the present disclosure interface with the underlying software and memory space in the computing device that handles communication between the virtual reality display and the game software. Embodiments of the present disclosure can intervene with the configurations and/or communications between the computing system processing the virtual reality game software and the display to render the virtual game software with increased the disparities (differences).
  • a virtual reality game designed for the general population can be modified on a frame-by- frame basis to render the virtual reality game with increased disparities between specific objects within the rendered virtual reality game that are seen by the left and right eyes in a controlled manner within the virtual environment.
  • This increase in disparity causes an increase in the convergence demand required for the user to maintain a single and clear image of the objects being rendered on the display, prompting the user to rotate his/her eyes in a controlled manner.
  • This stimulation of the extraocular muscles and engagement of the neural system when done repeatedly with increased difficulty, can lead to an improvement in that person’ s convergence system.
  • a system, method, and non-transitory computer-readable medium are disclosed.
  • Embodiments of the systems, methods, and non-transitory computer-readable media are provided for modifying the manner in which virtual reality media developed for the general public is rendered on a virtual reality display to provide vision therapy to a user, the method comprising.
  • Virtual reality media to be rendered in a virtual reality display is processed and one or more parameters are received from a virtual reality display.
  • the one or more parameters specify the manner in which the virtual reality media is to be rendered on the virtual reality display.
  • At least one of the one or more parameters is modified to alter the manner in which the virtual reality media is render on the virtual reality display after the one or more parameters are received and prior to transmission of the virtual reality media to the virtual reality display.
  • the virtual reality media on the virtual reality display based on the at least one of the one or more parameters that have been modified to increases an effect of the vergence-accommodation conflict of the user viewing the virtual reality display.
  • The includes receiving, by a processing device, one or more parameters from the virtual reality display, storing the one or more parameters in memory by the processing device, and modifying, in response to execution of intermediate software by the processing device, at least one of the one or more parameters in memory to have a value that is different than that which was received from the virtual reality display. The value is modified based on a setting in the intermediate software.
  • the method also includes retrieving, by the processing device, the one or more parameters including the value of the at least one of the one or more parameters and a frame of the virtual reality media to be rendered on the virtual reality display; processing, by the processing device, the frame of virtual reality media to be rendered by the virtual reality display.
  • a manner in which the frame of virtual reality media is render is determined at least in part by the value of the at least one of the one or more parameters.
  • the method also includes rendering the frame of the virtual reality media on the virtual reality display as modified by the value of the at least one of the one or more parameters.
  • a non-transitory computer-readable medium comprising instructions. Execution of the instructions by a processing device causes the processing device to receive one or more parameters from the virtual reality display, store the one or more parameters in memory by the processing device; and modify, in response to execution of intermediate software by the processing device, at least one of the one or more parameters in memory to have a value that is different than that which was received from the virtual reality display. The value is modified based on a setting in the intermediate software.
  • Execution of the instructions by the processing also cause the processing device to retrieve the one or more parameters including the value of the at least one of the one or more parameters and a frame of the virtual reality media to be rendered on the virtual reality display, and process the frame of virtual reality media to be rendered by a virtual reality display.
  • a manner in which the frame of virtual reality media is render is determined at least in part by the value of the at least one of the one or more parameters.
  • Execution of the instructions by the processing also cause the processing device to transmit the frame of the virtual reality media to be rendered on the virtual reality display as modified by the at least one of the one or more parameters to increase an effect of the vergence- accommodation conflict of the user viewing the virtual reality display.
  • a system for modifying the manner in which virtual reality media is rendered on a virtual reality display includes a virtual reality display, a memory, and a processing device.
  • the processing device is operatively coupled to the memory and the virtual reality display, and is configured to receive one or more parameters from the virtual reality display, store the one or more parameters in memory, and modify, in response to execution of intermediate software, at least one of the one or more parameters in memory to have a value that is different than that which was received from the virtual reality display.
  • the value is modified based on a setting in the intermediate software.
  • the processing device is also configured to retrieve the one or more parameters including the value of the at least one of the one or more parameters and a frame of the virtual reality media to be rendered on the virtual reality display, and process the frame of virtual reality media to be rendered by the virtual reality display.
  • a manner in which the frame of virtual reality media is render is determined at least in part by the value of the at least one of the one or more parameters.
  • the processing device is also configured to transmit the frame of the virtual reality media to be rendered on the virtual reality display as modified by the at least one of the one or more parameters to increase an effect of the vergence-accommodation conflict of the user viewing the virtual reality display.
  • the at least one of the one or more parameters received by the processing device from the virtual reality display is an interpupillary distance and the value is modified to be different than an actual interpupillary distance between eyes of the user.
  • the frame of virtual reality media can be processed by the processing device by setting an inter-camera distance between left image and right image of the frame of virtual reality media to be less than or greater than the actual interpupillary distance.
  • the virtual reality media can be rendered on the virtual reality display on a frame- by-frame basis, and the processing device can receive the interpupillary distance on the frame-by-frame basis from the virtual reality display.
  • the processing device can execute the intermediate software to modify the value of the interpupillary distance on the frame-by- frame basis after the interpupillary distance is stored in memory and before each frame is processed by the processing device to render each frame on the virtual reality display.
  • the processing device can executes the intermediate software to modify the value of the interpupillary distance to be identical for each frame, to be different for each frame, and/or to be based on a pattern or profile or combination thereof defined by the intermediate software, a step profile, a ramp profile, and/or a sweep profile.
  • FIG. 1 shows an exemplary virtual reality system in accordance with exemplary embodiments of the present disclosure.
  • FIG. 2 is a block diagram of an exemplary embodiment of the virtual reality display shown in FIG. 1.
  • FIG. 3 is a block diagram of an exemplary embodiment of the computing system shown in FIG. 1.
  • FIG. 4 illustrates a user’s eyes in an unmodified virtual reality environment in which virtual reality media developed for the general population.
  • FIG. 5 illustrates the user’s eyes in a modified virtual reality environment in which virtual reality media developed for the general population in accordance with embodiments of the present disclosure.
  • FIG. 6 is a flowchart illustrating an information flow in a virtual reality environment for a single frame according to embodiments of the present disclosure.
  • FIG. 7 is an exemplary graphical user interface for the intermediate software in accordance with embodiments of the present disclosure.
  • FIG. 8 is a flowchart illustrating an exemplary process for modifying the manner in which VR media developed for the general public is rendered on a virtual reality display in accordance with embodiments of the present disclosure.
  • Exemplary embodiments of the present disclosure can interface with the underlying controller software executing on a processing device that handles communication between the processing device and the virtual reality (VR) display to render the VR media on the VR display.
  • VR virtual reality
  • the manner in which the VR media is rendered by exemplary embodiments is modified to increase the disparities between the images presented to the user within the virtual environment. This increase in disparity causes an increase in the convergence demand required for the user to maintain a single and clear image of the scene, prompting the user to rotate his/her eyes in a controlled manner.
  • FIG. 1 shows an exemplary virtual reality system 100 in accordance with exemplary embodiments of the present disclosure.
  • the virtual reality system 100 can include a computing system 110 and a virtual reality display 150.
  • the virtual reality display 150 and the computing system 110 can be communicatively coupled to each other via wireless or wired communications such that the virtual reality display 150 and the computing system 110 can interact with each other to implement a virtual reality environment.
  • the computing system 110 and the virtual reality display can be integrally formed or housed within the same form factor (e.g., tablet, smart phone, etc.).
  • the virtual reality display 150 can communicate with the computing system 110 via a communications network such that the computing system 110 can remotely located relative to the virtual reality display 150.
  • the virtual reality system 100 can be configured to provide a three-dimensional virtual reality gaming environment.
  • the computing system 110 can receive and/or store virtual reality media 105, virtual media software 112, controller software 114, and an intermediate software application 116.
  • the virtual reality media 105 can be a virtual reality game, movie, scenes, images, virtual worlds, and/or any other content to be rendered using the virtual reality display 150.
  • the virtual reality media 112 can be developed for the general population (as opposed to for vision therapy) such that, under normal circumstances and conditions, the virtual reality media software 112 renders data of the virtual reality media 105 in a manner that reduces vergence-accommodation conflict of a user to provide a comfortable viewing environment for a user.
  • the controller software 114 can be executed by the computing system 110 to provide an interface between the virtual media software 112 and the virtual reality display and/or to control communication between the virtual reality media software 112 and the virtual reality display 150 to render images from the virtual reality media 105.
  • the computing system 110 can reserve and utilize a memory space 118 in memory 120 of the computing system 110 processing the virtual reality media 105 and interfacing with the virtual reality display 150.
  • the computing system 110 can reserve and utilize the memory space 118 to store data 122 from the virtual reality media 105 (e.g., to rendered on the virtual reality display), parameters 124 from the virtual reality media 105, and/or parameters 126 received from the virtual reality display 150.
  • the computing system 110 can generate and/or utilize the memory space 118 in the memory 120 or other storage device to store an in-memory version of the virtual reality data 122, or portions thereof, to be transmitted to, and rendered by, the virtual reality display 150.
  • the intermediate software application 116 can be executed by the computing system 110 to adjust/modify the data 122 from the virtual reality media software 112, parameters 124, and/or the parameters 126 subsequent to processing by the controller software 114 being executed by the computing system 110 and prior to transmission of the data 122 to the virtual reality display 150.
  • the computing system 110 can access the memory space 118 and can modify the data 122 from the virtual reality media 105 and/or virtual reality media software 112 (e.g., to be rendered on the virtual reality display), parameters 124 from the virtual reality media 105 and/or virtual media software 112, and/or the parameters 126 received from the virtual reality display 150 to change the manner in which the data 122 is rendered by the virtual reality display 150 as described herein. At least some of the parameters 124 and/or 126 can be update on a frame- by- frame basis.
  • the virtual reality display 150 can communicate with the computing system 110 on a frame-by-frame basis to change or update the parameters 126 associated with the virtual reality display 150 and/or the user interacting with the virtual reality display 150 in the memory 120.
  • the virtual reality display 150 can be configured to render the data 122 related to the virtual reality media 105 using the stereoscopic effect with separate images for the right and left eyes to create a visual perception of depth.
  • the virtual reality display 150 can include circuitry for interfacing with the computing system 110 to communicate/transmit information associated with the virtual reality display 150 and/or with the user interacting with the virtual reality display.
  • the virtual reality display 150 can transmit the parameters 126 including information related to a type of display, a resolution of the display, a refresh or frame rate of the display, and/or any other parameters that may be used by the virtual reality media software executing on the computing system 110 when processing the virtual reality media 105.
  • the virtual reality display 150 can transmit the parameters 126 including information related to an inter-pupil distance (IPD) of the user as well as other information related to the user of the virtual reality display.
  • the virtual reality media software 112 executing on the computing system 110 can process the virtual reality media 105 and transmit the data 122 related to the virtual reality media 105 to the virtual reality display 150, which can render virtual reality scenes or images to be viewed by one or more users of the display.
  • the scenes or images can be rendered by the virtual reality display 110 on a frame-by-frame basis.
  • the intermediate software 116 can be executed by the computing system 110 in parallel to the controller software 114 to modify the manner in which the scenes or images of the virtual reality media 105 are rendered on the virtual reality display 150 prior to transmission of the data 122 from the computing system 110 to the virtual reality display 150.
  • the intermediate software 116 can modify/change the IPD received from the virtual reality device 150 in a frame-by- frame basis to increase the effects of the vergence- accommodation conflict of the user viewing the virtual reality display 150.
  • the intermediate software 116 can be programmed to modify the IPD with a static fixed value that is the same for each frame, modify the IPD to change on a frame-by-frame basis, modify the IPD according to one or more patterns or specified profiles on an intra-frame basis or an inter- frame basis to change the inter-camera distance based on a setting in the intermediate software 116.
  • the patterns or profiles that can be used by the intermediate software 116 can include a step pattern where the received/reported IPD is changed in steps of a specified increment, a ramp pattern where the received/reported IPD is changed gradually in an increasing or decreasing manner, a sweep pattern where the received/reported IPD is changed across a spectrum of values including received/reported IPD values that are smaller than and larger than the received/reported IPD of a user, a step-ramp pattern that combines the step and ramp patterns, and/or a ramp-step pattern that combines the ramp and step patterns.
  • FIG. 2 shows an embodiment of the virtual reality display 150 in accordance with exemplary embodiments of the present disclosure.
  • the virtual reality display 150 can be a head mounted display that can be communicatively coupled to the computing system 110 via wireless or wired communications.
  • the virtual reality display 150 can include circuitry disposed within a housing 250.
  • the circuitry can include a right eye display 222, a left eye display 224, one or more right eye image capturing devices 226, one or more left eye image capturing devices 228, one or more right eye light emitting diodes 230, one or more left eye light emitting diodes 232, a right eye controller 234, a left eye controller 236, one or more display controllers 238, and one or more hardware interfaces 240.
  • the right and left eye displays 222 and 224 can be disposed within the housing 250 such that the right eye display 222 is positioned in front of the right eye of the user when the housing 250 is mounted on the user’s head and the left eye display 224 is positioned in front of the left eye of the user when the housing 250 is mounted on the user’s head.
  • the right eye display 222 and the left eye display 224 can be controlled by the one or more display controllers 238 to render images on the right and left eye displays 222 and 224 to induce a stereoscopic effect, which can be used to generate three-dimensional images, where objects in the images can be perceived by the user’s vision system as being at different depths while maintaining constant focal length between the user’ s right eye and the right eye display 222 and between the user’s left eye and the left eye display 224.
  • the right eye display 222 and/or the left eye display 224 can be implemented as a light emitting diode display, an organic light emitting diode (OLED) display (e.g., passive-matrix (PMOLED) display, active-matrix (AMOLED) display), and/or any suitable display.
  • OLED organic light emitting diode
  • PMOLED passive-matrix
  • AMOLED active-matrix
  • the one or more right eye image capturing devices 226 can be disposed in the housing 250 relative to the right eye display 222 so that the one or more right eye image capturing devices 226 can be positioned and oriented to capture images of the user’ s right eye as the user views the right eye display 222.
  • the one or more left eye image capturing devices 228 can be disposed in the housing 250 relative to the left eye display 224 so that the one or more left eye image capturing devices 228 can be positioned and oriented to capture images of the user’s left eye as the user views the left eye display 224.
  • the one or more right and left eye image capturing devices 222 and 224 can be infrared (IR) cameras configured to have a particular sensitive to IR light (e.g., to capture images of IR radiation).
  • the one or more right eye light emitting diodes 230 can be disposed in the housing 250 relative to the right eye display 222 and the one or more right eye light emitting diodes so that the one or more light emitting diodes 230 can be positioned and oriented to emit light towards the user’s right eye as the user views the right eye display 222.
  • the one or more left eye light emitting diodes 232 can be disposed in the housing 250 relative to the left eye display 224 so that the one or more left eye light emitting diodes 232 can be positioned and oriented to emit light towards the user’s left eye as the user views the left eye display 224.
  • the one or more right and left eye light emitting diodes 230 and 232 can be infrared (IR) light emitting diodes configured to emit IR light.
  • the light emitting diodes project infrared light into the eye at about ten percent (10%) of the safety limit.
  • the right eye controller 234 can be operatively coupled to the one or more right eye image capturing devices 226 to control an operation of the one or more right eye image capturing devices 226 and/or to process the images of the right eye captured by the one or more right eye image capturing devices 226.
  • the left eye controller 236 can be operatively coupled to the one or more left eye image capturing devices 228 to control an operation of the one or more left eye image capturing devices 228 and/or to process the images of the left eye captured by the one or more left eye image capturing devices 228.
  • the right and left eye controllers 234 and 236 can be configured to control a shutter, aperture, refresh rate, discharge rate, and the like of the one or more right and left eye image capturing devices 222 and 224, respectively.
  • the right and left eye controllers 234 and 236 can monitor and/or track the movement of the user’s right and right eyes as the user views the right and left eye displays 226, respectively, which can be utilized by exemplary embodiments to effect vision therapy of the user for binocular dysfunctions.
  • exemplary embodiments of the present disclosure can be implemented with a single integrated controller to control and interface with the right and left eye image capturing devices 222 and 224.
  • the one or more display controllers 238 can be operatively coupled to right and left eye displays 222 and 224 to control an operation of the right and left eye displays 222 and 224 in response to input (e.g., the data 122) received from the computing system 110.
  • the one or more display controllers 238 can be configured to render images on the right and left eye displays of the same scene and/or objects, where images of the scene and/or objects are render at slightly different angles or points-of-view to facilitate the stereoscopic effect.
  • the one or more display controllers 238 can include graphical processing units.
  • the one or more hardware interfaces 240 can facilitate communication between the head mounted virtual reality display 150 and the computing system 110.
  • the virtual reality display 150 can be configured to transmit the parameters 126 to the computing system 110 and to receive data 122 from the computing system 110 via the one or more hardware interfaces 240.
  • the one or more hardware interfaces 240 can be configured to receive the data 122 from the computing system 110 corresponding to images and can be configured to transmit the data 122 to the one or more display controllers 238, which can render the images on the right and left eye displays 222 and 224 to provide a virtual reality environment in three-dimensions (e.g., as a result of the stereoscopic effect).
  • the housing 250 can include a mounting structure 252 and a display structure 254.
  • the mounting structure 252 allows a user to wear the head mounted virtual reality display 150 on his/her head and to position the display structure over his/her eyes to facilitate viewing of the right and left eye displays 222 and 224 by the right and left eyes of the user, respectively.
  • the mounting structure can be configured to generally mount the head mounted virtual reality display 150 on a user’ s head in a secure and stable manner. As such, the head mounted virtual reality display 150 generally remains generally fixed with respect to the user’s head such that when the user moves his/her head left, right, up, and down, the head mounted virtual reality display 150 generally moves with the user’s head.
  • the display structure 254 can be contoured to fit snug against a user’s face to cover the user’ s eyes and to generally prevent light from the environment surrounding the user from reaching the user’s eyes.
  • the display structure 254 can include a right eye portal 256 and a left eye portal 258 formed therein.
  • a right eye lens 260a can be disposed over the right eye portal and a left eye lens 260b can be disposed over the left eye portal.
  • the right eye display 222, the one or more right eye image capturing devices 226, and the one or more right eye light emitting diodes 230 can be disposed within the display structure 254 behind the lens 260 covering the right eye portal 256 such that the lens 256 is disposed between the user’s right eye and each of the right eye display 222, the one or more right eye image capturing devices 226, and the one or more right eye light emitting diodes 230.
  • the left eye display 224, the one or more left eye image capturing devices 228, and the one or more left eye light emitting diodes 232 can be disposed within the display structure 254 behind the lens 260 covering the left eye portal 258 such that the lens 260 is disposed between the user’s left eye and each of the left eye display 224, the one or more left eye image capturing devices 228, and the one or more left eye light emitting diodes 232.
  • the one or more right eye image capturing devices 226 and the one or more right eye light emitting diodes 230 are described as being disposed behind the lens 260 covering the right eye portal as an example embodiment, in exemplary embodiments of the present disclosure the one or more right eye image capturing devices 226 and/or the one or more right eye light emitting diodes 230 can be disposed in front of and/or around the lens 260 covering the right eye portal such that lens 260 is not positioned between the user’ s right eye and the one or more right eye image capturing devices 226 and/or the one or more right eye light emitting diodes 230.
  • the one or more left eye image capturing devices 228 and the one or more left eye light emitting diodes 232 are described as being disposed behind the lens 260 covering the left eye portal as an example embodiment, in exemplary embodiments of the present disclosure the one or more left eye image capturing devices 228 and/or the one or more left eye light emitting diodes 232 can be disposed in front of and/or around the lens 260 covering the left eye portal such that lens 260 is not positioned between the user’ s left eye and the one or more right eye image capturing devices 226 and/or the one or more right eye light emitting diodes 230.
  • the computing system 110 transmits the data 122 to the head mounted display 150 and information for rendering the data 122.
  • the data 122 can include right and left images to be rendered by the right and left eye displays 222 and 224 and the information can include camera positions from which the images are rendered based on the parameters 124 and/or 126 (e.g. the parameter for the inter-pupil distance).
  • the user’s visual system can attempt to perceive the right and left images as a single image in three-dimensional space (e.g., using the stereoscopic effect).
  • exemplary embodiments of the present disclosure can be configured such that the head mounted display includes the computing system 110 and/or is configured to perform the functions and operations of the computing system 110 such that the head mount virtual display 150 is a self-contained, stand-alone device or system.
  • the virtual reality display 150 is shown as having two displays, exemplary embodiments of the virtual reality display can be formed be a single display that is divided into a right eye portion and a left eye portion.
  • a mobile device such as a smart phone, can be the virtual reality display.
  • FIG. 3 is a block diagram of an exemplary embodiment of the computing system 110.
  • the computing system 110 can be a gaming console configured to execute virtual reality games to be rendered through embodiments of the head mounted display 150 and/or can be any system configured to and/or programmed to process virtual reality media to be rendered on a display.
  • the computing system 110 includes one or more non-transitory computer-readable media for storing one or more computer-executable instructions or software for implementing exemplary embodiments, such as the virtual reality media software 112, the controller software 114, and/or the intermediate software 116.
  • the non-transitory computer-readable media may include, but are not limited to, one or more types of hardware memory, non-transitory tangible media (for example, one or more magnetic storage disks, one or more optical disks, one or more flash drives), and the like.
  • memory 120 included in the computing system 110 may store computer-readable and computer-executable instructions or software for implementing exemplary embodiments.
  • the computing system 110 also includes processor 302 and associated core 304, and optionally, one or more additional processor(s) 302’ and associated core(s) 304’ (for example, in the case of computer systems having multiple processors/cores), for executing instances of computer-readable and computer-executable instructions or software stored in the memory 120 and other programs for controlling system hardware.
  • the processor(s) 302, 302’ can execute the virtual reality media software, the controller software 114, and/ the intermediate software to process the virtual reality media 105 and render data from the virtual reality media on the virtual reality display 150.
  • Processor 302 and processor(s) 302’ may each be a single core processor or multiple core (304 and 304’) processor and may be central processing units, graphical processing units, and the like.
  • Virtualization may be employed in the computing system 110 so that infrastructure and resources in the computing device may be shared dynamically.
  • a virtual machine 314 may be provided to handle a process running on multiple processors so that the process appears to be using only one computing resource rather than multiple computing resources. Multiple virtual machines may also be used with one processor.
  • Memory 120 may include a computer system memory or random access memory, such as DRAM, SRAM, EDO RAM, and the like. Memory 120 may include other types of memory as well, or combinations thereof.
  • a user may interact with the computing system 110 through an embodiment of the virtual reality display 150, which can display one or more images of the virtual reality media 112 in accordance with exemplary embodiments.
  • the computing system 110 may include other I/O devices for receiving input from a user, for example, a keyboard or any suitable multi-point touch interface 308, a pointing device 310 (e.g., a mouse or joystick).
  • the computing system 110 may include other suitable conventional I/O peripherals.
  • the computing system 110 may also include one or more storage devices 324, such as a hard-drive, CD-ROM, or other computer readable media, for storing data and computer- readable instructions and/or software that process the virtual reality media 112 including, for example, the controller software 114 and the intermediate application 116.
  • Exemplary storage device 324 may also store one or more databases for storing any suitable information required to implement exemplary embodiments.
  • exemplary storage device 324 can store one or more databases 328 for storing information, the data 122, the parameters 124, and/or the parameters 126, and the like.
  • the databases may be updated at any suitable time to add, delete, and/or update one or more items in the databases.
  • the computing system 110 can include a network interface 312 configured to interface via one or more network devices 322 with one or more networks, for example, Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (for example, 802.11, Tl, T3, 56kb, X.25), broadband connections (for example, ISDN, Frame Relay, ATM), wireless connections, controller area network (CAN), or some combination of any or all of the above.
  • LAN Local Area Network
  • WAN Wide Area Network
  • the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (for example, 802.11, Tl, T3, 56kb, X.25), broadband connections (for example, ISDN, Frame Relay, ATM), wireless connections, controller area network (CAN), or some combination of any or all of the above.
  • the network interface 312 may include a built-in network adapter, network interface card, PCMCIA network card, card bus network adapter, wireless network adapter, USB network adapter, modem or any other device suitable for interfacing the computing system 110 to any type of network capable of communication and performing the operations described herein.
  • the computing system 110 may be any computer system, such as a workstation, desktop computer, server, laptop, handheld computer, tablet computer (e.g., the iPadTM tablet computer), mobile computing or communication device (e.g., the iPhoneTM communication device), or other form of computing or telecommunications device that is capable of communication and that has sufficient processor power and memory capacity to perform the operations described herein.
  • the computing system 110 may run any operating system 316, such as any of the versions of the Microsoft® Windows® operating systems, the different releases of the Unix and Linux operating systems, any version of the MacOS® for Macintosh computers, any embedded operating system, Microsoft® Xbox operating systems for Xbox gaming systems, Playstation operating systems for PlayStation gaming systems, Wii operating systems for Nintendo® Wii gaming systems, Switch operating system for Nintendo® Switch gaming systems, any real-time operating system, any open source operating system, any proprietary operating system, or any other operating system capable of running on the computing device and performing the operations described herein.
  • the operating system 316 may be run in native mode or emulated mode.
  • the operating system 316 may be run on one or more cloud machine instances.
  • FIG. 4 illustrates a user’s eyes 402, 404 in a conventional virtual reality environment 400 in which virtual reality media developed for the general population (e.g., the virtual reality media 112).
  • the virtual reality media processed by the controller software is designed to be rendered so that the distance between the two cameras 406, 408 in the virtual world, or inter-camera distance (ICD) 410, should match the distance between the user’s eyes 402, 404, or interpupillary distance (IPD) 412, in the real world.
  • ICD inter-camera distance
  • IPD interpupillary distance
  • FIG. 5 illustrates the user’s eyes 402, 404 in a virtual reality environment 500 in accordance with embodiments of the present disclosure in which the virtual reality media from FIG. 4 (e.g., the virtual reality media 112) is rendered on the display is modified by the intermediate application.
  • the virtual reality media processed by the controller software is modified prior to transmission to the virtual reality display so that the distance between the two cameras 406, 408 in the virtual world, or inter-camera distance (ICD) 510, is different than (e.g., greater than or less than) the distance between the user’s eyes 402, 404, or interpupillary distance (IPD) 512, in the real world.
  • ICD inter-camera distance
  • IPD interpupillary distance
  • FIG. 6 is a flowchart 600 illustrating an information flow in a virtual reality environment for a single frame according to embodiments of the present disclosure.
  • the virtual reality media software 112 requests recent pose and IPD information from the controller software 114.
  • the controller software 114 looks for this information from the virtual reality display 150 in the memory space.
  • the controller software 114 informs the virtual reality media software 112 of any changes to pose or IPD.
  • the virtual reality media software 112 sends the final frame render to the virtual reality display.
  • the intermediate software 116 disrupts the flow of information from the controller software to the virtual reality media, changing the reported IPD values from the virtual reality display to modify the inter-camera distance (ICD) to be less than or greater than the reported IPD and to modify the manner in which the virtual reality data is rendered; thereby providing a therapeutic effect from virtual reality media.
  • ICD inter-camera distance
  • FIG. 7 is an exemplary graphical user interface (GUI) 700 for the intermediate software in accordance with embodiments of the present disclosure.
  • the GUI 700 can include input fields for controlling the operation of the intermediate software 116 to modify the manner in which the data from the virtual reality media is rendered on the virtual reality display.
  • the GUI 700 can allow a user to select various specified patterns or profiles (e.g., setting) by which the intermediate software controls the IPD value received from the virtual reality display and utilized by the computing system to render the data in a manner that increases and/or decreases the effects of the vergence-accommodation conflict of the user viewing the virtual reality display according to the patterns or profiles (e.g., by adjusting the inter-camera distance).
  • the inputs can allow a user to create patterns or profiles using inputs to specify, for example, step, ramp, sweep, step-ramp, and/or ramp-step patterns with different values, which once created, can be added to the available specified patterns or profiles for subsequent use.
  • an input can be provided that allows users to remove a specified pattern or profile.
  • Outputs can be provided to provide the user with information regarding the selected pattern or profile being implemented by the intermediate software as well as when the pattern or profile was started and a time remaining for the pattern or profile to ran.
  • the GUI 700 can allow a user to start, stop, and/or pause the implementation of the pattern or profile.
  • FIG. 8 is a flowchart illustrating an exemplary process 800 for modifying the manner in which VR media developed for the general public is rendered on a virtual reality display to provide vision therapy to a user.
  • the process 800 can be performed for each frame of virtual reality data to be rendered on a virtual reality display, can be performed once for all of the frames of virtual reality data to be rendered on the virtual reality display, and/or can be performed on some frames, but not others, of the virtual reality data to be rendered on the virtual reality display.
  • a computing system 110 executes virtual reality media software 112 to process virtual reality media 105 to be rendered on a virtual reality display 150.
  • intermediate software 116 and controller software 114 are executed on the computing system.
  • the controller software 114 interfaces and communicates with the virtual reality display to configure the virtual reality environment.
  • the controller software 114 can receive parameters 126 from the virtual reality display (e.g., such as the IPD) that can be used by the virtual reality media software to generate (left and right images of) frames to be rendered on the virtual reality display.
  • the intermediate software application can modify the parameters (e.g., the reported and stored IPD) based on one or more specified programs or profiles (e.g., setting in the intermediate software).
  • the virtual reality data to be rendered on the virtual reality display is altered according to the modified parameters.
  • the intermediate software can modify the reported IPD received from the virtual reality display to change an inter-camera distance associated with the rendering of the data of the virtual reality media (e.g., set the inter-camera distance to be less than or greater than the reported IPD). This modification can increases the effects of the vergence-accommodation conflict of the user viewing the virtual reality display.
  • Exemplary flowcharts are provided herein for illustrative purposes and are non limiting examples of methods.
  • One of ordinary skill in the art will recognize that exemplary methods may include more or fewer steps than those illustrated in the exemplary flowcharts, and that the steps in the exemplary flowcharts may be performed in a different order than the order shown in the illustrative flowcharts.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Systems, methods, and non-transitory computer-readable media are provided for modifying the manner in which virtual reality media developed for the general public is rendered on a virtual reality display to provide vision therapy to a user, the method comprising. Virtual reality media to be rendered in a virtual reality display is processed and one or more parameters are received from a virtual reality display. The one or more parameters specify the manner in which the virtual reality media is to be rendered on the virtual reality display. At least one of the one or more parameters is modified to alter the manner in which the virtual reality media is render on the virtual reality display after the one or more parameters are received and prior to transmission of the virtual reality media to the virtual reality display. The virtual reality media on the virtual reality display based on the at least one of the one or more parameters that have been modified to increases an effect of the vergence- accommodation conflict of the user viewing the virtual reality display.

Description

SYSTEMS AND METHODS FOR RENDERING GENERAL VIRTUAL REALITY ENVIRONMENTS FOR VISION THERAPY
RELATED APPLICATIONS
[0001] The present application claims the benefit of and priority to U.S. Provisional Application No. 62/778,766, filed on December 12, 2018, the disclosure of which is incorporated by reference herein in its entirety.
BACKGROUND
[0002] Binocular dysfunctions are present in between 4% to 8% of the general population and between 40% and 50% of the brain injury population. Some examples of binocular dysfunctions include, but are not limited to nystagmus, strabismus, convergence insufficiency (Cl), convergence excess, divergence insufficiency and divergence excess. The visual symptoms of binocular dysfunctions can be exacerbated by, for example, extensive use of hand held electronic devices (e.g., smart phones, tablets, etc.) as well as by any near visual tasks (e.g., reading, computer work, etc.) - adversely impacting occupational and recreational activities of those suffering from binocular dysfunctions. When engaged in reading or other near work, asthenopic symptoms associated with binocular dysfunctions can include, but are not limited to double/blurred vision, eyestrain, visual fatigue, and headaches, which all negatively impact activities of daily living. Vision therapy is one therapeutic intervention that is commonly used to treat binocular dysfunctions.
[0003] Patients with convergence insufficiency, convergence excess, divergence insufficiency, divergence excess, amblyopia, strabismus, and/or nystagmus have an impaired ability to reduce the disparity (the error or difference) between the left and right eye image and report seeing double or having blurred vision. Vision therapy aims to improve a patient’s ability to reduce disparity by enhancing the brain’s visual system to increase the speed, accuracy and precision of vergence eye movements, which are critical for everyday activities such as reading. [0004] In recent years, virtual reality systems have been used to provide therapeutic treatment for vision. Conventionally, the use of these virtual reality systems for vision therapy has been limited. One reason may be that developers are typically forced to create their own library of games specifically designed for vision therapy, a time-consuming and capital-intensive process.
SUMMARY
[0005] Exemplary embodiments of the present disclosure relate to systems, methods, apparatus, and computer-readable media for rending virtual reality environments are disclosed. In exemplary embodiments, the systems, methods, apparatus, and computer- readable media disclosed herein can be used to render the virtual reality environments for vision therapy and/or for other purposes. Third party virtual reality media (e.g., games, videos, images, etc.) developed for the general public can be utilized by exemplary embodiments of the present disclosure. For example, exemplary embodiments of the present disclosure can dynamically modify and/or adjust the manner in which the virtual reality media is rendered on a virtual reality display to affect the visual system vergence- accommodation conflict of a user.
[0006] Embodiments of the present disclosure negate the need for a full-time game development team to develop customized virtual reality media specifically for vision therapy, and allow users and practitioners to leverage the full content library available on various virtual reality platforms developed for the general public. By interfacing with virtual reality displays, embodiments of the present disclosure change the game that the user is currently playing from a virtual reality media developed for the general public into a therapeutic experience.
[0007] Normally, in conventional virtual reality platforms, a virtual reality game is designed so that the distance between the two cameras in the virtual world typically matches the distance between the user’s eyes, or interpupillary distance (IPD), in the real world. This setting minimizes the effects of the visual system’s vergence-accommodation conflict to improve visual comfort of the gamer. Embodiments of the present disclosure interface with the underlying software and memory space in the computing device that handles communication between the virtual reality display and the game software. Embodiments of the present disclosure can intervene with the configurations and/or communications between the computing system processing the virtual reality game software and the display to render the virtual game software with increased the disparities (differences). For example, a virtual reality game designed for the general population can be modified on a frame-by- frame basis to render the virtual reality game with increased disparities between specific objects within the rendered virtual reality game that are seen by the left and right eyes in a controlled manner within the virtual environment. This increase in disparity causes an increase in the convergence demand required for the user to maintain a single and clear image of the objects being rendered on the display, prompting the user to rotate his/her eyes in a controlled manner. This stimulation of the extraocular muscles and engagement of the neural system, when done repeatedly with increased difficulty, can lead to an improvement in that person’ s convergence system. By creating software that interfaces with the virtual reality display itself and modifies the way scenes are rendered on the virtual reality display, embodiments of the present disclosure can elicit increased convergence demands from any game supported by that virtual reality display.
[0008] In accordance with embodiments of the present disclosure, a system, method, and non-transitory computer-readable medium are disclosed. Embodiments of the systems, methods, and non-transitory computer-readable media are provided for modifying the manner in which virtual reality media developed for the general public is rendered on a virtual reality display to provide vision therapy to a user, the method comprising. Virtual reality media to be rendered in a virtual reality display is processed and one or more parameters are received from a virtual reality display. The one or more parameters specify the manner in which the virtual reality media is to be rendered on the virtual reality display. At least one of the one or more parameters is modified to alter the manner in which the virtual reality media is render on the virtual reality display after the one or more parameters are received and prior to transmission of the virtual reality media to the virtual reality display. The virtual reality media on the virtual reality display based on the at least one of the one or more parameters that have been modified to increases an effect of the vergence-accommodation conflict of the user viewing the virtual reality display. [0009] In accordance with embodiments of the present disclosure, a method for modifying a manner in which virtual reality media is rendered on a virtual reality display to a user is disclosed. The includes receiving, by a processing device, one or more parameters from the virtual reality display, storing the one or more parameters in memory by the processing device, and modifying, in response to execution of intermediate software by the processing device, at least one of the one or more parameters in memory to have a value that is different than that which was received from the virtual reality display. The value is modified based on a setting in the intermediate software. The method also includes retrieving, by the processing device, the one or more parameters including the value of the at least one of the one or more parameters and a frame of the virtual reality media to be rendered on the virtual reality display; processing, by the processing device, the frame of virtual reality media to be rendered by the virtual reality display. A manner in which the frame of virtual reality media is render is determined at least in part by the value of the at least one of the one or more parameters. The method also includes rendering the frame of the virtual reality media on the virtual reality display as modified by the value of the at least one of the one or more parameters.
[0010] In accordance with embodiments of the present disclosure, a non-transitory computer-readable medium comprising instructions is disclosed. Execution of the instructions by a processing device causes the processing device to receive one or more parameters from the virtual reality display, store the one or more parameters in memory by the processing device; and modify, in response to execution of intermediate software by the processing device, at least one of the one or more parameters in memory to have a value that is different than that which was received from the virtual reality display. The value is modified based on a setting in the intermediate software. Execution of the instructions by the processing also cause the processing device to retrieve the one or more parameters including the value of the at least one of the one or more parameters and a frame of the virtual reality media to be rendered on the virtual reality display, and process the frame of virtual reality media to be rendered by a virtual reality display. A manner in which the frame of virtual reality media is render is determined at least in part by the value of the at least one of the one or more parameters. Execution of the instructions by the processing also cause the processing device to transmit the frame of the virtual reality media to be rendered on the virtual reality display as modified by the at least one of the one or more parameters to increase an effect of the vergence- accommodation conflict of the user viewing the virtual reality display.
[0011] In accordance with embodiments of the present disclosure, a system for modifying the manner in which virtual reality media is rendered on a virtual reality display is disclosed. The system includes a virtual reality display, a memory, and a processing device. The processing device is operatively coupled to the memory and the virtual reality display, and is configured to receive one or more parameters from the virtual reality display, store the one or more parameters in memory, and modify, in response to execution of intermediate software, at least one of the one or more parameters in memory to have a value that is different than that which was received from the virtual reality display. The value is modified based on a setting in the intermediate software. The processing device is also configured to retrieve the one or more parameters including the value of the at least one of the one or more parameters and a frame of the virtual reality media to be rendered on the virtual reality display, and process the frame of virtual reality media to be rendered by the virtual reality display. A manner in which the frame of virtual reality media is render is determined at least in part by the value of the at least one of the one or more parameters.
[0012] In accordance with embodiments of the present disclosure, the processing device is also configured to transmit the frame of the virtual reality media to be rendered on the virtual reality display as modified by the at least one of the one or more parameters to increase an effect of the vergence-accommodation conflict of the user viewing the virtual reality display.
[0013] In accordance with embodiments of the present disclosure, the at least one of the one or more parameters received by the processing device from the virtual reality display is an interpupillary distance and the value is modified to be different than an actual interpupillary distance between eyes of the user. The frame of virtual reality media can be processed by the processing device by setting an inter-camera distance between left image and right image of the frame of virtual reality media to be less than or greater than the actual interpupillary distance. The virtual reality media can be rendered on the virtual reality display on a frame- by-frame basis, and the processing device can receive the interpupillary distance on the frame-by-frame basis from the virtual reality display. The processing device can execute the intermediate software to modify the value of the interpupillary distance on the frame-by- frame basis after the interpupillary distance is stored in memory and before each frame is processed by the processing device to render each frame on the virtual reality display. The processing device can executes the intermediate software to modify the value of the interpupillary distance to be identical for each frame, to be different for each frame, and/or to be based on a pattern or profile or combination thereof defined by the intermediate software, a step profile, a ramp profile, and/or a sweep profile.
[0014] Any combination and/or permutation of embodiments is envisioned. Other objects and features will become apparent from the following detailed description considered in conjunction with the accompanying drawings. It is to be understood, however, that the drawings are designed as an illustration only and not as a definition of the limits of the present disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] So that those having ordinary skill in the art will have a better understanding of how to make and use the disclosed systems and methods, reference is made to the accompanying figures wherein:
[0016] FIG. 1 shows an exemplary virtual reality system in accordance with exemplary embodiments of the present disclosure.
[0017] FIG. 2 is a block diagram of an exemplary embodiment of the virtual reality display shown in FIG. 1.
[0018] FIG. 3 is a block diagram of an exemplary embodiment of the computing system shown in FIG. 1.
[0019] FIG. 4 illustrates a user’s eyes in an unmodified virtual reality environment in which virtual reality media developed for the general population. [0020] FIG. 5 illustrates the user’s eyes in a modified virtual reality environment in which virtual reality media developed for the general population in accordance with embodiments of the present disclosure.
[0021] FIG. 6 is a flowchart illustrating an information flow in a virtual reality environment for a single frame according to embodiments of the present disclosure.
[0022] FIG. 7 is an exemplary graphical user interface for the intermediate software in accordance with embodiments of the present disclosure.
[0023] FIG. 8 is a flowchart illustrating an exemplary process for modifying the manner in which VR media developed for the general public is rendered on a virtual reality display in accordance with embodiments of the present disclosure.
DETAILED DESCRIPTION
[0024] Exemplary embodiments of the present disclosure can interface with the underlying controller software executing on a processing device that handles communication between the processing device and the virtual reality (VR) display to render the VR media on the VR display. By dynamically modifying and/or adjusting parameters the virtual reality media software uses to calculate the user’s IPD during processing of the VR media by the computing system, the manner in which the VR media is rendered by exemplary embodiments is modified to increase the disparities between the images presented to the user within the virtual environment. This increase in disparity causes an increase in the convergence demand required for the user to maintain a single and clear image of the scene, prompting the user to rotate his/her eyes in a controlled manner. This stimulation of the extraocular muscles and engagement of the neural system, when done repeatedly with increased difficulty, can lead to an improvement in that person’s convergence system. By executing intermediate software that interfaces with the VR display itself, exemplary embodiments of the present disclosure can elicit increased convergence demands from any VR media supported by that VR display including VR media developed for the general public. [0025] FIG. 1 shows an exemplary virtual reality system 100 in accordance with exemplary embodiments of the present disclosure. The virtual reality system 100 can include a computing system 110 and a virtual reality display 150. The virtual reality display 150 and the computing system 110 can be communicatively coupled to each other via wireless or wired communications such that the virtual reality display 150 and the computing system 110 can interact with each other to implement a virtual reality environment. In some embodiments, the computing system 110 and the virtual reality display can be integrally formed or housed within the same form factor (e.g., tablet, smart phone, etc.). In some embodiments, the virtual reality display 150 can communicate with the computing system 110 via a communications network such that the computing system 110 can remotely located relative to the virtual reality display 150. As one non- limiting embodiment, the virtual reality system 100 can be configured to provide a three-dimensional virtual reality gaming environment.
[0026] The computing system 110 can receive and/or store virtual reality media 105, virtual media software 112, controller software 114, and an intermediate software application 116. The virtual reality media 105 can be a virtual reality game, movie, scenes, images, virtual worlds, and/or any other content to be rendered using the virtual reality display 150. The virtual reality media 112 can be developed for the general population (as opposed to for vision therapy) such that, under normal circumstances and conditions, the virtual reality media software 112 renders data of the virtual reality media 105 in a manner that reduces vergence-accommodation conflict of a user to provide a comfortable viewing environment for a user.
[0027] The controller software 114 can be executed by the computing system 110 to provide an interface between the virtual media software 112 and the virtual reality display and/or to control communication between the virtual reality media software 112 and the virtual reality display 150 to render images from the virtual reality media 105. When the controller software 114 is executed, the computing system 110 can reserve and utilize a memory space 118 in memory 120 of the computing system 110 processing the virtual reality media 105 and interfacing with the virtual reality display 150. The computing system 110 can reserve and utilize the memory space 118 to store data 122 from the virtual reality media 105 (e.g., to rendered on the virtual reality display), parameters 124 from the virtual reality media 105, and/or parameters 126 received from the virtual reality display 150. When the virtual reality media software 112 is executed by computing system 110 to process the virtual reality media 105, the computing system 110 can generate and/or utilize the memory space 118 in the memory 120 or other storage device to store an in-memory version of the virtual reality data 122, or portions thereof, to be transmitted to, and rendered by, the virtual reality display 150.
[0028] The intermediate software application 116 can be executed by the computing system 110 to adjust/modify the data 122 from the virtual reality media software 112, parameters 124, and/or the parameters 126 subsequent to processing by the controller software 114 being executed by the computing system 110 and prior to transmission of the data 122 to the virtual reality display 150. In response to execution of the intermediate software application 116, the computing system 110 can access the memory space 118 and can modify the data 122 from the virtual reality media 105 and/or virtual reality media software 112 (e.g., to be rendered on the virtual reality display), parameters 124 from the virtual reality media 105 and/or virtual media software 112, and/or the parameters 126 received from the virtual reality display 150 to change the manner in which the data 122 is rendered by the virtual reality display 150 as described herein. At least some of the parameters 124 and/or 126 can be update on a frame- by- frame basis. For example, the virtual reality display 150 can communicate with the computing system 110 on a frame-by-frame basis to change or update the parameters 126 associated with the virtual reality display 150 and/or the user interacting with the virtual reality display 150 in the memory 120.
[0029] The virtual reality display 150 can be configured to render the data 122 related to the virtual reality media 105 using the stereoscopic effect with separate images for the right and left eyes to create a visual perception of depth. The virtual reality display 150 can include circuitry for interfacing with the computing system 110 to communicate/transmit information associated with the virtual reality display 150 and/or with the user interacting with the virtual reality display. As one example, the virtual reality display 150 can transmit the parameters 126 including information related to a type of display, a resolution of the display, a refresh or frame rate of the display, and/or any other parameters that may be used by the virtual reality media software executing on the computing system 110 when processing the virtual reality media 105. As another example, the virtual reality display 150 can transmit the parameters 126 including information related to an inter-pupil distance (IPD) of the user as well as other information related to the user of the virtual reality display. Based on the information received and processed by the controller software 114, the virtual reality media software 112 executing on the computing system 110 can process the virtual reality media 105 and transmit the data 122 related to the virtual reality media 105 to the virtual reality display 150, which can render virtual reality scenes or images to be viewed by one or more users of the display. The scenes or images can be rendered by the virtual reality display 110 on a frame-by-frame basis. The intermediate software 116 can be executed by the computing system 110 in parallel to the controller software 114 to modify the manner in which the scenes or images of the virtual reality media 105 are rendered on the virtual reality display 150 prior to transmission of the data 122 from the computing system 110 to the virtual reality display 150. For example, the intermediate software 116 can modify/change the IPD received from the virtual reality device 150 in a frame-by- frame basis to increase the effects of the vergence- accommodation conflict of the user viewing the virtual reality display 150. The intermediate software 116 can be programmed to modify the IPD with a static fixed value that is the same for each frame, modify the IPD to change on a frame-by-frame basis, modify the IPD according to one or more patterns or specified profiles on an intra-frame basis or an inter- frame basis to change the inter-camera distance based on a setting in the intermediate software 116. The patterns or profiles that can be used by the intermediate software 116 can include a step pattern where the received/reported IPD is changed in steps of a specified increment, a ramp pattern where the received/reported IPD is changed gradually in an increasing or decreasing manner, a sweep pattern where the received/reported IPD is changed across a spectrum of values including received/reported IPD values that are smaller than and larger than the received/reported IPD of a user, a step-ramp pattern that combines the step and ramp patterns, and/or a ramp-step pattern that combines the ramp and step patterns.
[0030] FIG. 2 shows an embodiment of the virtual reality display 150 in accordance with exemplary embodiments of the present disclosure. The virtual reality display 150 can be a head mounted display that can be communicatively coupled to the computing system 110 via wireless or wired communications. The virtual reality display 150 can include circuitry disposed within a housing 250. The circuitry can include a right eye display 222, a left eye display 224, one or more right eye image capturing devices 226, one or more left eye image capturing devices 228, one or more right eye light emitting diodes 230, one or more left eye light emitting diodes 232, a right eye controller 234, a left eye controller 236, one or more display controllers 238, and one or more hardware interfaces 240.
[0031] The right and left eye displays 222 and 224 can be disposed within the housing 250 such that the right eye display 222 is positioned in front of the right eye of the user when the housing 250 is mounted on the user’s head and the left eye display 224 is positioned in front of the left eye of the user when the housing 250 is mounted on the user’s head. In this configuration, the right eye display 222 and the left eye display 224 can be controlled by the one or more display controllers 238 to render images on the right and left eye displays 222 and 224 to induce a stereoscopic effect, which can be used to generate three-dimensional images, where objects in the images can be perceived by the user’s vision system as being at different depths while maintaining constant focal length between the user’ s right eye and the right eye display 222 and between the user’s left eye and the left eye display 224. In exemplary embodiments, the right eye display 222 and/or the left eye display 224 can be implemented as a light emitting diode display, an organic light emitting diode (OLED) display (e.g., passive-matrix (PMOLED) display, active-matrix (AMOLED) display), and/or any suitable display.
[0032] The one or more right eye image capturing devices 226 can be disposed in the housing 250 relative to the right eye display 222 so that the one or more right eye image capturing devices 226 can be positioned and oriented to capture images of the user’ s right eye as the user views the right eye display 222. Likewise, the one or more left eye image capturing devices 228 can be disposed in the housing 250 relative to the left eye display 224 so that the one or more left eye image capturing devices 228 can be positioned and oriented to capture images of the user’s left eye as the user views the left eye display 224. In exemplary embodiments, the one or more right and left eye image capturing devices 222 and 224 can be infrared (IR) cameras configured to have a particular sensitive to IR light (e.g., to capture images of IR radiation).
[0033] The one or more right eye light emitting diodes 230 can be disposed in the housing 250 relative to the right eye display 222 and the one or more right eye light emitting diodes so that the one or more light emitting diodes 230 can be positioned and oriented to emit light towards the user’s right eye as the user views the right eye display 222. Likewise, the one or more left eye light emitting diodes 232 can be disposed in the housing 250 relative to the left eye display 224 so that the one or more left eye light emitting diodes 232 can be positioned and oriented to emit light towards the user’s left eye as the user views the left eye display 224. In exemplary embodiments, the one or more right and left eye light emitting diodes 230 and 232 can be infrared (IR) light emitting diodes configured to emit IR light. In some embodiments, the light emitting diodes project infrared light into the eye at about ten percent (10%) of the safety limit.
[0034] The right eye controller 234 can be operatively coupled to the one or more right eye image capturing devices 226 to control an operation of the one or more right eye image capturing devices 226 and/or to process the images of the right eye captured by the one or more right eye image capturing devices 226. Likewise, the left eye controller 236 can be operatively coupled to the one or more left eye image capturing devices 228 to control an operation of the one or more left eye image capturing devices 228 and/or to process the images of the left eye captured by the one or more left eye image capturing devices 228. As one non-limiting example, the right and left eye controllers 234 and 236 can be configured to control a shutter, aperture, refresh rate, discharge rate, and the like of the one or more right and left eye image capturing devices 222 and 224, respectively. As another non-limiting example, the right and left eye controllers 234 and 236 can monitor and/or track the movement of the user’s right and right eyes as the user views the right and left eye displays 226, respectively, which can be utilized by exemplary embodiments to effect vision therapy of the user for binocular dysfunctions. While separate controllers in the form of the right and left eye controllers 234 and 236 are utilized to control and interface with the right and left eye image capturing device 222 and 224, exemplary embodiments of the present disclosure can be implemented with a single integrated controller to control and interface with the right and left eye image capturing devices 222 and 224.
[0035] The one or more display controllers 238 can be operatively coupled to right and left eye displays 222 and 224 to control an operation of the right and left eye displays 222 and 224 in response to input (e.g., the data 122) received from the computing system 110. In exemplary embodiments, the one or more display controllers 238 can be configured to render images on the right and left eye displays of the same scene and/or objects, where images of the scene and/or objects are render at slightly different angles or points-of-view to facilitate the stereoscopic effect. In exemplary embodiments, the one or more display controllers 238 can include graphical processing units.
[0036] The one or more hardware interfaces 240 can facilitate communication between the head mounted virtual reality display 150 and the computing system 110. The virtual reality display 150 can be configured to transmit the parameters 126 to the computing system 110 and to receive data 122 from the computing system 110 via the one or more hardware interfaces 240. As one example, the one or more hardware interfaces 240 can be configured to receive the data 122 from the computing system 110 corresponding to images and can be configured to transmit the data 122 to the one or more display controllers 238, which can render the images on the right and left eye displays 222 and 224 to provide a virtual reality environment in three-dimensions (e.g., as a result of the stereoscopic effect).
[0037] The housing 250 can include a mounting structure 252 and a display structure 254. The mounting structure 252 allows a user to wear the head mounted virtual reality display 150 on his/her head and to position the display structure over his/her eyes to facilitate viewing of the right and left eye displays 222 and 224 by the right and left eyes of the user, respectively. The mounting structure can be configured to generally mount the head mounted virtual reality display 150 on a user’ s head in a secure and stable manner. As such, the head mounted virtual reality display 150 generally remains generally fixed with respect to the user’s head such that when the user moves his/her head left, right, up, and down, the head mounted virtual reality display 150 generally moves with the user’s head.
[0038] The display structure 254 can be contoured to fit snug against a user’s face to cover the user’ s eyes and to generally prevent light from the environment surrounding the user from reaching the user’s eyes. The display structure 254 can include a right eye portal 256 and a left eye portal 258 formed therein. A right eye lens 260a can be disposed over the right eye portal and a left eye lens 260b can be disposed over the left eye portal. The right eye display 222, the one or more right eye image capturing devices 226, and the one or more right eye light emitting diodes 230 can be disposed within the display structure 254 behind the lens 260 covering the right eye portal 256 such that the lens 256 is disposed between the user’s right eye and each of the right eye display 222, the one or more right eye image capturing devices 226, and the one or more right eye light emitting diodes 230. The left eye display 224, the one or more left eye image capturing devices 228, and the one or more left eye light emitting diodes 232 can be disposed within the display structure 254 behind the lens 260 covering the left eye portal 258 such that the lens 260 is disposed between the user’s left eye and each of the left eye display 224, the one or more left eye image capturing devices 228, and the one or more left eye light emitting diodes 232.
[0039] While the one or more right eye image capturing devices 226 and the one or more right eye light emitting diodes 230 are described as being disposed behind the lens 260 covering the right eye portal as an example embodiment, in exemplary embodiments of the present disclosure the one or more right eye image capturing devices 226 and/or the one or more right eye light emitting diodes 230 can be disposed in front of and/or around the lens 260 covering the right eye portal such that lens 260 is not positioned between the user’ s right eye and the one or more right eye image capturing devices 226 and/or the one or more right eye light emitting diodes 230. Likewise, while the one or more left eye image capturing devices 228 and the one or more left eye light emitting diodes 232 are described as being disposed behind the lens 260 covering the left eye portal as an example embodiment, in exemplary embodiments of the present disclosure the one or more left eye image capturing devices 228 and/or the one or more left eye light emitting diodes 232 can be disposed in front of and/or around the lens 260 covering the left eye portal such that lens 260 is not positioned between the user’ s left eye and the one or more right eye image capturing devices 226 and/or the one or more right eye light emitting diodes 230.
[0040] To facilitate rendering the data 122 from the virtual reality media 112, the computing system 110 transmits the data 122 to the head mounted display 150 and information for rendering the data 122. The data 122 can include right and left images to be rendered by the right and left eye displays 222 and 224 and the information can include camera positions from which the images are rendered based on the parameters 124 and/or 126 (e.g. the parameter for the inter-pupil distance). In response to rendering the right and left images, the user’s visual system can attempt to perceive the right and left images as a single image in three-dimensional space (e.g., using the stereoscopic effect).
[0041] While an example embodiment has been illustrated including a head mounted virtual reality display 150 and a computing system 110, exemplary embodiments of the present disclosure can be configured such that the head mounted display includes the computing system 110 and/or is configured to perform the functions and operations of the computing system 110 such that the head mount virtual display 150 is a self-contained, stand-alone device or system. While an example embodiment of the virtual reality display 150 is shown as having two displays, exemplary embodiments of the virtual reality display can be formed be a single display that is divided into a right eye portion and a left eye portion. In some embodiments, a mobile device, such as a smart phone, can be the virtual reality display.
[0042] FIG. 3 is a block diagram of an exemplary embodiment of the computing system 110. In some embodiments, the computing system 110 can be a gaming console configured to execute virtual reality games to be rendered through embodiments of the head mounted display 150 and/or can be any system configured to and/or programmed to process virtual reality media to be rendered on a display. The computing system 110 includes one or more non-transitory computer-readable media for storing one or more computer-executable instructions or software for implementing exemplary embodiments, such as the virtual reality media software 112, the controller software 114, and/or the intermediate software 116. The non-transitory computer-readable media may include, but are not limited to, one or more types of hardware memory, non-transitory tangible media (for example, one or more magnetic storage disks, one or more optical disks, one or more flash drives), and the like. For example, memory 120 included in the computing system 110 may store computer-readable and computer-executable instructions or software for implementing exemplary embodiments. The computing system 110 also includes processor 302 and associated core 304, and optionally, one or more additional processor(s) 302’ and associated core(s) 304’ (for example, in the case of computer systems having multiple processors/cores), for executing instances of computer-readable and computer-executable instructions or software stored in the memory 120 and other programs for controlling system hardware. For example, the processor(s) 302, 302’ can execute the virtual reality media software, the controller software 114, and/ the intermediate software to process the virtual reality media 105 and render data from the virtual reality media on the virtual reality display 150. Processor 302 and processor(s) 302’ may each be a single core processor or multiple core (304 and 304’) processor and may be central processing units, graphical processing units, and the like. [0043] Virtualization may be employed in the computing system 110 so that infrastructure and resources in the computing device may be shared dynamically. A virtual machine 314 may be provided to handle a process running on multiple processors so that the process appears to be using only one computing resource rather than multiple computing resources. Multiple virtual machines may also be used with one processor.
[0044] Memory 120 may include a computer system memory or random access memory, such as DRAM, SRAM, EDO RAM, and the like. Memory 120 may include other types of memory as well, or combinations thereof.
[0045] A user may interact with the computing system 110 through an embodiment of the virtual reality display 150, which can display one or more images of the virtual reality media 112 in accordance with exemplary embodiments. The computing system 110 may include other I/O devices for receiving input from a user, for example, a keyboard or any suitable multi-point touch interface 308, a pointing device 310 (e.g., a mouse or joystick). The computing system 110 may include other suitable conventional I/O peripherals.
[0046] The computing system 110 may also include one or more storage devices 324, such as a hard-drive, CD-ROM, or other computer readable media, for storing data and computer- readable instructions and/or software that process the virtual reality media 112 including, for example, the controller software 114 and the intermediate application 116. Exemplary storage device 324 may also store one or more databases for storing any suitable information required to implement exemplary embodiments. For example, exemplary storage device 324 can store one or more databases 328 for storing information, the data 122, the parameters 124, and/or the parameters 126, and the like. The databases may be updated at any suitable time to add, delete, and/or update one or more items in the databases.
[0047] The computing system 110 can include a network interface 312 configured to interface via one or more network devices 322 with one or more networks, for example, Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (for example, 802.11, Tl, T3, 56kb, X.25), broadband connections (for example, ISDN, Frame Relay, ATM), wireless connections, controller area network (CAN), or some combination of any or all of the above. The network interface 312 may include a built-in network adapter, network interface card, PCMCIA network card, card bus network adapter, wireless network adapter, USB network adapter, modem or any other device suitable for interfacing the computing system 110 to any type of network capable of communication and performing the operations described herein. Moreover, the computing system 110 may be any computer system, such as a workstation, desktop computer, server, laptop, handheld computer, tablet computer (e.g., the iPad™ tablet computer), mobile computing or communication device (e.g., the iPhone™ communication device), or other form of computing or telecommunications device that is capable of communication and that has sufficient processor power and memory capacity to perform the operations described herein.
[0048] The computing system 110 may run any operating system 316, such as any of the versions of the Microsoft® Windows® operating systems, the different releases of the Unix and Linux operating systems, any version of the MacOS® for Macintosh computers, any embedded operating system, Microsoft® Xbox operating systems for Xbox gaming systems, Playstation operating systems for PlayStation gaming systems, Wii operating systems for Nintendo® Wii gaming systems, Switch operating system for Nintendo® Switch gaming systems, any real-time operating system, any open source operating system, any proprietary operating system, or any other operating system capable of running on the computing device and performing the operations described herein. In exemplary embodiments, the operating system 316 may be run in native mode or emulated mode. In an exemplary embodiment, the operating system 316 may be run on one or more cloud machine instances.
[0049] FIG. 4 illustrates a user’s eyes 402, 404 in a conventional virtual reality environment 400 in which virtual reality media developed for the general population (e.g., the virtual reality media 112). Under normal conditions, the virtual reality media processed by the controller software is designed to be rendered so that the distance between the two cameras 406, 408 in the virtual world, or inter-camera distance (ICD) 410, should match the distance between the user’s eyes 402, 404, or interpupillary distance (IPD) 412, in the real world. This configuration minimizes the effects of the vergence-accommodation conflict of the user viewing the virtual reality display. [0050] FIG. 5 illustrates the user’s eyes 402, 404 in a virtual reality environment 500 in accordance with embodiments of the present disclosure in which the virtual reality media from FIG. 4 (e.g., the virtual reality media 112) is rendered on the display is modified by the intermediate application. Under modified conditions, the virtual reality media processed by the controller software is modified prior to transmission to the virtual reality display so that the distance between the two cameras 406, 408 in the virtual world, or inter-camera distance (ICD) 510, is different than (e.g., greater than or less than) the distance between the user’s eyes 402, 404, or interpupillary distance (IPD) 512, in the real world. This configuration increases the effects of the vergence-accommodation conflict of the user viewing the virtual reality display.
[0051] FIG. 6 is a flowchart 600 illustrating an information flow in a virtual reality environment for a single frame according to embodiments of the present disclosure. The virtual reality media software 112 requests recent pose and IPD information from the controller software 114. The controller software 114 looks for this information from the virtual reality display 150 in the memory space. Upon reception of the requested information, the controller software 114 informs the virtual reality media software 112 of any changes to pose or IPD. The virtual reality media software 112 sends the final frame render to the virtual reality display. The intermediate software 116 disrupts the flow of information from the controller software to the virtual reality media, changing the reported IPD values from the virtual reality display to modify the inter-camera distance (ICD) to be less than or greater than the reported IPD and to modify the manner in which the virtual reality data is rendered; thereby providing a therapeutic effect from virtual reality media.
[0052] FIG. 7 is an exemplary graphical user interface (GUI) 700 for the intermediate software in accordance with embodiments of the present disclosure. The GUI 700 can include input fields for controlling the operation of the intermediate software 116 to modify the manner in which the data from the virtual reality media is rendered on the virtual reality display. For example, the GUI 700 can allow a user to select various specified patterns or profiles (e.g., setting) by which the intermediate software controls the IPD value received from the virtual reality display and utilized by the computing system to render the data in a manner that increases and/or decreases the effects of the vergence-accommodation conflict of the user viewing the virtual reality display according to the patterns or profiles (e.g., by adjusting the inter-camera distance). The inputs can allow a user to create patterns or profiles using inputs to specify, for example, step, ramp, sweep, step-ramp, and/or ramp-step patterns with different values, which once created, can be added to the available specified patterns or profiles for subsequent use. Likewise an input can be provided that allows users to remove a specified pattern or profile. Outputs can be provided to provide the user with information regarding the selected pattern or profile being implemented by the intermediate software as well as when the pattern or profile was started and a time remaining for the pattern or profile to ran. The GUI 700 can allow a user to start, stop, and/or pause the implementation of the pattern or profile.
[0053] FIG. 8 is a flowchart illustrating an exemplary process 800 for modifying the manner in which VR media developed for the general public is rendered on a virtual reality display to provide vision therapy to a user. The process 800 can be performed for each frame of virtual reality data to be rendered on a virtual reality display, can be performed once for all of the frames of virtual reality data to be rendered on the virtual reality display, and/or can be performed on some frames, but not others, of the virtual reality data to be rendered on the virtual reality display. In operation 802, a computing system 110 executes virtual reality media software 112 to process virtual reality media 105 to be rendered on a virtual reality display 150. At operation 804, intermediate software 116 and controller software 114 are executed on the computing system. In operation 806, the controller software 114 interfaces and communicates with the virtual reality display to configure the virtual reality environment. For example, the controller software 114 can receive parameters 126 from the virtual reality display (e.g., such as the IPD) that can be used by the virtual reality media software to generate (left and right images of) frames to be rendered on the virtual reality display. In operation 808, after the controller software receives processes and stores this information is a specified memory space/location in memory accessed by the computing system 110, the intermediate software application can modify the parameters (e.g., the reported and stored IPD) based on one or more specified programs or profiles (e.g., setting in the intermediate software). In operation 810, in response to the modified parameters (modified value of the reported IPD), the virtual reality data to be rendered on the virtual reality display is altered according to the modified parameters. For example, the intermediate software can modify the reported IPD received from the virtual reality display to change an inter-camera distance associated with the rendering of the data of the virtual reality media (e.g., set the inter-camera distance to be less than or greater than the reported IPD). This modification can increases the effects of the vergence-accommodation conflict of the user viewing the virtual reality display.
[0054] In describing exemplary embodiments, specific terminology is used for the sake of clarity. For purposes of description, each specific term is intended to at least include all technical and functional equivalents that operate in a similar manner to accomplish a similar purpose. Additionally, in some instances where a particular exemplary embodiment includes a plurality of system elements, device components or method steps, those elements, components or steps may be replaced with a single element, component or step. Likewise, a single element, component or step may be replaced with a plurality of elements, components or steps that serve the same purpose. Moreover, while exemplary embodiments have been shown and described with references to particular embodiments thereof, those of ordinary skill in the art will understand that various substitutions and alterations in form and detail may be made therein without departing from the scope of the invention. Further still, other aspects, functions and advantages are also within the scope of the invention.
[0055] Exemplary flowcharts are provided herein for illustrative purposes and are non limiting examples of methods. One of ordinary skill in the art will recognize that exemplary methods may include more or fewer steps than those illustrated in the exemplary flowcharts, and that the steps in the exemplary flowcharts may be performed in a different order than the order shown in the illustrative flowcharts.

Claims

Claims:
1. A method for modifying a manner in which virtual reality media is rendered on a virtual reality display to a user, the method comprising:
receiving, by a processing device, one or more parameters from the virtual reality display;
storing the one or more parameters in memory by the processing device;
modifying, in response to execution of intermediate software by the processing device, at least one of the one or more parameters in memory to have a value that is different than that which was received from the virtual reality display, the value being modified based on a setting in the intermediate software;
retrieving, by the processing device, the one or more parameters including the value of the at least one of the one or more parameters and a frame of the virtual reality media to be rendered on the virtual reality display;
processing, by the processing device, the frame of virtual reality media to be rendered by the virtual reality display, a manner in which the frame of virtual reality media is render being determined at least in part by the value of the at least one of the one or more parameters; and
rendering the frame of the virtual reality media on the virtual reality display as modified by the value of the at least one of the one or more parameters.
2. The method of claim 1, wherein the at least one of the one or more parameters received by the processing device from the virtual reality display is an interpupillary distance and the value is modified to be different than an actual interpupillary distance between eyes of the user.
3. The method of claim 2, wherein processing the frame of virtual reality media by the processing device includes setting an inter-camera distance between left image and right image of the frame of virtual reality media to be less than or greater than the actual interpupillary distance.
4. The method of claim 2, wherein the virtual reality media is rendered on the virtual reality display frame-by-frame, the processing device receives the interpupillary distance on a frame-by-frame basis from the virtual reality display, and the processing device executes the intermediate software to modify the value of the interpupillary distance on a frame-by- frame basis after the interpupillary distance stored in memory and before each frame is processed by the processing device to render each frame on the virtual reality display.
5. The method of claim 4, wherein the processing device executes the intermediate software to modify the value of the interpupillary distance to be identical for each frame.
6. The method of claim 4, wherein the processing device executes the intermediate software to modify the value of the interpupillary distance to be different for each frame.
7. The method of claim 4, wherein the processing device executes the intermediate software to modify the value of the interpupillary distance based on a pattern or profile defined by the intermediate software.
8. The method of claim 4, wherein the processing device executes the intermediate software to modify the value of the interpupillary distance based on a step profile.
9. The method of claim 4, wherein the processing device executes the intermediate software to modify the value of the interpupillary distance based on a ramp profile.
10. The method of claim 4, wherein the processing device executes the intermediate software to modify the value of the interpupillary distance based on a sweep profile.
11. The method of claim 4, wherein the processing device executes the intermediate software to modify the value of the interpupillary distance based on a combination of a step profile and a ramp profile.
12. A system for modifying the manner in which virtual reality media is rendered on a virtual reality display, the system comprising:
a virtual reality display;
a memory; and
a processing device operatively coupled to the memory and the virtual reality display, the processing device being configured to: receive one or more parameters from the virtual reality display;
store the one or more parameters in memory;
modify, in response to execution of intermediate software, at least one of the one or more parameters in memory to have a value that is different than that which was received from the virtual reality display, the value being modified based on a setting in the intermediate software;
retrieve the one or more parameters including the value of the at least one of the one or more parameters and a frame of the virtual reality media to be rendered on the virtual reality display;
process the frame of virtual reality media to be rendered by the virtual reality display, a manner in which the frame of virtual reality media is render being determined at least in part by the value of the at least one of the one or more parameters; and
transmit the frame of the virtual reality media to be rendered on the virtual reality display as modified by the at least one of the one or more parameters to increase an effect of the vergence-accommodation conflict of the user viewing the virtual reality display.
13. The system of claim 12, wherein the at least one of the one or more parameters received by the processing device from the virtual reality display is an interpupillary distance and the value is modified to be different than an actual interpupillary distance between eyes of the user.
14. The system of claim 13, wherein the processing device is configured to set an inter camera distance between left image and right image of the frame of virtual reality media to be less than or greater than the actual interpupillary distance.
15. The system of claim 13, wherein the virtual reality media is rendered on the virtual reality display frame-by- frame, the processing device is configured to receive the interpupillary distance on a frame-by-frame basis from the virtual reality display, and the processing device is configured to execute the intermediate software to modify the value of the interpupillary distance on the frame-by- frame basis after the interpupillary distance stored in memory and before each frame is processed by the processing device to render each frame on the virtual reality display.
16. The system of claim 15, wherein the processing device executes the intermediate software to modify the value of the interpupillary distance to be identical for each frame.
17. The system of claim 15, wherein the processing device executes the intermediate software to modify the value of the interpupillary distance to be different for each frame.
18. The system of claim 15, wherein the processing device executes the intermediate software to modify the value of the interpupillary distance based on a pattern or profile defined by the intermediate software.
19. The system of claim 15, wherein the processing device executes the intermediate software to modify the value of the interpupillary distance based on a step profile.
20. The system of claim 15, wherein the processing device executes the intermediate software to modify the value of the interpupillary distance based on a ramp profile.
21. The system of claim 15, wherein the processing device executes the intermediate software to modify the value of the interpupillary distance based on a sweep profile.
22. The system of claim 15, wherein the processing device executes the intermediate software to modify the value of the interpupillary distance based on a combination of a step profile and a ramp profile.
23. A non-transitory computer-readable medium comprising instructions, wherein execution of the instructions by a processing device causes the processing device to:
receive one or more parameters from the virtual reality display;
store the one or more parameters in memory;
modify, in response to execution of intermediate software by the processing device, at least one of the one or more parameters in memory to have a value that is different than that which was received from the virtual reality display, the value being modified based on a setting in the intermediate software; retrieve the one or more parameters including the value of the at least one of the one or more parameters and a frame of the virtual reality media to be rendered on the virtual reality display;
process the frame of virtual reality media to be rendered by a virtual reality display, a manner in which the frame of virtual reality media is render being determined at least in part by the value of the at least one of the one or more parameters; and
transmit the frame of the virtual reality media to be rendered on the virtual reality display as modified by the at least one of the one or more parameters to increase an effect of the vergence-accommodation conflict of the user viewing the virtual reality display.
24. The medium of claim 23, wherein the at least one of the one or more parameters received by the processing device from the virtual reality display is an interpupillary distance and the value is modified to be different than an actual interpupillary distance between eyes of the user.
25. The medium of claim 24, wherein execution of the instructions by the processing device causes the processing device to set an inter-camera distance between left image and right image of the frame of virtual reality media to be less than or greater than the actual interpupillary distance.
26. The medium of claim 24, wherein the virtual reality media is rendered on the virtual reality display frame-by- frame, and execution of the instructions by the processing device causes the processing device to:
receive the interpupillary distance on a frame-by-frame basis from the virtual reality display; and
modify the value of the interpupillary distance on the frame-by- frame basis after the interpupillary distance stored in memory and before each frame is processed by the processing device to render each frame on the virtual reality display.
27. The medium of claim 26, wherein execution of the instructions by the processing device causes the processing device to modify the value of the interpupillary distance to be identical for each frame.
28. The medium of claim 26, wherein execution of the instructions by the processing device causes the processing device to modify the value of the interpupillary distance to be different for each frame.
29. The medium of claim 26, wherein execution of the instructions by the processing device causes the processing device to modify the value of the interpupillary distance based on a pattern or profile defined by the intermediate software.
30. The medium of claim 26, wherein execution of the instructions by the processing device causes the processing device to modify the value of the interpupillary distance based on at least one of a step profile, a ramp profile, or a sweep profile.
PCT/US2019/066038 2018-12-12 2019-12-12 Systems and methods for rendering general virtual reality environments for vision therapy Ceased WO2020123841A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862778766P 2018-12-12 2018-12-12
US62/778,766 2018-12-12

Publications (1)

Publication Number Publication Date
WO2020123841A1 true WO2020123841A1 (en) 2020-06-18

Family

ID=71077090

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2019/066038 Ceased WO2020123841A1 (en) 2018-12-12 2019-12-12 Systems and methods for rendering general virtual reality environments for vision therapy

Country Status (1)

Country Link
WO (1) WO2020123841A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115002438A (en) * 2022-05-27 2022-09-02 厦门雅基软件有限公司 Development preview method, device, electronic device and readable storage medium for XR application

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101352385A (en) * 2008-08-26 2009-01-28 北京航空航天大学 Interpupillary distance adjustment mechanism of amblyopia therapeutic apparatus
US20160007849A1 (en) * 2014-07-08 2016-01-14 Krueger Wesley W O Systems and methods for the measurement of vestibulo-ocular reflex to improve human performance in an occupational environment
CN106233328A (en) * 2014-02-19 2016-12-14 埃弗加泽公司 For improving, improve or strengthen equipment and the method for vision
CN106461939A (en) * 2015-05-29 2017-02-22 深圳市柔宇科技有限公司 Adaptive Display Adjustment Method And Head-Mounted Display Device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101352385A (en) * 2008-08-26 2009-01-28 北京航空航天大学 Interpupillary distance adjustment mechanism of amblyopia therapeutic apparatus
CN106233328A (en) * 2014-02-19 2016-12-14 埃弗加泽公司 For improving, improve or strengthen equipment and the method for vision
US20160007849A1 (en) * 2014-07-08 2016-01-14 Krueger Wesley W O Systems and methods for the measurement of vestibulo-ocular reflex to improve human performance in an occupational environment
CN106461939A (en) * 2015-05-29 2017-02-22 深圳市柔宇科技有限公司 Adaptive Display Adjustment Method And Head-Mounted Display Device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115002438A (en) * 2022-05-27 2022-09-02 厦门雅基软件有限公司 Development preview method, device, electronic device and readable storage medium for XR application

Similar Documents

Publication Publication Date Title
CA3023488C (en) System and method for generating a progressive representation associated with surjectively mapped virtual and physical reality image data
CA2825563C (en) Virtual reality display system
JP2023037626A (en) Eye tracking with prediction and late update to gpu for fast foveated rendering in hmd environment
US10438418B2 (en) Information processing method for displaying a virtual screen and system for executing the information processing method
US10029176B2 (en) Data processing apparatus and method of controlling display
CN106484116B (en) Method and device for processing media files
US20180136723A1 (en) Immersive displays
US11237413B1 (en) Multi-focal display based on polarization switches and geometric phase lenses
KR20200056658A (en) Method and apparatus for buffer management in cloud based virtual reallity services
CN117480469A (en) Eye model registration
WO2020123841A1 (en) Systems and methods for rendering general virtual reality environments for vision therapy
US20210286701A1 (en) View-Based Breakpoints For A Display System
JP6996450B2 (en) Image processing equipment, image processing methods, and programs
JP6416338B1 (en) Information processing method, information processing program, information processing system, and information processing apparatus
US10659755B2 (en) Information processing device, information processing method, and program
CN113101158A (en) A VR-based binocular vision fusion training method and device
WO2017022303A1 (en) Information processing device, information processing method, and program
JP7300569B2 (en) Information processing device, information processing method and program
JP6934374B2 (en) How it is performed by a computer with a processor
JP2024040528A (en) Information processing device, information processing method, and program
KR20210133058A (en) Tourism experience system with 60DOF online multi-player
JP7467748B1 (en) Display control device, display system and program
US20250244827A1 (en) Information display system
KR102286517B1 (en) Control method of rotating drive dependiong on controller input and head-mounted display using the same
CN106484114B (en) Interaction control method and device based on virtual reality

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19895046

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19895046

Country of ref document: EP

Kind code of ref document: A1