US20170323482A1 - Systems and methods for generating stereoscopic, augmented, and virtual reality images - Google Patents
Systems and methods for generating stereoscopic, augmented, and virtual reality images Download PDFInfo
- Publication number
- US20170323482A1 US20170323482A1 US15/586,956 US201715586956A US2017323482A1 US 20170323482 A1 US20170323482 A1 US 20170323482A1 US 201715586956 A US201715586956 A US 201715586956A US 2017323482 A1 US2017323482 A1 US 2017323482A1
- Authority
- US
- United States
- Prior art keywords
- eyewear
- ride
- display
- world environment
- real world
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63G—MERRY-GO-ROUNDS; SWINGS; ROCKING-HORSES; CHUTES; SWITCHBACKS; SIMILAR DEVICES FOR PUBLIC AMUSEMENT
- A63G21/00—Chutes; Helter-skelters
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63G—MERRY-GO-ROUNDS; SWINGS; ROCKING-HORSES; CHUTES; SWITCHBACKS; SIMILAR DEVICES FOR PUBLIC AMUSEMENT
- A63G31/00—Amusement arrangements
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63G—MERRY-GO-ROUNDS; SWINGS; ROCKING-HORSES; CHUTES; SWITCHBACKS; SIMILAR DEVICES FOR PUBLIC AMUSEMENT
- A63G31/00—Amusement arrangements
- A63G31/16—Amusement arrangements creating illusions of travel
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63G—MERRY-GO-ROUNDS; SWINGS; ROCKING-HORSES; CHUTES; SWITCHBACKS; SIMILAR DEVICES FOR PUBLIC AMUSEMENT
- A63G7/00—Up-and-down hill tracks; Switchbacks
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
-
- G02B27/26—
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/22—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type
- G02B30/25—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type using polarisation techniques
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- H04N13/0239—
-
- H04N13/0422—
-
- H04N13/0434—
-
- H04N13/044—
-
- H04N13/0459—
-
- H04N13/0468—
-
- H04N13/0484—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/324—Colour aspects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/337—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using polarisation multiplexing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/363—Image reproducers using image projection screens
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/383—Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0132—Head-up displays characterised by optical features comprising binocular systems
- G02B2027/0134—Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0181—Adaptation to the pilot/driver
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0183—Adaptation to parameters characterising the motion of the vehicle
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2213/00—Details of stereoscopic systems
- H04N2213/008—Aspects relating to glasses for viewing stereoscopic images
Definitions
- the subject matter disclosed herein relates to amusement park attractions, and more specifically, to providing enhanced thrill factors and components of interest in amusement park attractions.
- Amusement parks and/or theme parks may include various entertainment attractions, restaurants, and rides useful in providing enjoyment to patrons (e.g., families and/or people of all ages) of the amusement park.
- the attractions may include traditional rides for kids such as carousels, as well as traditional rides for thrill seekers such as rollercoasters. It is now recognized that adding components of interest and thrill factors to such attractions can be difficult and limiting.
- the thrill factor of such rollercoasters and/or other similar thrill rides may be limited to the existing course or physical nature of the thrill ride itself. It is now recognized that it is desirable to include components of interest and thrill factors in such attractions in a flexible and efficient manner relative to traditional techniques.
- a ride system includes eyewear configured to be worn by the user, wherein the eyewear comprises a display having a stereoscopic feature configured to permit viewing of externally generated stereoscopically displayed images.
- the ride system includes a computer graphics generation system communicatively coupled to the eyewear, and configured to generate streaming media of a real world environment based on image data captured via the camera of the eyewear, generate one or more virtual augmentations superimposed on the streaming media of the real world environment, transmit the streaming media of the real world environment along with the one or more superimposed virtual augmentations to be displayed on the display of the eyewear, and project stereoscopic images into the real world environment.
- a wearable electronic device in a second embodiment, includes a frame comprising a frame front; a left eye display lens and a right eye display lens coupled to the frame front; a first filter on the left eye display lens; a second filter on the right eye display lens, wherein the first filter is different than the second filter; and processing circuitry configured to: receive a signal from the computer graphics generation system, wherein the signal comprises a video stream of a virtualization of a real world environment along with at least one augmented reality (AR) image or at least one virtual reality (VR) image included in the video stream; and cause the left eye display and the right eye display to display the video stream.
- AR augmented reality
- VR virtual reality
- a method includes receiving or accessing environmental image data via a computer graphics generation system, generating a virtualization of a real world environment of the amusement park based on the environmental image data; overlaying an augmented reality (AR) image or a virtual reality (VR) image onto the virtualization of the real world environment; transmitting the overlaid AR image or the VR image along with the virtualization of the real world environment to the eyewear during the cycle of the amusement park ride; transmitting a signal to the eyewear to permit viewing through displays of the eyewear; projecting stereoscopic images onto a surface of the real-world environment after transmitting the signal; and causing the stereoscopic images to be reflected through filters in the eyewear into a left and right eye of a user to generate an illusion of a 3D image.
- AR augmented reality
- VR virtual reality
- FIG. 1 illustrates an embodiment of an amusement park including one or more attractions in accordance with the present embodiments
- FIG. 2 is an illustration of an embodiment of stereoscopic augmented reality (AR) or virtual reality (VR) eyewear and a computer graphics generation system in accordance with present embodiments;
- AR augmented reality
- VR virtual reality
- FIG. 3 is an illustration of an embodiment of stereoscopic augmented reality (AR) or virtual reality (VR) eyewear;
- AR augmented reality
- VR virtual reality
- FIG. 4 is a perspective view of a thrill ride of FIG. 1 including various AR and VR images provided by way of the stereoscopic AR/VR eyewear of FIG. 2 , in accordance with present embodiments;
- FIG. 5 is a flowchart illustrating an embodiment of a process useful in creating stereoscopic images within an AR experience, a VR experience, or a mixed reality experience during a ride by using the computer graphics generation system of FIG. 2 , in accordance with present embodiments.
- Present embodiments relate to systems and methods of providing a stereoscopic mixed or augmented reality (AR) experience, a virtual reality (VR) experience, or a combination thereof, as part of an attraction, such as a thrill ride, in an amusement park or theme park.
- each ride passenger e.g., first passenger, second passenger, etc.
- eyewear such as a pair of electronic goggles or eyeglasses to be worn during a cycle of the thrill ride.
- the eyewear may facilitate an AR experience, a VR experience, or a combination of both experiences.
- the eyewear may be referred to as stereoscopic eyewear or stereoscopic AR/VR eyewear.
- the stereoscopic AR/VR eyewear provides the capability of viewing stereoscopic images, which generate the illusion of 3D images.
- the stereoscopic AR/VR eyewear is configured for displaying augmented or virtual reality images overlaid on an image of the user's real-world environment, which generates the illusion that the overlaid image is part of the real world environment.
- the stereoscopic AR/VR eyewear is implemented with display lenses that are capable of displaying the overlaid AR/VR images transmitted from a central controller as well as being capable of permitting the user to view the real-world environment, including any stereoscopically displayed images.
- the display lenses may be implemented with a polarizing layer, active shutters, color shifting capability, or other technology that permits stereoscopic viewing and that is compatible with the AR/VR capability of the eyewear.
- a single eyewear device may be used within an environment to render a variety of different types of visual experiences.
- the eyewear may also permit unaided, non-stereoscopic, or unaugmented viewing in certain instances, e.g., at the start of a theme park ride to permit the users to acclimatize themselves to the environment.
- the stereoscopic AR/VR eyewear is capable of acting as a display for images that are created to reflect the real-world environment with augmented images.
- the users view a displayed image that is displayed on the lenses of the eyewear in a manner that create the illusion that the augmented image is the real-world environment viewed in real time.
- the images of the real-world environment may be recorded ahead of time, e.g., may be stored in a memory of the system, or, in certain embodiments, may be collected in real-time by a user.
- the eyewear includes at least two cameras, which may respectively correspond to the respective points of view (e.g., right and left eye views) of the ride passengers, and may be used to capture real-time video data (e.g., video captured during live use and transmitted in substantially real-time) of the real-world environment (e.g., aspects of the physical amusement park) of the ride passengers and/or the thrill ride.
- the eyewear may also include a display.
- the eyewear may include at least two displays respectively corresponding to each eye of a ride passenger using the eyewear.
- a computer graphics generation system may also be provided.
- the computer graphics generation system may receive the real-time video data (e.g., live video that is transmitted in substantially real-time) from the eyewear, and may render a video stream of the real-world environment along with various AR, VR, or combined AR and VR (AR/VR) graphical images to the respective displays of the respective eyewear of the ride passengers during a cycle of the ride.
- real-time video data e.g., live video that is transmitted in substantially real-time
- AR/VR combined AR and VR
- the computer graphics generation system may render the AR/VR graphical images to the eyewear based on, for example, the position or location of a ride passenger vehicle along the tracks of a rollercoaster during a cycle of a thrill ride, a predetermined distance traveled by the passenger ride vehicle during a cycle of the thrill ride, or after a predetermined lapse of time in the cycle of the thrill ride.
- the eyewear and the computer graphics generation system may enhance the thrill factor of the thrill ride, and, by extension, may enhance the experience of the ride passengers as they ride the thrill ride.
- the techniques described herein may not be limited to thrill rides and/or amusement park attraction applications, but may also be extended to any of various applications such as, for example, medical applications (e.g., image-guided surgery, noninvasive imaging analysis), engineering design applications (e.g., engineering model development), manufacturing, construction, and maintenance applications (e.g., products manufacturing, new building construction, automobile repairs), academic and/or vocational training applications, exercise applications (e.g., bodybuilding and weight loss models), television (TV) applications (e.g., weather and news), and the like.
- medical applications e.g., image-guided surgery, noninvasive imaging analysis
- engineering design applications e.g., engineering model development
- manufacturing, construction, and maintenance applications e.g., products manufacturing, new building construction, automobile repairs
- academic and/or vocational training applications e.g., exercise applications (e.g., bodybuilding and weight loss models)
- television (TV) applications e.g., weather and news), and the like.
- the amusement park 10 may include a thrill ride 12 , a mall of amusement park facilities 14 (e.g., restaurants, souvenir shops, and so forth), and additional amusement attractions 16 (e.g., Ferris Wheel, dark ride, or other attraction).
- the thrill ride 12 may include a rollercoaster or other similar thrill ride, and may thus further include a closed-loop track or a system of closed-loop tracks 18 (e.g., miles of tracks 18 ).
- the tracks 18 may be provided as an infrastructure on which a passenger ride vehicle 20 may traverse, for example, as ride passengers 22 , 24 , 26 , 28 ride the thrill ride 12 .
- the tracks 18 may thus define the motion of the ride vehicle 20 .
- the tracks 18 may be replaced by a controlled path, in which the movement of the ride vehicle 20 may be controlled via an electronic system, a magnetic system, or other similar system infrastructure other than the tracks 18 .
- the passenger ride vehicle 20 may be illustrated as a 4-passenger vehicle, in other embodiments, the passenger ride vehicle 20 may include any number of passenger spaces (e.g., 1, 2, 4, 8, 10, 20, or more spaces) to accommodate a single or multiple groups of ride passengers 22 , 24 , 26 , 28 . It should be understood that, while the thrill ride 12 is described in the context of the ride vehicle 20 , other embodiments are contemplated (e.g., a seated theater environment, a walking or free movement arena environment, etc.) and may be used in conjunction with the disclosed embodiments.
- the ride passengers 22 , 24 , 26 , 28 may be provided a moving tour of the scenery (e.g., facilities 14 , additional amusement attractions 16 , and so forth) in an area around or nearby the thrill ride 12 .
- this may include the environment surrounding the thrill ride 12 (e.g., a building that fully or partially houses the thrill ride 12 ).
- While the ride passengers 22 , 24 , 26 , 28 may find the thrill ride 12 to be a very enjoyable experience, in certain embodiments, it may be useful to enhance the experience of the ride passengers 22 , 24 , 26 , 28 as the ride passengers 22 , 24 , 26 , 28 ride the thrill ride 12 by enhancing, for example, the thrill factor of the thrill ride 12 .
- amusement attractions 16 e.g., Ferris Wheel or other attractions
- AR augmented reality
- VR virtual reality
- each of the ride passengers 22 , 24 , 26 , 28 may be provided a pair of stereoscopic AR/VR eyewear 34 , which may, in certain embodiments, include eyeglasses.
- the stereoscopic AR/VR eyewear 34 may be included as part of a helmet, a visor, a headband, a pair of blinders, one or more eyepatches, and/or other headwear or eyewear that may be worn by each of the ride passengers 22 , 24 , 26 , 28 .
- the stereoscopic AR/VR eyewear 34 may be communicatively coupled to a computer graphics generation system 32 (e.g., within the amusement park 10 ) via a wireless network 48 (e.g., wireless local area networks [WLAN], wireless wide area networks [WWAN], near field communication [NFC]).
- the stereoscopic AR/VR eyewear 34 may be used to create surreal environment 30 , which may include an AR experience, a VR experience, a mixed reality experience, a combination of AR and VR experience, a computer-mediated reality experience, a combination thereof, or other similar surreal environment for the ride passengers 22 , 24 , 26 , 28 as the ride passengers 22 , 24 , 26 , 28 ride the thrill ride 12 .
- the stereoscopic AR/VR eyewear 34 may be worn by the ride passengers 22 , 24 , 26 , 28 throughout the duration of the ride, such that ride passengers 22 , 24 , 26 , 28 may feel completely encompassed by the environment 30 and may perceive the environment 30 to be a real-world physical environment.
- the environment 30 may be a real-time video including real-world images 44 that the ride passengers 22 , 24 , 26 , 28 would see, even when not wearing the stereoscopic AR/VR eyewear 34 electronically merged with one or more AR or VR images 45 (e.g., virtual augmentations).
- the term “real-time” indicates that the images are obtained and/or provided in a timeframe substantially close to the time of actual observation. In alternative embodiments, the obtained images may be historical images of the environment.
- the stereoscopic AR/VR eyewear 34 may be any of various wearable electronic devices that may be useful in creating an AR experience, a VR, and/or other computed-mediated experience to enhance the thrill factor of the thrill ride 12 , and, by extension, the experience of the ride passengers 22 , 24 , 26 , 28 while on the thrill ride 12 .
- the eyeglasses embodiment of the stereoscopic AR/VR eyewear 34 as discussed herein may be distinct from, and may provide many advantages over traditional devices such as head-mounted displays (HMDs) and/or heads-up displays (HUDs).
- HMDs head-mounted displays
- HUDs heads-up displays
- the stereoscopic AR/VR eyewear 34 may include a number of orientation and position sensors (e.g., accelerometers, magnetometers, gyroscopes, Global Positioning System [GPS] receivers) that may be used to track the position, orientation, and motion of the ride passengers 22 , 24 , 26 , 28 during a cycle of the thrill ride 12 .
- orientation and position sensors e.g., accelerometers, magnetometers, gyroscopes, Global Positioning System [GPS] receivers
- features of the stereoscopic AR/VR eyewear 34 may be monitored by a monitoring system (e.g., one or more cameras) to determine position, location, orientation, and so forth of the stereoscopic AR/VR eyewear 34 and, in turn, that of the wearer.
- a monitoring system e.g., one or more cameras
- the ride passengers 22 , 24 , 26 , 28 may be monitored by a monitoring system 33 (e.g., a camera), which may be communicatively coupled to the computer graphics generation system 32 and used to identify position, location, orientation, and so forth of the ride passengers 22 , 24 , 26 , 28 .
- the ride vehicle 20 may also include one or more sensors (e.g., weight sensors, mass sensors, motion sensors, ultrasonic sensors) that may be useful in monitoring the respective ride passengers 22 , 24 , 26 , 28 for the graphics generation system 32 to determine the point of view of the respective ride passengers 22 , 24 , 26 , 28 .
- sensors e.g., weight sensors, mass sensors, motion sensors, ultrasonic sensors
- the stereoscopic AR/VR eyewear 34 may include individual cameras (e.g., cameras 40 and 42 ) and individual displays (e.g., displays 37 and 38 ), data with respect to the respective points of view of each eye of the ride passengers 22 , 24 , 26 , 28 may be captured by stereoscopic AR/VR eyewear 34 . All of these advantages may be unavailable using devices such as traditional HMDs and/or HUDs.
- the stereoscopic AR/VR eyewear 34 may include processing circuitry, such as a processor 35 and a memory 36 .
- the processor 35 may be operatively coupled to the memory 36 to execute instructions for carrying out the presently disclosed techniques of generating real-world images 44 merged with one or more AR/VR images 45 to enhance the thrill factor of the thrill ride 12 , and, by extension, the experience of the ride passengers 22 , 24 , 26 , 28 while on the thrill ride 12 .
- These instructions may be encoded in programs or code stored in a tangible non-transitory computer-readable medium, such as the memory 36 and/or other storage.
- the processor 35 may be a general-purpose processor, system-on-chip (SoC) device, an application-specific integrated circuit (ASIC), or some other similar processor configuration.
- the processor 35 and the memory 36 may be provided as an auxiliary pack carried by the user (e.g., clipped at the waited or carried in a pocket), either wired to or in wireless communication with the stereoscopic AR/VR eyewear 34 .
- the stereoscopic AR/VR eyewear 34 communicates wirelessly with the computer graphics generation system 32 and does not perform on-board image processing.
- the stereoscopic AR/VR eyewear 34 may also include the pair of displays 37 and 38 (e.g., which may be provided in the frame front 39 of the stereoscopic AR/VR eyewear 34 where eyeglass lenses would otherwise appear) respectively corresponding to each eye of the ride passengers 22 , 24 , 26 , 28 .
- a unified display may be employed.
- the respective displays 37 and 38 may each include a display that covers at least part or only some of the viewing surface.
- the displays 37 , 38 may be an opaque liquid crystal display (LCD), an opaque organic light emitting diode (OLED) display, or other similar display useful in displaying the real-world images 44 and the AR/VR graphical images 45 to the ride passengers 22 , 24 , 26 , 28 .
- the respective displays 37 and 38 may each include a see-through LCD or a see-through OLED display useful in allowing, for example, the ride passengers 22 , 24 , 26 , 28 to view the real-world images 44 and the AR/VR graphical images 45 appearing on the respective displays 37 and 38 while preserving the ability to see through the respective displays 37 and 38 to the actual and physical real world environment (e.g., the amusement park 10 ).
- the displays 37 and 38 permit viewing of stereoscopic images 43 .
- the displays 37 , 38 may also include light field displays. In certain embodiments, the displays 37 , 38 may toggle between opaque and transparent configurations, depending on the desired visual environment.
- the cameras 40 and 42 may respectively correspond to the respective points of view of the ride passengers 22 , 24 , 26 , 28 , and may be used to capture real-time video data (e.g., live video) of the real-world environment. In some embodiments, a single camera may be employed. Specifically, in the illustrated embodiment, the cameras 40 , 42 of the stereoscopic AR/VR eyewear 34 may be used to capture real-time images of the real-world physical environment (e.g., the physical amusement park 10 ) perceived by the respective ride passengers 22 , 24 , 26 , 28 from the point of view of the respective ride passengers 22 , 24 , 26 , 28 . As will be further appreciated, the stereoscopic AR/VR eyewear 34 may then transmit (e.g.
- the real-time video data captured via the respective cameras 40 and 42 may be processed on the stereoscopic AR/VR eyewear 34 via the processor 35 .
- the stereoscopic AR/VR eyewear 34 may also transmit orientation data, position data, point of view data (e.g., focal length, orientation, pose, and so forth), motion tracking data, and so forth obtained and/or derived based on data obtained via orientation and position sensors (e.g., accelerometers, magnetometers, gyroscopes, Global Positioning System [GPS] receivers, and so forth) motion tracking sensors (e.g., electromagnetic and solid-state motion tracking sensors), and so forth, that may be included in the stereoscopic AR/VR eyewear 34 .
- orientation and position sensors e.g., accelerometers, magnetometers, gyroscopes, Global Positioning System [GPS] receivers, and so forth
- motion tracking sensors e.g., electromagnetic and solid-state motion tracking sensors
- the stereoscopic AR/VR eyewear may be implemented without the cameras 40 and 42 .
- the computer graphics generation system 32 which may also includes processing circuitry, such as a processor 46 (e.g., general purpose processor or other processor) and a memory 47 , may process the real-time video data (e.g., live video) and orientation and position data and/or point of view data received from the stereoscopic AR/VR eyewear 34 or the monitoring system 33 . Specifically, the computer graphics generation system 32 may use this data to generate a frame of reference to register the real-time video data with the generated real-world images 44 and the AR/VR graphical images 45 .
- processing circuitry such as a processor 46 (e.g., general purpose processor or other processor) and a memory 47 , may process the real-time video data (e.g., live video) and orientation and position data and/or point of view data received from the stereoscopic AR/VR eyewear 34 or the monitoring system 33 .
- the computer graphics generation system 32 may use this data to generate a frame of reference to register the real-time video data with the generated real-world images 44 and the AR
- the graphics generation system 32 may then render a view of the real-world images 44 that is temporally and spatially commensurate with what the respective ride passengers 22 , 24 , 26 , 28 would perceive if not wearing the stereoscopic AR/VR eyewear 34 .
- the graphics generation system 32 may constantly update (e.g., in real-time) the rendering of the real-world images to reflect change in respective orientation, position, and/or motion of the respective the ride passengers 22 , 24 , 26 , 28 .
- the graphics generation system 32 may render images (e.g., real world images 44 and AR/VR images 45 ) at a real-time rate greater than or equal to approximately 20 frames per second (FPS), greater than or equal to approximately 30 FPS, greater than or equal to approximately 40 FPS, greater than or equal to approximately 50 FPS, greater than or equal to approximately 60 FPS, greater than or equal to approximately 90 FPS, or greater than or equal to approximately 120 FPS.
- FPS frames per second
- the graphics generation system 32 may generate the real-world images 44 for each of the respective stereoscopic AR/VR eyewear 34 worn by the respective ride passengers 22 , 24 , 26 , 28 (e.g., adjusted for the respective orientation, position, and point of view of the respective ride passengers 22 , 24 , 26 , and 28 ).
- the computer graphics generation system 32 may also generate and render one or more AR/VR graphical images 45 superimposed on the real-world images 44 to create a complete AR experience, VR experience, mixed reality, and/or other computer-mediated experience for the ride passengers 22 , 24 , 26 , 28 .
- the computer graphics generation system 32 may utilize one or more of the discussed video merging and/or optical merging techniques to superimpose the AR/VR graphical images 45 onto the real-world images 44 , such that the ride passengers 22 , 24 , 26 , 28 perceive the real-world physical environment of the amusement park 10 (e.g., provided as rendered video data via the respective displays 37 and 38 ) along with an AR/VR graphical image 45 (e.g., virtual augmentations) as the passenger ride vehicle 20 traverses the tracks 18 .
- the discussed video merging and/or optical merging techniques to superimpose the AR/VR graphical images 45 onto the real-world images 44 , such that the ride passengers 22 , 24 , 26 , 28 perceive the real-world physical environment of the amusement park 10 (e.g., provided as rendered video data via the respective displays 37 and 38 ) along with an AR/VR graphical image 45 (e.g., virtual augmentations) as the passenger ride vehicle 20 traverses the tracks 18 .
- the graphics generation system 32 may render a view of the AR/VR graphical images 45 that is temporally and spatially commensurate with the real-world images 44 , such that the real-world images 44 may appear as a background overlaid with the AR/VR graphical images 45 .
- a model may provide computer generated images for any available viewpoint and specific images may be provided to the stereoscopic AR/VR eyewear 34 for display based on a detected orientation of the stereoscopic AR/VR eyewear 34 .
- the graphics generation system 32 may also generate one or more brightness, lighting, or shading models, and/or other photorealistic rendering models to generate the real-world images 44 and the AR/VR graphical images 45 adjusted to accurately reflect contrast and brightness of the real-world physical environment (e.g., sunny day, partly cloudy day, cloudy day, evening, night) in rendering the real-world images 44 and the AR/VR graphical images 45 .
- the graphics generation system 32 may, in some embodiments, receive weather related data from one or more weather forecast and/or prediction systems (e.g., Global Forecast System, Doppler radars, and so forth). The graphics generation system 32 may then use the weather related data or other similar data to adjust the contrast, brightness, and/or other lighting effects of the real-world images 44 and/or the AR/VR graphical images 45 .
- the graphics generation system 32 may adjust the contrast, brightness, and/or other lighting effects of the real-world images 44 and/or the AR/VR graphical images 45 based on lighting detected from one or more light sensors included in the stereoscopic AR/VR eyewear 34 or based on the real-time video data captured by the cameras 40 , 42 .
- the graphics generation system 32 may constantly update (e.g., in real-time) the rendering of the AR/VR graphical images 45 to reflect change in respective orientations, positions, points of view, and/or motion of the respective ride passengers 22 , 24 , 26 , 28 .
- the graphics generation system 32 may render the AR/VR graphical images 45 on the respective displays 37 and 38 of each of the respective stereoscopic AR/VR eyewear 34 worn by the respective the ride passengers 22 , 24 , 26 , 28 adjusted for the variable respective positions, points of view, and motions of the respective the ride passengers 22 , 24 , 26 , and 28 .
- the graphics generation system 32 may also generate the AR/VR graphical images 45 at a time in which the passenger ride vehicle 20 crosses at a predetermined point along the tracks 18 .
- the graphics generation system 32 may use the received position data, point of view data, motion data along with GPS data or geographical informational systems (GIS) data to derive an illumination map of, for example, the thrill ride 12 and tracks 18 , as well as the immediate environment surrounding the thrill ride 12 for the entire cycle of the thrill ride 12 .
- the graphics generation system 32 may then use the map to introduce the AR/VR graphical images 45 at certain predetermined points (e.g., points based on location, distance, or time) as the passenger ride vehicle 24 traverses the tracks 18 .
- the video or image data captured via the cameras 40 , 42 may be used by the graphics generation system 32 to determine the points of location of the ride vehicle 20 and when to introduce the AR/VR graphical images 45 .
- the graphics generation system 32 may perform one or more geometric recognition algorithms (e.g., shape or object recognition) or photometric recognition algorithms (e.g., face recognition or specific object recognition) to determine the position or location of the ride vehicle 20 as well as the viewing position of the ride passengers 22 , 24 , 26 , 28 .
- FIG. 3 is an illustration of the stereoscopic AR/VR eyewear 34 showing an embodiment in which the stereoscopic AR/VR eyewear 34 includes features that also permit viewing of externally projected stereoscopic images.
- the displays 37 and 38 may include a polarization feature such as a polarized coating or layer to permit the user to resolve stereoscopically projected images as being in 3D.
- the polarization feature may be coated on an outer surface 57 of the display 37 and an outer surface of the display 38 .
- the polarization feature may be formed within, embedded in, or formed on an opposing surface of the displays 37 and 38 .
- the polarization feature on the right eye display 37 has different polarization characteristics than the polarization feature on the left eye display 38 to permit each respective display 37 and 38 to act as a filtered lens that only permits polarized light having the appropriate characteristics to pass through. In this manner, two images projected superimposed onto a screen may viewed stereoscopically.
- the polarization feature in the respective displays may be linear polarization filters orthogonally oriented relative to one another.
- the polarization filters of the displays 37 and 38 may be circular polarization filters of opposite handedness relative to one another.
- the stereoscopic AR/VR eyewear 34 has color-shifting filters, such that the respective displays 37 and 38 include color filters that filter different wavelengths relative to one another.
- the stereoscopic AR/VR eyewear 34 may be implemented with Inficolor 3D technology or with Infitec® 3D technology (Infitec GmbH, Baden Wuerttemberg, Germany).
- the spectroscopic AR/VR eyewear 34 may have active stereoscopic capabilities, such as active shutters that cycle each display 37 and 38 on and off alternately. It is contemplated that changing the shutter rates may be used to provide individualized content between different users. For example, a first user and a second user, both with respective eyewear 34 , may have different assembled content if their active shutters are controlled at different rates. The control may be based on signals received from the system 32 , including signals embedded within the displayed frames. In other embodiments, the shutter control may be preset on the device.
- Active stereoscopic implementations may be advantageous in darker rides, because the lack of color or polarizing filters may permit more light to pass through the displays 37 and 38 when they are acting as lenses for stereoscopic viewing.
- the displays 37 and 38 may be used to generate an internal 3D or stereoscopic image. That is, in certain embodiments, the user views a transmitted image or a video stream that may be implemented stereoscopically.
- the left eye display 38 may display a separate video channel than the right eye display 37 . Based on the perspective differences or slight differences in the displayed images or video stream between the left eye/righteye view, similar to those generated on projected stereoscopic images, a 3D illusion may be internally generated in the displayed content.
- FIG. 4 illustrates various examples of AR/VR images 45 that may be generated by the graphics generation system 32 , or in other embodiments, that may be generated via the stereoscopic AR/VR eyewear 34 .
- the graphics generation system 32 may render stereoscopic images 43 , the real-world images 44 , as well as various AR/VR graphical images 45 through the respective stereoscopic AR/VR eyewear 34 (e.g., via the respective displays 37 and 38 ) of the rides passengers 22 , 24 , 26 , 28 .
- the graphics generation system 32 may be used in conjunction with stereoscopic projectors 53 .
- the real-world images 44 may include rendered images of, for example, the tracks 18 , the facilities 14 , and/or other patrons or objects that the ride passengers 22 , 24 , 26 , 28 would see while riding the thrill 12 , including the other passengers 22 , 24 , 26 , 28 , even if the stereoscopic AR/VR eyewear 34 were not being worn by the ride passengers 22 , 24 , 26 , 28 .
- the graphics generation system 32 may render AR/VR graphical images 45 (illustrated via the dashed lines) that may include, for example, an AR/VR image of a second mall of amusement park facilities 49 , an AR/VR image of one or more fictional characters 50 , an AR/VR image of a breach 52 of the tracks 18 , and/or additional AR/VR image 54 , 56 , and 58 .
- AR/VR graphical images 45 illustrated via the dashed lines
- the AR/VR image 50 may include an image of a monster or other similar fictional character appearing (e.g., from the point of view of the ride passengers 22 , 24 , 26 , 28 while wearing the stereoscopic AR/VR eyewear 34 ) to be obstructing a portion of the tracks 18 as the passenger ride vehicle 20 traverses the tracks 18 .
- the graphics generation system 32 may also render certain AR/VR graphical images 45 that include a deletion of one or more real-world physical objects that no longer appear while the ride passengers 22 , 24 , 26 , 28 are wearing the stereoscopic AR/VR eyewear 34 .
- the AR/VR image of the facilities 49 may appear at a place in which the attraction 16 is placed in the real-world environment.
- the graphics generation system 32 may render the AR/VR graphical images 45 based on, for example, the position or location of the passenger ride vehicle 20 along the tracks 18 at any given time during a cycle of the thrill ride 12 , a predetermined distance traveled by the passenger ride vehicle 20 during a cycle of the thrill ride 12 , or after a predetermined lapse of time.
- the AR/VR image of the fictional character 50 may appear to the ride passengers 22 , 24 , 26 , 28 , via the stereoscopic AR/VR eyewear 34 , as obstructing a place on the tracks 18 not yet traversed by the passenger ride vehicle 20 during a given cycle of the thrill ride 12 .
- the AR/VR image of the breach 52 of the tracks 18 may appear to the ride passengers 22 , 24 , 26 , 28 , via the stereoscopic AR/VR eyewear 34 , as though the passenger ride vehicle 20 will encounter a place in which there is no supporting tracks 18 .
- the graphics generation system 32 may render the AR/VR graphical images 45 based on the identity of the individual users of the eyewear 34 .
- Each eyewear 34 may be associated with an RFID tag or other identification element that transmits an identification signal to the graphics generation system 32 .
- the system 32 may select the overlaid image from among several options stored in the memory 47 based on the identity of the ride passenger (e.g., ride passengers 22 , 24 , 26 , 28 ). In this manner, each passenger in a ride vehicle 20 may receive customized content that is different from that received by the other passengers in the ride vehicle 20 . For example, in a ride that includes character content, certain passengers wearing particular eyewear 34 may be associated with particular characters. In such embodiments, the overlaid AR/VR image may be associated with the particular character.
- the ride passengers may (e.g., ride passengers 22 , 24 , 26 , 28 ) may have individualized interactive content displayed via the eyewear 34 that is based on previous park experiences, rewards, characters, passenger age or interests, passenger profile information acquired from a central server, etc.
- a guest in an interactive arena may see a particular overlaid image displayed only if they successfully perform a physical action (e.g., punch a block or open a door).
- the illumination map generated by the graphics generation system 32 may allow the graphics generation system 32 to include one or more detection and/or trigger points (e.g., trigger point for which to introduce the AR/VR images 45 ) at every mile of the tracks 18 , every yard of the tracks 18 , every foot of the tracks 18 , every inch of the tracks 18 , every centimeter of the tracks 18 , or every millimeter of the tracks 18 .
- the graphics generation system 32 may detect when to begin rendering of the AR/VR graphical images 45 based on position or location, distance traveled, and/or time elapsed during a cycle of the thrill ride 12 with sufficient accuracy and efficiency.
- certain images 54 , 56 illustrate that one or more of the AR/VR graphical images 45 may appear to the ride passengers 22 , 24 , 26 , 28 as interacting with each other (e.g., overlapping or touching).
- the images e.g., images 54 A and 54 B
- the AR/VR image 58 illustrates an example of AR/VR graphical images 45 that may appear outside the line of sight or the point of view (e.g., blind spot) of the ride passengers 22 , 24 , 26 , 28 that may be nevertheless perceived by the ride passengers 22 , 24 , 26 , 28 should any of them look into the direction of the AR/VR image 58 .
- completely different images may also be provided to different ride passengers 22 , 24 , 26 , 28 such that one or more of the ride passengers 22 , 24 , 26 , 28 have partially or completely different ride experiences or even ride themes.
- the graphics generation system 32 may render the real-world images 44 and the AR/VR images 45 to each of the respective displays 37 and 38 of the stereoscopic AR/VR eyewear 34 worn by each of the respective the ride passengers 22 , 24 , 26 , and 28 , the ride passengers 22 , 24 , 26 , 28 may each perceive the real-world images 44 (e.g., facilities 14 , thrill ride 12 , and so forth) and the AR/VR images 45 (e.g., AR/VR images or virtual augmentations 49 , 50 , 52 , 54 , 56 , and 58 ) temporally and spatially commensurate with their respective points of view, thus creating a photorealistic effect as the passenger ride vehicle 20 traverses the tracks 18 .
- the real-world images 44 e.g., facilities 14 , thrill ride 12 , and so forth
- the AR/VR images 45 e.g., AR/VR images or virtual augmentations 49 , 50 , 52 , 54 , 56 , and 58
- the graphics generation system 32 may also trigger one or more sound effects, haptic feedback effects, scented effects, and so forth that may coincide with the appearances of the AR/VR images 45 on the stereoscopic AR/VR eyewear 34 .
- the graphics generation system 32 is integral with the stereoscopic AR/VR eyewear 34 .
- the stereoscopic AR/VR eyewear 34 and the graphics generation system 32 may enhance the thrill factor of the thrill ride 12 , and, by extension, the experience of the ride passengers 22 , 24 , 26 , 28 while on the thrill ride 12 .
- the ride passengers 22 , 24 , 26 , 28 may be provided with greater freedom of movement, as well as a more photorealistic experience.
- each of the ride passengers 22 , 24 , 26 , 28 may be able to see each other ride passenger 22 , 24 , 26 , 28 , as well as the passenger ride vehicle 20 itself even when wearing the stereoscopic AR/VR eyewear 34 .
- the stereoscopic AR/VR eyewear 34 may include individual cameras 40 , 42 and individual displays 37 , 38 , data with respect to the respective points of view of each eye of the ride passengers 22 , 24 , 26 , 28 may be captured by the stereoscopic AR/VR eyewear 34 .
- the graphics generation system 32 may render real-world images 44 and AR/VR images 45 on the displays 37 , 38 of the stereoscopic AR/VR eyewear 34 that are consistent with the respective points of view of the ride passengers 22 , 24 , 26 , 28 .
- Such advantages may be unavailable using devices such as traditional HMDs.
- the system 32 may use audio watermarking to synchronize AR content within the ride 12 , e.g., to synchronize played media to AR images.
- FIG. 5 a flow diagram is presented, illustrating an embodiment of a process 80 useful in creating a stereoscopic experience, an AR experience, a VR experience, and/or other computed-mediated experience during a thrill ride using, for example, the computer graphics generation system 32 depicted in FIG. 2 .
- the process 80 may be representative of initiated code or instructions stored in a non-transitory computer-readable medium (e.g., the memory 47 ) and executed, for example, by the processor 46 included in the computer graphics generation system 32 .
- the process 64 may begin with the processor 46 receiving (block 82 ) position information for a user wearing the eyewear 34 . As discussed, the eyewear position may be assessed by RFID tags on each device, by cameras, GPS, etc.
- the system 32 may determine that the user wearing the eyewear 34 is positioned in the proximity of a desired stereoscopic event (block 84 ). Accordingly, the system 32 may initiate or maintain projection of stereoscopic images for display and viewing by the user (block 86 ).
- the method 80 may receive updated position information (block 88 ) to reflect that the user has moved to a new location associated with a desired mixed or AR/VR effect (block 90 ).
- the method may access pre-scanned or receive real-time captured image data (block 92 ).
- the processor 46 may receive real-time video data (e.g., live video) captured via cameras 40 , 42 of the stereoscopic AR/VR eyewear 34 .
- the process 64 may then continue with the processor 46 generating a visualization of the real-world environment based on the real-world image data.
- the processor 46 may generate a video data stream of the real-world environment (e.g., the amusement park 10 ) to be displayed on the displays 37 , 38 of the stereoscopic AR/VR eyewear 34 .
- the process 64 may then continue with the processor 46 overlaying (block 92 ) or superimposing one or more augmented or virtual reality images onto the generated visualization of the real-world environment.
- the processor 46 may generate a video data stream of the real-world images 44 (e.g., facilities 14 , thrill ride 12 ), and overlay or superimpose the AR/VR images 45 (e.g., AR/VR images or virtual augmentations 49 , 50 , 52 , 54 , 56 , and 58 ) onto the real-world images 44 using one or more video merging and/or optical merging techniques.
- the processor 46 of the graphics generation system 32 may render the AR/VR graphical images 45 based on, for example, the position or location of the passenger ride vehicle 20 along the tracks 18 at any given time during a cycle of the thrill ride 12 , a predetermined distance traveled by the passenger ride vehicle 20 during a cycle of the thrill ride 12 , or after a predetermined lapse of time.
- the graphics generation system 32 may perform one or more geometric or photometric recognition algorithms on the video or image data captured via the cameras 40 , 42 to determine the points of location of the ride vehicle 20 and when to introduce the AR/VR graphical images 45 .
- the process 64 may then conclude with the processor 46 transmitting (block 94 ) the overlaid augmented or virtual reality image data (e.g., AR/VR images 45 ) along with the real-world environment data (e.g., real-world images 44 ) to be displayed on the displays 37 , 38 of the stereoscopic AR/VR eyewear 34 to enhance the thrill factor of the thrill ride 12 , and, by extension, the experience of the ride passengers 22 , 24 , 26 , 28 while on the thrill ride 12 .
- the system 32 is configured to permit the eyewear 34 to switch between different viewing modes, e.g., AR/VR, stereoscopic, and real world (e.g., no effects).
- the switch may be based on the time or position of the user within the ride 12 and may be mediated by a control signal from the system 32 .
- the system 32 may also receive user input, e.g., via an input button or switch on the eyewear. For example, certain users may be sensitive to stereoscopic image display. Such users may have the option of turning off the 3D stereoscopic viewing and the system 32 may provide alternative video data in the proximity of stereoscopic effects.
- each ride passenger may be provided with eyewear (e.g., stereoscopic AR/VR eyewear 34 that is configured to be used as AR/VR eyewear) to be worn during a cycle of the thrill ride.
- eyewear e.g., stereoscopic AR/VR eyewear 34 that is configured to be used as AR/VR eyewear
- the eyewear is both AR/VR capable as well as being capable of facilitating the viewing of projected stereoscopic images.
- the eyewear may be configured to display virtual images overlaid over a real-world representation.
- the eyewear may include at least two cameras, which may respectively correspond to the respective points of view of the ride passengers, and may be used to capture real-time video data (e.g., live video) of the real-world environment (e.g., the physical amusement park) of the ride passengers and/or the thrill ride.
- the eyewear may also include at least two displays respectively corresponding to each eye of the ride passengers.
- a computer graphics generation system may also be provided. The computer graphics generation system may render a video stream of the real-world environment along with various AR/VR graphical images to the respective displays of the respective stereoscopic eyewear of the ride passengers during a cycle of the thrill ride.
- the graphics generation system 32 may render the AR/VR graphical images to the eyewear based on, for example, the position or location of the passenger ride vehicle along the tracks at any given time during a cycle of the thrill ride, a predetermined distance traveled by the passenger ride vehicle during a cycle of the thrill ride, or after a predetermined lapse of time.
- the eyewear and the computer graphics generation system may enhance the thrill factor of the thrill ride, and, by extension, may enhance the experience of the ride passengers as they ride the thrill ride.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Processing Or Creating Images (AREA)
- Controls And Circuits For Display Device (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- The present application claims the benefit of U.S. Provisional Application No. 62/332,299, entitled “SYSTEMS AND METHODS FOR GENERATING STEREOSCOPIC, AUGMENTED, AND VIRTUAL REALITY IMAGES” and filed May 5, 2016, the disclosure of which is incorporated herein by reference for all purposes.
- The subject matter disclosed herein relates to amusement park attractions, and more specifically, to providing enhanced thrill factors and components of interest in amusement park attractions.
- Amusement parks and/or theme parks may include various entertainment attractions, restaurants, and rides useful in providing enjoyment to patrons (e.g., families and/or people of all ages) of the amusement park. For example, the attractions may include traditional rides for kids such as carousels, as well as traditional rides for thrill seekers such as rollercoasters. It is now recognized that adding components of interest and thrill factors to such attractions can be difficult and limiting. Traditionally, for example, outside of providing an increasingly complex system of steep, twisting, and winding rollercoaster tracks, the thrill factor of such rollercoasters and/or other similar thrill rides may be limited to the existing course or physical nature of the thrill ride itself. It is now recognized that it is desirable to include components of interest and thrill factors in such attractions in a flexible and efficient manner relative to traditional techniques.
- Certain embodiments commensurate in scope with the present disclosure are summarized below. These embodiments are not intended to limit the scope of the disclosure, but rather these embodiments are intended only to provide a brief summary of possible forms of present embodiments. Indeed, present embodiments may encompass a variety of forms that may be similar to or different from the embodiments set forth below.
- In one embodiment, a ride system includes eyewear configured to be worn by the user, wherein the eyewear comprises a display having a stereoscopic feature configured to permit viewing of externally generated stereoscopically displayed images. The ride system includes a computer graphics generation system communicatively coupled to the eyewear, and configured to generate streaming media of a real world environment based on image data captured via the camera of the eyewear, generate one or more virtual augmentations superimposed on the streaming media of the real world environment, transmit the streaming media of the real world environment along with the one or more superimposed virtual augmentations to be displayed on the display of the eyewear, and project stereoscopic images into the real world environment.
- In a second embodiment, a wearable electronic device includes a frame comprising a frame front; a left eye display lens and a right eye display lens coupled to the frame front; a first filter on the left eye display lens; a second filter on the right eye display lens, wherein the first filter is different than the second filter; and processing circuitry configured to: receive a signal from the computer graphics generation system, wherein the signal comprises a video stream of a virtualization of a real world environment along with at least one augmented reality (AR) image or at least one virtual reality (VR) image included in the video stream; and cause the left eye display and the right eye display to display the video stream.
- In a third embodiment, a method includes receiving or accessing environmental image data via a computer graphics generation system, generating a virtualization of a real world environment of the amusement park based on the environmental image data; overlaying an augmented reality (AR) image or a virtual reality (VR) image onto the virtualization of the real world environment; transmitting the overlaid AR image or the VR image along with the virtualization of the real world environment to the eyewear during the cycle of the amusement park ride; transmitting a signal to the eyewear to permit viewing through displays of the eyewear; projecting stereoscopic images onto a surface of the real-world environment after transmitting the signal; and causing the stereoscopic images to be reflected through filters in the eyewear into a left and right eye of a user to generate an illusion of a 3D image.
- These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
-
FIG. 1 illustrates an embodiment of an amusement park including one or more attractions in accordance with the present embodiments; -
FIG. 2 is an illustration of an embodiment of stereoscopic augmented reality (AR) or virtual reality (VR) eyewear and a computer graphics generation system in accordance with present embodiments; -
FIG. 3 is an illustration of an embodiment of stereoscopic augmented reality (AR) or virtual reality (VR) eyewear; -
FIG. 4 is a perspective view of a thrill ride ofFIG. 1 including various AR and VR images provided by way of the stereoscopic AR/VR eyewear ofFIG. 2 , in accordance with present embodiments; and -
FIG. 5 is a flowchart illustrating an embodiment of a process useful in creating stereoscopic images within an AR experience, a VR experience, or a mixed reality experience during a ride by using the computer graphics generation system ofFIG. 2 , in accordance with present embodiments. - One or more specific embodiments of the present disclosure will be described below. In an effort to provide a concise description of these embodiments, all features of an actual implementation may not be described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
- Present embodiments relate to systems and methods of providing a stereoscopic mixed or augmented reality (AR) experience, a virtual reality (VR) experience, or a combination thereof, as part of an attraction, such as a thrill ride, in an amusement park or theme park. In certain embodiments, each ride passenger (e.g., first passenger, second passenger, etc.) may be provided eyewear, such as a pair of electronic goggles or eyeglasses to be worn during a cycle of the thrill ride. The eyewear may facilitate an AR experience, a VR experience, or a combination of both experiences. Thus, the eyewear may be referred to as stereoscopic eyewear or stereoscopic AR/VR eyewear. The stereoscopic AR/VR eyewear provides the capability of viewing stereoscopic images, which generate the illusion of 3D images. In addition, the stereoscopic AR/VR eyewear is configured for displaying augmented or virtual reality images overlaid on an image of the user's real-world environment, which generates the illusion that the overlaid image is part of the real world environment. Accordingly, the stereoscopic AR/VR eyewear is implemented with display lenses that are capable of displaying the overlaid AR/VR images transmitted from a central controller as well as being capable of permitting the user to view the real-world environment, including any stereoscopically displayed images. For example, the display lenses may be implemented with a polarizing layer, active shutters, color shifting capability, or other technology that permits stereoscopic viewing and that is compatible with the AR/VR capability of the eyewear. In this manner, a single eyewear device may be used within an environment to render a variety of different types of visual experiences. At the base level, the eyewear may also permit unaided, non-stereoscopic, or unaugmented viewing in certain instances, e.g., at the start of a theme park ride to permit the users to acclimatize themselves to the environment.
- The stereoscopic AR/VR eyewear is capable of acting as a display for images that are created to reflect the real-world environment with augmented images. In such embodiments, the users view a displayed image that is displayed on the lenses of the eyewear in a manner that create the illusion that the augmented image is the real-world environment viewed in real time. The images of the real-world environment may be recorded ahead of time, e.g., may be stored in a memory of the system, or, in certain embodiments, may be collected in real-time by a user. Specifically, in one embodiment, the eyewear includes at least two cameras, which may respectively correspond to the respective points of view (e.g., right and left eye views) of the ride passengers, and may be used to capture real-time video data (e.g., video captured during live use and transmitted in substantially real-time) of the real-world environment (e.g., aspects of the physical amusement park) of the ride passengers and/or the thrill ride. The eyewear may also include a display. For example, the eyewear may include at least two displays respectively corresponding to each eye of a ride passenger using the eyewear.
- In certain embodiments, a computer graphics generation system may also be provided. The computer graphics generation system may receive the real-time video data (e.g., live video that is transmitted in substantially real-time) from the eyewear, and may render a video stream of the real-world environment along with various AR, VR, or combined AR and VR (AR/VR) graphical images to the respective displays of the respective eyewear of the ride passengers during a cycle of the ride. For example, in one embodiment, the computer graphics generation system may render the AR/VR graphical images to the eyewear based on, for example, the position or location of a ride passenger vehicle along the tracks of a rollercoaster during a cycle of a thrill ride, a predetermined distance traveled by the passenger ride vehicle during a cycle of the thrill ride, or after a predetermined lapse of time in the cycle of the thrill ride. In this way, by using the eyewear and the graphics generation system to create an AR experience, a VR experience, or mixed reality experience, the eyewear and the computer graphics generation system may enhance the thrill factor of the thrill ride, and, by extension, may enhance the experience of the ride passengers as they ride the thrill ride. However, it should be appreciated that the techniques described herein may not be limited to thrill rides and/or amusement park attraction applications, but may also be extended to any of various applications such as, for example, medical applications (e.g., image-guided surgery, noninvasive imaging analysis), engineering design applications (e.g., engineering model development), manufacturing, construction, and maintenance applications (e.g., products manufacturing, new building construction, automobile repairs), academic and/or vocational training applications, exercise applications (e.g., bodybuilding and weight loss models), television (TV) applications (e.g., weather and news), and the like.
- With the foregoing mind, it may be useful to describe an embodiment of an amusement park, such as an
example amusement park 10 as depicted inFIG. 1 . As illustrated, theamusement park 10 may include athrill ride 12, a mall of amusement park facilities 14 (e.g., restaurants, souvenir shops, and so forth), and additional amusement attractions 16 (e.g., Ferris Wheel, dark ride, or other attraction). In certain embodiments, thethrill ride 12 may include a rollercoaster or other similar thrill ride, and may thus further include a closed-loop track or a system of closed-loop tracks 18 (e.g., miles of tracks 18). Thetracks 18 may be provided as an infrastructure on which apassenger ride vehicle 20 may traverse, for example, as 22, 24, 26, 28 ride theride passengers thrill ride 12. Thetracks 18 may thus define the motion of theride vehicle 20. However, in another embodiment, for example, thetracks 18 may be replaced by a controlled path, in which the movement of theride vehicle 20 may be controlled via an electronic system, a magnetic system, or other similar system infrastructure other than thetracks 18. It should be appreciated that while thepassenger ride vehicle 20 may be illustrated as a 4-passenger vehicle, in other embodiments, thepassenger ride vehicle 20 may include any number of passenger spaces (e.g., 1, 2, 4, 8, 10, 20, or more spaces) to accommodate a single or multiple groups of 22, 24, 26, 28. It should be understood that, while theride passengers thrill ride 12 is described in the context of theride vehicle 20, other embodiments are contemplated (e.g., a seated theater environment, a walking or free movement arena environment, etc.) and may be used in conjunction with the disclosed embodiments. - As the
passenger ride vehicle 20 traverses thetracks 18, the 22, 24, 26, 28 may be provided a moving tour of the scenery (e.g.,ride passengers facilities 14,additional amusement attractions 16, and so forth) in an area around or nearby thethrill ride 12. For example, this may include the environment surrounding the thrill ride 12 (e.g., a building that fully or partially houses the thrill ride 12). While the 22, 24, 26, 28 may find theride passengers thrill ride 12 to be a very enjoyable experience, in certain embodiments, it may be useful to enhance the experience of the 22, 24, 26, 28 as theride passengers 22, 24, 26, 28 ride theride passengers thrill ride 12 by enhancing, for example, the thrill factor of thethrill ride 12. Specifically, instead of having a physical view of only the facilities 14 (e.g., restaurants, souvenir shops, and so forth), additional amusement attractions 16 (e.g., Ferris Wheel or other attractions), or other patrons or pedestrians within theamusement park 10, it may be useful to provide the 22, 24, 26, 28 with a augmented reality (AR) experience or a virtual reality (VR) experience as theride passengers ride vehicle 20 traverses thetracks 18. - For example, turning now to
FIG. 2 , each of the 22, 24, 26, 28 may be provided a pair of stereoscopic AR/ride passengers VR eyewear 34, which may, in certain embodiments, include eyeglasses. In other embodiments, the stereoscopic AR/VR eyewear 34 may be included as part of a helmet, a visor, a headband, a pair of blinders, one or more eyepatches, and/or other headwear or eyewear that may be worn by each of the 22, 24, 26, 28. As depicted, the stereoscopic AR/ride passengers VR eyewear 34 may be communicatively coupled to a computer graphics generation system 32 (e.g., within the amusement park 10) via a wireless network 48 (e.g., wireless local area networks [WLAN], wireless wide area networks [WWAN], near field communication [NFC]). The stereoscopic AR/VR eyewear 34 may be used to createsurreal environment 30, which may include an AR experience, a VR experience, a mixed reality experience, a combination of AR and VR experience, a computer-mediated reality experience, a combination thereof, or other similar surreal environment for the 22, 24, 26, 28 as theride passengers 22, 24, 26, 28 ride theride passengers thrill ride 12. Specifically, the stereoscopic AR/VR eyewear 34 may be worn by the 22, 24, 26, 28 throughout the duration of the ride, such thatride passengers 22, 24, 26, 28 may feel completely encompassed by theride passengers environment 30 and may perceive theenvironment 30 to be a real-world physical environment. Specifically, as will be further appreciated, theenvironment 30 may be a real-time video including real-world images 44 that the 22, 24, 26, 28 would see, even when not wearing the stereoscopic AR/ride passengers VR eyewear 34 electronically merged with one or more AR or VR images 45 (e.g., virtual augmentations). The term “real-time” indicates that the images are obtained and/or provided in a timeframe substantially close to the time of actual observation. In alternative embodiments, the obtained images may be historical images of the environment. - In certain embodiments, the stereoscopic AR/
VR eyewear 34 may be any of various wearable electronic devices that may be useful in creating an AR experience, a VR, and/or other computed-mediated experience to enhance the thrill factor of thethrill ride 12, and, by extension, the experience of the 22, 24, 26, 28 while on theride passengers thrill ride 12. It should be appreciated that the eyeglasses embodiment of the stereoscopic AR/VR eyewear 34 as discussed herein may be distinct from, and may provide many advantages over traditional devices such as head-mounted displays (HMDs) and/or heads-up displays (HUDs). For example, as will be further appreciated, the stereoscopic AR/VR eyewear 34 may include a number of orientation and position sensors (e.g., accelerometers, magnetometers, gyroscopes, Global Positioning System [GPS] receivers) that may be used to track the position, orientation, and motion of the 22, 24, 26, 28 during a cycle of theride passengers thrill ride 12. - Similarly, features of the stereoscopic AR/VR eyewear 34 (e.g., geometric aspects or markings) may be monitored by a monitoring system (e.g., one or more cameras) to determine position, location, orientation, and so forth of the stereoscopic AR/
VR eyewear 34 and, in turn, that of the wearer. Still, the 22, 24, 26, 28 may be monitored by a monitoring system 33 (e.g., a camera), which may be communicatively coupled to the computerride passengers graphics generation system 32 and used to identify position, location, orientation, and so forth of the 22, 24, 26, 28. Theride passengers ride vehicle 20 may also include one or more sensors (e.g., weight sensors, mass sensors, motion sensors, ultrasonic sensors) that may be useful in monitoring the 22, 24, 26, 28 for therespective ride passengers graphics generation system 32 to determine the point of view of the 22, 24, 26, 28. Moreover, as will be further appreciated, because the stereoscopic AR/respective ride passengers VR eyewear 34 may include individual cameras (e.g.,cameras 40 and 42) and individual displays (e.g., displays 37 and 38), data with respect to the respective points of view of each eye of the 22, 24, 26, 28 may be captured by stereoscopic AR/ride passengers VR eyewear 34. All of these advantages may be unavailable using devices such as traditional HMDs and/or HUDs. - In certain embodiments, to support the creation of the
environment 30, the stereoscopic AR/VR eyewear 34 may include processing circuitry, such as aprocessor 35 and amemory 36. Theprocessor 35 may be operatively coupled to thememory 36 to execute instructions for carrying out the presently disclosed techniques of generating real-world images 44 merged with one or more AR/VR images 45 to enhance the thrill factor of thethrill ride 12, and, by extension, the experience of the 22, 24, 26, 28 while on theride passengers thrill ride 12. These instructions may be encoded in programs or code stored in a tangible non-transitory computer-readable medium, such as thememory 36 and/or other storage. Theprocessor 35 may be a general-purpose processor, system-on-chip (SoC) device, an application-specific integrated circuit (ASIC), or some other similar processor configuration. In alternative embodiments, theprocessor 35 and thememory 36 may be provided as an auxiliary pack carried by the user (e.g., clipped at the waited or carried in a pocket), either wired to or in wireless communication with the stereoscopic AR/VR eyewear 34. In other embodiments, the stereoscopic AR/VR eyewear 34 communicates wirelessly with the computergraphics generation system 32 and does not perform on-board image processing. - In certain embodiments, as further illustrated, the stereoscopic AR/
VR eyewear 34 may also include the pair ofdisplays 37 and 38 (e.g., which may be provided in theframe front 39 of the stereoscopic AR/VR eyewear 34 where eyeglass lenses would otherwise appear) respectively corresponding to each eye of the 22, 24, 26, 28. In other embodiments, a unified display may be employed. The respective displays 37 and 38 may each include a display that covers at least part or only some of the viewing surface. Theride passengers 37, 38 may be an opaque liquid crystal display (LCD), an opaque organic light emitting diode (OLED) display, or other similar display useful in displaying the real-displays world images 44 and the AR/VRgraphical images 45 to the 22, 24, 26, 28. In another embodiment, theride passengers 37 and 38 may each include a see-through LCD or a see-through OLED display useful in allowing, for example, therespective displays 22, 24, 26, 28 to view the real-ride passengers world images 44 and the AR/VRgraphical images 45 appearing on the 37 and 38 while preserving the ability to see through therespective displays 37 and 38 to the actual and physical real world environment (e.g., the amusement park 10). In yet another embodiment, therespective displays 37 and 38 permit viewing ofdisplays stereoscopic images 43. The 37, 38 may also include light field displays. In certain embodiments, thedisplays 37, 38 may toggle between opaque and transparent configurations, depending on the desired visual environment.displays - The
40 and 42 may respectively correspond to the respective points of view of thecameras 22, 24, 26, 28, and may be used to capture real-time video data (e.g., live video) of the real-world environment. In some embodiments, a single camera may be employed. Specifically, in the illustrated embodiment, theride passengers 40, 42 of the stereoscopic AR/cameras VR eyewear 34 may be used to capture real-time images of the real-world physical environment (e.g., the physical amusement park 10) perceived by the 22, 24, 26, 28 from the point of view of therespective ride passengers 22, 24, 26, 28. As will be further appreciated, the stereoscopic AR/respective ride passengers VR eyewear 34 may then transmit (e.g. wirelessly via one or more communications interfaces included in the stereoscopic AR/VR eyewear 34) real-time video data captured via the 40 and 42 to a computerrespective cameras graphics generation system 32 for processing. However, in other embodiments, the real-time video data captured via the 40 and 42 may be processed on the stereoscopic AR/respective cameras VR eyewear 34 via theprocessor 35. Additionally, the stereoscopic AR/VR eyewear 34 may also transmit orientation data, position data, point of view data (e.g., focal length, orientation, pose, and so forth), motion tracking data, and so forth obtained and/or derived based on data obtained via orientation and position sensors (e.g., accelerometers, magnetometers, gyroscopes, Global Positioning System [GPS] receivers, and so forth) motion tracking sensors (e.g., electromagnetic and solid-state motion tracking sensors), and so forth, that may be included in the stereoscopic AR/VR eyewear 34. Further, in embodiments in which the real-world image data of the environment (e.g., the ride 12) is previously acquired and accessed, the stereoscopic AR/VR eyewear may be implemented without the 40 and 42.cameras - In certain embodiments, as previously noted, the computer
graphics generation system 32, which may also includes processing circuitry, such as a processor 46 (e.g., general purpose processor or other processor) and amemory 47, may process the real-time video data (e.g., live video) and orientation and position data and/or point of view data received from the stereoscopic AR/VR eyewear 34 or themonitoring system 33. Specifically, the computergraphics generation system 32 may use this data to generate a frame of reference to register the real-time video data with the generated real-world images 44 and the AR/VRgraphical images 45. Specifically, using the frame of reference generated based on the orientation data, position data, point of view data, motion tracking data, and so forth, thegraphics generation system 32 may then render a view of the real-world images 44 that is temporally and spatially commensurate with what the 22, 24, 26, 28 would perceive if not wearing the stereoscopic AR/respective ride passengers VR eyewear 34. Thegraphics generation system 32 may constantly update (e.g., in real-time) the rendering of the real-world images to reflect change in respective orientation, position, and/or motion of the respective the 22, 24, 26, 28.ride passengers - For example, in certain embodiments, the
graphics generation system 32 may render images (e.g.,real world images 44 and AR/VR images 45) at a real-time rate greater than or equal to approximately 20 frames per second (FPS), greater than or equal to approximately 30 FPS, greater than or equal to approximately 40 FPS, greater than or equal to approximately 50 FPS, greater than or equal to approximately 60 FPS, greater than or equal to approximately 90 FPS, or greater than or equal to approximately 120 FPS. Furthermore, thegraphics generation system 32 may generate the real-world images 44 for each of the respective stereoscopic AR/VR eyewear 34 worn by the 22, 24, 26, 28 (e.g., adjusted for the respective orientation, position, and point of view of therespective ride passengers 22, 24, 26, and 28).respective ride passengers - In certain embodiments, as previously discussed, the computer
graphics generation system 32 may also generate and render one or more AR/VRgraphical images 45 superimposed on the real-world images 44 to create a complete AR experience, VR experience, mixed reality, and/or other computer-mediated experience for the 22, 24, 26, 28. For example, in certain embodiments, the computerride passengers graphics generation system 32 may utilize one or more of the discussed video merging and/or optical merging techniques to superimpose the AR/VRgraphical images 45 onto the real-world images 44, such that the 22, 24, 26, 28 perceive the real-world physical environment of the amusement park 10 (e.g., provided as rendered video data via theride passengers respective displays 37 and 38) along with an AR/VR graphical image 45 (e.g., virtual augmentations) as thepassenger ride vehicle 20 traverses thetracks 18. Specifically, as discussed above with respect to the rendering of the real-world images 44, thegraphics generation system 32 may render a view of the AR/VRgraphical images 45 that is temporally and spatially commensurate with the real-world images 44, such that the real-world images 44 may appear as a background overlaid with the AR/VRgraphical images 45. Indeed, a model may provide computer generated images for any available viewpoint and specific images may be provided to the stereoscopic AR/VR eyewear 34 for display based on a detected orientation of the stereoscopic AR/VR eyewear 34. - In certain embodiments, the
graphics generation system 32 may also generate one or more brightness, lighting, or shading models, and/or other photorealistic rendering models to generate the real-world images 44 and the AR/VRgraphical images 45 adjusted to accurately reflect contrast and brightness of the real-world physical environment (e.g., sunny day, partly cloudy day, cloudy day, evening, night) in rendering the real-world images 44 and the AR/VRgraphical images 45. For example, to increase the photorealism of the real-world images 44 and the AR/VRgraphical images 45, thegraphics generation system 32 may, in some embodiments, receive weather related data from one or more weather forecast and/or prediction systems (e.g., Global Forecast System, Doppler radars, and so forth). Thegraphics generation system 32 may then use the weather related data or other similar data to adjust the contrast, brightness, and/or other lighting effects of the real-world images 44 and/or the AR/VRgraphical images 45. - In other embodiments, the
graphics generation system 32 may adjust the contrast, brightness, and/or other lighting effects of the real-world images 44 and/or the AR/VRgraphical images 45 based on lighting detected from one or more light sensors included in the stereoscopic AR/VR eyewear 34 or based on the real-time video data captured by the 40, 42. Furthermore, as previously noted, thecameras graphics generation system 32 may constantly update (e.g., in real-time) the rendering of the AR/VRgraphical images 45 to reflect change in respective orientations, positions, points of view, and/or motion of the 22, 24, 26, 28. For example, as will be further appreciated with respect torespective ride passengers FIG. 3 , thegraphics generation system 32 may render the AR/VRgraphical images 45 on the 37 and 38 of each of the respective stereoscopic AR/respective displays VR eyewear 34 worn by the respective the 22, 24, 26, 28 adjusted for the variable respective positions, points of view, and motions of the respective theride passengers 22, 24, 26, and 28.ride passengers - As will be further appreciated, the
graphics generation system 32 may also generate the AR/VRgraphical images 45 at a time in which thepassenger ride vehicle 20 crosses at a predetermined point along thetracks 18. Thus, in certain embodiments, thegraphics generation system 32 may use the received position data, point of view data, motion data along with GPS data or geographical informational systems (GIS) data to derive an illumination map of, for example, thethrill ride 12 and tracks 18, as well as the immediate environment surrounding thethrill ride 12 for the entire cycle of thethrill ride 12. Thegraphics generation system 32 may then use the map to introduce the AR/VRgraphical images 45 at certain predetermined points (e.g., points based on location, distance, or time) as thepassenger ride vehicle 24 traverses thetracks 18. Furthermore, in certain embodiments, the video or image data captured via the 40, 42 may be used by thecameras graphics generation system 32 to determine the points of location of theride vehicle 20 and when to introduce the AR/VRgraphical images 45. For example, thegraphics generation system 32 may perform one or more geometric recognition algorithms (e.g., shape or object recognition) or photometric recognition algorithms (e.g., face recognition or specific object recognition) to determine the position or location of theride vehicle 20 as well as the viewing position of the 22, 24, 26, 28.ride passengers -
FIG. 3 is an illustration of the stereoscopic AR/VR eyewear 34 showing an embodiment in which the stereoscopic AR/VR eyewear 34 includes features that also permit viewing of externally projected stereoscopic images. For example, the 37 and 38 may include a polarization feature such as a polarized coating or layer to permit the user to resolve stereoscopically projected images as being in 3D. The polarization feature may be coated on andisplays outer surface 57 of thedisplay 37 and an outer surface of thedisplay 38. Alternatively, the polarization feature may be formed within, embedded in, or formed on an opposing surface of the 37 and 38. The polarization feature on thedisplays right eye display 37 has different polarization characteristics than the polarization feature on theleft eye display 38 to permit each 37 and 38 to act as a filtered lens that only permits polarized light having the appropriate characteristics to pass through. In this manner, two images projected superimposed onto a screen may viewed stereoscopically. In certain embodiments, the polarization feature in the respective displays may be linear polarization filters orthogonally oriented relative to one another. In another embodiment, the polarization filters of therespective display 37 and 38 may be circular polarization filters of opposite handedness relative to one another. In another embodiment, the stereoscopic AR/displays VR eyewear 34 has color-shifting filters, such that the 37 and 38 include color filters that filter different wavelengths relative to one another. In a specific embodiment, the stereoscopic AR/respective displays VR eyewear 34 may be implemented with Inficolor 3D technology or with Infitec® 3D technology (Infitec GmbH, Baden Wuerttemberg, Germany). - Other implementations are also contemplated. For example, the spectroscopic AR/
VR eyewear 34 may have active stereoscopic capabilities, such as active shutters that cycle each 37 and 38 on and off alternately. It is contemplated that changing the shutter rates may be used to provide individualized content between different users. For example, a first user and a second user, both withdisplay respective eyewear 34, may have different assembled content if their active shutters are controlled at different rates. The control may be based on signals received from thesystem 32, including signals embedded within the displayed frames. In other embodiments, the shutter control may be preset on the device. Active stereoscopic implementations may be advantageous in darker rides, because the lack of color or polarizing filters may permit more light to pass through the 37 and 38 when they are acting as lenses for stereoscopic viewing. It should also be understood that when the stereoscopic AR/displays VR eyewear 34 is being used in the AR/VR mode, the 37 and 38 may be used to generate an internal 3D or stereoscopic image. That is, in certain embodiments, the user views a transmitted image or a video stream that may be implemented stereoscopically. For example, thedisplays left eye display 38 may display a separate video channel than theright eye display 37. Based on the perspective differences or slight differences in the displayed images or video stream between the left eye/righteye view, similar to those generated on projected stereoscopic images, a 3D illusion may be internally generated in the displayed content. -
FIG. 4 illustrates various examples of AR/VR images 45 that may be generated by thegraphics generation system 32, or in other embodiments, that may be generated via the stereoscopic AR/VR eyewear 34. Specifically, as illustrated inFIG. 3 , during a cycle of thethrill ride 12, thegraphics generation system 32 may renderstereoscopic images 43, the real-world images 44, as well as various AR/VRgraphical images 45 through the respective stereoscopic AR/VR eyewear 34 (e.g., via therespective displays 37 and 38) of the 22, 24, 26, 28. For rendering stereoscopic images, therides passengers graphics generation system 32 may be used in conjunction withstereoscopic projectors 53. The real-world images 44 may include rendered images of, for example, thetracks 18, thefacilities 14, and/or other patrons or objects that the 22, 24, 26, 28 would see while riding theride passengers thrill 12, including the 22, 24, 26, 28, even if the stereoscopic AR/other passengers VR eyewear 34 were not being worn by the 22, 24, 26, 28. However, as previously discussed with respect toride passengers FIG. 2 , in certain embodiments, it may be useful to enhance the thrill factor of thethrill ride 12 by rendering various AR/VRgraphical images 45 to the 37 and 38 of the respective stereoscopic AR/respective displays VR eyewear 34 of the 22, 24, 26, and 28.ride passengers - For example, as further depicted in
FIG. 3 , thegraphics generation system 32 may render AR/VR graphical images 45 (illustrated via the dashed lines) that may include, for example, an AR/VR image of a second mall ofamusement park facilities 49, an AR/VR image of one or morefictional characters 50, an AR/VR image of abreach 52 of thetracks 18, and/or additional AR/ 54, 56, and 58. In one embodiment, as illustrated inVR image FIG. 3 , the AR/VR image 50 may include an image of a monster or other similar fictional character appearing (e.g., from the point of view of the 22, 24, 26, 28 while wearing the stereoscopic AR/VR eyewear 34) to be obstructing a portion of theride passengers tracks 18 as thepassenger ride vehicle 20 traverses thetracks 18. It should be appreciated that in addition to AR/VR graphical images 45 (e.g., virtual augmentations) that include an added image, thegraphics generation system 32 may also render certain AR/VRgraphical images 45 that include a deletion of one or more real-world physical objects that no longer appear while the 22, 24, 26, 28 are wearing the stereoscopic AR/ride passengers VR eyewear 34. For example, the AR/VR image of thefacilities 49 may appear at a place in which theattraction 16 is placed in the real-world environment. - As previously discussed, in certain embodiments, the
graphics generation system 32 may render the AR/VRgraphical images 45 based on, for example, the position or location of thepassenger ride vehicle 20 along thetracks 18 at any given time during a cycle of thethrill ride 12, a predetermined distance traveled by thepassenger ride vehicle 20 during a cycle of thethrill ride 12, or after a predetermined lapse of time. For example, in one embodiment, once the passenger ride vehicle travels to a point 60 (e.g., defined by acertain distance 62 or location on the tracks 18), the AR/VR image of thefictional character 50 may appear to the 22, 24, 26, 28, via the stereoscopic AR/ride passengers VR eyewear 34, as obstructing a place on thetracks 18 not yet traversed by thepassenger ride vehicle 20 during a given cycle of thethrill ride 12. Similarly, once thepassenger ride vehicle 20 travels to a point 62 (e.g., defined by acertain distance 62 or location on the tracks 18), the AR/VR image of thebreach 52 of the tracks 18 (e.g., appearance of a broken track) may appear to the 22, 24, 26, 28, via the stereoscopic AR/ride passengers VR eyewear 34, as though thepassenger ride vehicle 20 will encounter a place in which there is no supporting tracks 18. Thegraphics generation system 32 may render the AR/VRgraphical images 45 based on the identity of the individual users of theeyewear 34. Eacheyewear 34 may be associated with an RFID tag or other identification element that transmits an identification signal to thegraphics generation system 32. Thesystem 32 may select the overlaid image from among several options stored in thememory 47 based on the identity of the ride passenger (e.g., ride 22, 24, 26, 28). In this manner, each passenger in apassengers ride vehicle 20 may receive customized content that is different from that received by the other passengers in theride vehicle 20. For example, in a ride that includes character content, certain passengers wearingparticular eyewear 34 may be associated with particular characters. In such embodiments, the overlaid AR/VR image may be associated with the particular character. The ride passengers may (e.g., ride 22, 24, 26, 28) may have individualized interactive content displayed via thepassengers eyewear 34 that is based on previous park experiences, rewards, characters, passenger age or interests, passenger profile information acquired from a central server, etc. In one embodiment, a guest in an interactive arena may see a particular overlaid image displayed only if they successfully perform a physical action (e.g., punch a block or open a door). - Furthermore, in certain embodiments, the illumination map generated by the
graphics generation system 32 may allow thegraphics generation system 32 to include one or more detection and/or trigger points (e.g., trigger point for which to introduce the AR/VR images 45) at every mile of thetracks 18, every yard of thetracks 18, every foot of thetracks 18, every inch of thetracks 18, every centimeter of thetracks 18, or every millimeter of thetracks 18. In this way, thegraphics generation system 32 may detect when to begin rendering of the AR/VRgraphical images 45 based on position or location, distance traveled, and/or time elapsed during a cycle of thethrill ride 12 with sufficient accuracy and efficiency. Furthermore,certain images 54, 56 illustrate that one or more of the AR/VRgraphical images 45 may appear to the 22, 24, 26, 28 as interacting with each other (e.g., overlapping or touching). In one embodiment, the images (e.g.,ride passengers images 54A and 54B) may be stereoscopic images. Similarly, the AR/VR image 58 illustrates an example of AR/VRgraphical images 45 that may appear outside the line of sight or the point of view (e.g., blind spot) of the 22, 24, 26, 28 that may be nevertheless perceived by theride passengers 22, 24, 26, 28 should any of them look into the direction of the AR/ride passengers VR image 58. It should be noted that completely different images may also be provided to 22, 24, 26, 28 such that one or more of thedifferent ride passengers 22, 24, 26, 28 have partially or completely different ride experiences or even ride themes.ride passengers - In certain embodiments, as discussed above with respect to
FIG. 2 , because thegraphics generation system 32 may render the real-world images 44 and the AR/VR images 45 to each of the 37 and 38 of the stereoscopic AR/respective displays VR eyewear 34 worn by each of the respective the 22, 24, 26, and 28, theride passengers 22, 24, 26, 28 may each perceive the real-world images 44 (e.g.,ride passengers facilities 14,thrill ride 12, and so forth) and the AR/VR images 45 (e.g., AR/VR images or 49, 50, 52, 54, 56, and 58) temporally and spatially commensurate with their respective points of view, thus creating a photorealistic effect as thevirtual augmentations passenger ride vehicle 20 traverses thetracks 18. Furthermore, in other embodiments, in addition to the AR/VR images 45 (e.g., AR/VR images or 49, 50, 52, 54, 56, and 58), thevirtual augmentations graphics generation system 32 may also trigger one or more sound effects, haptic feedback effects, scented effects, and so forth that may coincide with the appearances of the AR/VR images 45 on the stereoscopic AR/VR eyewear 34. In some embodiments, thegraphics generation system 32 is integral with the stereoscopic AR/VR eyewear 34. - In this way, by providing the stereoscopic AR/
VR eyewear 34 and thegraphics generation system 32 to create an AR experience, a VR experience, and/or other computed-mediated reality experience, the stereoscopic AR/VR eyewear 34 and thegraphics generation system 32 may enhance the thrill factor of thethrill ride 12, and, by extension, the experience of the 22, 24, 26, 28 while on theride passengers thrill ride 12. Moreover, by providing the stereoscopic AR/VR eyewear 34 as AR/VR eyeglasses, as opposed to bulkier and more cumbersome devices such as traditional head-mounted displays (HMDs), the 22, 24, 26, 28 may be provided with greater freedom of movement, as well as a more photorealistic experience. For example, each of theride passengers 22, 24, 26, 28 may be able to see eachride passengers 22, 24, 26, 28, as well as theother ride passenger passenger ride vehicle 20 itself even when wearing the stereoscopic AR/VR eyewear 34. Moreover, because the stereoscopic AR/VR eyewear 34 may include 40, 42 andindividual cameras 37, 38, data with respect to the respective points of view of each eye of theindividual displays 22, 24, 26, 28 may be captured by the stereoscopic AR/ride passengers VR eyewear 34. Thus, thegraphics generation system 32 may render real-world images 44 and AR/VR images 45 on the 37, 38 of the stereoscopic AR/displays VR eyewear 34 that are consistent with the respective points of view of the 22, 24, 26, 28. Such advantages may be unavailable using devices such as traditional HMDs. In other embodiments, theride passengers system 32 may use audio watermarking to synchronize AR content within theride 12, e.g., to synchronize played media to AR images. - Turning now to
FIG. 5 , a flow diagram is presented, illustrating an embodiment of aprocess 80 useful in creating a stereoscopic experience, an AR experience, a VR experience, and/or other computed-mediated experience during a thrill ride using, for example, the computergraphics generation system 32 depicted inFIG. 2 . Theprocess 80 may be representative of initiated code or instructions stored in a non-transitory computer-readable medium (e.g., the memory 47) and executed, for example, by theprocessor 46 included in the computergraphics generation system 32. The process 64 may begin with theprocessor 46 receiving (block 82) position information for a user wearing theeyewear 34. As discussed, the eyewear position may be assessed by RFID tags on each device, by cameras, GPS, etc. Based on the position, thesystem 32 may determine that the user wearing theeyewear 34 is positioned in the proximity of a desired stereoscopic event (block 84). Accordingly, thesystem 32 may initiate or maintain projection of stereoscopic images for display and viewing by the user (block 86). - If the user of the
eyewear 34 is a passenger on a ride vehicle (seeFIG. 4 ) or otherwise moving relative to the environment, themethod 80 may receive updated position information (block 88) to reflect that the user has moved to a new location associated with a desired mixed or AR/VR effect (block 90). To generate the AR/VR effect, the method may access pre-scanned or receive real-time captured image data (block 92). For example, theprocessor 46 may receive real-time video data (e.g., live video) captured via 40, 42 of the stereoscopic AR/cameras VR eyewear 34. The process 64 may then continue with theprocessor 46 generating a visualization of the real-world environment based on the real-world image data. For example, theprocessor 46 may generate a video data stream of the real-world environment (e.g., the amusement park 10) to be displayed on the 37, 38 of the stereoscopic AR/displays VR eyewear 34. - The process 64 may then continue with the
processor 46 overlaying (block 92) or superimposing one or more augmented or virtual reality images onto the generated visualization of the real-world environment. For example, theprocessor 46 may generate a video data stream of the real-world images 44 (e.g.,facilities 14, thrill ride 12), and overlay or superimpose the AR/VR images 45 (e.g., AR/VR images or 49, 50, 52, 54, 56, and 58) onto the real-virtual augmentations world images 44 using one or more video merging and/or optical merging techniques. As previously discussed above, in certain embodiments, for example, theprocessor 46 of thegraphics generation system 32 may render the AR/VRgraphical images 45 based on, for example, the position or location of thepassenger ride vehicle 20 along thetracks 18 at any given time during a cycle of thethrill ride 12, a predetermined distance traveled by thepassenger ride vehicle 20 during a cycle of thethrill ride 12, or after a predetermined lapse of time. In other embodiments, thegraphics generation system 32 may perform one or more geometric or photometric recognition algorithms on the video or image data captured via the 40, 42 to determine the points of location of thecameras ride vehicle 20 and when to introduce the AR/VRgraphical images 45. The process 64 may then conclude with theprocessor 46 transmitting (block 94) the overlaid augmented or virtual reality image data (e.g., AR/VR images 45) along with the real-world environment data (e.g., real-world images 44) to be displayed on the 37, 38 of the stereoscopic AR/displays VR eyewear 34 to enhance the thrill factor of thethrill ride 12, and, by extension, the experience of the 22, 24, 26, 28 while on theride passengers thrill ride 12. Thesystem 32 is configured to permit theeyewear 34 to switch between different viewing modes, e.g., AR/VR, stereoscopic, and real world (e.g., no effects). The switch may be based on the time or position of the user within theride 12 and may be mediated by a control signal from thesystem 32. Thesystem 32 may also receive user input, e.g., via an input button or switch on the eyewear. For example, certain users may be sensitive to stereoscopic image display. Such users may have the option of turning off the 3D stereoscopic viewing and thesystem 32 may provide alternative video data in the proximity of stereoscopic effects. - Technical effects of the present embodiments relate to systems and methods of providing an augmented reality (AR) experience, a virtual reality (VR) experience, a mixed reality (e.g., a combination of AR and VR) experience, or a combination thereof, as part of a thrill ride in an amusement park or theme park. In certain embodiments, each ride passenger may be provided with eyewear (e.g., stereoscopic AR/
VR eyewear 34 that is configured to be used as AR/VR eyewear) to be worn during a cycle of the thrill ride. In on embodiment, the eyewear is both AR/VR capable as well as being capable of facilitating the viewing of projected stereoscopic images. To facilitate an AR/VR or mixed reality experience, the eyewear may be configured to display virtual images overlaid over a real-world representation. To that end, the eyewear may include at least two cameras, which may respectively correspond to the respective points of view of the ride passengers, and may be used to capture real-time video data (e.g., live video) of the real-world environment (e.g., the physical amusement park) of the ride passengers and/or the thrill ride. The eyewear may also include at least two displays respectively corresponding to each eye of the ride passengers. In certain embodiments, a computer graphics generation system may also be provided. The computer graphics generation system may render a video stream of the real-world environment along with various AR/VR graphical images to the respective displays of the respective stereoscopic eyewear of the ride passengers during a cycle of the thrill ride. For example, in one embodiment, thegraphics generation system 32 may render the AR/VR graphical images to the eyewear based on, for example, the position or location of the passenger ride vehicle along the tracks at any given time during a cycle of the thrill ride, a predetermined distance traveled by the passenger ride vehicle during a cycle of the thrill ride, or after a predetermined lapse of time. In this way, by using the eyewear and the graphics generation system to create an AR experience, a VR experience, and/or mixed reality experience, the eyewear and the computer graphics generation system may enhance the thrill factor of the thrill ride, and, by extension, may enhance the experience of the ride passengers as they ride the thrill ride. - While only certain features of the present embodiments have been illustrated and described herein, many modifications and changes will occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the present disclosure. Further, it should be understood that certain elements of the disclosed embodiments may be combined or exchanged with one another.
Claims (25)
Priority Applications (13)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/586,956 US20170323482A1 (en) | 2016-05-05 | 2017-05-04 | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
| SG11201809219XA SG11201809219XA (en) | 2016-05-05 | 2017-05-05 | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
| CN201780027854.0A CN109069935B (en) | 2016-05-05 | 2017-05-05 | System and method for generating stereoscopic, augmented and virtual reality images |
| JP2018558202A JP7079207B2 (en) | 2016-05-05 | 2017-05-05 | Systems and methods for generating stereoscopic augmented reality and virtual reality images |
| MYPI2018001780A MY194042A (en) | 2016-05-05 | 2017-05-05 | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
| PCT/US2017/031371 WO2017193043A1 (en) | 2016-05-05 | 2017-05-05 | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
| KR1020187034763A KR102488332B1 (en) | 2016-05-05 | 2017-05-05 | Systems and methods for generating stereoscopic, augmented and virtual reality images |
| RU2018142013A RU2735458C2 (en) | 2016-05-05 | 2017-05-05 | Systems and methods for generating stereoscopic images of augmented and virtual reality |
| ES17723901T ES2858320T3 (en) | 2016-05-05 | 2017-05-05 | Systems and procedures for generating stereoscopic, augmented reality and virtual reality images |
| EP17723901.9A EP3452192B1 (en) | 2016-05-05 | 2017-05-05 | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
| CA3021561A CA3021561A1 (en) | 2016-05-05 | 2017-05-05 | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
| US16/984,845 US11670054B2 (en) | 2016-05-05 | 2020-08-04 | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
| US18/306,902 US12469230B2 (en) | 2016-05-05 | 2023-04-25 | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201662332299P | 2016-05-05 | 2016-05-05 | |
| US15/586,956 US20170323482A1 (en) | 2016-05-05 | 2017-05-04 | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/984,845 Continuation US11670054B2 (en) | 2016-05-05 | 2020-08-04 | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20170323482A1 true US20170323482A1 (en) | 2017-11-09 |
Family
ID=58709641
Family Applications (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/586,956 Abandoned US20170323482A1 (en) | 2016-05-05 | 2017-05-04 | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
| US16/984,845 Active US11670054B2 (en) | 2016-05-05 | 2020-08-04 | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
| US18/306,902 Active 2037-05-24 US12469230B2 (en) | 2016-05-05 | 2023-04-25 | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
Family Applications After (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/984,845 Active US11670054B2 (en) | 2016-05-05 | 2020-08-04 | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
| US18/306,902 Active 2037-05-24 US12469230B2 (en) | 2016-05-05 | 2023-04-25 | Systems and methods for generating stereoscopic, augmented, and virtual reality images |
Country Status (11)
| Country | Link |
|---|---|
| US (3) | US20170323482A1 (en) |
| EP (1) | EP3452192B1 (en) |
| JP (1) | JP7079207B2 (en) |
| KR (1) | KR102488332B1 (en) |
| CN (1) | CN109069935B (en) |
| CA (1) | CA3021561A1 (en) |
| ES (1) | ES2858320T3 (en) |
| MY (1) | MY194042A (en) |
| RU (1) | RU2735458C2 (en) |
| SG (1) | SG11201809219XA (en) |
| WO (1) | WO2017193043A1 (en) |
Cited By (30)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106775566A (en) * | 2016-12-30 | 2017-05-31 | 维沃移动通信有限公司 | The data processing method and virtual reality terminal of a kind of virtual reality terminal |
| US10151927B2 (en) * | 2016-05-31 | 2018-12-11 | Falcon's Treehouse, Llc | Virtual reality and augmented reality head set for ride vehicle |
| US10289194B2 (en) | 2017-03-06 | 2019-05-14 | Universal City Studios Llc | Gameplay ride vehicle systems and methods |
| CN110399035A (en) * | 2018-04-25 | 2019-11-01 | 国际商业机器公司 | In computing system with the delivery of the reality environment of time correlation |
| WO2019221924A1 (en) * | 2018-05-15 | 2019-11-21 | Universal City Studios Llc | Systems and methods for dynamic ride profiles |
| US10528132B1 (en) * | 2018-07-09 | 2020-01-07 | Ford Global Technologies, Llc | Gaze detection of occupants for vehicle displays |
| CN110910507A (en) * | 2018-09-17 | 2020-03-24 | 脸谱科技有限责任公司 | Computer-implemented method, computer-readable medium, and system for mixed reality |
| WO2020065497A1 (en) * | 2018-09-24 | 2020-04-02 | Cae Inc. | Camera based display method and system for simulators |
| US10685492B2 (en) * | 2016-12-22 | 2020-06-16 | Choi Enterprise, LLC | Switchable virtual reality and augmented/mixed reality display device, and light field methods |
| WO2020141945A1 (en) * | 2019-01-03 | 2020-07-09 | Samsung Electronics Co., Ltd. | Electronic device for changing characteristics of display according to external light and method therefor |
| US10777012B2 (en) * | 2018-09-27 | 2020-09-15 | Universal City Studios Llc | Display systems in an entertainment environment |
| CN112114428A (en) * | 2019-06-21 | 2020-12-22 | 海信视像科技股份有限公司 | AR or MR glasses |
| JP2021510853A (en) * | 2018-02-26 | 2021-04-30 | グーグル エルエルシーGoogle LLC | Augmented Reality Light Field Head Mounted Display |
| WO2021175727A1 (en) * | 2020-03-02 | 2021-09-10 | Carl Zeiss Meditec Ag | Head-mounted visualisation system |
| US11143874B2 (en) * | 2019-03-29 | 2021-10-12 | Sony Interactive Entertainment Inc. | Image processing apparatus, head-mounted display, and image displaying method |
| US11182976B2 (en) * | 2016-06-06 | 2021-11-23 | Devar Entertainment Limited | Device for influencing virtual objects of augmented reality |
| JP2021183169A (en) * | 2018-01-23 | 2021-12-02 | ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー | Interactive tower attraction systems and methods |
| US11200656B2 (en) * | 2019-01-11 | 2021-12-14 | Universal City Studios Llc | Drop detection systems and methods |
| US11206505B2 (en) * | 2019-05-06 | 2021-12-21 | Universal City Studios Llc | Systems and methods for dynamically loading area-based augmented reality content |
| JP2022012273A (en) * | 2020-07-01 | 2022-01-17 | 凸版印刷株式会社 | Projection system for play equipment |
| US20220026723A1 (en) * | 2020-07-24 | 2022-01-27 | Universal City Studios Llc | Electromagnetic coupling systems and methods for visualization device |
| US11391956B2 (en) | 2019-12-30 | 2022-07-19 | Samsung Electronics Co., Ltd. | Method and apparatus for providing augmented reality (AR) object to user |
| US20220358687A1 (en) * | 2021-05-10 | 2022-11-10 | Cerence Operating Company | Touring circuitry for an immersive tour through a touring theater |
| US11562539B2 (en) | 2018-09-25 | 2023-01-24 | Universal City Studios Llc | Modular augmented and virtual reality ride attraction |
| US11774770B2 (en) | 2020-06-03 | 2023-10-03 | Universal City Studios Llc | Interface device with three-dimensional (3-D) viewing functionality |
| US11794121B1 (en) * | 2020-12-09 | 2023-10-24 | Falcon's Beyond Brands, Llc | Theme or amusement park attraction using high frame rate active shutter technology |
| US20230377277A1 (en) * | 2020-11-05 | 2023-11-23 | Crsc Communication & Information Group Company Ltd. | Video patrol method and device, electronic device, and readable medium |
| US20240054693A1 (en) * | 2022-08-15 | 2024-02-15 | Universal City Studios Llc | Show effect system for amusement park attraction system |
| WO2024039679A1 (en) * | 2022-08-15 | 2024-02-22 | Universal City Studios Llc | Show effect system for amusement park attraction system |
| US12175602B2 (en) | 2022-08-19 | 2024-12-24 | Meta Platforms Technologies, Llc | Method of generating a virtual environment by scanning a real-world environment with a first device and displaying the virtual environment on a second device |
Families Citing this family (24)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10685491B2 (en) * | 2017-07-18 | 2020-06-16 | Universal City Studios Llc | Systems and methods for virtual reality and augmented reality path management |
| CN108079574B (en) * | 2017-12-22 | 2022-01-18 | 深圳华侨城卡乐技术有限公司 | Virtual reality game platform based on carousel equipment and control method thereof |
| US10603564B2 (en) * | 2018-01-03 | 2020-03-31 | Universal City Studios Llc | Interactive component for an amusement park |
| IT201800003194A1 (en) * | 2018-03-01 | 2019-09-01 | Image Studio Consulting S R L | CAROUSEL INTERACTIONS |
| ES2837433T3 (en) * | 2018-03-16 | 2021-06-30 | Vr Coaster Gmbh & Co Kg | Synchronization device with a base station for synchronizing head-mounted displays with a virtual world in an amusement ride, amusement ride with such a synchronization device, and method of operating such an amusement ride |
| KR20210034585A (en) | 2018-07-25 | 2021-03-30 | 라이트 필드 랩 인코포레이티드 | Amusement park equipment based on light field display system |
| DE102018121258A1 (en) * | 2018-08-30 | 2020-03-05 | Vr Coaster Gmbh & Co. Kg | Head-mounted display and amusement facility with such a head-mounted display |
| JP7326740B2 (en) * | 2018-12-28 | 2023-08-16 | トヨタ紡織株式会社 | Spatial provision system |
| US11318607B2 (en) * | 2019-01-04 | 2022-05-03 | Universal City Studios Llc | Extended reality ride test assembly for amusement park system |
| CN109876474A (en) * | 2019-02-20 | 2019-06-14 | 北京当红齐天国际文化发展集团有限公司 | A kind of rail mounted viewing system and its control method |
| US10767997B1 (en) * | 2019-02-25 | 2020-09-08 | Qualcomm Incorporated | Systems and methods for providing immersive extended reality experiences on moving platforms |
| US11265487B2 (en) * | 2019-06-05 | 2022-03-01 | Mediatek Inc. | Camera view synthesis on head-mounted display for virtual reality and augmented reality |
| US10828576B1 (en) * | 2019-07-29 | 2020-11-10 | Universal City Studios Llc | Motion exaggerating virtual reality ride systems and methods |
| US10976818B2 (en) * | 2019-08-21 | 2021-04-13 | Universal City Studios Llc | Interactive attraction system and method for object and user association |
| JP2023511407A (en) | 2020-01-22 | 2023-03-17 | フォトニック メディカル インク. | Open-field multi-mode depth-sensing calibrated digital loupe |
| KR102506312B1 (en) * | 2020-01-22 | 2023-03-03 | 신화현 | Roller-coaster system with virtual reality content and method for providing service thereof |
| KR20220136422A (en) | 2020-02-06 | 2022-10-07 | 밸브 코포레이션 | Field-of-view-based optical correction using spatially varying polarizers |
| JP7556773B2 (en) * | 2020-12-21 | 2024-09-26 | トヨタ自動車株式会社 | Display system, display device, and program |
| JP7473489B2 (en) * | 2021-01-15 | 2024-04-23 | トヨタ紡織株式会社 | Output control device, output control system, and control method |
| JP2023136239A (en) * | 2022-03-16 | 2023-09-29 | 株式会社リコー | Information processing device, information processing system, supporting system, and information processing method |
| US12079384B2 (en) | 2022-04-21 | 2024-09-03 | Universal City Studios Llc | Artificial intelligence (AI)-assisted and dynamic ride profile head tracking systems and methods |
| WO2023205092A1 (en) * | 2022-04-21 | 2023-10-26 | Universal City Studios Llc | Artificial intelligence (ai)-assisted and dynamic ride profile head tracking systems and methods |
| KR20230165984A (en) | 2022-05-27 | 2023-12-06 | 삼성디스플레이 주식회사 | Polarizing film and Display apparutus employing the same |
| US12217372B2 (en) | 2022-10-17 | 2025-02-04 | T-Mobile Usa, Inc. | Generating mixed reality content based on data from a wireless device |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20100079585A1 (en) * | 2008-09-29 | 2010-04-01 | Disney Enterprises, Inc. | Interactive theater with audience participation |
| US20130169924A1 (en) * | 2012-01-04 | 2013-07-04 | Hayward LAMPLEY, JR. | Apparatus and system for data storage and retrieval |
| US20150301592A1 (en) * | 2014-04-18 | 2015-10-22 | Magic Leap, Inc. | Utilizing totems for augmented or virtual reality systems |
| US20160011422A1 (en) * | 2014-03-10 | 2016-01-14 | Ion Virtual Technology Corporation | Method and system for reducing motion blur when experiencing virtual or augmented reality environments |
| US20160167672A1 (en) * | 2010-05-14 | 2016-06-16 | Wesley W. O. Krueger | Systems and methods for controlling a vehicle or device in response to a measured human response to a provocative environment |
| US20160373731A1 (en) * | 2015-06-18 | 2016-12-22 | Disney Enterprises, Inc. | Three dimensional (3d) stereo display systems for creating 3d effects for viewers wearing 3d glasses |
| US9766462B1 (en) * | 2013-09-16 | 2017-09-19 | Amazon Technologies, Inc. | Controlling display layers of a head-mounted display (HMD) system |
| US20180033199A9 (en) * | 2016-02-12 | 2018-02-01 | Disney Enterprises, Inc. | Method for motion-synchronized ar or vr entertainment experience |
Family Cites Families (31)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2000035200A1 (en) | 1998-12-07 | 2000-06-15 | Universal City Studios, Inc. | Image correction method to compensate for point of view image distortion |
| RU51241U1 (en) * | 2005-07-13 | 2006-01-27 | Евгений Борисович Гаскевич | STEREO IMAGE FORMATION SYSTEM |
| US20110141246A1 (en) | 2009-12-15 | 2011-06-16 | Justin Michael Schwartz | System and Method for Producing Stereoscopic Images |
| US20130278631A1 (en) * | 2010-02-28 | 2013-10-24 | Osterhout Group, Inc. | 3d positioning of augmented reality information |
| CN103080983A (en) | 2010-09-06 | 2013-05-01 | 国立大学法人东京大学 | Vehicle system |
| JP5423716B2 (en) | 2011-03-30 | 2014-02-19 | ブラザー工業株式会社 | Head mounted display |
| CN202036738U (en) * | 2011-05-09 | 2011-11-16 | 西安灵境科技有限公司 | Man-machine interaction virtual roaming body-building device |
| US20130002559A1 (en) | 2011-07-03 | 2013-01-03 | Vardi Nachum | Desktop computer user interface |
| KR20130035457A (en) * | 2011-09-30 | 2013-04-09 | 삼성전자주식회사 | Display apparatus and image processing method |
| US20130083008A1 (en) * | 2011-09-30 | 2013-04-04 | Kevin A. Geisner | Enriched experience using personal a/v system |
| KR101209854B1 (en) * | 2012-09-11 | 2012-12-11 | 신동호 | 3D exercise device for simulator |
| WO2014056000A1 (en) | 2012-10-01 | 2014-04-10 | Coggins Guy | Augmented reality biofeedback display |
| US20140282911A1 (en) * | 2013-03-15 | 2014-09-18 | Huntington Ingalls, Inc. | System and Method for Providing Secure Data for Display Using Augmented Reality |
| KR20140115637A (en) * | 2013-03-21 | 2014-10-01 | 한국전자통신연구원 | System of providing stereoscopic image for multiple users and method thereof |
| US9442294B2 (en) * | 2013-06-27 | 2016-09-13 | Koc Universitesi | Image display device in the form of a pair of eye glasses comprising micro reflectors |
| KR102309257B1 (en) * | 2013-09-04 | 2021-10-06 | 에씰로 앙터나시오날 | Methods and systems for augmented reality |
| KR102277893B1 (en) * | 2013-09-04 | 2021-07-15 | 에씰로 앙터나시오날 | Methods and systems for augmented reality |
| CN203894474U (en) * | 2014-04-15 | 2014-10-22 | 王傲立 | Virtual reality (VR) glasses |
| US20150358539A1 (en) * | 2014-06-06 | 2015-12-10 | Jacob Catt | Mobile Virtual Reality Camera, Method, And System |
| WO2016025962A1 (en) | 2014-08-15 | 2016-02-18 | The University Of Akron | Device and method for three-dimensional video communication |
| US9690375B2 (en) * | 2014-08-18 | 2017-06-27 | Universal City Studios Llc | Systems and methods for generating augmented and virtual reality images |
| US20160114222A1 (en) * | 2014-10-24 | 2016-04-28 | Acushnet Company | Method of making a golf ball with composite construction |
| US10154239B2 (en) * | 2014-12-30 | 2018-12-11 | Onpoint Medical, Inc. | Image-guided surgery with surface reconstruction and augmented reality visualization |
| US10018844B2 (en) * | 2015-02-09 | 2018-07-10 | Microsoft Technology Licensing, Llc | Wearable image display system |
| US10222619B2 (en) * | 2015-07-12 | 2019-03-05 | Steven Sounyoung Yu | Head-worn image display apparatus for stereoscopic microsurgery |
| CN204952289U (en) * | 2015-08-31 | 2016-01-13 | 深圳市维骏文化旅游科技有限公司 | Novel shadow system is seen in interactive lift of virtual reality |
| EP3151554A1 (en) * | 2015-09-30 | 2017-04-05 | Calay Venture S.a.r.l. | Presence camera |
| US20170171534A1 (en) * | 2015-11-12 | 2017-06-15 | Samsung Electronics Co., Ltd. | Method and apparatus to display stereoscopic image in 3d display system |
| US20170272732A1 (en) * | 2016-03-17 | 2017-09-21 | Disney Enterprises, Inc. | Wavelength multiplexing visualization using discrete pixels |
| EP3443407B1 (en) * | 2016-04-10 | 2023-05-17 | Everysight Ltd. | Binocular wide field of view (wfov) wearable optical display system |
| KR101687174B1 (en) * | 2016-04-29 | 2016-12-16 | 주식회사 조이펀 | A message display method on virtual reality device according to event occurrence and the message display apparatus by using the same |
-
2017
- 2017-05-04 US US15/586,956 patent/US20170323482A1/en not_active Abandoned
- 2017-05-05 CA CA3021561A patent/CA3021561A1/en active Pending
- 2017-05-05 ES ES17723901T patent/ES2858320T3/en active Active
- 2017-05-05 RU RU2018142013A patent/RU2735458C2/en active
- 2017-05-05 KR KR1020187034763A patent/KR102488332B1/en active Active
- 2017-05-05 CN CN201780027854.0A patent/CN109069935B/en active Active
- 2017-05-05 SG SG11201809219XA patent/SG11201809219XA/en unknown
- 2017-05-05 EP EP17723901.9A patent/EP3452192B1/en active Active
- 2017-05-05 JP JP2018558202A patent/JP7079207B2/en active Active
- 2017-05-05 MY MYPI2018001780A patent/MY194042A/en unknown
- 2017-05-05 WO PCT/US2017/031371 patent/WO2017193043A1/en not_active Ceased
-
2020
- 2020-08-04 US US16/984,845 patent/US11670054B2/en active Active
-
2023
- 2023-04-25 US US18/306,902 patent/US12469230B2/en active Active
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20100079585A1 (en) * | 2008-09-29 | 2010-04-01 | Disney Enterprises, Inc. | Interactive theater with audience participation |
| US20160167672A1 (en) * | 2010-05-14 | 2016-06-16 | Wesley W. O. Krueger | Systems and methods for controlling a vehicle or device in response to a measured human response to a provocative environment |
| US20130169924A1 (en) * | 2012-01-04 | 2013-07-04 | Hayward LAMPLEY, JR. | Apparatus and system for data storage and retrieval |
| US9766462B1 (en) * | 2013-09-16 | 2017-09-19 | Amazon Technologies, Inc. | Controlling display layers of a head-mounted display (HMD) system |
| US20160011422A1 (en) * | 2014-03-10 | 2016-01-14 | Ion Virtual Technology Corporation | Method and system for reducing motion blur when experiencing virtual or augmented reality environments |
| US20150301592A1 (en) * | 2014-04-18 | 2015-10-22 | Magic Leap, Inc. | Utilizing totems for augmented or virtual reality systems |
| US20160373731A1 (en) * | 2015-06-18 | 2016-12-22 | Disney Enterprises, Inc. | Three dimensional (3d) stereo display systems for creating 3d effects for viewers wearing 3d glasses |
| US20180033199A9 (en) * | 2016-02-12 | 2018-02-01 | Disney Enterprises, Inc. | Method for motion-synchronized ar or vr entertainment experience |
Cited By (57)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10151927B2 (en) * | 2016-05-31 | 2018-12-11 | Falcon's Treehouse, Llc | Virtual reality and augmented reality head set for ride vehicle |
| US11182976B2 (en) * | 2016-06-06 | 2021-11-23 | Devar Entertainment Limited | Device for influencing virtual objects of augmented reality |
| US10685492B2 (en) * | 2016-12-22 | 2020-06-16 | Choi Enterprise, LLC | Switchable virtual reality and augmented/mixed reality display device, and light field methods |
| CN106775566A (en) * | 2016-12-30 | 2017-05-31 | 维沃移动通信有限公司 | The data processing method and virtual reality terminal of a kind of virtual reality terminal |
| US10289194B2 (en) | 2017-03-06 | 2019-05-14 | Universal City Studios Llc | Gameplay ride vehicle systems and methods |
| US12153723B2 (en) | 2017-03-06 | 2024-11-26 | Universal City Studios Llc | Systems and methods for layered virtual features in an amusement park environment |
| US10528123B2 (en) | 2017-03-06 | 2020-01-07 | Universal City Studios Llc | Augmented ride system and method |
| US10572000B2 (en) | 2017-03-06 | 2020-02-25 | Universal City Studios Llc | Mixed reality viewer system and method |
| US11666833B2 (en) * | 2018-01-23 | 2023-06-06 | Universal City Studios Llc | Interactive tower attraction systems and methods |
| JP2023085297A (en) * | 2018-01-23 | 2023-06-20 | ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー | Interactive tower attraction system and method |
| US20220080327A1 (en) * | 2018-01-23 | 2022-03-17 | Universal City Studios Llc | Interactive tower attraction systems and methods |
| JP7245297B2 (en) | 2018-01-23 | 2023-03-23 | ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー | Interactive tower attraction system and method |
| JP2021183169A (en) * | 2018-01-23 | 2021-12-02 | ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー | Interactive tower attraction systems and methods |
| JP7621402B2 (en) | 2018-01-23 | 2025-01-24 | ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー | Interactive tower attraction system and method |
| JP7024105B2 (en) | 2018-02-26 | 2022-02-22 | グーグル エルエルシー | Augmented reality light field head-mounted display |
| JP2021510853A (en) * | 2018-02-26 | 2021-04-30 | グーグル エルエルシーGoogle LLC | Augmented Reality Light Field Head Mounted Display |
| US11022806B2 (en) | 2018-02-26 | 2021-06-01 | Google Llc | Augmented reality light field head-mounted displays |
| CN110399035A (en) * | 2018-04-25 | 2019-11-01 | 国际商业机器公司 | In computing system with the delivery of the reality environment of time correlation |
| WO2019221924A1 (en) * | 2018-05-15 | 2019-11-21 | Universal City Studios Llc | Systems and methods for dynamic ride profiles |
| KR20210006990A (en) * | 2018-05-15 | 2021-01-19 | 유니버셜 시티 스튜디오스 엘엘씨 | System and method for dynamic vehicle profile |
| KR102844184B1 (en) | 2018-05-15 | 2025-08-07 | 유니버셜 시티 스튜디오스 엘엘씨 | Systems and methods for dynamic ride-sharing mechanism profiling |
| US10528132B1 (en) * | 2018-07-09 | 2020-01-07 | Ford Global Technologies, Llc | Gaze detection of occupants for vehicle displays |
| CN110910507A (en) * | 2018-09-17 | 2020-03-24 | 脸谱科技有限责任公司 | Computer-implemented method, computer-readable medium, and system for mixed reality |
| WO2020065497A1 (en) * | 2018-09-24 | 2020-04-02 | Cae Inc. | Camera based display method and system for simulators |
| US11741680B2 (en) | 2018-09-25 | 2023-08-29 | Universal City Studios Llc | Modular augmented and virtual reality ride attraction |
| US11562539B2 (en) | 2018-09-25 | 2023-01-24 | Universal City Studios Llc | Modular augmented and virtual reality ride attraction |
| JP7331090B2 (en) | 2018-09-27 | 2023-08-22 | ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー | Display systems in entertainment environments |
| US10777012B2 (en) * | 2018-09-27 | 2020-09-15 | Universal City Studios Llc | Display systems in an entertainment environment |
| JP2022502699A (en) * | 2018-09-27 | 2022-01-11 | ユニバーサル シティ スタジオズ リミテッド ライアビリティ カンパニー | Display system in the entertainment environment |
| CN113272888A (en) * | 2019-01-03 | 2021-08-17 | 三星电子株式会社 | Electronic device for changing display characteristics according to external light and method thereof |
| US10971053B2 (en) * | 2019-01-03 | 2021-04-06 | Samsung Electronics Co., Ltd. | Electronic device for changing characteristics of display according to external light and method therefor |
| WO2020141945A1 (en) * | 2019-01-03 | 2020-07-09 | Samsung Electronics Co., Ltd. | Electronic device for changing characteristics of display according to external light and method therefor |
| US11210772B2 (en) | 2019-01-11 | 2021-12-28 | Universal City Studios Llc | Wearable visualization device systems and methods |
| US11200656B2 (en) * | 2019-01-11 | 2021-12-14 | Universal City Studios Llc | Drop detection systems and methods |
| US11200655B2 (en) | 2019-01-11 | 2021-12-14 | Universal City Studios Llc | Wearable visualization system and method |
| US11143874B2 (en) * | 2019-03-29 | 2021-10-12 | Sony Interactive Entertainment Inc. | Image processing apparatus, head-mounted display, and image displaying method |
| US11614627B2 (en) | 2019-03-29 | 2023-03-28 | Sony Interactive Entertainment Inc. | Image processing apparatus, head-mounted display, and image displaying method |
| US12096307B2 (en) | 2019-05-06 | 2024-09-17 | Universal City Studios Llc | Systems and methods for dynamically loading area-based augmented reality content |
| US11206505B2 (en) * | 2019-05-06 | 2021-12-21 | Universal City Studios Llc | Systems and methods for dynamically loading area-based augmented reality content |
| CN112114428A (en) * | 2019-06-21 | 2020-12-22 | 海信视像科技股份有限公司 | AR or MR glasses |
| US11391956B2 (en) | 2019-12-30 | 2022-07-19 | Samsung Electronics Co., Ltd. | Method and apparatus for providing augmented reality (AR) object to user |
| US11953687B2 (en) | 2020-03-02 | 2024-04-09 | Carl Zeiss Meditec Ag | Head-mounted visualization unit and visualization system comprising light-transmissive optical system |
| US12153220B2 (en) | 2020-03-02 | 2024-11-26 | Carl Zeiss Meditec Ag | Head-mounted visualization system |
| WO2021175776A1 (en) * | 2020-03-02 | 2021-09-10 | Carl Zeiss Meditec Ag | Head-mounted visualisation unit and visualisation system |
| WO2021175727A1 (en) * | 2020-03-02 | 2021-09-10 | Carl Zeiss Meditec Ag | Head-mounted visualisation system |
| US11774770B2 (en) | 2020-06-03 | 2023-10-03 | Universal City Studios Llc | Interface device with three-dimensional (3-D) viewing functionality |
| JP7592990B2 (en) | 2020-07-01 | 2024-12-03 | Toppanホールディングス株式会社 | Projection system for playground equipment |
| JP2022012273A (en) * | 2020-07-01 | 2022-01-17 | 凸版印刷株式会社 | Projection system for play equipment |
| US20220026723A1 (en) * | 2020-07-24 | 2022-01-27 | Universal City Studios Llc | Electromagnetic coupling systems and methods for visualization device |
| US12379604B2 (en) * | 2020-07-24 | 2025-08-05 | Universal City Studios Llc | Electromagnetic coupling systems and methods for visualization device |
| US20230377277A1 (en) * | 2020-11-05 | 2023-11-23 | Crsc Communication & Information Group Company Ltd. | Video patrol method and device, electronic device, and readable medium |
| US11794121B1 (en) * | 2020-12-09 | 2023-10-24 | Falcon's Beyond Brands, Llc | Theme or amusement park attraction using high frame rate active shutter technology |
| US12175560B2 (en) * | 2021-05-10 | 2024-12-24 | Cerence Operating Company | Touring circuitry for an immersive tour through a touring theater |
| US20220358687A1 (en) * | 2021-05-10 | 2022-11-10 | Cerence Operating Company | Touring circuitry for an immersive tour through a touring theater |
| WO2024039679A1 (en) * | 2022-08-15 | 2024-02-22 | Universal City Studios Llc | Show effect system for amusement park attraction system |
| US20240054693A1 (en) * | 2022-08-15 | 2024-02-15 | Universal City Studios Llc | Show effect system for amusement park attraction system |
| US12175602B2 (en) | 2022-08-19 | 2024-12-24 | Meta Platforms Technologies, Llc | Method of generating a virtual environment by scanning a real-world environment with a first device and displaying the virtual environment on a second device |
Also Published As
| Publication number | Publication date |
|---|---|
| RU2018142013A3 (en) | 2020-06-08 |
| EP3452192A1 (en) | 2019-03-13 |
| US11670054B2 (en) | 2023-06-06 |
| US20230260230A1 (en) | 2023-08-17 |
| RU2735458C2 (en) | 2020-11-02 |
| MY194042A (en) | 2022-11-09 |
| KR102488332B1 (en) | 2023-01-13 |
| CN109069935B (en) | 2021-05-11 |
| JP7079207B2 (en) | 2022-06-01 |
| US12469230B2 (en) | 2025-11-11 |
| ES2858320T3 (en) | 2021-09-30 |
| CA3021561A1 (en) | 2017-11-09 |
| US20200364940A1 (en) | 2020-11-19 |
| JP2019515749A (en) | 2019-06-13 |
| KR20190005906A (en) | 2019-01-16 |
| EP3452192B1 (en) | 2020-12-09 |
| RU2018142013A (en) | 2020-06-05 |
| WO2017193043A1 (en) | 2017-11-09 |
| CN109069935A (en) | 2018-12-21 |
| SG11201809219XA (en) | 2018-11-29 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12469230B2 (en) | Systems and methods for generating stereoscopic, augmented, and virtual reality images | |
| US12147594B2 (en) | Systems and methods for generating augmented and virtual reality images | |
| HK40001050B (en) | Systems and methods for generating stereoscopic, augmented, and virtual reality images | |
| HK40001050A (en) | Systems and methods for generating stereoscopic, augmented, and virtual reality images | |
| HK1242806A1 (en) | Systems and methods for generating augmented and virtual reality images |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: UNIVERSAL CITY STUDIOS LLC, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:COUP, THIERRY;MCQUILLIAN, BRIAN;SCHWARTZ, JUSTIN;REEL/FRAME:042245/0475 Effective date: 20170503 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |