HK40016431A - Gameplay ride vehicle systems and methods - Google Patents
Gameplay ride vehicle systems and methods Download PDFInfo
- Publication number
- HK40016431A HK40016431A HK62020005815.9A HK62020005815A HK40016431A HK 40016431 A HK40016431 A HK 40016431A HK 62020005815 A HK62020005815 A HK 62020005815A HK 40016431 A HK40016431 A HK 40016431A
- Authority
- HK
- Hong Kong
- Prior art keywords
- ride
- ride vehicle
- user input
- vehicle
- movement
- Prior art date
Links
Description
Cross Reference to Related Applications
The priority and benefit of U.S. provisional patent application No 62/467,817 entitled "SYSTEMS AND METHODS FOR DIGITAL OVERLAY IN AN amusment PARK entry FOR", filed on 3/6 IN 2017, hereby incorporated by reference IN its entirety FOR all purposes.
Background
The subject matter disclosed herein relates to amusement park attractions, and more particularly, to providing an interactive experience in amusement park ride attractions.
An amusement or theme park may include various entertainment attractions that provide enjoyment to patrons of the park. For example, the attractions may include ride attractions (e.g., closed loop tracks, dark rides, thriller rides, or other similar rides). In such ride attractions, the motion of the ride vehicle is comprised of programmed movements, or the ride vehicle may include features (e.g., various buttons and knobs) that provide the occupant with varying levels of control over the ride vehicle. It is now recognized that there is a need for improved ride attractions that provide enhanced occupant control of a ride vehicle to create a more interactive ride experience.
Disclosure of Invention
The following outlines certain embodiments commensurate with the scope of the disclosure. These embodiments are not intended to limit the scope of the present disclosure, but rather these embodiments are intended only to provide a brief summary of possible forms of the present embodiments. Indeed, the present embodiments may encompass a variety of forms that may be similar to or different from the embodiments set forth below.
In an embodiment, a ride system for an amusement park includes: a ride vehicle configured to accommodate a rider and configured to travel along a ride path; a head mounted display configured to be worn by a rider; and a control system. The control system is configured to display a virtual command to the occupant via the head-mounted display, receive a signal from a user input device associated with the occupant, determine that the signal received from the user input device corresponds to the virtual command, and trigger movement of the ride vehicle in response to determining that the signal corresponds to the virtual command.
In an embodiment, a method of providing entertainment in an amusement park includes: generating, via a computer graphics generation system communicatively coupled to a control system of a ride vehicle, a gaming environment, wherein the gaming environment includes a plurality of Augmented Reality (AR) objects, wherein the ride vehicle is configured to carry one or more riders and travel along a ride path; displaying, via one or more head mounted displays associated with each of the one or more riders, a gaming environment; receiving, via a control system, one or more signals from one or more user input devices associated with each of one or more occupants; and triggering movement of the ride vehicle via the control system based at least in part on signals received from one or more user input devices associated with each of the one or more riders, wherein the one or more user input devices comprise at least one steering device and at least one marking device.
In an embodiment, a ride and game control system is integrated with a ride vehicle configured to carry a rider along a ride track, the ride and game control system comprising a game controller configured to: receiving a first input signal from a steering user input device associated with a ride vehicle; receiving a second input signal from at least one tag user input device associated with the ride vehicle; determining that the first input signal corresponds to a virtual instruction; outputting a first control signal in response to determining that the first input signal corresponds to the virtual command, the first control signal indicating a command to cause a first movement of the ride vehicle; determining that the second input signal corresponds to an interaction with an Augmented Reality (AR) object; and output a second control signal in response to determining that the second input signal corresponds to an interaction with the AR object, the second control signal indicating an instruction to cause a second movement of the ride vehicle. The ride and game control system also includes a ride controller communicatively coupled to the game controller and configured to receive the first control signal and the second control signal and configured to control movement of the ride vehicle based at least in part on the first control signal and the second control signal.
Drawings
These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
FIG. 1 is a schematic diagram of an embodiment of an amusement park including ride attractions having a ride and game control system according to the present embodiment;
FIG. 2 is a block diagram of an embodiment of a ride and game control system according to the present embodiments that may be utilized within the ride attraction of FIG. 1;
FIG. 3 is a perspective view of an embodiment of a head mounted display that may be utilized within the ride and game control system of FIG. 2, according to an embodiment;
FIG. 4 is a schematic diagram illustrating an embodiment of driver control of a ride vehicle of the ride attraction of FIG. 1 and showing example Augmented Reality (AR) instructions, according to the present embodiments;
figure 5 is a schematic diagram illustrating an embodiment of occupant control of a ride vehicle at the ride attraction of figure 1, according to an embodiment;
fig. 6 is a flow diagram of an embodiment of a method for controlling movement of a ride vehicle based at least in part on execution by a driver of the ride vehicle using the ride and game control system of fig. 2, according to an embodiment; and
fig. 7 is a flow diagram of an embodiment of a method for controlling movement of a ride vehicle based at least in part on execution of one or more riders of the ride vehicle using the ride and game control system of fig. 2, according to an embodiment.
Detailed Description
One or more specific embodiments of the present disclosure will be described below. In an effort to provide a concise description of these embodiments, all features of an actual implementation may not be described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
The present embodiments relate to systems and methods for providing an enhanced experience for riders at a ride attraction, such as riders with a closed loop track, dark ride, or other similar ride. A ride and game control system associated with a ride vehicle at a ride attraction may provide an Augmented Reality (AR) environment through a head-mounted display or other suitable display worn by the rider. The AR environment may be combined with other off-board displays and function to create an entire ride environment. The AR environment may be presented as or part of a gaming or amusement environment with which riders may interact throughout the ride. To provide a more interactive ride experience and to provide a varying ride experience between subsequent visits, the ride and game control system may provide control of the division (e.g., branching) of the movement of the ride vehicle by the rider. That is, movement of the ride vehicle along and around a track (e.g., path) of the ride attraction may be controlled based at least in part on direct input from one or more user input devices, such as steering of the ride vehicle via a steering wheel user input device and/or changing a speed of the ride vehicle by pressing one or more pedals (e.g., an accelerator pedal, a brake pedal). Movement of the ride vehicle may also be controlled by the ride and game control system based at least in part on performance of the rider's interaction with the AR environment (e.g., as part of the game). For example, driving instructions to steer the ride vehicle may be presented to the occupant via the AR environment, and compliance or non-compliance with the driving instructions may result in different resulting movements of the ride vehicle. As another example, an occupant's interaction with an object in the AR environment via a user input device, such as releasing (e.g., shooting, throwing, or ejecting) an AR projectile (e.g., a housing, ball, or other item) to mark or impact a target object, may result in some movement of the ride vehicle. Thus, the movement of the ride vehicle may be controlled at times by the driver, at times by other riders, or at times by both the driver and other riders. The ride vehicle may enable the driver to drive the ride vehicle using the steering wheel user input device at certain times, and then the ride vehicle may override or supplement the driver's control to affect movement of the ride vehicle based on the driver's performance (e.g., following instructions) and/or based on the occupant's performance (e.g., marking objects of the AR environment with AR projections) at other times. Further, the driver may maintain steering control of the ride vehicle throughout some or all of the ride, but the speed of the ride vehicle may be varied during at least some portions of the ride based on the driver's performance and/or based on the rider's performance. Rider control of ride vehicle movement and the division of such control may provide a more interactive and varied ride experience. Additionally, in embodiments, the ride and game control system may provide varying ride experiences via a weighting process based on their scores or estimated skill levels as a function of the driving of the occupants of the ride vehicle and/or the interaction with the objects of the AR environment.
With the above in mind, FIG. 1 illustrates a schematic view of an amusement park 10 including ride attractions 12. As illustrated, the amusement park may also include theme attractions 14 (e.g., fixtures, architectural layouts, props, ornaments, etc. corresponding to the theme) and other amusement park attractions 16 (e.g., ferris wheels or other attractions). The ride attractions 12 may include a track or path 18 (e.g., a closed loop track or a system of closed loop tracks), which track or path 18 may provide the infrastructure along which the ride vehicle 20 may travel. The ride attractions 12 may include a ride vehicle 20, which ride vehicle 20 may house and carry one or more riders 22 through the ride attraction 12. Although vehicles 20 are shown to accommodate four riders 22, one or more ride vehicles 20 may each accommodate any number of riders (e.g., 1, 2, 3, 5, 6, or more). It should also be appreciated that although the ride attraction 12 may be illustrated with one ride vehicle 20, the ride attraction 12 may include any number of ride vehicles 20 (e.g., 1, 2, 4, 8, 10, or more).
The track 18 may be a simple track or a controlled path in which movement of the ride vehicle 20 may be limited or controlled via an electronic system, a magnetic system, or other similar system. The track 18 may define linear movement of the ride vehicle 20 as the ride vehicle 20 moves along the track 18. Movement of ride vehicle 20 may be controlled by a control system of ride vehicle 20 (e.g., ride and play control system 24), which ride vehicle 20 may include multiple control systems. As well as causing linear movement of ride vehicle 20 along track 18, ride and game control system 24 may cause other motions of ride vehicle 20, such as rotation, sway, spin, vibration, pivoting, and other similar motions. In addition, the ride and game control system 24 may provide an Augmented Reality (AR) environment 25 of AR graphics, the AR environment 25 including an AR object 26 presented to the rider 22 via a head mounted display worn by the rider 22 throughout the ride attraction 12. The ride and game control system 24 may also coordinate the presentation of the AR objects 26 via the head mounted display 28 and/or coordinate the movement of the ride vehicle 20 with other off-board effects, such as visual and/or sound presentations, provided by a show effects system 30, which may include projection gaming computers, display devices (e.g., projection display devices, digital display devices), lighting systems, and sound effects devices (e.g., speakers) disposed along the track 18.
In operation, the ride attraction 12 may be presented as a game or play interaction between a rider 22 of the ride vehicle 20, an AR environment 25 (e.g., a gaming environment) including AR objects 26, and/or one or more other ride vehicles 20 of the ride attraction 12. The AR object 26 may include objects, characters, and instructions for the occupant. In some embodiments, the game of riding attractions 12 may be presented as a competition between riding vehicles 20 and/or between characters presented via AR environment 25. As ride vehicle 20 moves along track 18, one rider 22 may control some direct movement of ride vehicle 20, such as steering and turning ride vehicle 20, via user input device 32. The user input device 32 may be communicatively coupled to the ride and game control system 24, which the ride and game control system 24 may cause movement of the ride vehicle 20 based at least in part on signals received from the user input device 32. Various AR objects 26 (including AR instructions) may be presented to the rider 22 via the ride and game control system 24 and the head mounted display 28 throughout the ride spot 12. Each occupant may have control of more than one user input device 32, and the user input devices 32 may include a steering user input device 34, a marking user input device 36, or other user input devices 38 such as an accelerator user input device. The occupant 22 may interact with the AR object 26 presented via the head mounted display 28, such as following instructions presented via the AR object 26, or interact with the AR object 26 via the tagging user input device 36 (e.g., tagging the AR object 26 with an AR projection). Such interaction of the occupant 22 with the AR environment 25 may also trigger or affect movement of the ride vehicle 20 caused by the ride and game control system 24. As such, the ride and game control system 24 may provide control over the division of movement of the ride vehicle 20 triggered by direct user input from the rider 22 (e.g., a steering wheel) and by virtual interaction of the rider 22 with the AR environment 25 (e.g., a game environment), as discussed in more detail with reference to fig. 2, 4, and 5. Thus, the ride and game control system 24 allows riders 22 to directly and indirectly alter the movement of their respective ride vehicles 22 to provide a more interactive ride experience that may change during subsequent rides at the ride attraction 12.
Additionally, the ride vehicle 20 may include a plurality of user input devices 32 such that each occupant 22 is provided with one or more user input devices 32. For example, each occupant 22 may be provided with a user input device 32 for driving the ride vehicle 20, such as a steering user input device 34 and/or other user input devices (e.g., accelerator user input devices), and a separate user input device 32 for interacting with the AR object 26, such as a marking user input device 36 that may be used to simulate marking the AR object 26 with an AR projectile or a user input device for simulating grabbing an AR reward object. The ride and game control system 24 may activate and deactivate control of the user input device 32 throughout the duration of the ride attraction 12. In this way, each occupant 22 may have an opportunity to operate a different user input device 32 to interact with the ride attraction 12 and the AR environment 25 during each operation of the ride attraction 12. Further, ride and game control system 24 may alternate control of certain user input devices 32 between riders 22 of ride vehicles 20. For example, each occupant 22 may be provided with a steering user input device 34 and a marking user input device 36. The ride and game control system 24 may activate the steering user input devices 34 for one rider 22 and the marking user input devices 36 for the other riders 22 in the ride vehicle 20. After a predetermined amount of time, a certain track length, a certain number of interactions, or other metric, the ride and game control system 24 may turn the steering control to another rider 22, thereby deactivating the steering user input device 34 from the previous driving rider 22 and activating the tagging user input device 36 for the previous driving rider 22. Such rotation of control of the various types of user input devices 32 throughout the duration of the ride attraction 12 may further provide a more interactive ride experience that may change during subsequent rides of the ride attraction 12.
As previously discussed, ride and game control system 24 may provide riders 22 with divided control of the movement of ride vehicle 20 based at least in part on signals received from one or more user input devices 32 regarding the steering of ride vehicle 20 and based at least in part on riders 22' interaction with AR environment 25, which may be presented as play settings. FIG. 2 is a block diagram illustrating various components of the ride and game control system 24. Ride and game control system 24 may be a dedicated system disposed on ride vehicle 20 or integrated with ride vehicle 20. The ride and game control system 20 may include a communication network 40 (e.g., wired and/or wireless communication networks such as wireless local area network [ WLAN ], wireless wide area network [ WWAN ], and near field communication [ NFC ]), a ride controller 42, a game controller 44, a monitoring system 46, and one or more game systems 48. The communication network 40 may communicatively couple the ride and game control system 24 to other systems that show effects system 30 and/or ride attraction systems. The show effects system 30 may include a memory 50 and processor 52 disposed along the track 18, a projection game computer, one or more display devices 54, one or more lighting systems 56, and other devices (e.g., sound systems, speakers). The communication network 40 may also communicatively couple the ride vehicle 22 and various on-board components of the ride and game control system 24 (e.g., ride controller 42, game controller 44, monitoring system 46, one or more game systems 48, head mounted display 28) to one another, as shown in the illustrated embodiment.
The ride controller 42 of the ride and game control system 24 may be a Programmable Logic Controller (PLC) or other suitable control device. The ride controller 42 may include a processor 58 (e.g., a general purpose processor, a system on a chip (SoC) device, an Application Specific Integrated Circuit (ASIC), or some other similar processor configuration), the processor 58 operatively coupled to a memory 60 (e.g., a tangible, non-transitory computer-readable medium and/or other storage device) to execute instructions for tracking operational parameters of the ride vehicle 20 and instructions for causing movement of the ride vehicle 20. As such, ride controller 42 may be configured to track operational parameters of ride vehicle 20 (including, but not limited to, position, yaw, pitch, roll, and speed of ride vehicle 20) and input control states (e.g., inputs provided by one or more riders 22 to steer and/or drive ride vehicle 20). In addition, ride controller 42 may be configured to control or change the physical operation of ride vehicle 22 (e.g., change the position, yaw, pitch, roll, and speed of ride vehicle 22) based on input signals received from user input devices 32 and/or game controllers 44. Based at least in part on input signals received from user input device 32 and/or game controller 44, ride controller 42 may output one or more control signals indicative of instructions to motor 62 or brake 64 and/or steering system 65 of ride vehicle 20 to perform the input movement, such as turning ride vehicle 20, changing the speed of ride vehicle 20, rotating ride vehicle 20, or other suitable movement.
The game controller 44 of the ride and game control system 24 may be a Programmable Logic Controller (PLC) or other suitable control device. The game controller 44 may include a processor 66 (e.g., a general purpose processor, a system on a chip (SoC) device, an Application Specific Integrated Circuit (ASIC), or some other similar processor configuration) operatively coupled to a memory 68 (e.g., a tangible, non-transitory computer-readable medium and/or other storage device) to execute instructions stored in the memory 68. Game controller 44 may be configured to provide operating parameters or information regarding ride vehicle 22 to one or more game systems 48 and/or ride controllers 42. The operating parameters or information may include, but is not limited to, position, yaw, pitch roll, and speed of ride vehicle 22. Additionally, the game controller 44 may be configured to determine and output signals to one or more gaming systems 48 and/or ride controllers 42 that indicate how input signals received from the user input devices 32 should affect movement of the ride vehicle 20. Accordingly, the game controller 44 may output command signals to the ride controller 42 that indicate particular movements associated with input signals received via the user input devices 32 (e.g., the steering user input device 34, the one or more marking user input devices 36, and the other user input devices 38). Game controller 44 may also coordinate movement instructions output to the ride controller with control signals and parameters output to one or more game systems 48 indicating how ride vehicle 20 is to be moved in order to coordinate movement of ride vehicle 20 with AR environment 25 presented to rider 22 via head mounted display 28.
Monitoring system 46 may include any suitable sensors and/or computing systems disposed on ride vehicle 20 or integrated with ride vehicle 20 to track the position, location, orientation, presence, etc. of occupant 22 and/or the position, location, or orientation of ride vehicle 20. Such sensors may include orientation and position sensors (e.g., accelerometers, magnetometers, gyroscopes, global positioning system [ GPS ] receivers), motion tracking sensors (e.g., electromagnetic and solid state motion tracking sensors), Inertial Measurement Units (IMUs), presence sensors, and so forth. Information obtained by the monitoring system 46 may be provided to the game controller 44, one or more game systems 48, and/or ride controller 42 for determining a gaze direction, viewing perspective, field of view, interaction with the game, etc. for each occupant. In an embodiment, the monitoring system 46 may also receive data obtained by the head mounted display 32 (e.g., position and orientation data of the head mounted display 32) indicative of a gaze direction, viewing perspective, field of view, interaction with the game, and so forth of the respective occupant.
The one or more gaming systems 48 may be a Central Processing Unit (CPU) or other suitable system, and may generally be configured to render virtual or enhanced graphics for overlay on a real world environment view. As such, one or more gaming systems 48 may generate an AR environment 25 with which the rider 22 may interact in the attraction setting. One or more gaming systems 48 may also be responsible for the gaming logic and run simulations of the real-world ride vehicle and stage geometry for placement of virtual objects in real space. In certain embodiments, the one or more gaming systems 48 are configured to provide an AR and/or gameplay experience to the occupant 22 via the head-mounted display 32. Specifically, each seat or position of the ride vehicle 20 may include a specialized gaming system 48. In an embodiment, one or more gaming systems 48 may be communicatively coupled to each other such that the occupant may participate in a shared game (e.g., a game with multiple players). One or more gaming systems 48 may be communicatively coupled (directly or indirectly) to ride controller 42, game controller 44, monitoring system 46, and show effects system 34. Each of the one or more gaming systems 48 may include a user input device 32 or a set of multiple user input devices 32 and a computer graphics generation system 70. The user input devices 32 may be communicatively coupled to a computer graphics generation system 70, and the computer graphics generation system 70 may be communicatively coupled to respective head mounted displays 32 (e.g., via the communication network 40).
The one or more user input devices 32 may include one or more user input devices (e.g., hand-held controllers, joysticks, buttons) disposed on the ride vehicle 20 to enable the respective rider 22 to provide inputs to the ride controller 42, the game controller 44, and or one or more game systems 48 for playing and controlling movement of the ride vehicle 20, such as changing the speed and/or direction of travel of the ride vehicle 20. For example, as previously discussed, the steering user input device 34 may include a steering wheel or other device to allow the occupant 22 to steer the ride vehicle 20. The one or more user input devices 32 may also include a marker device user input device 36 to allow the occupant 22 to interact with the AR object 26 of the AR environment 25. The user input device 32 may include any other type of input device configured to allow the occupant 22 to interact with an AR environment (e.g., a gaming environment) and/or directly control the operation of the ride vehicle 20, such as an accelerator user input device. Additionally, one or more user input devices 32 may be configured to allow different actions and/or effects to be applied in the AR environment 25. For example, one or more user input devices 32 may allow the occupant 22 to control the AR object 26 (e.g., character, object) of the AR environment 25 in different directions (e.g., up, down, left, right). In an embodiment, the one or more user input devices 32 may also include a display screen and/or a touch screen to enable ride and play related information to be communicated to the riders 22, such as information regarding which user input device(s) 32 are currently being activated for each rider 22 and/or play instruction.
The computer graphics generation system 70 may generate and transmit AR graphics (e.g., AR environment 25 including AR object 26) to be displayed on the respective head mounted display 28 so that the respective occupant 22 may visualize the AR environment 25 (e.g., a gaming environment). The computer graphics generation system 70 includes processing circuitry such as a processor 72 (e.g., a general purpose processor or other processor) and memory 74, and may process data useful in generating the AR environment 25 for the respective occupant 22. Data useful in generating the AR environment 25 may include, but is not limited to, real-time data received from the respective head mounted display 28, one or more user input devices 32, and game controller 44 (e.g., including data from the ride controller 42 and monitoring system 46) and data stored in memory 74.
The computer graphics generation system 70 may use such data to generate a reference frame to register the AR environment 25 to the real environment of the ride attraction 12 (e.g., to the generated real world image or to the actual physical environment). In particular, in certain embodiments, using reference frames generated based on orientation data, position data, viewpoint data, motion tracking data, and so forth, the computer graphics generation system 70 may render views of the AR environment 25 in a manner that is comparable in time and space to what the respective occupant 22 would perceive without wearing the head mounted display 28. The computer graphics generation system 52 may store a model of the ride attraction 12 built using spatial information including real-world physical features of the ride attraction 12 of the subject environment (e.g., the physical scene of the ride attraction 12). The model may be used with other inputs, such as inputs from the ride controller 42, the game controller 44, the monitoring system 46, and/or the head mounted display 28, to locate the respective occupant 22 and determine the occupant's gaze direction and/or field of view. The model may be used to provide display signals to the head mounted display 28 that are dynamically updated as the occupant 22 travels along the track 18.
For example, the computer graphics generation system 70 may selectively generate AR graphics of the AR environment 25 (e.g., AR object 26 including an instruction object) to reflect changes in orientation, position, gaze direction, field of view, motion, etc. of the respective occupant. Computer graphics generation system 70 may selectively generate AR environment 25 based on data received from monitoring system 46, game controller 44, and/or ride controller 42 indicative of the position, yaw, and speed of ride vehicle 20, and/or other operating parameters. The computer graphics generation system 70 may also selectively generate AR graphics to reflect changes in the input provided by the respective occupant using one or more user input devices 32. Further, the computer graphics generation system 70 may generate AR graphics based on simulated interactions that may cause the AR object 26 to be affected according to certain predetermined or modeled responses stored by the computer graphics generation system 70 (e.g., in memory 74). As an example, the predetermined or modeled response may be implemented by a physics engine or similar module or as part of the computer graphics generation system 70. In some embodiments, the computer graphics generation system 70 may track the information or data set forth above corresponding to multiple riders 22 in the shared game so that the riders 22 in the shared game may see the game effects applied by other riders 22 (e.g., players) in the shared game.
Additionally, computer graphics generation system 70 may receive input signals from ride controller 24 and/or game controller 44 indicative of movement of ride vehicle 20, such that computer graphics generation system 70 may generate AR environment 25 based at least in part on how ride vehicle 20 is moving along or around track 18. That is, the game controller 44 may determine how the ride vehicle 20 should move in response to the input received from the user input device 32 (e.g., the steering user input device 34, the marking user input device 36, and the other user input devices 38) indicative of the occupant's 22 interaction with the AR environment 25, at least in part, and the game controller 44 may output a signal to the computer graphics generation system 70 indicative of how the ride vehicle 20 will move in response to the occupant's interaction with the AR environment 25. As such, the computer graphics generation system 70 may generate the AR environment 25 based at least in part on how the occupant 22 is causing the ride vehicle 20 to move. Further, computer graphics generation system 70 may receive signals from ride controller 42 indicative of certain direct user inputs received from steering user input device 34 (such as steering ride vehicle 22), such that computer graphics generation system 70 may further generate AR environment 25 based at least in part on how occupant 22 is causing ride vehicle 20 to move.
Fig. 3 is an illustration of an embodiment of a head mounted display 28 that may be worn by the occupant 22 during riding the attraction 12. In an embodiment, the head mounted display 28 may be coupled (e.g., tethered via a cable or wire) to the ride vehicle 20. In an embodiment, the occupant 22 riding the vehicle 20 may purchase or otherwise be provided with a head mounted display 28. The head mounted display 28 may include electronic glasses 80 (e.g., AR glasses, goggles) and a wearable portion 82 configured to receive at least a portion of the electronic glasses 80. The head mounted display 28 may be used alone or in combination with other features to create a super-reality environment 84 (e.g., including the AR environment 25), which super-reality environment 84 may include an AR experience or other similar super-reality environment for the respective occupant 22. In particular, the head mounted display 28 may be worn by the occupant 22 during traversal of the ride spot 12.
The head mounted display 28 may include a processor 86 and a memory 88 (e.g., a tangible, non-transitory computer readable medium). The processor 86 and memory 88 may be configured to allow the head mounted display 28 to function as a display (e.g., to receive signals from the computer graphics generation system 70 that ultimately drives the display). The processor 86 may be a general purpose processor, a system on a chip (SoC) device, an Application Specific Integrated Circuit (ASIC), or some other similar processor configuration.
The head mounted display 28 may include a tracking system 90, which tracking system 90 may include orientation and/or position sensors (such as accelerometers, magnetometers, gyroscopes, GPS receivers), motion tracking sensors, electromagnetic and solid state motion tracking sensors, IMUs, presence sensors, and the like. The tracking system 90 may collect real-time data indicative of the occupant's position, orientation, focus, gaze direction, field of view, motion, or any combination thereof. The head mounted display 28 may include a communication interface 92 (e.g., including a wireless transceiver), which communication interface 92 may transmit real-time data captured via the tracking system 90 to the processor 86 and/or the computer graphics generation system 70 for processing. The communication interface 92 may also allow the head mounted display 28 to receive display signals transmitted by the computer graphics generation system 70.
The electronic glasses 80 of the head mounted display 28 may include one or more displays 94. The one or more displays 94 may include a see-through display surface onto which images are projected, such as a see-through Liquid Crystal Display (LCD), a see-through Organic Light Emitting Diode (OLED) display, or other similar display useful in displaying real world and AR graphical images to the occupant 22. For example, the occupant 22 may view AR graphics that appear to be superimposed onto the actual and physical real world environment on the respective display 94. According to the present embodiment, the head mounted display 28 may receive display signals (e.g., AR graphics along with corresponding overlay information, such as spatial and/or temporal information about the one or more displays 94) via the communication interface 92 such that the head mounted display 28 may process the AR graphics via the processor 86 and overlay the AR graphics on the one or more displays 94 such that the occupant 22 perceives the AR graphics of the AR environment 25 as being integrated into the real world environment. In an embodiment, the head mounted display 28 may include one or more sound devices (e.g., headphones, speakers, microphones).
To illustrate driver control over movement of ride vehicle 20, fig. 4 is a schematic diagram showing a driving rider 100 (e.g., rider 22 with control of an activated steering user input device 34 for steering ride vehicle 20) steering ride vehicle 20 via steering user input device 34. As previously mentioned, the ride and game control system 24 of the ride vehicle 20 may allow the rider 22 control of the division of the movement of the ride vehicle 20. As such, the driving occupant 100 can affect movement of the ride vehicle 20 in two ways: direct input via movement of the ride vehicle 20, such as steering of the ride vehicle 20 using the steering user input device 34; and via indirect input, such as by following one or more AR instructions 104 presented to the driving occupant 22. That is, the driving occupant 100 may steer the ride vehicle 20 via the steering user input device 34. Additionally, the driving occupant 100 may affect the speed of the ride vehicle 20 via their driving performance in following the presented AR instructions 104.
In the illustrated embodiment, ride vehicle 20 is traveling along track 18 in direction 106. The rider 22 may be viewing an AR environment 25 (e.g., a gaming environment) combined with a theme attraction 108 (e.g., a real-world scene of a ride attraction 12) via the head-mounted display 28. At some point during the ride of the sight 12, one rider 22 may be a driving rider 100. The driving occupant 100 may be the occupant 22 whose respective steering user input device 34 is currently activated. As previously discussed, the ride and game control system 24 may toggle control of the steering user input devices 34 between riders 22 throughout the ride spot 12 so that each rider 22 of the ride vehicle 20 may have an opportunity to steer the ride vehicle 20. Steering user input device 34 may output signals to ride and game control system 24 (e.g., to ride controller 42, game controller 44, and/or respective game systems 48) to directly control movement of ride vehicle 20. While traveling along the track 28, the driving occupant 100 may be presented with the AR object 26 or a physical object (i.e., the instructional object 104) via the respective head mounted display 28. The instructional object 104 may present driving or gaming instructions to the driving occupant 100, such as instructions to turn in a certain direction, increase or decrease speed, follow a certain track at a track intersection, or bump or avoid the AR object 26 of the AR environment 25. The command object 104 indicating the driving command may be presented only to the driving occupant 100 or may be presented to all occupants 22 of the ride vehicle 20. Additionally, although the instruction object 104 is shown as an arrow in the illustrated embodiment, the instruction object 104 may be text, symbols, graphics, characters, or any other object that may be used to indicate to the driving occupant 100 a direction to steer the ride vehicle 20 or control the ride vehicle 20.
As an example, in the illustrated embodiment, the driving occupant 100 controlling the steering user input device 34 is presented with an instruction object 104 combined with a theme spot 108 within its field of view 110 via the respective head mounted display 28. In the illustrated embodiment, the instruction object 104 is presented as an arrow indicating the direction of the turn to follow at the upcoming track fork 112. Track 18 diverges at track divergence 112 so that the driver can continue on the current path by staying on track 14, or can turn ride vehicle 20 using steering user input device 34 to follow path 116 as indicated by command object 104. If the driving occupant 100 uses the steering user input device 34 to turn the ride vehicle 20 toward track 116, the ride vehicle 20 may be controlled to turn in the direction of track 116 by ride controller 42 and/or game controller 44 of ride and game control system 24. In this way, the driving occupant 100 is able to directly control movement of the ride vehicle 20 via the steering user input device 34.
Additionally, the driving occupant 100 may control movement of the ride vehicle 20 by following the command object 104. As such, the indication of proper compliance with the presented instruction object 104 may result in additional movement of ride vehicle 20, and/or may be added to the score of the driving occupant 100 or the score of ride vehicle 20, which may result in additional movement of ride vehicle 20 triggered by ride and game control system 24. For example, in the illustrated embodiment, if the driving occupant 100 follows the command object 104 by turning the ride vehicle 20 toward the track 116 using the steering user input device 34, not only will the ride vehicle 20 be caused to turn toward the track 116 based at least in part on the signal received from the steering user input device 34, but additional movement of the ride vehicle 20 can be triggered by properly following the command object 104 and/or increasing the driving score. The movement triggered by the adherence instruction object 104 may include any movement in addition to direct movement initiated by an input signal from the steering user input device 34, such as a change in speed of the ride vehicle 20 (e.g., from a first non-zero speed to a second non-zero speed), starting the ride vehicle 20, stopping the ride vehicle 20, or other suitable movement. For example, movement triggered by compliance with instruction object 104 may be causing ride vehicle 20 to move faster to provide a more exciting ride, or causing ride vehicle 20 to move slower to allow other riders 22 more time to interact with AR object 26, thereby allowing other riders 22 and/or ride vehicle 20 to collect more points. In this way, the driving occupant 100 is able to indirectly control movement of the ride vehicle 20 by following the presented command object 104. Additionally, compliance with the instruction object 104 may cause the AR environment 25 to change such that other occupants 22 may be presented with more AR objects 26.
In embodiments, in addition to triggering movement of ride vehicle 20 by complying with command object 104 and/or increasing the score of driving occupant 100 or the score of ride vehicle 20, driving occupant 100 may also cause movement of ride vehicle 20 by not complying or incorrectly complying with command object 104. Such movement may include spinning of ride vehicle 20, a decrease in speed of ride vehicle 20, or other suitable movement. For example, in the illustrated embodiment, if the driving occupant 100 is not compliant with the command object 104 indicating to steer the ride vehicle 20 toward track 116 (e.g., within a time period after the command is displayed and/or before track turnout 112), ride and game control system 24 may cause the ride vehicle to spin and/or change speed based at least in part on the missed command object 104. For example, movement triggered by non-compliance or missing of the instruction object 104 may be to cause the ride vehicle 20 to move faster to make it more difficult for other riders 22 to earn points or move slower to make the game simpler. As such, in an embodiment, the driving occupant 100 can also indirectly cause movement of the ride vehicle 20 by not following the presented command object 104. Additionally, missing the instruction object 104 may cause the AR environment 25 to change such that other occupants 22 may be presented with fewer or different AR objects 26. In an embodiment, a missed instruction object 104 may result in a decrease in occupant score or ride vehicle score, which in turn may also trigger movement of the ride vehicle 20.
In an embodiment, game controller 44 and/or respective game system 48 of ride and game control system 24 may include a weighting system that may result in the presentation of a particular AR object (including instruction object 104) to each rider 22 or ride vehicle 20 that corresponds to the skill level of the rider 22 or ride vehicle 20 determined by the weighting system. Ride and game control system 24 may monitor the driving of each rider 22 and/or ride vehicle 20 and/or interaction with AR environment 25 (including instructional objects 104) for a determined period of time at the beginning of a ride attraction 12. Based at least in part on the monitored driving and/or interaction with AR environment 25, ride and game control system 24 may determine an initial skill level for rider 22 or ride vehicle 20. Subsequent scenes or predetermined interactions may be generated by the computer graphics generation system 70 and presented to one or more riders 22 based on the determined skill level.
As such, each rider 22 or ride vehicle 20 may be presented with an AR environment 25 (including instruction object 104) corresponding to the determined skill level, such that each rider 22 or ride vehicle 20 may be presented with a different AR object at the same interaction area along the ride attraction 12. The weighting system of ride and game control system 24 may determine and update the skill level of rider 22 and/or ride vehicle 20 at each predetermined interaction area, thereby allowing AR environment 25 to correspond to the skill level of rider 22 and/or generally all riders 22 of ride vehicle 20 as ride vehicle 20 travels through ride attraction 12. Thus, the instruction object 104 presented to the driving occupant 100 may be based at least in part on the current determined skill level of the driving occupant 100 and/or the current determined skill level of the occupant 22 as a whole riding the vehicle 20. Additionally, the types of movements of the ride vehicle 20 that are directly and indirectly triggered by the input received from the steering user input device 34 may be based at least in part on the current determined skill level of the driving rider 100 and/or the current determined skill level of the rider 22 as a whole of the ride vehicle 20. For example, if driving occupant 100 is determined to have a relatively high skill level, command object 104 may appear closer to track turnout 112 than a lower skill level, and/or proper compliance with the command may result in a greater change in speed of ride vehicle 20.
Similarly, fig. 5 is a schematic diagram illustrating a marking occupant 130 having control of a marking user input device 36 for interacting with an AR object 26 of an AR environment 25. Several riders 22 may control the tag user input devices 36 at the same time and as previously discussed, control of different types of user input devices 32 may be toggled between riders 22 of the ride vehicle 20 throughout the ride spot 12. As previously mentioned, the ride and game control system 24 of the ride vehicle 20 may allow the rider 22 control of the division of the movement of the ride vehicle 20. As such, the marker rider 130 may affect movement of the ride vehicle 20 through interaction with the AR object 26 of the AR environment 25 (e.g., a gaming environment). Although described in the illustrated embodiment as a tagging user input device, user input device 32 other than steering user input device 34 may be any other type of user input device 32 that may be used to interact with AR environment 25, such as a device for grabbing or moving AR object 26 or other suitable user input device 32.
In the illustrated embodiment, ride vehicle 20 is traveling along track 18 in direction 106. The rider 22 may be viewing the AR environment 25 combined with a theme attraction 108 (e.g., a real world scene of the ride attraction 12) via the head mounted display 28. While traveling along track 18, one or more marker riders 130 may be presented with AR objects 26, such as objects, rewards, and/or characters, within their field of view 134. The marking occupant 130 may mark or otherwise interact with an AR object using AR projection 136, as in the illustrated embodiment. Such interaction with AR objects 26 of AR environment 25 may indirectly affect or trigger movement of ride vehicle 20 via ride and game system 24. Interaction with the AR object 26 may also increase the occupant score or ride vehicle score, which in turn may affect or trigger movement of the ride vehicle 20. For example, the marking occupant 130 may mark the presented AR object 26 with the AR projection 136. If marking occupant 130 marks or hits AR object 26 or wins a game with AR object 26 (e.g., a character), ride and game control system 24 may trigger certain movements of ride vehicle 20, such as changing the speed of ride vehicle 20, spinning ride vehicle 20, shaking ride vehicle 20 or a component (e.g., a seat) of ride vehicle 20, or other suitable movements. Additionally, taking certain occupant scores or ride vehicle scores may also trigger such movement of the ride vehicle 20. Game controller 44 and/or respective game system 48 may receive input signals from marking user input device 36 and, based at least in part on the received signals, may output control signals to ride controller 42 to cause some movement of ride vehicle 20. In this way, one or more marker riders 130 or riders 22 that are not currently controlling steering user input device 34 are able to indirectly control movement of ride vehicle 20 by interacting with presented AR object 26.
As previously discussed, in embodiments, the game controller 44 and/or the respective game system 48 of the ride and game control system 24 may include or act as a weighting system that may result in the presentation of a particular AR object 26 to each rider 22 or ride vehicle 20 that corresponds to the skill level of the rider 22 or ride vehicle 20 as determined by the weighting system. Ride and game control system 24 may monitor the driving of each rider 22 and/or ride vehicle 20 and/or interaction with the AR environment for a determined period of time at the beginning of the ride attraction 12. Based at least in part on the monitored driving and/or interaction with AR environment 25, ride and game control system 24 may determine an initial skill level for rider 22 or ride vehicle 20. Subsequent scenes or predetermined interactions may be generated by the computer graphics generation system 70 and presented to one or more riders 22 based on the determined skill level.
As such, each rider 22 or ride vehicle 20 may be presented with an AR environment 25 corresponding to the determined skill level such that each rider 22 or ride vehicle 20 may be presented with a different AR object 26 at the same interaction area along the ride attraction 12. The weighting system of ride and game control system 24 may determine and update the skill level of rider 22 and/or ride vehicle 20 at each predetermined interaction area, thereby allowing AR environment 25 to correspond to the skill level of rider 22 and/or generally all riders 22 of ride vehicle 20 as ride vehicle 20 travels through ride attraction 12. Accordingly, the AR object 26 presented to the marked occupant 130 may be based at least in part on the current determined skill level of the driving occupant 100 and/or the current determined skill level of the occupant 22 of the ride vehicle 20 as a whole. Additionally, the types of movements of the ride vehicle 20 that are indirectly triggered by the input received from the tagging user input device 36 may be based at least in part on the current determined skill level of the tagging rider 130 and/or the current determined skill level of the rider 22 as a whole of the ride vehicle 20. For example, if the tagged rider 130 is determined to have a relatively high skill level, the difficulty level of interaction may be relatively high compared to a lower skill level, such as it is more difficult to hit the AR object 26, and hitting or tagging the AR object 26 may result in a greater degree of change in the speed of the ride vehicle 20.
It should be understood that both fig. 4 and 5 have been simplified to show the perspective of only one occupant 22 at a time, and that all occupants 22 riding in the vehicle 20 at the same time may control one or more user input devices 32. As discussed above, the interaction and driving performance of all riders 22 of ride vehicle 20 with AR objects 26 may affect and cause movement of ride vehicle 20 via ride and game control system 24.
As previously discussed, movement of ride vehicle 20 may be controlled by both the driving occupant 100 and other occupants (e.g., the tag occupant 130). Steering of ride vehicle 20 may be controlled by driving occupant 100 via a steering user input device, and additionally, in embodiments, the speed of ride vehicle 20 may be controlled by driving occupant 100 via other user input devices 38 (such as an accelerator user input device or a brake user input device). Movement of ride vehicle 20, such as steering and/or speed of ride vehicle 20, may also be controlled based on execution of driving rider 100 and/or other riders 22 (e.g., marking rider 130) relative to AR environment 25 or the game. With this in mind, fig. 6 is a flow diagram of an embodiment of a method 150 for controlling or triggering movement of the ride vehicle 20 based at least in part on direct and indirect movement inputs received by the driving occupant 100 via the ride and game control system 24. Method 150 may be continuously performed by ride and game control system 24 as ride vehicle 20 travels along track 18 and as ride vehicle's driving controls (e.g., steering user input device 32 and/or accelerator user input device controls) are transferred between riders 22 throughout the ride attraction 12. The method 150 includes generating and displaying the AR instruction object 104 to present driving instructions to the driving occupant 100 (block 152). As previously discussed, the instruction object 34 may be text, a symbol, a graphic, a character, or any other object that indicates to the driving occupant 100 to steer the ride vehicle 20 or control the ride vehicle 20 in a particular manner. The instruction object 104 may also be an object on the track 18 to be avoided. A current driving rider 100 of the ride vehicle 20 may directly steer the ride vehicle 20 via the steering user input device 34 in response to an instruction object 104 generated and displayed to the driving rider 100 via the game controller 44 and/or the respective game system 48. Based on the steering of the driving occupant 100 via the steering user input device 34, a signal indicative of the steering is received by the ride controller 42 and/or the game controller 44 of the ride and game control system 24 (block 154).
Next, the ride and game control system 24 may trigger direct movement of the ride vehicle 20 based at least in part on received steering inputs received via the ride controller 42 and/or the game controller 44, such as turning the ride vehicle 20 in the direction of the steering inputs (block 156). Next, ride and game control system 24 may determine, via game controller 44 and/or respective game system 48, whether the steering input received from steering user input device 34 corresponds to the indicated instruction presented to driving occupant 100 via instruction object 104 (block 158). For example, whether the driver has properly turned the steering wheel (e.g., turned the user input device 34) and/or operated other inputs such as an accelerator user input device within a period of time after the instruction object 104 is presented. If ride and game control system 24 determines that the received steering input corresponds to command object 104, ride and game control system 24 may trigger additional movement of ride vehicle 20, such as a change in the speed of ride vehicle 20 (block 160). Additionally or alternatively, movement of the ride vehicle 20 may be affected or triggered based at least in part on the rider score and/or the ride vehicle score such that movement is triggered when the rider score and/or the ride vehicle score exceeds a threshold score. In such embodiments, the occupant score and/or the ride vehicle score may be increased in response to the driver correctly complying with instructions 104.
If ride and game control system 24 determines that the received steering input does not correspond to command object 104, ride and game control system 24 may trigger different movements of ride vehicle 20, such as a change in speed of ride vehicle 20 and/or a spin of ride vehicle 20 (block 162). In an embodiment, if the ride and game control system 24 determines that the received steering input does not correspond to the presented instruction object 104, no movement may be triggered. It should be understood that the method 150 may be an iterative or repetitive process that is performed throughout the duration of the ride attraction 12 to trigger movement of the ride vehicle 20.
Fig. 7 is a flow diagram of an embodiment of a method 170 for controlling or triggering movement of ride vehicle 20 based at least in part on one or more marker riders 130 interaction with AR environment 25 via ride and game control system 24. Method 170 may be performed continuously by ride and game control system 24 as ride vehicle 20 travels along track 18 and during control of one or more marking user input devices 36 throughout ride attraction 12 or as control is transferred between riders 22 at certain times or portions of ride attraction 12. Method 170 includes generating and displaying an AR object 26 of AR environment 25 to one or more marker riders 130 (block 172). One or more marking riders 130 may interact with AR object 26 (e.g., mark at AR object 26) via one or more marking user input devices 36. Based on the interaction of one or more currently tagged riders 130 with AR object 26, a signal indicative of the interaction is received by the game controller of ride and game control system 24 and/or the respective game system 48 (block 174).
Next, ride and game control system 24 may determine, via game controller 44 and/or respective game system 48, whether the received signal indicative of the interaction of one or more marker riders 130 with AR object 26 corresponds to a hit by AR object 26 or a win of the simulated game (block 176). If the ride and game control system 24 determines that the interaction corresponds to a marker or hit of the AR object 26 or a win of the simulated game, the ride and game control system 24 may trigger movement of the ride vehicle 20, such as a change in the speed of the ride vehicle 20 (block 178). Additionally or alternatively, movement of the ride vehicle 20 based at least in part on the interaction of one or more marker riders 130 with the AR environment 25 may be triggered based at least in part on the rider score and/or the ride vehicle score such that additional movement is triggered when the rider score and/or the ride vehicle score exceeds a threshold score. In such embodiments, the rider score and/or the ride vehicle score may be increased based on the hit or win. If ride and game control system 24 determines that the interaction does not correspond to a tag or hit of an AR object 26 or a win of a simulated game, ride and game control system 24 may trigger movement of ride vehicle 20, such as changing the speed of ride vehicle 20, or may not trigger additional movement of ride vehicle 20.
It should be understood that the methods 150 and 170 may be performed independently or together throughout the duration of the ride attraction 12. The methods 150 and 170 may be performed alternately or at different portions of the ride attraction 12. Thus, movement of ride vehicle 20 may be controlled sometimes by driving occupant 100, sometimes by other occupants 22 (e.g., tag occupant 130), or sometimes by both driving occupant 100 and tag occupant 130. Further, while each rider 22 may control at least one user input device 32 at the same time, the methods 150 and 170 may be performed continuously for each rider 22 and each user device throughout the duration of the ride attraction 12 to create a divided control of the movement of the ride vehicle 20 through the riders 22 and their interaction with the AR environment 25 (e.g., a gaming environment).
While certain features of the present embodiments have been illustrated and described herein, many modifications and changes will become apparent to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit of the disclosure. Further, it should be understood that certain elements of the disclosed embodiments may be combined or interchanged with one another.
The technology presented and claimed herein is cited and applied to material objects and concrete examples of a practical nature that significantly improve the technical field and are therefore not abstract, intangible or purely theoretical. Further, if any claim appended to the end of this specification contains one or more elements expressed as "means for [ performing ] … … [ function" or "step for [ performing ] … … [ function"), it is intended that such elements be construed at 35 u.s.c.112 (f). However, for any claim that contains elements expressed in any other way, it is not intended that such elements be construed at 35 u.s.c.112 (f).
Claims (20)
1. A ride system for an amusement park, comprising:
a ride vehicle configured to accommodate a rider and configured to travel along a ride path;
a head mounted display configured to be worn by a rider; and
a control system configured to:
displaying a virtual instruction to the occupant via the head mounted display;
receiving a signal from a user input device associated with an occupant;
determining that a signal received from a user input device corresponds to a virtual instruction; and
triggering movement of the ride vehicle in response to determining that the signal corresponds to a virtual command.
2. The ride system of claim 1, wherein the virtual instructions comprise an Augmented Reality (AR) object.
3. The ride system of claim 1, comprising a user input device, wherein the user input device comprises a steering device configured to enable an occupant to steer the ride vehicle.
4. The ride system of claim 3, wherein the control system is configured to control the steering system to steer the ride vehicle according to a signal received from the steering apparatus, and to trigger the change in the speed of the ride vehicle in response to determining that the signal corresponds to the virtual command.
5. The ride system of claim 1, wherein the ride vehicle is configured to accommodate another rider, wherein the ride system comprises another head mounted display configured to be worn by the other rider, and wherein the control system is further configured to:
displaying an Augmented Reality (AR) object to another occupant via another head mounted display;
receiving a respective signal from another user input device associated with another occupant;
determining that the signal received from the other user input device corresponds to an interaction with the AR object; and
triggering another movement of the ride vehicle in response to determining that the signal received from the other user input device corresponds to an interaction with the AR object.
6. The ride system of claim 5, comprising another user input device, wherein the other user input device comprises a marking device, and wherein determining that the respective signal received from the other user input device corresponds to interaction with the AR object comprises determining that the respective signal received from the other user input device corresponds to marking of the AR object with a virtual projectile from the marking device.
7. The ride system of claim 5, wherein the control system is configured to determine the skill level of another rider based on a weighting process of a previous interaction of the another rider with a portion of the ride prior to the interaction with the plurality of AR objects throughout the interaction.
8. The ride system of claim 1, wherein the control system is configured to trigger another movement of the ride vehicle in response to determining that the signal does not correspond to a virtual command.
9. The ride system of claim 1, wherein moving comprises changing a speed of the ride vehicle.
10. The ride system of claim 1, wherein the ride is configured to play a game by the ride vehicle as the ride vehicle travels along the ride track.
11. A method of providing entertainment in an amusement park, comprising:
generating, via a computer graphics generation system communicatively coupled to a control system of a ride vehicle, a gaming environment, wherein the gaming environment includes a plurality of Augmented Reality (AR) objects, wherein the ride vehicle is configured to carry one or more riders and travel along a ride path;
displaying, via one or more head mounted displays associated with each of the one or more riders, a gaming environment;
receiving, via a control system, one or more signals from one or more user input devices associated with each of one or more occupants; and
triggering, via a control system, movement of the ride vehicle based at least in part on signals received from one or more user input devices associated with each of the one or more riders, wherein the one or more user input devices comprise at least one steering device and at least one marking device.
12. The method of claim 11, wherein the plurality of AR objects includes an instruction object indicating a virtual instruction, and wherein triggering movement of the ride vehicle comprises:
determining that the respective signal received from the at least one steering device corresponds to a virtual instruction; and
in response to determining that the respective signal received from the at least one steering apparatus corresponds to the virtual command, movement of the ride vehicle is triggered.
13. The method of claim 12, wherein moving comprises changing a speed of a ride vehicle.
14. The method of claim 12, wherein moving comprises spinning the ride vehicle relative to the ride path.
15. The method of claim 11, wherein moving comprises changing a speed of a ride vehicle.
16. The method of claim 11, wherein triggering movement of the ride vehicle comprises:
determining that the signal received from the marker device corresponds to a virtual interaction with at least one of the plurality of AR objects; and
triggering movement of the ride vehicle in response to determining that the signal received from the marker device corresponds to a virtual interaction with at least one of the plurality of AR objects.
17. The method of claim 11, comprising determining, via the control system, a score for the ride vehicle based at least in part on one or more signals received from one or more user input devices associated with each of the one or more riders, wherein triggering movement of the ride vehicle is based at least in part on the determined score of the ride vehicle.
18. A ride and gaming control system integrated with a ride vehicle configured to carry a rider along a ride track, the ride and gaming control system comprising:
a game controller configured to:
receiving a first input signal from a steering user input device associated with a ride vehicle;
receiving a second input signal from at least one tag user input device associated with the ride vehicle;
determining that the received first input signal corresponds to a virtual instruction;
in response to determining that the received first input signal corresponds to a virtual command, outputting a first control signal indicative of a command that results in a first movement of the ride vehicle;
determining that the received second input signal corresponds to an interaction with an Augmented Reality (AR) object; and
outputting a second control signal indicative of an instruction to cause a second movement of the ride vehicle in response to determining that the received second input signal corresponds to an interaction with the AR object; and
a ride controller communicatively coupled to the game controller and configured to receive the first control signal and the second control signal and configured to control movement of the ride vehicle based at least in part on the first control signal and the second control signal.
19. The ride and game control system of claim 18, comprising a computer graphics generation system communicatively coupled to the game controller and a head mounted display configured to be worn by the rider, wherein the computer graphics generation system is configured to generate and display the virtual instructions and the AR objects via the head mounted display.
20. The ride and play control system of claim 18, wherein the ride controller is configured to adjust the speed of the ride vehicle based at least in part on the first control signal or the second control signal.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US62/467817 | 2017-03-06 | ||
| US15/912281 | 2018-03-05 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| HK40016431A true HK40016431A (en) | 2020-09-11 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10289194B2 (en) | Gameplay ride vehicle systems and methods | |
| JP7454544B2 (en) | Systems and methods for generating augmented reality and virtual reality images | |
| JP6714791B2 (en) | Simulation system and program | |
| JP7629942B2 (en) | Methods of haptic response and interaction | |
| ES2750959T3 (en) | Interactive gaming floor system and method | |
| JP2020513957A5 (en) | ||
| HK40016431A (en) | Gameplay ride vehicle systems and methods | |
| US20230321552A1 (en) | Systems and methods for storing player state caches locally on ride vehicles | |
| JP2025511880A (en) | System and method for storing player state cache locally on a vehicle | |
| JP7689481B2 (en) | Attraction system, attraction method, attraction program, and attraction server | |
| HK40016435A (en) | Augmented ride system and method | |
| HK40016435B (en) | Augmented ride system and method | |
| JP2010284258A (en) | Game device and game program |