[go: up one dir, main page]

WO2016204617A2 - Game controller - Google Patents

Game controller Download PDF

Info

Publication number
WO2016204617A2
WO2016204617A2 PCT/NL2016/050434 NL2016050434W WO2016204617A2 WO 2016204617 A2 WO2016204617 A2 WO 2016204617A2 NL 2016050434 W NL2016050434 W NL 2016050434W WO 2016204617 A2 WO2016204617 A2 WO 2016204617A2
Authority
WO
WIPO (PCT)
Prior art keywords
scene
interactive system
motion controller
previous
professional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/NL2016/050434
Other languages
French (fr)
Other versions
WO2016204617A3 (en
Inventor
Mark Thomas Gertruda BEUMERS
Jacobus Josephus Adrianus GROEN IN 'T WOUT
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lagotronics Projects BV
Original Assignee
Lagotronics Projects BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from NL2014974A external-priority patent/NL2014974B1/en
Priority claimed from NL2014976A external-priority patent/NL2014976B1/en
Application filed by Lagotronics Projects BV filed Critical Lagotronics Projects BV
Publication of WO2016204617A2 publication Critical patent/WO2016204617A2/en
Publication of WO2016204617A3 publication Critical patent/WO2016204617A3/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/216Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/32Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using local area network [LAN] connections
    • A63F13/327Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using local area network [LAN] connections using wireless networks, e.g. Wi-Fi® or piconet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact

Definitions

  • the present invention generally relates to an interactive systems for interaction of multiple users with multiple objects within a scene.
  • the present invention relates to a motion controller device to be used to shoot virtual projectiles towards an object displayed on the display screen.
  • Interactive systems are typically known as professional entertainment systems present in theme/amusement parks but also widely and increasingly used at other locations such as public entertainment environments, temporary game locations, at exhibitions or even at home or work locations.
  • Most entertainment systems comprise a large display screen and are non-interactive and only display pre-programmed information, which users cannot manipulate or control.
  • a dedicated device known as a game controller typically performs the manipulation.
  • game controllers In home use, i.e. small screen and small and static environments, home entertainment systems these controllers are for example gamepads, mice, or a pointing device such as a laser gun shooter.
  • Both a home and a professional entertainment system is controlled by a processing device such as a dedicated central processing unit, i.e. computer.
  • the computer receives the input signals from the input devices, e.g. the gamepads, and processes these signals for controlling manipulation of certain physical objects in the scene or objects of the virtual environment shown on the display.
  • a central computer device or a cluster of central computer devices, is used to calculate and process al the input and output signals of the system to enable a multi-user interactive environment.
  • Traditional entertainment systems are for example comprised of light gun type game controllers in the form of light guns that are able to manipulate objects within the scene, such as a projectile shot from a shooter, by relative movement thereof.
  • a light gun consists of at least one button, e.g. to trigger, to launch a projectile, as well as means for determining the target location on the screen, e.g. a light signal.
  • Light guns are known from home console systems and are often modelled on a ballistic weapon such as a pistol. Different technical designs are known. For example a detection method is known which involves drawing frames in which each target is sequentially displayed in the form of white light after a full black frame is shown. Such a light gun is for example known from US4813682A. How it works is that in the frame after the full black frame, the target area is the only area on the screen that is white, and the rest of the screen remains black. A photodiode inside the light gun detects this change from low to bright light as well as the duration of the flash to identify a target an optionally multiple targets on screen.
  • More modern type light guns rely on one or several infrared or laser light emitters placed near the screen and one sensor in the gun.
  • certain values are sent toward the computer, such as the intensity of the infrared or laser light beam. Since the intensity depends on both the distance and the relative angle to the screen, angle sensors are located in the gun.
  • a trigonometric equation system is solved, on the basis of which the point of impact is determined.
  • no angle detector is present, but only four light detectors. These are however less accurate since calculation of the point of impact is difficult.
  • Other variants are also known in which the sensors are located around the screen and emitters are located inside the gun.
  • Multiplayer environments can pose problems for known light gun game systems since the system needs to be able to handle all data from the controllers and be able to distinguish the different controllers from each other.
  • For professional interactive systems such as the systems used in amusement parks, it is not uncommon to have more than 8 or even 16 simulations users playing a single game and moving around in a large scene.
  • Home entertainment systems are not capable of handling and processing such high amounts of simulations inputs from all users and handling and processing large scenes with a plurality of interactive objects divided over the scene in between which h a plurality of users move around in the scene.
  • known multiplayer game environments with pointing devices in the form of laser guns have the drawback that the implementation of a laser light is not only from a technical point of view challenging due to calibration for example, due to the use of laser light strict regulations must be met.
  • the known laser guns have limit freedom since they are mostly mounted to a fixed position in the scene or to the carriage moving through the scene, since the system needs these positions to be constant factor in order for the game to work correctly.
  • Yet another object of the present invention is to provide an improved professional interactive system that can be used as a professional entertainment or infotainment system capable of processing high amounts of simultaneous users such that they can all interact with objects of the scene with an increased realistic experience for the users and a large degree of freedom of control.
  • a professional interactive system for interaction of a plurality of users with at least one object within a scene comprising:
  • At least one human interface device per user of said plurality of users arranged for generating control signals corresponding to a user input from said at least one user;
  • said at least one human interface devices are motion controller devices, each comprising at least one hand held unit to be held by said at least one user and comprising a orientation determining unit for determining the orientation of the hand held unit, as well as a wireless transmitter unit for transmitting a radio frequency signal throughout said scene, wherein said radio frequency signal comprises identification information for identifying said human interface device from said plurality of human interface devices of said plurality of user, wherein said professional interactive system further comprises a plurality of beacons distributed over said scene, arranged for receiving said radio frequency signal from said wireless transmitter units of said motion controller devices and identifying said motion controller devices as well as determining said positions of said motion controller devices within said scene for interaction with said at least one object within said scene.
  • Known professional interactive systems for interaction of one or, mostly multiple simultaneous users are comprised of at least one computer device, one or more (large) display screens for displaying a virtual environment and at least one human interface device per user for manipulating objects within the virtual environment.
  • the computer device is the central device within the system, which consists of a single computer or a cluster of computers to perform all calculations, process all input control signals from the human interface devices and control the manipulation of the objects of the scene as a response to the input signals.
  • the human interface device or devices are arranged to generate the input control signals for the computer device based on input of the user, and to communicate these control signals towards the computer device which calculates in accordance herewith the movement of the object, for example on the display screen.
  • the human interface device of the present invention is a motion controller device and in particular a motion controller device for which its position within the scene is determined by a radio frequency beacons based localisation principle.
  • the motion controller device is comprised of at least one unit that can be held in the hand of the user, i.e. a hand held unit, and of a controller unit.
  • the system can determine both the actual position of each device within the scene as well as the orientation of each device. Preferably not only position and orientation can be determined but also other movement or position information such as altitude, and acceleration.
  • the orientation of the controller unit i.e. the human interface device / the (hand held) motion controller device, is preferably determined within the hand held unit by a orientation determining unit.
  • the orientation determining unit is arranged to determine the orientation through any one or more of an (tri-axial) accelerometer, a (tri-axial) gyroscope and a (tri-axial) geomagnetic sensor.
  • the orientation (and preferably also acceleration) is communicated, preferably wireless, towards the at least one computer device for processing and manipulation of aspects of the interactive system such as one or more of the object(s) in the scene.
  • the system is, as indicated, not only arranged to determine the orientation but also to determine the position of the hand held unit within the scene.
  • the system is comprised of a plurality of beacons.
  • the beacons are distributed evenly of the scene or at specific positions along a path of the users or the users within the carriages throughout the scene.
  • the system, in particular the at least one computer device is thus arranged to determine the position of the hand held units in the scene on the basis of both the beacons and a wireless transmitter for transmitting and/or receiving wireless radio frequency signals within the hand held unit.
  • the system is arranged to determine the position preferably on the basis of a difference in the signals from/towards the beacons, wherein the difference in particular is any one or more of a time of flight, two way ranging time of flight, double sided two way ranging, symmetrical double sided two way ranging, time of arrival, time difference of arrival, and signal power.
  • Determining (in real time) the actual location or position by a system based on any of these above mentioned techniques is based on the fact that distance between the beacon and the hand held unit can be determined if the time between sending and receiving is known, given the fact that the speed of the radio frequency signal is a constant factor, i.e. 299792 km/s, and given the fact that the positions of the beacons (in the real world/actual positions) are known as well.
  • the distance between one of the beacons and the hand held device is known, one can conclude that its position in anywhere on circumference of the circle having a radius corresponding to the determined distance. If accordingly the distance between the hand held device and multiple beacons is determined, multiple circles can be defined, in which the position of the hand held device lies in the point in which these circles cross each other. The more beacons, the more distances, the more circles, the more precise one can determine its position. Moreover, with three beacons one could determine the position and a forth or further beacon can be used to determine if errors are made, and also to further increase accuracy or correct for these errors.
  • the beacons and the hand held device communicate either in a one way direction in which the hand held device is preferably the transmitter and the beacons the receivers, but it could also be the other way round or both could transmit and receive.
  • the first is preferred however since the hand held then only needs to transmit a single radio frequency, RF, pulse and thus less power is consumed.
  • both the beacons and the hand held device have exact internal clocks, e.g. which are synchronised. Alternatively, only the beacon is provided with an exact internal clock.
  • the beacons and the hand held device communicate over an ultra wideband RF frequency and more preferably one or multiple channels of an ultra wideband RF frequency according to the I EEE standard 802.15.4, more in particular I EEE 802.15.4-201 1 or I EEE 802.15.4-2015.
  • These ultra wideband UWB, frequencies have the advantage of a high bandwidth/throughput, and low sensitivity to interference by and on WiFi signals.
  • UWB is not susceptible to fading by reflections.
  • the motion controller device can in its most simple example be arranged to detect the orientation of the hand held unit, and preferably also the position in respect thereof. Even more preferably, the motion controller device can detect six degrees of freedom, which refers to the freedom of movement of the hand held unit in three-dimensional space. Specifically, the movement of the hand held unit can be detected when moving forward/backward (Z), up/down (X), left/right (Y) (translation in three perpendicular axes) combined with rotation about three perpendicular axes, known as pitch, yaw, and roll.
  • Z forward/backward
  • X up/down
  • Y left/right
  • the motion controller device is a device for controlling movement of one or more objects in the scene.
  • the motion controller device determines an orientation of the hand held unit and generates a corresponding control signal thereof, which control signal is send to the computer device for further processing.
  • This control signal thus at least comprises the orientation of the hand held unit, which corresponds to a direction in which the hand held unit is pointed.
  • a projectile can be fired by a human interface device such as a gun or the like, the moment of firing, i.e. the trigger, and the trajectory of the projectile are determined by the control signal and hence, by a trigger of the device and movement defined by two values in two perpendicular planes.
  • the human interface device is a laser gun
  • the gun emits a particle beam in the form of a non-visible light, e.g. a laser light, which is aimed at the large screen display.
  • a non-visible light e.g. a laser light
  • One or more sensors or cameras are arranged to sense the non-visible light and determine the position aimed at.
  • the computer determines if the aimed position and the actual position of the target on the screen correspond and if thus the shot was a hit or miss.
  • the invention is based on the insight that the objects in the scene can be manipulated not only by orientation of the human interface device but also by the position of the human interface device in respect of the object.
  • the orientation and position of the human interface device in this case the motion controller device, and hence the hand held unit thereof, is used to define a trajectory.
  • the hand held unit is a gun for firing a projectile in a trajectory towards a physical target object in the scene or a virtual target object shown on a large displays screen
  • the position of the gun, hence the hand held unit, in respect of the object and the orientation of the gun, thus the direction in which the gun is aimed determines the trajectory of the projectile and hence if upon firing a fire button or the like, the target object is hit or missed.
  • the computer device can determine or calculate the (relative) position of the hand held device in respect of the object, or even the actual or absolute position of the controller within the scene as well as the orientation of the hand held device, the computer can calculate an extrapolated trajectory thereof and determine if the target object is in that extrapolated trajectory and thus if the shot was a hit or a miss.
  • the computer device to which the motion controller device is connected knows or can determine or calculate the position of the motion controller device and knows the position of the object, for example by a predefined object location position and predefined motion controller device position location stored in the computer or remote.
  • one or both of the positions of the motion controller device and the object are determined at certain (calibration) moments in time, for example when a carriage of a ride enters the scene where the users can see the object and can start the game.
  • the calibration methods of determining one of the motion controller devices and object positions (or both) can be performed on a real-time or near real-time basis.
  • the manipulation of the at least one object within the scene is controlled if the at least one object is located in a trajectory corresponding to the determined orientation and a determined position of the motion controller device in respect of the at least one object.
  • the motion controller device is arranged to manipulate an object in a relative manner, i.e. if the hand held unit of the motion controller device is moved forward, the object moves forward as well for example.
  • This example is in particular useful when using a large display screen on which the object can be shown and manipulated, e.g. moved.
  • the object is however controlled in a different manner more in accordance with a shooter game.
  • the motion controller device can act as a gun.
  • the gun i.e. the hand held unit of the motion controller
  • the shot can be counted as a hit.
  • the position of the motion controller device, and in particular the hand held unit is determined by the computer, such in relation to the object.
  • the orientation of the hand held unit is determined and both the position and orientation define a trajectory. If the object is within that trajectory, the shot is counted as a hit.
  • the at least one object within the scene is a physical object, and in another example the at least one object within the scene is a virtual object within a virtual environment displayed on at least one display screen.
  • the manipulation comprises providing any one or more of the group consisting of: a visual signal, an audio signal, change of position of said at least one object, change of appearance of said at least one object.
  • the object can be a physical object located somewhere within the scene, or a virtual object displayed on a display screen.
  • the manipulation of the object is to be understood as for example an audio signal or a visual signal such as a blinking light by which the object indicates that is has been hit.
  • a virtual object manipulation is to be understood in a broad sense.
  • the virtual object can for example explode, move, fall over or whatever is suitable in the game setting in which it is used.
  • the at least one computer device is arranged to display a projectile on the at least one display screen in a projectile trajectory defined by the determined orientation and position of the at least one hand held device.
  • the object can be target object and a further object can be displayed on the display screen in the form of a projectile that is launched towards the target object. That projectile is displayed on the screen in a trajectory that corresponds to the orientation and position of the hand held device.
  • the at least one computer device is arranged to display the projectile on the at least one display screen in a projectile trajectory defined by an extrapolation of a trajectory defined by the determined orientation and position and a determined movement of the at least one hand held device.
  • the trajectory of the projectile is only determined by the position and orientation of the hand held device, which corresponds to a motion controller device which points (or aims) towards a target objects. If besides the position and orientation also the movement of the hand held device is determined, a trajectory of that movement can be determined and used to define an extrapolation of that trajectory as the extrapolated trajectory of the projectile.
  • the projectile for example a snowball in a snowball shooter game, can also travel in non-straight lines.
  • a curvature in the trajectory of the hand held device could be of influence on the trajectory of the projectile.
  • control signal comprises a trigger signal for triggering manipulation of the at least on object and wherein the trigger signal is, in particular generated by at least one of the group comprising: a gun trigger, a fire button and a pull cord.
  • the control signals comprise the orientation of the hand held unit and the position thereof or of the controller device, such in respect of the object or the large display screen.
  • the control signals can further comprise a trigger signal to trigger the firing of the projectile.
  • a separate trigger unit can do this, or a trigger unit on the hand held device, e.g. a gun trigger, fire button or pull cord.
  • the motion controller device comprises at least two hand held devices to be held by two hands of the at least one user and a single controller device.
  • One single controller device can control multiple hand held device, for example two devices one for each hand of a single user, or one for each user.
  • controller unit is further arranged for detecting at least one of pitch, yaw and roll movement of the at least one hand held unit in respect of the controller unit and for generating the control signals thereon for movement of the at least one object within the virtual environment in trajectory corresponding to the at least one of pitch, yaw and roll movement.
  • the motion controller device can detect orientation through the orientation determining unit and preferably also motion of the hand held unit wherein the controller unit can detect six degrees of freedom which refers to the freedom of movement of the hand held unit in three-dimensional space. Specifically, the movement of the hand held unit can be detected when moving forward/backward (Z), up/down (X) , left/right (Y) (translation in three perpendicular axes) combined with rotation about three perpendicular axes, known as pitch, yaw, and roll.
  • Z forward/backward
  • X up/down
  • Y left/right
  • the at least one computer device is arranged to determine the position of the motion controller device by receiving an identification value comprised in the wireless communication, i.e. RF signal, towards the at least one computer, and determine a corresponding pre-defined position of each beacon stored within a memory of the at least one computer device and calculate the distance between the hand held devices and the beacons and thus calculate the actual position.
  • an identification value comprised in the wireless communication i.e. RF signal
  • the system preferably comprises multiple motion controller devices, for example 1 per user, or 1 per 2 users or 1 per 4 or 8 users, etc.
  • the motion controller can determine orientation of the hand held unit and the computer device can calculate the trajectory of the projectile thereof by determining the position of the motion controller device for example by a position table stored in a memory of the computer device wherein each motion controller device is identified by a unique identification value and a corresponding position in the scene, i.e. in relative to the large screen or the physical object.
  • each motion controller device is identified by a unique identification value and a corresponding position in the scene, i.e. in relative to the large screen or the physical object.
  • the computer device receives control signals from the motion controller device it can distinguish each motion controller device by an identification value comprised in the control signals. Then the computer device can access the memory to determine which position belongs to that motion controller device.
  • the at least one computer device is arranged to determine the position of the motion controller device by a visual and/or RF tracking system.
  • the computer device can also determine the position of each motion controller device in respect of the large display screen or the physical object by visual recognition of the motion controller device or in particular of the hand held unit
  • the computer device to which the motion controllers are connected preferably knows the position of the controller and knows the position of the display or the physical objects, for example by a predefined display location position and predefined controller position location stored in the computer or remote mapped according to real world coordinates.
  • one or both of the positions of the controller and the display or objects are determined at certain (calibration) moments in time, for example when a carriage of a ride enters the scene where the users can see the display and can start the game.
  • the calibration methods of determining one of the controller and display or object positions (or both) can be performed on a real-time or near real-time basis.
  • system is arranged for stereoscopy, and in particular, wherein the system comprises multiple 3D glasses for the at least one user, and wherein the display screen is arranged to display virtual environment with a depth perspective.
  • the hand held unit defines a trajectory comprising a x-position, y- position and orientation, hence a two dimensional coordinate system used for manipulation of the object(s) the virtual environment.
  • more variables manipulate the movement of the objects. Examples thereof are the speed at which the hand held unit is moved can correspond to the speed variables of the objects within the virtual environment.
  • the motion controller device is arranged not only for X, Y, and Z position and movement, i.e. motion, but also for so called six degrees of freedom motion detection, which refers to the freedom of movement of a body part in three-dimensional space.
  • the body is free to move forward/backward (Z), up/down (X), left/right (Y) (translation in three perpendicular axes) combined with rotation about three perpendicular axes, known as pitch, yaw, and roll.
  • the system i.e. computer, and display are arranged for 3D.
  • the trajectory of the projectile can then be displayed as a 3D representation of the projectile starting from the hand held unit, in an extrapolated trajectory towards a certain target location on the large screen, all in 3D.
  • 3D display methods are applicable, such as a stereoscopic projector, large screen 3D display, etc.
  • the system can also be arranged for both 2D and 3D representation, by use of a 3D capable display or projector that can switch between 2D and 3D, for example by turning of a single lamp in the projector.
  • the motion controller device is arranged for wireless communication with the at least one computer via one or more of the group comprising Bluetooth, Wifi, Ethernet I EEE 802.3, Zigbee, RS422, RS485 and CAN .
  • the computer is in communicative connection with the motion controller device, either by wire or wireless.
  • wired or wireless communication standards are applicable, e.g. peer-to-peer, Ethernet, 801 .1 1 wireless, 801 .15 wireless PAN, Bluetooth, I EEE 802.3, Zigbee, RS422, RS485 and CAN .
  • the system comprises at least 2 motion controller devices, an in particular at least 4, more in particular at least 8, and even more in particular at least 16 or at least 32.
  • each of the motion controller devices is arranged to generate the control signals on position and orientation of the hand held unit of at least 2 users, and in particular at least 4 users, and more in particular at least 8, and even more in particular at least 16 or at least 32 users simultaneously.
  • the virtual environment is a virtual environment of a shooter game, and wherein the at least one object is a projectile.
  • the examples of the present invention can be applied in a plurality of different scenes and virtual environments.
  • the preferred scenes or virtual environments are however game environments, and in particular shooter games.
  • the projectile of such a shooter game is thus the projectile used in the shooter. That could be a bullet of a gun, arrow of a bow, a javelin, a spear, etc.
  • This projectile can also be a snowball from a snowball cannon in a snowball shooter game as available from the applicant of the present invention.
  • a motion controller device arranged to be used as a human interface device in a system for interaction of multiple users with a virtual environment according to any of the previous descriptions.
  • a computer device arranged to be used as a computer device in a system for interaction of multiple users with a virtual environment according to any of the previous descriptions.
  • the definition of computer device is not to be restricted to one single physical computer, but is to be understood as merely an example.
  • the one or more computers could be a physical desktop or preferably server computer, but more preferred, a plurality of physical or virtual servers.
  • an interactive amusement ride comprising a scene and a system for interaction of at least one user with a virtual environment according to any of the previous descriptions, and wherein the interactive amusement ride in particular comprises a track and at least one car or carriage for moving users over the track, wherein scene, and in particular the car or carriage comprises a plurality of motion controller devices.
  • the virtual environment is comprised of a large screen display.
  • the invention also applies to static displays, i.e. a scene in which the physical objects are the targets of the shooter game and the system is comprised of one or more computers, one or more motion controller devices and one or more physical target objects positioned somewhere in the scene.
  • the trajectory of the projectile is thus not displayed on a screen, due to the absence of the screen, but the system is still arranged to determine a virtual trajectory as an extrapolation of the trajectory defined by the position and orientation of the hand held unit.
  • the system i.e.
  • the computer thereof determines the extrapolation of the trajectory defined by the position and orientation of the hand held unit, and the position of the physical target object and then calculates whether or not the projectile hits or misses the target object.
  • the user i.e. player, is informed of a hit or miss by visual and/or audio and/or force feedback information on the motion controller device, e.g. the hand held unit or on the controller unit, or an additional device such a scoreboard, speaker, etc.
  • a professional interactive system for interaction of at least one user with at least one object within a scene comprising:
  • At least one human interface device arranged for generating control signals corresponding to a user input from the at least one user
  • the at least one computer device arranged for receiving and processing the control signals from the at least one human interface device, and for controlling manipulation of the at least one object within the scene
  • the at least one human interface device is an optical motion sensor controller device arranged for determining motion and generating the control signals for controlling the manipulation of the at least one object within the scene in correspondence to the determined motion, and wherein the manipulation of the at least one object within the scene is further controlled by a determined position of the motion controller device in respect of the at least one object.
  • Known professional interactive systems for interaction of one or, mostly a plurality of simultaneous users are comprised of at least one computer device, one or more (large) screen displays for displaying a virtual environment and at least one human interface device per user for manipulating objects within the virtual environment.
  • the computer device is the central device within the system, which consists of a single computer or a cluster of computers to perform all calculations, process all input control signals from the human interface devices and control manipulation of the objects in the scene as a response to the input signals.
  • the human interface device or devices are arranged to generate the input control signals for the computer device based on input of the user and to communicate these control signals towards the computer device which calculates in accordance therewith the movement of the object, for example on the display screen.
  • the human interface device of the present invention is a motion controller device, an in particular an optical motion controller device, which motion controller device is arranged to generating the control signals on position and movement of at least one body part of the user.
  • the motion controller is arranged to generate a depth map of the image, i.e. the scene with the user(s). This can be done in several manners, such as by imaging, e.g. optical motion detection, or by radio frequency signal reflection, e.g. RF motion detection or laser motion detection.
  • reference is made to motion optical motion controller however only by way of example.
  • the invention is not restricted to such optical motion detection only, but is also applicable for radio frequency or laser motion detection, such in all examples demonstrated below.
  • Optical motion detection can for example be performed by capturing the scene and determine difference in depth within the scene on the basis of differences in colour and/or texture. This way objects can be distinguished from their backgrounds and depths can be determined.
  • An alternative method of distinguishing objects and determining depth perspectives is by transmitting light, e.g. invisible (near)-infrared light (or laser light, or even RF signals), towards the scene and its objects. Then a camera or other type of RF or laser sensor can measure the time-of- flight of the light after it reflects off the objects. Since some objects are further away then others, the time-of-flight of the light reflecting from these objects is larger, hence, depth can be determined.
  • light e.g. invisible (near)-infrared light (or laser light, or even RF signals
  • a camera or other type of RF or laser sensor can measure the time-of- flight of the light after it reflects off the objects. Since some objects are further away then others, the time-of-flight of the light reflecting from
  • the (optical) motion controller can be arranged to transmit a light pattern to the scene, i.e. illuminate the scene with a pattern.
  • This pattern is comprised of elements that can be individually distinguished. This can be done in several manners, as long as the elements of the pattern can be individually distinguished.
  • each element is unique in one-way or the other. This uniqueness can be determined by the shape of the element, e.g. unique characters can be used that all have different shapes, or unique colours can be used wherein each element has the same shape but a unique colour, or it can be determined by its orientation or by the positioning of the element in respect of other elements.
  • examples of such patterns are an image having plural unique characters, plural dots in a unique colour, dots with a "random" but unique position within the picture (such the stars at night, each star appears in the sky as a dot and most dots are alike, however, due to the position of the star in respect of other stars one can distinguish the individual stars).
  • the scene and its objects therein e.g. the user(s), the user (and body parts thereof)
  • other objects and background elements are all constantly illuminated with this pattern of unique identifiable elements such as dots, characters or coloured dots.
  • the difference between the observed and expected element positions can be determined by a camera placed at an offset relative to the light source, i.e. a (near) infrared transmitter. The difference can be used to calculate the depth at each element.
  • the camera can also be a stereoscopic camera setup wherein two cameras are placed at an offset relative to each other and offset relative to the camera. Both can distinguish each element transmitted by the infrared transmitter and on the basis of triangular measurement the depth can be calculated for each object of the scene, i.e. for each element that is illuminated thereon.
  • the depth information is used to generate a depth map of the scene, which in turn is used to calculate a skeletal model of any objects of interest within the scene, which preferably are the users.
  • Each dot, e.g. illuminated element, of the skeletal model is mapped in a large database to determine a body part and/or to determine if it belongs to the background of the scene.
  • the dots of the skeletal model are grouped, i.e. clustered to these body parts such that body parts of the users can be determined from the scene and motion tracking of these body parts, i.e. movements of the clusters of the skeletal model from one captured frame (video still) to another, can be determined.
  • motion tracking of the user and in particular of one or more body parts of the user within the scene is accomplished. That motion tracking is used to control elements of the scene such as objects physically present therein, or objects shown within a virtual environment on a display screen.
  • the virtual environment can be an interactive shooter game wherein users can shoot a projectile towards a target displayed on the large screen.
  • the motion captured body part movement determines the trajectory of the projectile.
  • the determined body part movement of the user, i.e. determined gesture defines a trajectory.
  • the object within the virtual environment then moves within the virtual environment in a corresponding trajectory.
  • a gesture control game controlling is only able to control by relative movement.
  • That position is for example a position at the bottom centre of the display, where an end part of the gun or the like is permanently displayed.
  • the user aims at a target somewhere in the screen and when the fire button or trigger is pushed, the projectile starts its trajectory in a straight line from the bottom centre of the display towards the aimed position on the display, hoping to hit the target if the aimed position and target position correspond.
  • the trajectory of the gesture then defines the direction and trajectory of the projectile on the large screen, however, always starting from the same initial starting point, e.g. from the bottom centre of the large screen wherein the end part of the gun is displayed.
  • the invention is based on the insight that in order to really satisfy the need for interaction of multiple simultaneous users with a scene or virtual environment with an increased realistic experience for the users and a large degree of freedom of control the position of each user in the scene, the position of the user has to be determined and used to define the manipulation of the objects in the scene and in particular a trajectory of a projectile in the virtual environment starting from the position of the user towards the aimed position.
  • a carriage can ride over a track through a scene wherein the users in the carriage can interact with several objects physically placed within the scene or displayed as virtual objects within the virtual environment displayed on a large screen in the scene. Then each user has a different position within the scene and with respect to the object (physically or virtually).
  • the trajectory defined by the gesture will define the trajectory towards the object, or in particular displayed on the screen which does not start from the bottom centre of the screen but from the left side of the screen, since this is the position of the user relative to the screen.
  • the trajectory of the projectile on the screen is an extrapolation of the trajectory defined by the gesture which comprises preferably the start point of the gesture, the end point of the gesture, the trajectory of the gesture from start to end point and the relative position of the body part making the gesture relative to the large display screen. Since all users have different positions in the scene, i.e. relative to the large display screen, each projectile trajectory on the screen starts from a different point, by which the projectiles from the individual users are manipulated, i.e. moved, in a realistic manner and each user can be distinguished.
  • the manipulation of the at least one object within the scene is controlled if the at least one object is located in a trajectory defined as an extrapolation of a trajectory defined by the determined position and movement of the at least one body part of the user.
  • the optical motion controller is arranged to manipulate an object in a relative manner, i.e. if the motion controller determines that the body part of the user is moved forward, the object moves forward as well for example.
  • This example is in particular useful when using a large display screen on which the object can be shown and manipulated, e.g. moved.
  • the object is however controlled in a different manner more in accordance with a shooter game.
  • the body part can act as a gun.
  • the gun as determined by the optical motion controller, is aimed at the object, e.g. moved towards the direction of the object, the shot can be counted as a hit.
  • a trajectory is determined. From that trajectory an extrapolated projectile trajectory is determined and the computer calculates if the object is within that projectile trajectory. If that is the case, the shot can be considered a hit.
  • the at least one object within the scene is a physical object.
  • the at least one object within the scene is a virtual object within a virtual environment displayed on at least one display screen.
  • the manipulation comprises providing any one or more of the group consisting of: a visual signal, an audio signal, change of position of the at least one object, change of appearance of the at least one object.
  • the object can be a physical object located somewhere within the scene, or a virtual object displayed on a display screen.
  • the manipulation of the object is to be understood as for example an audio signal or a visual signal such as a blinking light by which the object indicates that is has been hit.
  • a virtual object manipulation is to be understood in a broad sense.
  • the virtual object can for example explode, move, fall over or whatever is suitable in the game setting in which it is used.
  • the at least one computer device is arranged to display a projectile on the at least one display screen in a projectile trajectory defined by the determined position and movement of the at least one body part of the user.
  • the object can be target object and a further object can be displayed on the display screen in the form of a projectile that is launched towards the target object. That projectile is displayed on the screen in a trajectory that corresponds to the position and movement of the body part.
  • control signal comprises a trigger signal for triggering manipulation of the at least one object, in particular generated by at least one of the group comprising: a gun trigger, a fire button and a pull cord.
  • the control signals comprise the movement of the body part and the position of the motion controller in respect of the object or the large display screen.
  • the control signals can further comprise a trigger signal to trigger the firing of the projectile.
  • a separate trigger unit can do this, or a trigger unit on a hand held device, e.g. a gun trigger, fire button or pull cord, or by a particular movement of body or body part that is pre-defined as being the trigger gesture.
  • the object can be a target object and a further object can be displayed on the display screen in the form of a projectile that is launched towards the target object. That projectile is displayed on the screen in a trajectory that corresponds to the movement of the body part.
  • the at least one computer device is arranged to determine the position of the at least one body part by determining a position of the optical motion controller device in respect of the object, by receiving an identification value comprised in the control signals and determining a corresponding pre-defined position stored within a memory of the at least one computer device.
  • the system preferably comprises multiple optical motion controller devices, for example 1 per user, or 1 per 2 users or 1 per 4 or 8 users, etc.
  • the optical motion controller can determine movement of a body part or multiple body parts and the computer device can calculate the trajectory of the projectile thereof by determining the position of the optical motion controller device for example by a position table stored in a memory of the computer device wherein each motion controller device is identified by a unique identification value and a corresponding position in the scene, i.e. in relative to the large screen or the physical object.
  • each motion controller device is identified by a unique identification value and a corresponding position in the scene, i.e. in relative to the large screen or the physical object.
  • the computer device receives control signals from the optical motion controller device it can distinguish each optical motion controller device by an identification value comprised in the control signals. Then the computer device can access the memory to determine which position belongs to that motion controller device.
  • the at least one computer device is arranged to determine the position of the at least one body part by determining a position of the optical motion controller device in respect of the object and a position of the at least one body part of the user in respect of the optical motion controller device.
  • the computer device can determine the relative position of each optical motion controller device in respect of the large display screen or the physical object, and each optical motion controller device can determine the relative position of the body part in respect of the controller, the computer can determine the relative position of the user and the body part in respect of the large display screen or the object on the basis of the sum of both positions.
  • the at least one computer device is arranged to determine the position of the at least one body part in respect of the object by determining a position of a visual and/or RF marker provided on the optical motion controller device.
  • the at least one computer device is arranged to determine the position of the at least one body part in respect of the object by determining a position of a visual and/or RF marker provided on the user.
  • the at least one computer device is arranged to determine the position of the at least one body part in respect of the object by determining a position of a visual and/or RF marker provided on the body part of the user.
  • the computer device can also determine the position by visual and/or RF recognition of the optical motion controller device or in particular of the user or the body part thereof. This has the advantage that the optical motion controller device does not have to be stationary and can be moved in the scene since the computer device can determine current positions of each motion controller device at any time.
  • the computer device to which the optical motion controllers are connected preferably knows the position of the individual controllers and knows the position of the display or the physical objects, for example by a predefined display location position and predefined controller position location stored in the computer or remote.
  • one or both of the positions of the controller and the display or objects are determined at certain (calibration) moments in time, for example when a carriage of a ride enters the scene where the users can see the display and can start the game.
  • the calibration methods of determining one of the controller and display or object positions (or both) can be performed on a real-time or near real-time basis.
  • the interactive system is arranged for stereoscopy, and in particular, wherein the system comprises at least one 3D glass for the at least one user, and wherein the display screen is arranged to display virtual environment with a depth perspective.
  • the optical motion controller device is arranged not only for X, Y, and Z position and movement, i.e. motion, but also for so called six degrees of freedom motion detection, which refers to the freedom of movement of a the body part in three-dimensional space.
  • the body parts detected are free to move forward/backward (Z), up/down (X), left/right (Y) (translation in three perpendicular axes) combined with rotation about three perpendicular axes, known as pitch, yaw, and roll.
  • the system i.e. computer, and display are arranged for 3D.
  • the trajectory of the projectile can then be displayed as a 3D representation of the projectile starting from the body part, in an extrapolated trajectory towards a certain target location on the large screen, all in 3D.
  • 3D display methods are applicable, such as a stereoscopic projector, large screen 3D display, etc.
  • the system can also be arranged for both 2D and 3D representation, by use of a 3D capable display or projector that can switch between 2D and 3D, for example by turning of a single lamp in the projector.
  • the optical motion controller device is arranged for wireless communication with the at least one computer via one or more of the group comprising Bluetooth, Wifi, Ethernet I EEE 802.3, Zigbee, RS422, RS485 and CAN .
  • the computer is in communicative connection with the optical motion controller device, either by wire or wireless.
  • wired or wireless communication standards are applicable, e.g. peer-to-peer, Ethernet, 801 .1 1 wireless, 801 .15 wireless PAN, Bluetooth, I EEE 802.3, Zigbee, RS422, RS485 and CAN .
  • the interactive system comprises at least 2 optical motion controller devices, and in particular at least 4, more in particular at least 8, and even more in particular at least 16 or at least 32.
  • each of the optical motion controller devices is arranged to generate the control signals on movement of the body part or body parts of at least 1 user, or at least 2 users, and in particular at least 4 users, and more in particular at least 8, and even more in particular at least 16 or at least 32 users simultaneously.
  • each of the motion controller devices is arranged to generate the control signals on position and movement of at least one body part of at least 2 users, and in particular at least 4 users, and more in particular at least 8, and even more in particular at least 16 or at least 32 users simultaneously.
  • the scene is a scene of a shooter game, and wherein the at least one object is a target object to be shot with a projectile.
  • the at least one computer device is arranged to communicate with a plurality of optical motion controller devices and wherein each respective position thereof is determined by the at least one computer device.
  • a speed of movement of the at least one body part is determined by the optical motion controller device and wherein a speed of movement of the at least one object on the display screen corresponds to the determined speed of movement of the at least one body part.
  • an optical motion controller device arranged to be used as a human interface device in an interactive system for interaction of multiple users with scene according to any of the previous descriptions.
  • a computer device arranged to be used as a computer in an interactive system for interaction of multiple users with scene according to any of the previous descriptions.
  • an interactive amusement ride comprising a scene and an interactive system according to any of the previous descriptions, and wherein the interactive amusement ride in particular comprises a track and at least one car or carriage for moving users over the track, wherein the scene, and in particular the car or carriage comprises a plurality of the optical motion controller devices.
  • the preferred example or embodiment is a virtual environment such as a virtual game environments, and in particular a shooter game.
  • the projectile of such a shooter game is thus the projectile used in the shooter. That could be a bullet of a gun, arrow of a bow, a javelin, a spear, etc.
  • This projectile can also be a snowball from a snowball cannon in a snowball shooter game as available from the applicant of the present invention.
  • the scene comprises a virtual environment having a large screen display.
  • static displays i.e. a scene in which the objects are physical objects, e.g. the targets of the shooter game, and the system is comprised of one or more computers, one or more optical motion controller devices and one or more physical target objects positioned somewhere in the scene.
  • the trajectory of the projectile is thus not displayed on a screen, due to the absence of the screen, but the system is still arranged to determine a virtual trajectory as an extrapolation of the trajectory defined by the gesture.
  • the system i.e.
  • the computer thereof determines the extrapolation of the trajectory defined by the gesture, the position of the optical motion controller device and the position of the physical target object and then calculates whether or not the projectile hits or misses the target object.
  • the user i.e. player, is informed of a hit or misses by visual and/or audio and/or force feedback information on the optical motion controller device or an additional device such a scoreboard, speaker, etc.
  • Figure 1 shows a setup of a system, according to a first aspect of the invention, with a computer, large screen display and multiple users each controlling a single hand held unit.
  • Figures 2a and 2b show illustrations, according to a first aspect of the invention, of an example of a hand held unit and a controller unit and its orientation principle as well as the trajectory of shooting the projectile towards the screen.
  • Figures 3 shows a hand held motion controller device, according to a first aspect of the invention, in the embodiment of two hand held units and one controller unit.
  • Figure 4 shows a setup of a system according to a first aspect of the invention with a computer, large screen display and multiple users and optical motion controller devices.
  • Figures 5 shows an illustration according to an example of the invention of a trajectory of a projectile defined by gesture detected by the optical motion controller device and the corresponding trajectory of the projectile on the large screen display.
  • Figures 6 shows other illustrations according to an example of the invention of trajectories of a projectile defined by detected gestures.
  • a scene 100 is illustrated in accordance with a first aspect of the invention.
  • the scene is in this example an interactive stationary ride in which multiple users, which users 131 , 132 can manipulate objects 141 , 142, 143, in the virtual environment.
  • the interactive virtual environment illustrated in this example is a shooter game and in particular a game wherein multiple users 131 , 132 can each simultaneously operate their own hand held shooter, e.g. the hand held units 151 , 152.
  • the game illustrated here is a game wherein projectiles in the form of snowballs are fired from a snowball cannon 151 , 152.
  • first hand held unit is a first snowball cannon 151
  • second hand held unit is the second snowball cannon 152.
  • the cannons are not attached to the car, carriage or other part of the scene but can be held in hand by the users 131 , 132.
  • the snowball cannons 151 , 152 are just illustrative. Many different shapes and forms of motion controller devices are applicable.
  • the first user 131 operates the snowball cannon 151 by moving the cannon in any direction and aiming it at the target objects on the screen 141 , 142, all according to a spherical coordinate system as determined by the orientation determining unit of motion controller device.
  • the present invention is based on the insight of the use of a spherical coordinate system wherein the orientation of the hand held units 151 , 152 can be determined by the orientation determining unit, e.g. through a gyroscope and/or accelerometer.
  • Fig. 1 further shows a fan 191 which can be activated to create a flow of air, and a computer device 120.
  • the computer can also be comprised of a plurality of computers, for example a cluster of computers, whatever is sufficient to process al the data of the system.
  • the computer device 120 is either attached to an active large screen display 1 10, for example a large computer screen or large size television.
  • the large screen display is a passive display and the actual image of the virtual environment on the screen is generated by a projector arrangement 160.
  • the projector will be connected with the computer device 120 via wired communication 122.
  • the present invention is not restricted to the form of communication, i.e. wired or wireless, and is applicable to both forms and implementations thereof.
  • the scene 100 of Fig. 1 further shows additional speakers 181 , 182 which are arranged to add an audio component to the virtual environment.
  • the scene 100 of Fig. 1 shows, by way of example, a two-speaker set-up.
  • the invention as the skilled person will understand, is not restricted to merely a two speaker stereo setup, and is applicable to all audio set-ups, i.e. mono, stereo, and surround sound set- ups.
  • Fig. 1 further shows a camera system 171 which can for example be used to record the users 131 , 132 and to determine for example whether or not a motion controller device 151 , 152 should be enabled or disabled from/in the game if a user 131 , 132 is detected that is operating the pointing device.
  • the camera system 171 can further be used to for example trigger the start of the game, upon detection of movement of any person within the scene, or to extract images of the users such that these images of the users can be used in the virtual environment, e.g. as avatars.
  • the example shown in Fig. 1 is an example wherein the scene 100 comprises a large screen display 1 10 connected to the computer device 120.
  • the objects 141 , 142, 143 are shown as virtual objects within a virtual environment.
  • the invention however also relates to a scene wherein the objects are physical objects within the scene 100.
  • These objects can be target objects towards which the users should aim the projectiles. If the projectile hits the target object, the object interacts by for example an audio signal, a visual signal or the like.
  • the interaction can also be in the form of an additional score board or other device from which the user can determine if the shot was a hit or miss.
  • Fig. 2a an illustration of the functioning of the spherical coordinate system on the motion controller device 151 .
  • the centre 156 of the hand held unit 151 corresponds to the origin 210 of the spherical coordinate system.
  • the orientation determining unit within the hand held devices 151 , 152 of the users 131 , 132 are arranged to determine, e.g. by a tri-axial gyroscope and/or accelerometer the exact orientation of the devices 151 , 152. Together with the a plurality of beacons (not shown in this figure) distributed over the scene, the exact orientation and the position of the devices 151 , 152 can be determined.
  • the vector 260 corresponds to orientation of the hand-held unit 151 .
  • the exact orientation in a three-dimensional coordinate system is further determined by an azimuth 230, and an elevation or altitude 240 components.
  • the azimuth component defines the angle of the vector from the origin around a horizon, i.e. a horizontal displacement.
  • the azimuth is for example denoted as the angle alpha, a.
  • Azimuth can also be more generally defined as a horizontal angle measured clockwise from any fixed reference plane or easily established base direction line.
  • the components of the spherical coordinate system can also be comprised of a radial distance, a polar angle and azimuthal angle. The radial distance is then the length of the vector, the polar angle the altitude and the azimuthal angle the azimuth.
  • the altitude component defines the angle of the vector from the origin and the plane perpendicular to the horizon. For example, the angle between the vector and a reference horizon such a horizontal plane through the centre of the large screen display.
  • the altitude is for example denoted as the angle beta, ⁇ .
  • a motion controller device 151 also in the form of a shooter that can be held in hand.
  • the shooter 151 shoots a (virtual) projectile 463 towards a target object 431 , in this particular example a target object in the form of a virtual/digital object shown on the large screen 1 1 1 .
  • the projectile 463 follows a trajectory 462 on the screen towards the target object 431 which corresponds with the trajectory 461 outside the screen 1 1 1 , in particular the trajectory 462 is an extrapolation from the trajectory 461 from the shooter 151 , which trajectory 461 corresponds with the longitudinal axis of the shooter, hence the direction in which it is aimed.
  • the trajectory is determined by determining both the position of the shooter 151 in respect of the screen, or in absolute sense within the scene (real world coordinates), as well as on the determined orientation of the shooter. On the basis of these two variables the system is able to determine if the shot is a miss or a hit.
  • FIG. 3 an example of a suitable set-up of a professional interactive ride is disclosed, shown from a top view perspective.
  • Fig. 3 shows a scene 400 in which several objects or targets 431 , 432 are located throughout the scene 400. These targets can be physical objects or virtual objects shown on a large screen like a display or by projection.
  • the users in this example in the amount of six, take place in a carriage 410, 420, 430, and each have their own hand held device 41 1 , 412.
  • the carriage (automatically) moves over a track in a predetermined path 450 through the scene 400 and as it moves, passed several (interactive) objects 431 , 432.
  • targets/objects, beacons, users, etc. are indicated by a reference figure. However, although only some are, it is to be understood that several of these targets, beacons, users, etc. may exist in the scene.
  • the users can use the hand held device to shoot at the targets.
  • the orientation of the hand held devices are determined by the orientation determining units within the devices as described above. For realistic shooting experience, and optimal interaction of the users with the scene, orientation is not enough.
  • the positions of the users In order to determine if each of the users aims and fires correctly, the positions of the users have to be determined as well, such as high speed in order not to be biased by the movement of the users themselves.
  • the scene 400 is provided with a plurality of beacons, 421 , 422, 423, etc. These beacons can emit a radio frequency signal but preferably are arranged to receive a radio frequency signal, preferably in the UWB range, that is transmitted by the hand held devices 41 1 , 412.
  • the distance between the beacon 421 , 422, 423 and the hand held devices 41 1 , 4123 can be calculated.
  • the time of flight converted to a distance, determines the radius from the beacon.
  • beacon 1 , 421 can determine a certain time of flight between when the UWB RF signal was transmitted by the hand held device, and the time it arrives at beacon 1 , 421 . This is translated into the radius 427. I n the same way the radius 426 from beacon 2, 422 is determined and the radius of the third beacon, and preferably even more beacons.
  • the position at which the circles 426, 427, 428 cross each other, corresponds to the position 440 determined as the position of the hand held device, and thus the user.
  • This data is processed and transmitted to the server for further processing by the system and to manipulate aspects of the scene 400, for example to indicate a hit of the target by one of the users if both the position and direction (orientation) of the hand held device correspond with the target.
  • a scene 100 is illustrated in accordance with a first aspect of the invention.
  • the scene 100 is in this example an interactive stationary ride for multiple users, which users 131 , 132 can interact and manipulate objects 141 , 142, 143, in the virtual environment.
  • the interactive virtual environment illustrated in this example is a shooter game and in particular a game wherein multiple users 131 , 132 can each manipulate the objects within the virtual environment on the basis of detected gestures, i.e. position, orientation, motion, of their body parts 151 , 152.
  • the game illustrated here is a game wherein snowballs are fired by motion of the body parts 151 , 152 for example as if they would start from the body or body part in particular.
  • the first user 131 manipulates the object 141 or objects 141 , 142, 143 on the screen by moving his or her arm 151 .
  • the movement i.e. motion, defines a trajectory that corresponds with either the trajectory of the object moving on the screen, or with a projectile trajectory towards the object.
  • the gesture is detected and used to manipulate the object, i.e. as a target object, and in the other example the gesture is detected and used to manipulate a projectile towards the object, i.e. the target object.
  • there are two objects to be recognised one being the target object, the other being the projectile, e.g. a snowball.
  • the trajectory of the projectile e.g. the snowball
  • the trajectory of the projectile is shown on the large screen display 1 10, and in particular on the active part of the display 1 1 1 , towards a first target object 141 of the plurality of objects 141 , 142, 143 of the virtual environment.
  • the snowball trajectory is not only defined by relative movement, i.e. the movement of the body part in relation to the motion sensor 153, but also by the absolute movement, i.e. by the sum of the movement of the body part in relation to the motion sensor 153 and the position of the motion sensor 153 in relation to the large screen display 1 10 or particular objects, hence its absolute position within the scene 100.
  • Fig. 4 further shows a fan 191 which can be activated to create a flow of air, and a computer device 120.
  • the computer can also be comprised of a plurality of computers, for example a cluster of computers, whatever is sufficient to process all the data of the system.
  • the computer device 120 is either attached to an active large screen display 1 10, for example a large computer screen or large size television.
  • the large screen display is a passive display and the actual image of the virtual environment on the screen is generated by a projector arrangement 160.
  • the projector will be connected with the computer device 120 via wired communication 122.
  • the present invention is not restricted to the form of communication, i.e. wired or wireless, and is applicable to both forms and implementations thereof.
  • the scene 100 of Fig. 4 further shows additional speakers 181 , 182 which are arranged to add an audio component to the virtual environment.
  • the scene 100 of Fig. 4 shows, by way of example, a two-speaker set-up.
  • the invention as the skilled person will understand, is not restricted to merely a two speaker stereo setup, and is applicable to all audio set-ups, i.e. mono, stereo, and surround sound setups.
  • Fig. 4 further shows a camera system 171 which can for example be used to record the users 131 , 132 and to determine for example whether or not a pointing device 151 , 152 should be enabled or disabled from/in the game if a user 131 , 132 is detected who is operating the pointing device.
  • the camera system 171 can further be used to for example trigger the start of the game, upon detection of movement of any person within the scene or to recognise users, or to extract images of the users such that these images of the users can be used in the virtual environment, e.g. as avatars.
  • Known game controllers are only able to control the aspects of a game or virtual environment by relative movement.
  • the trajectory of the snowball displayed on the screen always starts from the same position. That position is for example a position at the bottom centre of the display, where an end part of a gun, or applicable here, a snowball cannon, is permanently displayed.
  • the user aims at a target somewhere in the screen and when the fire button or trigger is pushed, the snowball starts its trajectory in a straight line from the bottom centre of the display, at the end of the cannon, towards the aimed position on the display, hoping to hit the target, e.g. target object 141 , if the aimed position and target object position 141 correspond.
  • the trajectory is different since the starting position of the snowball on the display is not static but corresponds to the relative position between the users 131 , 132 or at least their body parts 151 , 152, and the display 1 10. If the body part 151 is located at a certain distance away from the display, illustrated by a position on the Z depth axis of the large display screen 1 10 as illustrated in Fig. 4, but positioned at centre of the display, i.e. in the centre of the width of the display, thus in the origin of Z horizontal axis of the large screen display 1 10 as illustrated in Fig.
  • the trajectory of the projectile starts at the middle of the display, from the bottom towards the aimed position on the display, e.g. target object 141 .
  • the body part 151 is located at the right side of the display 1 10, thus on a position away from the origin on the X horizontal axis, the trajectory of the snowball starts from the bottom right corner of the screen 1 10.
  • the motion detection system 153 is able to manipulate an object in a virtual environment, e.g. a trajectory of a snowball in a snowball shooter game or the target object, not only on the basis of the aimed direction but also on the relative position of the body part 151 in relation to the display 1 10.
  • the motion sensor device 153 is in particular arranged to define or calculate a trajectory of the projectile by a first, starting position of the body part, a second, end position on the body part and a trajectory defined from first to second position.
  • the example shown in Fig. 4 is an example wherein the scene 100 comprises a large screen display 1 10 connected to the computer device 120.
  • the objects 141 , 142, 143 are shown as virtual objects within a virtual environment.
  • the invention however also relates to a scene wherein the objects are physical objects within the scene 100.
  • These objects can be target objects towards which the users should aim the projectiles. If the projectile hits the target object, the object interacts by for example an audio signal, a visual signal or the like.
  • the interaction can also be in the form of an additional score board or other device from which the user can determine if the shot was a hit or miss.
  • the computer device 120 to which the motion sensor device 153 is connected knows the position of each body part 151 , 152 and knows the position of the display 1 10 for example by a predefined device location position value, stored local or remote.
  • one of, or both the positions of the body parts 151 , 152 and the display 1 10 are determined at certain (calibration) moments in time, for example when a carriage of a ride (not shown) enters the scene where the users 131 , 132 can see the display 1 10 and can start the game.
  • the calibration methods of determining one of the body parts 151 , 152 and display 1 10 positions (or both) can be performed on a real-time or near real-time basis.
  • the motion sensor device 153 is arranged to define or calculate a trajectory of the projectile, that trajectory is defined by the first position on of the body part 151 , the second position thereon and trajectory from first to second.
  • the computer device calculates a further virtual trajectory based on the trajectory determined thereof.
  • the further virtual trajectory is thus an extrapolation of the determined trajectory. If the position of the display is somewhere within that further virtual trajectory, the projectile is shown on the display. If however, the extrapolated trajectory does not cross the display, the projectile is not shown on the display.
  • the computer device can determine on the basis of the position of the display and the defined virtual trajectory, i.e.
  • the extrapolated trajectory how the projectile should be displayed.
  • the trajectory on the display i.e. as an extrapolation of the trajectory defined by the gesture, via the virtual trajectory between the body part and display also as extrapolation of the first, crosses the target on the display, the computer can count the "shot" as a "hit".
  • Fig. 5 and 6 the different trajectories are shown.
  • the large screen display 1 10, with active area 1 1 1 shows (besides other elements of the game) two objects 141 , 142 that can be manipulated. These are the targets that are aimed at by the two users 131 , 132.
  • the first user 131 with first body part 151 is on the left side of the screen 1 10, thus as shown on Fig. 4, on the left side of the X-axis. That first user 131 generates a trajectory 21 1-212-213 with his hand, i.e. the body part 151 .
  • a first start point 21 1 , a second end point 212 and the trajectory 213 from first to second define the trajectory.
  • the figure also shows an example of relative movement wherein the position of the hand 151 in relation to the display 1 10 is not relevant and determined.
  • the movement 382 of the hand from starting position 371 towards end position 372 defines a trajectory 373. That trajectory defines the control signal and thus the trajectory in which the object is moved on the screen, i.e. the trajectory 323 on the screen from start position 321 towards end position 322.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Environmental & Geological Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The present invention generally relates to an interactive system for interaction of multiple users with multiple objects within a scene. In particular the present invention relates to a professional interactive system for interaction of a plurality of users with at least one object within a scene, comprising at least one computer device, arranged for receiving and processing said control signals from said at least one human interface devices, and for controlling manipulation of said at least one object within said scene, and at least one human interface device being a motion controller device, comprising at least one hand held unit and comprising a orientation determining unit for determining the orientation of the hand held unit, as well as a wireless transmitter unit for transmitting a radio frequency signal throughout said scene, wherein said radio frequency signal comprises identification information for identifying said human interface device from said plurality of human interface devices of said plurality of user, wherein said professional interactive system further comprises a plurality of beacons distributed over said scene,arranged for receiving said radio frequency signal from said wireless transmitter units of said motion controller devices and identifying said motion controller devices as well as determining said positions of said motion controller devices within said scene for interaction with said at least one object within said scene.

Description

Title: Game controller
Description
FI ELD OF THE I NVENTION
The present invention generally relates to an interactive systems for interaction of multiple users with multiple objects within a scene. In particular the present invention relates to a motion controller device to be used to shoot virtual projectiles towards an object displayed on the display screen. BACKGROUND OF THE I NVENTION
Interactive systems are typically known as professional entertainment systems present in theme/amusement parks but also widely and increasingly used at other locations such as public entertainment environments, temporary game locations, at exhibitions or even at home or work locations. Most entertainment systems comprise a large display screen and are non-interactive and only display pre-programmed information, which users cannot manipulate or control.
The concept of an interactive system is that users can interact with the scene and objects present in the scene or displayed on the large screen display can be manipulated. A dedicated device known as a game controller typically performs the manipulation. In home use, i.e. small screen and small and static environments, home entertainment systems these controllers are for example gamepads, mice, or a pointing device such as a laser gun shooter.
Both a home and a professional entertainment system is controlled by a processing device such as a dedicated central processing unit, i.e. computer. The computer receives the input signals from the input devices, e.g. the gamepads, and processes these signals for controlling manipulation of certain physical objects in the scene or objects of the virtual environment shown on the display.
In professional environments such as in an amusement ride of an amusement/theme park, the amount of players playing in the virtual environment is higher than at home environments. A central computer device, or a cluster of central computer devices, is used to calculate and process al the input and output signals of the system to enable a multi-user interactive environment.
Moreover, in professional environments, large amounts of players simultaneously interact with a plurality of objects, e.g. physical object, digital/virtual object, or both. The interaction takes place in a area that is far larger than the small space in front of the television in home environments.
Traditional entertainment systems are for example comprised of light gun type game controllers in the form of light guns that are able to manipulate objects within the scene, such as a projectile shot from a shooter, by relative movement thereof. Such a light gun consists of at least one button, e.g. to trigger, to launch a projectile, as well as means for determining the target location on the screen, e.g. a light signal.
Light guns are known from home console systems and are often modelled on a ballistic weapon such as a pistol. Different technical designs are known. For example a detection method is known which involves drawing frames in which each target is sequentially displayed in the form of white light after a full black frame is shown. Such a light gun is for example known from US4813682A. How it works is that in the frame after the full black frame, the target area is the only area on the screen that is white, and the rest of the screen remains black. A photodiode inside the light gun detects this change from low to bright light as well as the duration of the flash to identify a target an optionally multiple targets on screen.
Drawback of these light guns is that they require the use of an old- fashioned cathode ray tube screens and cannot be used with for example modern thin-film transistor displays, liquid crystal displays or projection type screens.
More modern type light guns rely on one or several infrared or laser light emitters placed near the screen and one sensor in the gun. When the user presses the trigger of the gun certain values are sent toward the computer, such as the intensity of the infrared or laser light beam. Since the intensity depends on both the distance and the relative angle to the screen, angle sensors are located in the gun. Herewith a trigonometric equation system is solved, on the basis of which the point of impact is determined. For arcade game devices more simple variants are known in which no angle detector is present, but only four light detectors. These are however less accurate since calculation of the point of impact is difficult. Other variants are also known in which the sensors are located around the screen and emitters are located inside the gun.
Multiplayer environments can pose problems for known light gun game systems since the system needs to be able to handle all data from the controllers and be able to distinguish the different controllers from each other. For professional interactive systems such as the systems used in amusement parks, it is not uncommon to have more than 8 or even 16 simulations users playing a single game and moving around in a large scene. Home entertainment systems are not capable of handling and processing such high amounts of simulations inputs from all users and handling and processing large scenes with a plurality of interactive objects divided over the scene in between which h a plurality of users move around in the scene.
Moreover, known multiplayer game environments with pointing devices in the form of laser guns have the drawback that the implementation of a laser light is not only from a technical point of view challenging due to calibration for example, due to the use of laser light strict regulations must be met. Finally, the known laser guns have limit freedom since they are mostly mounted to a fixed position in the scene or to the carriage moving through the scene, since the system needs these positions to be constant factor in order for the game to work correctly.
As such, there is a need for an improved professional interactive system wherein at least some of the above mentioned drawbacks are removed. More in particular, there is a need for an improved professional interactive system that can be used as a professional entertainment or infotainment system capable of processing high amounts of simultaneous users such that they can all interact with objects of the scene. Moreover, traditional pointing devices such as laser light guns have restrictions with respect of detection of only certain input by the user and limited amounts of freedom to use the gun.
SUMMARY OF THE I NVENTION
It is an object of the present invention to provide an improved professional interactive system wherein at least some of the above mentioned drawbacks are removed.
It is a further object of the present invention to provide an improved professional interactive system that can be used as a professional entertainment or infotainment system capable of processing high amounts of simultaneous users such that they can all interact with objects of the scene.
Yet another object of the present invention is to provide an improved professional interactive system that can be used as a professional entertainment or infotainment system capable of processing high amounts of simultaneous users such that they can all interact with objects of the scene with an increased realistic experience for the users and a large degree of freedom of control.
In a first aspect of the invention there is provided a professional interactive system for interaction of a plurality of users with at least one object within a scene, said system comprising:
at least one human interface device per user of said plurality of users, arranged for generating control signals corresponding to a user input from said at least one user;
- at least one computer device, arranged for receiving and processing said control signals from said at least one human interface devices, and for controlling manipulation of said at least one object within said scene, characterized in that, said at least one human interface devices are motion controller devices, each comprising at least one hand held unit to be held by said at least one user and comprising a orientation determining unit for determining the orientation of the hand held unit, as well as a wireless transmitter unit for transmitting a radio frequency signal throughout said scene, wherein said radio frequency signal comprises identification information for identifying said human interface device from said plurality of human interface devices of said plurality of user, wherein said professional interactive system further comprises a plurality of beacons distributed over said scene, arranged for receiving said radio frequency signal from said wireless transmitter units of said motion controller devices and identifying said motion controller devices as well as determining said positions of said motion controller devices within said scene for interaction with said at least one object within said scene.
Known professional interactive systems for interaction of one or, mostly multiple simultaneous users are comprised of at least one computer device, one or more (large) display screens for displaying a virtual environment and at least one human interface device per user for manipulating objects within the virtual environment.
The computer device is the central device within the system, which consists of a single computer or a cluster of computers to perform all calculations, process all input control signals from the human interface devices and control the manipulation of the objects of the scene as a response to the input signals. The human interface device or devices are arranged to generate the input control signals for the computer device based on input of the user, and to communicate these control signals towards the computer device which calculates in accordance herewith the movement of the object, for example on the display screen. The human interface device of the present invention is a motion controller device and in particular a motion controller device for which its position within the scene is determined by a radio frequency beacons based localisation principle.
The motion controller device is comprised of at least one unit that can be held in the hand of the user, i.e. a hand held unit, and of a controller unit. The system can determine both the actual position of each device within the scene as well as the orientation of each device. Preferably not only position and orientation can be determined but also other movement or position information such as altitude, and acceleration.
The orientation of the controller unit, i.e. the human interface device / the (hand held) motion controller device, is preferably determined within the hand held unit by a orientation determining unit. In particular the orientation determining unit is arranged to determine the orientation through any one or more of an (tri-axial) accelerometer, a (tri-axial) gyroscope and a (tri-axial) geomagnetic sensor. The person skilled in the art with appreciate that with these sensor(s) the orientation of the device can be determined. The orientation (and preferably also acceleration) is communicated, preferably wireless, towards the at least one computer device for processing and manipulation of aspects of the interactive system such as one or more of the object(s) in the scene.
The system is, as indicated, not only arranged to determine the orientation but also to determine the position of the hand held unit within the scene. To this end the system is comprised of a plurality of beacons. The beacons are distributed evenly of the scene or at specific positions along a path of the users or the users within the carriages throughout the scene. The system, in particular the at least one computer device is thus arranged to determine the position of the hand held units in the scene on the basis of both the beacons and a wireless transmitter for transmitting and/or receiving wireless radio frequency signals within the hand held unit.
The system is arranged to determine the position preferably on the basis of a difference in the signals from/towards the beacons, wherein the difference in particular is any one or more of a time of flight, two way ranging time of flight, double sided two way ranging, symmetrical double sided two way ranging, time of arrival, time difference of arrival, and signal power.
Determining (in real time) the actual location or position by a system based on any of these above mentioned techniques is based on the fact that distance between the beacon and the hand held unit can be determined if the time between sending and receiving is known, given the fact that the speed of the radio frequency signal is a constant factor, i.e. 299792 km/s, and given the fact that the positions of the beacons (in the real world/actual positions) are known as well. When the travel time or time of flight is known, the distance can be calculated through D = V * t. Since a signal also loses power over time or over a distance upon traveling through the ether, the distance can also be determined by the inverse correlation between signal strength/power and distance.
If the distance between one of the beacons and the hand held device is known, one can conclude that its position in anywhere on circumference of the circle having a radius corresponding to the determined distance. If accordingly the distance between the hand held device and multiple beacons is determined, multiple circles can be defined, in which the position of the hand held device lies in the point in which these circles cross each other. The more beacons, the more distances, the more circles, the more precise one can determine its position. Moreover, with three beacons one could determine the position and a forth or further beacon can be used to determine if errors are made, and also to further increase accuracy or correct for these errors.
The beacons and the hand held device communicate either in a one way direction in which the hand held device is preferably the transmitter and the beacons the receivers, but it could also be the other way round or both could transmit and receive. The first is preferred however since the hand held then only needs to transmit a single radio frequency, RF, pulse and thus less power is consumed. It is also preferred that both the beacons and the hand held device have exact internal clocks, e.g. which are synchronised. Alternatively, only the beacon is provided with an exact internal clock.
Preferably the beacons and the hand held device communicate over an ultra wideband RF frequency and more preferably one or multiple channels of an ultra wideband RF frequency according to the I EEE standard 802.15.4, more in particular I EEE 802.15.4-201 1 or I EEE 802.15.4-2015. These ultra wideband UWB, frequencies have the advantage of a high bandwidth/throughput, and low sensitivity to interference by and on WiFi signals. Moreover, due to the short pulses, UWB is not susceptible to fading by reflections.
The motion controller device can in its most simple example be arranged to detect the orientation of the hand held unit, and preferably also the position in respect thereof. Even more preferably, the motion controller device can detect six degrees of freedom, which refers to the freedom of movement of the hand held unit in three-dimensional space. Specifically, the movement of the hand held unit can be detected when moving forward/backward (Z), up/down (X), left/right (Y) (translation in three perpendicular axes) combined with rotation about three perpendicular axes, known as pitch, yaw, and roll.
The motion controller device is a device for controlling movement of one or more objects in the scene. The motion controller device determines an orientation of the hand held unit and generates a corresponding control signal thereof, which control signal is send to the computer device for further processing. This control signal thus at least comprises the orientation of the hand held unit, which corresponds to a direction in which the hand held unit is pointed.
In for example a shooter game, a projectile can be fired by a human interface device such as a gun or the like, the moment of firing, i.e. the trigger, and the trajectory of the projectile are determined by the control signal and hence, by a trigger of the device and movement defined by two values in two perpendicular planes.
In known systems for interaction of multiple users with a virtual environment wherein the human interface device is a laser gun, the gun emits a particle beam in the form of a non-visible light, e.g. a laser light, which is aimed at the large screen display. One or more sensors or cameras are arranged to sense the non-visible light and determine the position aimed at. The computer then determines if the aimed position and the actual position of the target on the screen correspond and if thus the shot was a hit or miss.
The invention is based on the insight that the objects in the scene can be manipulated not only by orientation of the human interface device but also by the position of the human interface device in respect of the object. The orientation and position of the human interface device, in this case the motion controller device, and hence the hand held unit thereof, is used to define a trajectory. In case of for example a shooter game wherein the hand held unit is a gun for firing a projectile in a trajectory towards a physical target object in the scene or a virtual target object shown on a large displays screen, the position of the gun, hence the hand held unit, in respect of the object and the orientation of the gun, thus the direction in which the gun is aimed, determines the trajectory of the projectile and hence if upon firing a fire button or the like, the target object is hit or missed.
Since the computer device can determine or calculate the (relative) position of the hand held device in respect of the object, or even the actual or absolute position of the controller within the scene as well as the orientation of the hand held device, the computer can calculate an extrapolated trajectory thereof and determine if the target object is in that extrapolated trajectory and thus if the shot was a hit or a miss.
In all examples of the invention, the computer device to which the motion controller device is connected knows or can determine or calculate the position of the motion controller device and knows the position of the object, for example by a predefined object location position and predefined motion controller device position location stored in the computer or remote. In an alternative example, one or both of the positions of the motion controller device and the object are determined at certain (calibration) moments in time, for example when a carriage of a ride enters the scene where the users can see the object and can start the game. As a second alternative, the calibration methods of determining one of the motion controller devices and object positions (or both), can be performed on a real-time or near real-time basis.
With such a professional interactive system according to the invention, not only simultaneous precise orientation of a large amount of hand held human interface devices can be determined, at the same time their exact location can be determined as well. All of this can be done in a large scene, with several devices, i.e. for a large amount of simultaneous players/users, which move around through the scene for example in a train of carriages. With such human interface devices according to the invention there is a large amount of freedom, since they can be used throughout the scene and are not fixed at a certain location in the scene or fixed to an object through the scene like the carriages. By using RF based location determining units, the orientation data of the devices can preferably also be contained in the RF communication for the location determination.
In an example the manipulation of the at least one object within the scene is controlled if the at least one object is located in a trajectory corresponding to the determined orientation and a determined position of the motion controller device in respect of the at least one object.
With one example according to the invention the motion controller device is arranged to manipulate an object in a relative manner, i.e. if the hand held unit of the motion controller device is moved forward, the object moves forward as well for example. This example is in particular useful when using a large display screen on which the object can be shown and manipulated, e.g. moved.
In another preferred example according to the invention the object is however controlled in a different manner more in accordance with a shooter game. If the object is for example a target object within the shooter game, the motion controller device can act as a gun. When the gun, i.e. the hand held unit of the motion controller, is aimed at the object, the shot can be counted as a hit. To determine if the object is correctly aimed at, the position of the motion controller device, and in particular the hand held unit is determined by the computer, such in relation to the object. Moreover, the orientation of the hand held unit is determined and both the position and orientation define a trajectory. If the object is within that trajectory, the shot is counted as a hit.
In another example the at least one object within the scene is a physical object, and in another example the at least one object within the scene is a virtual object within a virtual environment displayed on at least one display screen.
In another example the manipulation comprises providing any one or more of the group consisting of: a visual signal, an audio signal, change of position of said at least one object, change of appearance of said at least one object.
As indicated, the object can be a physical object located somewhere within the scene, or a virtual object displayed on a display screen. In the first option, the manipulation of the object is to be understood as for example an audio signal or a visual signal such as a blinking light by which the object indicates that is has been hit. In the other option, of a virtual object manipulation is to be understood in a broad sense. The virtual object can for example explode, move, fall over or whatever is suitable in the game setting in which it is used. In another example the at least one computer device is arranged to display a projectile on the at least one display screen in a projectile trajectory defined by the determined orientation and position of the at least one hand held device.
In yet another example of a shooter game, the object can be target object and a further object can be displayed on the display screen in the form of a projectile that is launched towards the target object. That projectile is displayed on the screen in a trajectory that corresponds to the orientation and position of the hand held device.
In another example the at least one computer device is arranged to display the projectile on the at least one display screen in a projectile trajectory defined by an extrapolation of a trajectory defined by the determined orientation and position and a determined movement of the at least one hand held device.
In the first example, the trajectory of the projectile is only determined by the position and orientation of the hand held device, which corresponds to a motion controller device which points (or aims) towards a target objects. If besides the position and orientation also the movement of the hand held device is determined, a trajectory of that movement can be determined and used to define an extrapolation of that trajectory as the extrapolated trajectory of the projectile. This way the projectile, for example a snowball in a snowball shooter game, can also travel in non-straight lines. A curvature in the trajectory of the hand held device could be of influence on the trajectory of the projectile.
In an example the control signal comprises a trigger signal for triggering manipulation of the at least on object and wherein the trigger signal is, in particular generated by at least one of the group comprising: a gun trigger, a fire button and a pull cord.
The control signals comprise the orientation of the hand held unit and the position thereof or of the controller device, such in respect of the object or the large display screen. The control signals can further comprise a trigger signal to trigger the firing of the projectile. A separate trigger unit can do this, or a trigger unit on the hand held device, e.g. a gun trigger, fire button or pull cord.
In an example the motion controller device comprises at least two hand held devices to be held by two hands of the at least one user and a single controller device. One single controller device can control multiple hand held device, for example two devices one for each hand of a single user, or one for each user.
In an example the controller unit is further arranged for detecting at least one of pitch, yaw and roll movement of the at least one hand held unit in respect of the controller unit and for generating the control signals thereon for movement of the at least one object within the virtual environment in trajectory corresponding to the at least one of pitch, yaw and roll movement.
The motion controller device can detect orientation through the orientation determining unit and preferably also motion of the hand held unit wherein the controller unit can detect six degrees of freedom which refers to the freedom of movement of the hand held unit in three-dimensional space. Specifically, the movement of the hand held unit can be detected when moving forward/backward (Z), up/down (X) , left/right (Y) (translation in three perpendicular axes) combined with rotation about three perpendicular axes, known as pitch, yaw, and roll.
In an example the at least one computer device is arranged to determine the position of the motion controller device by receiving an identification value comprised in the wireless communication, i.e. RF signal, towards the at least one computer, and determine a corresponding pre-defined position of each beacon stored within a memory of the at least one computer device and calculate the distance between the hand held devices and the beacons and thus calculate the actual position.
The system preferably comprises multiple motion controller devices, for example 1 per user, or 1 per 2 users or 1 per 4 or 8 users, etc. The motion controller can determine orientation of the hand held unit and the computer device can calculate the trajectory of the projectile thereof by determining the position of the motion controller device for example by a position table stored in a memory of the computer device wherein each motion controller device is identified by a unique identification value and a corresponding position in the scene, i.e. in relative to the large screen or the physical object. When the computer device receives control signals from the motion controller device it can distinguish each motion controller device by an identification value comprised in the control signals. Then the computer device can access the memory to determine which position belongs to that motion controller device. In an example the at least one computer device is arranged to determine the position of the motion controller device by a visual and/or RF tracking system.
In an alternative manner the computer device can also determine the position of each motion controller device in respect of the large display screen or the physical object by visual recognition of the motion controller device or in particular of the hand held unit
The computer device to which the motion controllers are connected preferably knows the position of the controller and knows the position of the display or the physical objects, for example by a predefined display location position and predefined controller position location stored in the computer or remote mapped according to real world coordinates. In an alternative example, one or both of the positions of the controller and the display or objects are determined at certain (calibration) moments in time, for example when a carriage of a ride enters the scene where the users can see the display and can start the game. As a second alternative, the calibration methods of determining one of the controller and display or object positions (or both) can be performed on a real-time or near real-time basis.
In an example the system is arranged for stereoscopy, and in particular, wherein the system comprises multiple 3D glasses for the at least one user, and wherein the display screen is arranged to display virtual environment with a depth perspective.
As indicated, in a simple embodiment according to all examples of the invention, the hand held unit defines a trajectory comprising a x-position, y- position and orientation, hence a two dimensional coordinate system used for manipulation of the object(s) the virtual environment. In a more realistic embodiment, more variables manipulate the movement of the objects. Examples thereof are the speed at which the hand held unit is moved can correspond to the speed variables of the objects within the virtual environment.
In another example more dimensions are added. Thus, not only the X and Y dimensions, but also the Z dimension, e.g. Z position, being the distance from display to controller, is a variable in manipulating the object (projectile). Multiple dimensions based input determination is in particular more realistic if a three- dimensional representation is used. Preferably, the motion controller device is arranged not only for X, Y, and Z position and movement, i.e. motion, but also for so called six degrees of freedom motion detection, which refers to the freedom of movement of a body part in three-dimensional space. Specifically, the body is free to move forward/backward (Z), up/down (X), left/right (Y) (translation in three perpendicular axes) combined with rotation about three perpendicular axes, known as pitch, yaw, and roll. As such, the system, i.e. computer, and display are arranged for 3D. The trajectory of the projectile can then be displayed as a 3D representation of the projectile starting from the hand held unit, in an extrapolated trajectory towards a certain target location on the large screen, all in 3D. The person skilled in the art will understand what 3D display methods are applicable, such as a stereoscopic projector, large screen 3D display, etc. The system can also be arranged for both 2D and 3D representation, by use of a 3D capable display or projector that can switch between 2D and 3D, for example by turning of a single lamp in the projector.
In an example the motion controller device is arranged for wireless communication with the at least one computer via one or more of the group comprising Bluetooth, Wifi, Ethernet I EEE 802.3, Zigbee, RS422, RS485 and CAN .
The computer is in communicative connection with the motion controller device, either by wire or wireless. The person skilled in the art will understand which wired or wireless communication standards are applicable, e.g. peer-to-peer, Ethernet, 801 .1 1 wireless, 801 .15 wireless PAN, Bluetooth, I EEE 802.3, Zigbee, RS422, RS485 and CAN .
In an example the system comprises at least 2 motion controller devices, an in particular at least 4, more in particular at least 8, and even more in particular at least 16 or at least 32.
In an example each of the motion controller devices is arranged to generate the control signals on position and orientation of the hand held unit of at least 2 users, and in particular at least 4 users, and more in particular at least 8, and even more in particular at least 16 or at least 32 users simultaneously.
In an example the virtual environment is a virtual environment of a shooter game, and wherein the at least one object is a projectile.
The examples of the present invention can be applied in a plurality of different scenes and virtual environments. The preferred scenes or virtual environments are however game environments, and in particular shooter games. The projectile of such a shooter game is thus the projectile used in the shooter. That could be a bullet of a gun, arrow of a bow, a javelin, a spear, etc. This projectile can also be a snowball from a snowball cannon in a snowball shooter game as available from the applicant of the present invention.
In a second aspect there is provided a motion controller device arranged to be used as a human interface device in a system for interaction of multiple users with a virtual environment according to any of the previous descriptions.
In a third aspect there is provided a computer device arranged to be used as a computer device in a system for interaction of multiple users with a virtual environment according to any of the previous descriptions.
Throughout the description of the invention the definition of computer device is not to be restricted to one single physical computer, but is to be understood as merely an example. The one or more computers could be a physical desktop or preferably server computer, but more preferred, a plurality of physical or virtual servers.
In a fourth aspect there is provided an interactive amusement ride comprising a scene and a system for interaction of at least one user with a virtual environment according to any of the previous descriptions, and wherein the interactive amusement ride in particular comprises a track and at least one car or carriage for moving users over the track, wherein scene, and in particular the car or carriage comprises a plurality of motion controller devices.
In the invention the virtual environment is comprised of a large screen display. Such is, however only by way of example. The invention also applies to static displays, i.e. a scene in which the physical objects are the targets of the shooter game and the system is comprised of one or more computers, one or more motion controller devices and one or more physical target objects positioned somewhere in the scene. The trajectory of the projectile is thus not displayed on a screen, due to the absence of the screen, but the system is still arranged to determine a virtual trajectory as an extrapolation of the trajectory defined by the position and orientation of the hand held unit. The system, i.e. the computer thereof, then determines the extrapolation of the trajectory defined by the position and orientation of the hand held unit, and the position of the physical target object and then calculates whether or not the projectile hits or misses the target object. The user, i.e. player, is informed of a hit or miss by visual and/or audio and/or force feedback information on the motion controller device, e.g. the hand held unit or on the controller unit, or an additional device such a scoreboard, speaker, etc.
In a fifth aspect of the invention there is provided a professional interactive system for interaction of at least one user with at least one object within a scene, the system comprising:
at least one human interface device arranged for generating control signals corresponding to a user input from the at least one user;
at least one computer device arranged for receiving and processing the control signals from the at least one human interface device, and for controlling manipulation of the at least one object within the scene, characterized in that, the at least one human interface device is an optical motion sensor controller device arranged for determining motion and generating the control signals for controlling the manipulation of the at least one object within the scene in correspondence to the determined motion, and wherein the manipulation of the at least one object within the scene is further controlled by a determined position of the motion controller device in respect of the at least one object.
Known professional interactive systems for interaction of one or, mostly a plurality of simultaneous users are comprised of at least one computer device, one or more (large) screen displays for displaying a virtual environment and at least one human interface device per user for manipulating objects within the virtual environment.
The computer device is the central device within the system, which consists of a single computer or a cluster of computers to perform all calculations, process all input control signals from the human interface devices and control manipulation of the objects in the scene as a response to the input signals.
The human interface device or devices are arranged to generate the input control signals for the computer device based on input of the user and to communicate these control signals towards the computer device which calculates in accordance therewith the movement of the object, for example on the display screen. The human interface device of the present invention is a motion controller device, an in particular an optical motion controller device, which motion controller device is arranged to generating the control signals on position and movement of at least one body part of the user. The motion controller is arranged to generate a depth map of the image, i.e. the scene with the user(s). This can be done in several manners, such as by imaging, e.g. optical motion detection, or by radio frequency signal reflection, e.g. RF motion detection or laser motion detection. In the present invention reference is made to motion optical motion controller, however only by way of example. The invention is not restricted to such optical motion detection only, but is also applicable for radio frequency or laser motion detection, such in all examples demonstrated below.
Optical motion detection can for example be performed by capturing the scene and determine difference in depth within the scene on the basis of differences in colour and/or texture. This way objects can be distinguished from their backgrounds and depths can be determined. An alternative method of distinguishing objects and determining depth perspectives is by transmitting light, e.g. invisible (near)-infrared light (or laser light, or even RF signals), towards the scene and its objects. Then a camera or other type of RF or laser sensor can measure the time-of- flight of the light after it reflects off the objects. Since some objects are further away then others, the time-of-flight of the light reflecting from these objects is larger, hence, depth can be determined.
In yet another alternative manner the (optical) motion controller can be arranged to transmit a light pattern to the scene, i.e. illuminate the scene with a pattern. This pattern is comprised of elements that can be individually distinguished. This can be done in several manners, as long as the elements of the pattern can be individually distinguished. Thus each element is unique in one-way or the other. This uniqueness can be determined by the shape of the element, e.g. unique characters can be used that all have different shapes, or unique colours can be used wherein each element has the same shape but a unique colour, or it can be determined by its orientation or by the positioning of the element in respect of other elements. Thus examples of such patterns are an image having plural unique characters, plural dots in a unique colour, dots with a "random" but unique position within the picture (such the stars at night, each star appears in the sky as a dot and most dots are alike, however, due to the position of the star in respect of other stars one can distinguish the individual stars).
By illuminating the scene and its objects therein, e.g. the user(s), the user (and body parts thereof), other objects and background elements are all constantly illuminated with this pattern of unique identifiable elements such as dots, characters or coloured dots. Then the difference between the observed and expected element positions can be determined by a camera placed at an offset relative to the light source, i.e. a (near) infrared transmitter. The difference can be used to calculate the depth at each element.
The camera can also be a stereoscopic camera setup wherein two cameras are placed at an offset relative to each other and offset relative to the camera. Both can distinguish each element transmitted by the infrared transmitter and on the basis of triangular measurement the depth can be calculated for each object of the scene, i.e. for each element that is illuminated thereon.
The depth information is used to generate a depth map of the scene, which in turn is used to calculate a skeletal model of any objects of interest within the scene, which preferably are the users. Each dot, e.g. illuminated element, of the skeletal model is mapped in a large database to determine a body part and/or to determine if it belongs to the background of the scene. Then the dots of the skeletal model are grouped, i.e. clustered to these body parts such that body parts of the users can be determined from the scene and motion tracking of these body parts, i.e. movements of the clusters of the skeletal model from one captured frame (video still) to another, can be determined.
In this manner motion tracking of the user and in particular of one or more body parts of the user within the scene is accomplished. That motion tracking is used to control elements of the scene such as objects physically present therein, or objects shown within a virtual environment on a display screen. For example, the virtual environment can be an interactive shooter game wherein users can shoot a projectile towards a target displayed on the large screen. The motion captured body part movement determines the trajectory of the projectile. The determined body part movement of the user, i.e. determined gesture defines a trajectory. The object within the virtual environment then moves within the virtual environment in a corresponding trajectory.
Known gesture control game controlling however is only able to control by relative movement. Thus in a shooter game, wherein a projectile is fired, the trajectory of the projectile displayed on the screen always starts from the same position. That position is for example a position at the bottom centre of the display, where an end part of the gun or the like is permanently displayed. The user aims at a target somewhere in the screen and when the fire button or trigger is pushed, the projectile starts its trajectory in a straight line from the bottom centre of the display towards the aimed position on the display, hoping to hit the target if the aimed position and target position correspond. The trajectory of the gesture then defines the direction and trajectory of the projectile on the large screen, however, always starting from the same initial starting point, e.g. from the bottom centre of the large screen wherein the end part of the gun is displayed.
Such could be sufficient in a single user environment wherein only one user interacts within the virtual environment but in a professional environment such as an amusement park ride wherein plural users simultaneously interact with the virtual environment this is insufficient. In case of for example 8 users, these will all fire projectiles either from the same end part of the gun at the bottom centre of the large screen, or 8 different guns have to be displayed, or the large screen has to be divided in 8 individual small screens, each displaying an end part of a gun.
The invention is based on the insight that in order to really satisfy the need for interaction of multiple simultaneous users with a scene or virtual environment with an increased realistic experience for the users and a large degree of freedom of control the position of each user in the scene, the position of the user has to be determined and used to define the manipulation of the objects in the scene and in particular a trajectory of a projectile in the virtual environment starting from the position of the user towards the aimed position.
In for example an amusement park ride a carriage can ride over a track through a scene wherein the users in the carriage can interact with several objects physically placed within the scene or displayed as virtual objects within the virtual environment displayed on a large screen in the scene. Then each user has a different position within the scene and with respect to the object (physically or virtually). Thus when a user sitting in the carriage on a position which is at the left side of the object, fires a projectile by making a gesture with a body part, the trajectory defined by the gesture will define the trajectory towards the object, or in particular displayed on the screen which does not start from the bottom centre of the screen but from the left side of the screen, since this is the position of the user relative to the screen. Thus the trajectory of the projectile on the screen is an extrapolation of the trajectory defined by the gesture which comprises preferably the start point of the gesture, the end point of the gesture, the trajectory of the gesture from start to end point and the relative position of the body part making the gesture relative to the large display screen. Since all users have different positions in the scene, i.e. relative to the large display screen, each projectile trajectory on the screen starts from a different point, by which the projectiles from the individual users are manipulated, i.e. moved, in a realistic manner and each user can be distinguished.
In an example the manipulation of the at least one object within the scene is controlled if the at least one object is located in a trajectory defined as an extrapolation of a trajectory defined by the determined position and movement of the at least one body part of the user.
With one example according to the invention the optical motion controller is arranged to manipulate an object in a relative manner, i.e. if the motion controller determines that the body part of the user is moved forward, the object moves forward as well for example. This example is in particular useful when using a large display screen on which the object can be shown and manipulated, e.g. moved.
In another example according to the invention the object is however controlled in a different manner more in accordance with a shooter game. If the object is for example a target object within the shooter game, the body part can act as a gun. When the gun, as determined by the optical motion controller, is aimed at the object, e.g. moved towards the direction of the object, the shot can be counted as a hit. To determine if the object is correctly aimed at, from the movement of the body part a trajectory is determined. From that trajectory an extrapolated projectile trajectory is determined and the computer calculates if the object is within that projectile trajectory. If that is the case, the shot can be considered a hit.
In an example the at least one object within the scene is a physical object.
In an example the at least one object within the scene is a virtual object within a virtual environment displayed on at least one display screen.
In an example the manipulation comprises providing any one or more of the group consisting of: a visual signal, an audio signal, change of position of the at least one object, change of appearance of the at least one object.
As indicated, the object can be a physical object located somewhere within the scene, or a virtual object displayed on a display screen. In the first option, the manipulation of the object is to be understood as for example an audio signal or a visual signal such as a blinking light by which the object indicates that is has been hit. In the other option, of a virtual object manipulation is to be understood in a broad sense. The virtual object can for example explode, move, fall over or whatever is suitable in the game setting in which it is used.
In an example the at least one computer device is arranged to display a projectile on the at least one display screen in a projectile trajectory defined by the determined position and movement of the at least one body part of the user.
In yet another example of a shooter game, the object can be target object and a further object can be displayed on the display screen in the form of a projectile that is launched towards the target object. That projectile is displayed on the screen in a trajectory that corresponds to the position and movement of the body part.
In an example the control signal comprises a trigger signal for triggering manipulation of the at least one object, in particular generated by at least one of the group comprising: a gun trigger, a fire button and a pull cord.
The control signals comprise the movement of the body part and the position of the motion controller in respect of the object or the large display screen. The control signals can further comprise a trigger signal to trigger the firing of the projectile. A separate trigger unit can do this, or a trigger unit on a hand held device, e.g. a gun trigger, fire button or pull cord, or by a particular movement of body or body part that is pre-defined as being the trigger gesture.
In yet another example of a shooter game, the object can be a target object and a further object can be displayed on the display screen in the form of a projectile that is launched towards the target object. That projectile is displayed on the screen in a trajectory that corresponds to the movement of the body part.
In an example the at least one computer device is arranged to determine the position of the at least one body part by determining a position of the optical motion controller device in respect of the object, by receiving an identification value comprised in the control signals and determining a corresponding pre-defined position stored within a memory of the at least one computer device.
The system preferably comprises multiple optical motion controller devices, for example 1 per user, or 1 per 2 users or 1 per 4 or 8 users, etc. The optical motion controller can determine movement of a body part or multiple body parts and the computer device can calculate the trajectory of the projectile thereof by determining the position of the optical motion controller device for example by a position table stored in a memory of the computer device wherein each motion controller device is identified by a unique identification value and a corresponding position in the scene, i.e. in relative to the large screen or the physical object. When the computer device receives control signals from the optical motion controller device it can distinguish each optical motion controller device by an identification value comprised in the control signals. Then the computer device can access the memory to determine which position belongs to that motion controller device.
In an example the at least one computer device is arranged to determine the position of the at least one body part by determining a position of the optical motion controller device in respect of the object and a position of the at least one body part of the user in respect of the optical motion controller device.
If the computer device can determine the relative position of each optical motion controller device in respect of the large display screen or the physical object, and each optical motion controller device can determine the relative position of the body part in respect of the controller, the computer can determine the relative position of the user and the body part in respect of the large display screen or the object on the basis of the sum of both positions.
In an example the at least one computer device is arranged to determine the position of the at least one body part in respect of the object by determining a position of a visual and/or RF marker provided on the optical motion controller device.
In an example the at least one computer device is arranged to determine the position of the at least one body part in respect of the object by determining a position of a visual and/or RF marker provided on the user.
In an example the at least one computer device is arranged to determine the position of the at least one body part in respect of the object by determining a position of a visual and/or RF marker provided on the body part of the user.
In an alternative, the computer device can also determine the position by visual and/or RF recognition of the optical motion controller device or in particular of the user or the body part thereof. This has the advantage that the optical motion controller device does not have to be stationary and can be moved in the scene since the computer device can determine current positions of each motion controller device at any time.
The computer device to which the optical motion controllers are connected preferably knows the position of the individual controllers and knows the position of the display or the physical objects, for example by a predefined display location position and predefined controller position location stored in the computer or remote. In an alternative example, one or both of the positions of the controller and the display or objects are determined at certain (calibration) moments in time, for example when a carriage of a ride enters the scene where the users can see the display and can start the game. As a second alternative, the calibration methods of determining one of the controller and display or object positions (or both) can be performed on a real-time or near real-time basis.
In an example the interactive system is arranged for stereoscopy, and in particular, wherein the system comprises at least one 3D glass for the at least one user, and wherein the display screen is arranged to display virtual environment with a depth perspective.
In another example multiple dimensions are added. Thus, not only the X and Y dimensions, but also the Z dimension, e.g. Z position, being the distance from display to controller, is a variable in manipulating the object (projectile). Multiple dimensions based input determination is in particular more realistic if a three- dimensional representation is used. Preferably, the optical motion controller device is arranged not only for X, Y, and Z position and movement, i.e. motion, but also for so called six degrees of freedom motion detection, which refers to the freedom of movement of a the body part in three-dimensional space. Specifically, the body parts detected are free to move forward/backward (Z), up/down (X), left/right (Y) (translation in three perpendicular axes) combined with rotation about three perpendicular axes, known as pitch, yaw, and roll. As such, the system, i.e. computer, and display are arranged for 3D. The trajectory of the projectile can then be displayed as a 3D representation of the projectile starting from the body part, in an extrapolated trajectory towards a certain target location on the large screen, all in 3D. The person skilled in the art will understand what 3D display methods are applicable, such as a stereoscopic projector, large screen 3D display, etc. The system can also be arranged for both 2D and 3D representation, by use of a 3D capable display or projector that can switch between 2D and 3D, for example by turning of a single lamp in the projector.
In an example the optical motion controller device is arranged for wireless communication with the at least one computer via one or more of the group comprising Bluetooth, Wifi, Ethernet I EEE 802.3, Zigbee, RS422, RS485 and CAN .
The computer is in communicative connection with the optical motion controller device, either by wire or wireless. The person skilled in the art will understand which wired or wireless communication standards are applicable, e.g. peer-to-peer, Ethernet, 801 .1 1 wireless, 801 .15 wireless PAN, Bluetooth, I EEE 802.3, Zigbee, RS422, RS485 and CAN .
In an example the interactive system comprises at least 2 optical motion controller devices, and in particular at least 4, more in particular at least 8, and even more in particular at least 16 or at least 32.
In an example each of the optical motion controller devices is arranged to generate the control signals on movement of the body part or body parts of at least 1 user, or at least 2 users, and in particular at least 4 users, and more in particular at least 8, and even more in particular at least 16 or at least 32 users simultaneously.
In an example each of the motion controller devices is arranged to generate the control signals on position and movement of at least one body part of at least 2 users, and in particular at least 4 users, and more in particular at least 8, and even more in particular at least 16 or at least 32 users simultaneously.
In an example the scene is a scene of a shooter game, and wherein the at least one object is a target object to be shot with a projectile.
In an example the at least one computer device is arranged to communicate with a plurality of optical motion controller devices and wherein each respective position thereof is determined by the at least one computer device.
In an example a speed of movement of the at least one body part is determined by the optical motion controller device and wherein a speed of movement of the at least one object on the display screen corresponds to the determined speed of movement of the at least one body part.
In a sixth aspect of the invention there is provided an optical motion controller device arranged to be used as a human interface device in an interactive system for interaction of multiple users with scene according to any of the previous descriptions.
In a seventh aspect of the invention there is provided a computer device arranged to be used as a computer in an interactive system for interaction of multiple users with scene according to any of the previous descriptions.
In a eight aspect of the invention there is provided an interactive amusement ride comprising a scene and an interactive system according to any of the previous descriptions, and wherein the interactive amusement ride in particular comprises a track and at least one car or carriage for moving users over the track, wherein the scene, and in particular the car or carriage comprises a plurality of the optical motion controller devices.
As indicated above, all examples of the present invention can be applied in a plurality of different virtual environments. The preferred example or embodiment is a virtual environment such as a virtual game environments, and in particular a shooter game. The projectile of such a shooter game is thus the projectile used in the shooter. That could be a bullet of a gun, arrow of a bow, a javelin, a spear, etc. This projectile can also be a snowball from a snowball cannon in a snowball shooter game as available from the applicant of the present invention.
In a preferred example of the invention the scene comprises a virtual environment having a large screen display. Such is, however only by way of example. The invention also applies to static displays, i.e. a scene in which the objects are physical objects, e.g. the targets of the shooter game, and the system is comprised of one or more computers, one or more optical motion controller devices and one or more physical target objects positioned somewhere in the scene. The trajectory of the projectile is thus not displayed on a screen, due to the absence of the screen, but the system is still arranged to determine a virtual trajectory as an extrapolation of the trajectory defined by the gesture. The system, i.e. the computer thereof, then determines the extrapolation of the trajectory defined by the gesture, the position of the optical motion controller device and the position of the physical target object and then calculates whether or not the projectile hits or misses the target object. The user, i.e. player, is informed of a hit or misses by visual and/or audio and/or force feedback information on the optical motion controller device or an additional device such a scoreboard, speaker, etc. The above-mentioned and other features and advantages of the invention are illustrated in the appended figures and detailed description which are provided by way of illustration only and which are not limitative to the present invention.
BRI EF DESCRI PTION OF TH E DRAWI NGS
Figure 1 shows a setup of a system, according to a first aspect of the invention, with a computer, large screen display and multiple users each controlling a single hand held unit.
Figures 2a and 2b show illustrations, according to a first aspect of the invention, of an example of a hand held unit and a controller unit and its orientation principle as well as the trajectory of shooting the projectile towards the screen.
Figures 3 shows a hand held motion controller device, according to a first aspect of the invention, in the embodiment of two hand held units and one controller unit.
Figure 4 shows a setup of a system according to a first aspect of the invention with a computer, large screen display and multiple users and optical motion controller devices.
Figures 5 shows an illustration according to an example of the invention of a trajectory of a projectile defined by gesture detected by the optical motion controller device and the corresponding trajectory of the projectile on the large screen display.
Figures 6 shows other illustrations according to an example of the invention of trajectories of a projectile defined by detected gestures.
DETAI LED DESCRI PTION OF THE DRAWI NGS
In the subsequent description of the drawing reference is made to the following:
100 Scene
1 10 Large screen display
1 1 1 Active area of large screen display 20 Computer device
21 Wired communication with large screen display
122 Wired communication with projector
131 First user
132 Second user
141 First target object in virtual environment
142 Second target object in virtual environment
143 Third object in virtual environment
151 Hand held unit for first user
151 a First hand held unit for first user
151 b Second hand held unit for first user
152 Hand held unit for second user
152a First hand held unit for second user
152b Second hand held unit for second user
153 controller unit
156 centre of hand held unit
160 Projector arrangement
161 First projector of stereoscopic projector arrangement
162 Second projector of stereoscopic projector arrangement 15 171 Camera system
181 , 182 Stereo audio speaker system
191 Fan
210 origin of spherical coordinate system
2 First start point
212 Second end point
213 Trajectory
220 aimed position
230 azimuth
240 elevation/altitude
250 horizon
260 vector
261 Trajectory start point on large display screen
262 Trajectory end point on large display screen
263 Trajectory on large display screen 271 Virtual extrapolated trajectory
300 Another example of a trajectory on a large display screen
3 First start point
312 Second end point
313 Trajectory
321 Trajectory start point on large display screen
322 Trajectory end point on large display screen
323 Trajectory on large display screen
331 Virtual extrapolated trajectory
371 First start point
372 Second end point
373 Trajectory
381 First movement
382 Second movement
400 Scene
410 Carriage 1
41 1 User/player 1
412 User/player 2
421 Beacon 1
426 Radius beacon 2
427 Radius beacon 1
428 Radius beacon 3
420 Carriage 2
422 Beacon 2
423 Beacon x
430 Carriage 3
431 Target 1
432 Target 2
440 Determined position user
450 Path of carriages through scene
X horizontal axis of large display screen
Y vertical axis of large display screen
Z depth axis of large display screen In Fig. 1 a scene 100 is illustrated in accordance with a first aspect of the invention. The scene is in this example an interactive stationary ride in which multiple users, which users 131 , 132 can manipulate objects 141 , 142, 143, in the virtual environment. The interactive virtual environment illustrated in this example is a shooter game and in particular a game wherein multiple users 131 , 132 can each simultaneously operate their own hand held shooter, e.g. the hand held units 151 , 152. The game illustrated here is a game wherein projectiles in the form of snowballs are fired from a snowball cannon 151 , 152. Thus the first hand held unit is a first snowball cannon 151 , and the second hand held unit is the second snowball cannon 152. The cannons are not attached to the car, carriage or other part of the scene but can be held in hand by the users 131 , 132.
The snowball cannons 151 , 152 are just illustrative. Many different shapes and forms of motion controller devices are applicable. The first user 131 operates the snowball cannon 151 by moving the cannon in any direction and aiming it at the target objects on the screen 141 , 142, all according to a spherical coordinate system as determined by the orientation determining unit of motion controller device. Contrary to the conventional pointing devices which use a Cartesian coordinate system to define movement of the objects, the present invention is based on the insight of the use of a spherical coordinate system wherein the orientation of the hand held units 151 , 152 can be determined by the orientation determining unit, e.g. through a gyroscope and/or accelerometer.
Fig. 1 further shows a fan 191 which can be activated to create a flow of air, and a computer device 120. In this example shown as a single computer device, however, the computer can also be comprised of a plurality of computers, for example a cluster of computers, whatever is sufficient to process al the data of the system. The computer device 120 is either attached to an active large screen display 1 10, for example a large computer screen or large size television. In the example of Fig. 1 the large screen display is a passive display and the actual image of the virtual environment on the screen is generated by a projector arrangement 160. This can either be a single mono projector for displaying in two-dimensional, 2D, virtual environment, wherein only a single projector 161 and corresponding lens is used of the projector arrangement 160, or this could be a three-dimensional, 3D, virtual environment, wherein two projectors 161 and 162 and corresponding lenses are used to generate a stereoscopic image with depth perspective. In case of a projector 160 that produces the images, the projector will be connected with the computer device 120 via wired communication 122. As the skilled person will understand, the present invention is not restricted to the form of communication, i.e. wired or wireless, and is applicable to both forms and implementations thereof.
The scene 100 of Fig. 1 further shows additional speakers 181 , 182 which are arranged to add an audio component to the virtual environment. The scene 100 of Fig. 1 shows, by way of example, a two-speaker set-up. The invention, as the skilled person will understand, is not restricted to merely a two speaker stereo setup, and is applicable to all audio set-ups, i.e. mono, stereo, and surround sound set- ups.
Fig. 1 further shows a camera system 171 which can for example be used to record the users 131 , 132 and to determine for example whether or not a motion controller device 151 , 152 should be enabled or disabled from/in the game if a user 131 , 132 is detected that is operating the pointing device. The camera system 171 can further be used to for example trigger the start of the game, upon detection of movement of any person within the scene, or to extract images of the users such that these images of the users can be used in the virtual environment, e.g. as avatars.
The example shown in Fig. 1 is an example wherein the scene 100 comprises a large screen display 1 10 connected to the computer device 120. On the display the objects 141 , 142, 143 are shown as virtual objects within a virtual environment. The invention however also relates to a scene wherein the objects are physical objects within the scene 100. These objects can be target objects towards which the users should aim the projectiles. If the projectile hits the target object, the object interacts by for example an audio signal, a visual signal or the like. The interaction can also be in the form of an additional score board or other device from which the user can determine if the shot was a hit or miss.
In Fig. 2a an illustration of the functioning of the spherical coordinate system on the motion controller device 151 . The centre 156 of the hand held unit 151 corresponds to the origin 210 of the spherical coordinate system. The orientation determining unit within the hand held devices 151 , 152 of the users 131 , 132 are arranged to determine, e.g. by a tri-axial gyroscope and/or accelerometer the exact orientation of the devices 151 , 152. Together with the a plurality of beacons (not shown in this figure) distributed over the scene, the exact orientation and the position of the devices 151 , 152 can be determined. Thus the vector 260 corresponds to orientation of the hand-held unit 151 . The exact orientation in a three-dimensional coordinate system is further determined by an azimuth 230, and an elevation or altitude 240 components. The azimuth component defines the angle of the vector from the origin around a horizon, i.e. a horizontal displacement. The azimuth is for example denoted as the angle alpha, a. Azimuth can also be more generally defined as a horizontal angle measured clockwise from any fixed reference plane or easily established base direction line. In an alternative, the components of the spherical coordinate system can also be comprised of a radial distance, a polar angle and azimuthal angle. The radial distance is then the length of the vector, the polar angle the altitude and the azimuthal angle the azimuth. The altitude component, or elevation component or polar angle, defines the angle of the vector from the origin and the plane perpendicular to the horizon. For example, the angle between the vector and a reference horizon such a horizontal plane through the centre of the large screen display. The altitude is for example denoted as the angle beta, β.
In Fig. 2b an example is shown of a motion controller device 151 also in the form of a shooter that can be held in hand. The shooter 151 shoots a (virtual) projectile 463 towards a target object 431 , in this particular example a target object in the form of a virtual/digital object shown on the large screen 1 1 1 . The projectile 463 follows a trajectory 462 on the screen towards the target object 431 which corresponds with the trajectory 461 outside the screen 1 1 1 , in particular the trajectory 462 is an extrapolation from the trajectory 461 from the shooter 151 , which trajectory 461 corresponds with the longitudinal axis of the shooter, hence the direction in which it is aimed. The trajectory is determined by determining both the position of the shooter 151 in respect of the screen, or in absolute sense within the scene (real world coordinates), as well as on the determined orientation of the shooter. On the basis of these two variables the system is able to determine if the shot is a miss or a hit.
In Fig. 3 an example of a suitable set-up of a professional interactive ride is disclosed, shown from a top view perspective. Fig. 3 shows a scene 400 in which several objects or targets 431 , 432 are located throughout the scene 400. These targets can be physical objects or virtual objects shown on a large screen like a display or by projection. The users in this example in the amount of six, take place in a carriage 410, 420, 430, and each have their own hand held device 41 1 , 412. The carriage (automatically) moves over a track in a predetermined path 450 through the scene 400 and as it moves, passed several (interactive) objects 431 , 432. It will be appreciated that, for clarity reasons not all targets/objects, beacons, users, etc. are indicated by a reference figure. However, although only some are, it is to be understood that several of these targets, beacons, users, etc. may exist in the scene.
As the carriages move along the track towards the targets, the users can use the hand held device to shoot at the targets. The orientation of the hand held devices are determined by the orientation determining units within the devices as described above. For realistic shooting experience, and optimal interaction of the users with the scene, orientation is not enough. In order to determine if each of the users aims and fires correctly, the positions of the users have to be determined as well, such as high speed in order not to be biased by the movement of the users themselves. To this end the scene 400 is provided with a plurality of beacons, 421 , 422, 423, etc. These beacons can emit a radio frequency signal but preferably are arranged to receive a radio frequency signal, preferably in the UWB range, that is transmitted by the hand held devices 41 1 , 412. Through a time of flight, the distance between the beacon 421 , 422, 423 and the hand held devices 41 1 , 4123 can be calculated. The time of flight, converted to a distance, determines the radius from the beacon. For example beacon 1 , 421 can determine a certain time of flight between when the UWB RF signal was transmitted by the hand held device, and the time it arrives at beacon 1 , 421 . This is translated into the radius 427. I n the same way the radius 426 from beacon 2, 422 is determined and the radius of the third beacon, and preferably even more beacons. The position at which the circles 426, 427, 428 cross each other, corresponds to the position 440 determined as the position of the hand held device, and thus the user. This data is processed and transmitted to the server for further processing by the system and to manipulate aspects of the scene 400, for example to indicate a hit of the target by one of the users if both the position and direction (orientation) of the hand held device correspond with the target.
In Fig. 4 a scene 100 is illustrated in accordance with a first aspect of the invention. The scene 100 is in this example an interactive stationary ride for multiple users, which users 131 , 132 can interact and manipulate objects 141 , 142, 143, in the virtual environment. The interactive virtual environment illustrated in this example is a shooter game and in particular a game wherein multiple users 131 , 132 can each manipulate the objects within the virtual environment on the basis of detected gestures, i.e. position, orientation, motion, of their body parts 151 , 152. The game illustrated here is a game wherein snowballs are fired by motion of the body parts 151 , 152 for example as if they would start from the body or body part in particular.
The first user 131 manipulates the object 141 or objects 141 , 142, 143 on the screen by moving his or her arm 151 . The movement, i.e. motion, defines a trajectory that corresponds with either the trajectory of the object moving on the screen, or with a projectile trajectory towards the object. Thus, in one example the gesture is detected and used to manipulate the object, i.e. as a target object, and in the other example the gesture is detected and used to manipulate a projectile towards the object, i.e. the target object. In the later, there are two objects to be recognised, one being the target object, the other being the projectile, e.g. a snowball.
Thus, in case of the gesture trajectory to correspond to the projectile trajectory, the trajectory of the projectile, e.g. the snowball, is shown on the large screen display 1 10, and in particular on the active part of the display 1 1 1 , towards a first target object 141 of the plurality of objects 141 , 142, 143 of the virtual environment. Moreover, the snowball trajectory is not only defined by relative movement, i.e. the movement of the body part in relation to the motion sensor 153, but also by the absolute movement, i.e. by the sum of the movement of the body part in relation to the motion sensor 153 and the position of the motion sensor 153 in relation to the large screen display 1 10 or particular objects, hence its absolute position within the scene 100.
Fig. 4 further shows a fan 191 which can be activated to create a flow of air, and a computer device 120. In this example shown as a single computer device, however, the computer can also be comprised of a plurality of computers, for example a cluster of computers, whatever is sufficient to process all the data of the system. The computer device 120 is either attached to an active large screen display 1 10, for example a large computer screen or large size television. In the example of Fig. 4 the large screen display is a passive display and the actual image of the virtual environment on the screen is generated by a projector arrangement 160. This can either be a single mono projector for displaying a two-dimensional, 2D, virtual environment, wherein only a single projector 161 and corresponding lens is used of the projector arrangement 160, or this could be a three-dimensional, 3D, virtual environment, wherein two projectors 161 and 162 and corresponding lenses are used to generate a stereoscopic image with depth perspective. In case of a projector 160 that produces the images, the projector will be connected with the computer device 120 via wired communication 122. As the skilled person will understand, the present invention is not restricted to the form of communication, i.e. wired or wireless, and is applicable to both forms and implementations thereof.
The scene 100 of Fig. 4 further shows additional speakers 181 , 182 which are arranged to add an audio component to the virtual environment. The scene 100 of Fig. 4 shows, by way of example, a two-speaker set-up. The invention, as the skilled person will understand, is not restricted to merely a two speaker stereo setup, and is applicable to all audio set-ups, i.e. mono, stereo, and surround sound setups.
Fig. 4 further shows a camera system 171 which can for example be used to record the users 131 , 132 and to determine for example whether or not a pointing device 151 , 152 should be enabled or disabled from/in the game if a user 131 , 132 is detected who is operating the pointing device. The camera system 171 can further be used to for example trigger the start of the game, upon detection of movement of any person within the scene or to recognise users, or to extract images of the users such that these images of the users can be used in the virtual environment, e.g. as avatars.
Known game controllers are only able to control the aspects of a game or virtual environment by relative movement. Thus in the snowball shooter game according to Fig. 4, wherein a projectile is fired in the form of a snowball, the trajectory of the snowball displayed on the screen always starts from the same position. That position is for example a position at the bottom centre of the display, where an end part of a gun, or applicable here, a snowball cannon, is permanently displayed. The user aims at a target somewhere in the screen and when the fire button or trigger is pushed, the snowball starts its trajectory in a straight line from the bottom centre of the display, at the end of the cannon, towards the aimed position on the display, hoping to hit the target, e.g. target object 141 , if the aimed position and target object position 141 correspond.
With a motion detection system 151 , 152, 153 according to the invention, the trajectory is different since the starting position of the snowball on the display is not static but corresponds to the relative position between the users 131 , 132 or at least their body parts 151 , 152, and the display 1 10. If the body part 151 is located at a certain distance away from the display, illustrated by a position on the Z depth axis of the large display screen 1 10 as illustrated in Fig. 4, but positioned at centre of the display, i.e. in the centre of the width of the display, thus in the origin of Z horizontal axis of the large screen display 1 10 as illustrated in Fig. 4, the trajectory of the projectile starts at the middle of the display, from the bottom towards the aimed position on the display, e.g. target object 141 . But if the body part 151 is located at the right side of the display 1 10, thus on a position away from the origin on the X horizontal axis, the trajectory of the snowball starts from the bottom right corner of the screen 1 10. In this way, the motion detection system 153 is able to manipulate an object in a virtual environment, e.g. a trajectory of a snowball in a snowball shooter game or the target object, not only on the basis of the aimed direction but also on the relative position of the body part 151 in relation to the display 1 10.
The motion sensor device 153 is in particular arranged to define or calculate a trajectory of the projectile by a first, starting position of the body part, a second, end position on the body part and a trajectory defined from first to second position.
The example shown in Fig. 4 is an example wherein the scene 100 comprises a large screen display 1 10 connected to the computer device 120. On the display the objects 141 , 142, 143 are shown as virtual objects within a virtual environment. The invention however also relates to a scene wherein the objects are physical objects within the scene 100. These objects can be target objects towards which the users should aim the projectiles. If the projectile hits the target object, the object interacts by for example an audio signal, a visual signal or the like. The interaction can also be in the form of an additional score board or other device from which the user can determine if the shot was a hit or miss.
In all examples of the invention, the computer device 120 to which the motion sensor device 153 is connected knows the position of each body part 151 , 152 and knows the position of the display 1 10 for example by a predefined device location position value, stored local or remote. In an alternative, one of, or both the positions of the body parts 151 , 152 and the display 1 10 are determined at certain (calibration) moments in time, for example when a carriage of a ride (not shown) enters the scene where the users 131 , 132 can see the display 1 10 and can start the game. As a second alternative, the calibration methods of determining one of the body parts 151 , 152 and display 1 10 positions (or both), can be performed on a real-time or near real-time basis.
Thus, in an example wherein the motion sensor device 153 is arranged to define or calculate a trajectory of the projectile, that trajectory is defined by the first position on of the body part 151 , the second position thereon and trajectory from first to second. The computer device then calculates a further virtual trajectory based on the trajectory determined thereof. The further virtual trajectory is thus an extrapolation of the determined trajectory. If the position of the display is somewhere within that further virtual trajectory, the projectile is shown on the display. If however, the extrapolated trajectory does not cross the display, the projectile is not shown on the display. In the first option wherein the display is in the extrapolated trajectory, the computer device can determine on the basis of the position of the display and the defined virtual trajectory, i.e. the extrapolated trajectory, how the projectile should be displayed. Thus what the starting position of the projectile on the display would have to be, and what trajectory the projectile would than follow from there on. As such, if the trajectory on the display, i.e. as an extrapolation of the trajectory defined by the gesture, via the virtual trajectory between the body part and display also as extrapolation of the first, crosses the target on the display, the computer can count the "shot" as a "hit".
In Fig. 5 and 6 the different trajectories are shown. In Fig. 5 the large screen display 1 10, with active area 1 1 1 shows (besides other elements of the game) two objects 141 , 142 that can be manipulated. These are the targets that are aimed at by the two users 131 , 132. The first user 131 with first body part 151 is on the left side of the screen 1 10, thus as shown on Fig. 4, on the left side of the X-axis. That first user 131 generates a trajectory 21 1-212-213 with his hand, i.e. the body part 151 . A first start point 21 1 , a second end point 212 and the trajectory 213 from first to second define the trajectory. These variables are communicated to the computer device, which adds the location position variable/value thereto, and on the basis thereof as well as on the basis of the position of the display 1 10 the virtual trajectory 271 is calculated. That virtual trajectory 271 is an extrapolation of the trajectory 21 1 -212-213. The virtual trajectory, i.e. the extrapolated trajectory 213, continues on the screen 1 10 at start point 261 and the trajectory 263 is further continued, as an extrapolation of the trajectory 21 1 -212-213 and the virtual trajectory 271 , towards the end point on the screen 262. If trajectory 263 displayed on the screen 1 10 crosses a target object 141 , 142, a hit is counted, otherwise, the shot was counted as a miss.
In Fig. 6 yet another trajectory is shown. In this figure the trajectory
312-313-314 of the body part, i.e. the movement of the hand 151 is not a straight line but is curved 381 . The extrapolation 331 of that trajectory 312-313-314 is thus also curved according to a corresponding radius. This results in the example shown here in a miss of the target object 141 since the extrapolation 331 of the trajectory as displayed on the screen 1 10 starts at point 321 towards end point 322 via the displayed trajectory 323. This trajectory 323 does not cross a target object 141 and thus the shot is registered by the computer as a missed shot.
The figure also shows an example of relative movement wherein the position of the hand 151 in relation to the display 1 10 is not relevant and determined. In this situation the movement 382 of the hand from starting position 371 towards end position 372 defines a trajectory 373. That trajectory defines the control signal and thus the trajectory in which the object is moved on the screen, i.e. the trajectory 323 on the screen from start position 321 towards end position 322.
The skilled person will appreciate that the invention has been described in the foregoing with reference to the described examples. A skilled person may provide modifications and additions to the examples disclosed, which modifications and additions are all comprised by the scope of the appended claims.

Claims

1 . A professional interactive system for interaction of a plurality of users with at least one object within a scene, said system comprising:
- at least one human interface device per user of said plurality of users, arranged for generating control signals corresponding to a user input from said at least one user;
at least one computer device, arranged for receiving and processing said control signals from said at least one human interface devices, and for controlling manipulation of said at least one object within said scene, characterized in that, said at least one human interface devices are motion controller devices, each comprising at least one hand held unit to be held by said at least one user and comprising a orientation determining unit for determining the orientation of the hand held unit, as well as a wireless transmitter unit for transmitting a radio frequency signal throughout said scene, wherein said radio frequency signal comprises identification information for identifying said human interface device from said plurality of human interface devices of said plurality of user, wherein said professional interactive system further comprises a plurality of beacons distributed over said scene, arranged for receiving said radio frequency signal from said wireless transmitter units of said motion controller devices and identifying said motion controller devices as well as determining said positions of said motion controller devices within said scene for interaction with said at least one object within said scene.
2. The professional interactive system according to claim 1 , wherein said at least one object within said scene is manipulated if said at least one object is located in a trajectory corresponding to said determined orientation and a determined position of said motion controller device in respect of said at least one object.
3. The professional interactive system according to claim 1 or 2, wherein said at least one object within said scene is a physical object.
4. The professional interactive system according to claim 1 or 2, wherein said at least one object within said scene is a virtual object within a virtual environment displayed on at least one display screen.
5. The professional interactive system according to any of the previous claims, wherein said manipulation comprises providing any one or more of the consisting of: a visual signal, an audio signal, change of position of said at least one object, change of appearance of said at least one object.
6. The professional interactive system according to any of the previous claims 4 or 5, wherein said at least one computer device is arranged to display a projectile on said at least one display screen in a projectile trajectory defined by said determined orientation and position of said at least one hand held device.
7. The professional interactive system according to claims 6, wherein said at least one computer device is arranged to display said projectile on said at least one display screen in a projectile trajectory defined by an extrapolation of a trajectory defined by said determined orientation and position and a determined movement of said at least one hand held device.
8. The professional interactive system according to any of the previous claims, wherein said control signal comprises a trigger signal for triggering manipulation of said at least one object, in particular generated by at least one of the group comprising: a gun trigger, a fire button and a pull cord.
9. The professional interactive system according to any of the previous claims, wherein said motion controller device comprises at least two hand held devices to be held by two hands of said at least one user and a single controller device.
10. The professional interactive system according to any of the previous claims, wherein said motion controller device is further arranged for detecting at least one of pitch, yaw and roll movement of said at least one hand held unit in respect of said motion controller device and for generating said control signals thereon for movement of said at least one object within said virtual environment in a trajectory corresponding to said at least one of pitch, yaw and roll movement.
1 1 . The professional interactive system according to any of the previous claims, wherein said at least one computer device is arranged to determine said positions of said motion controller devices by receiving said radio frequency signals received by said beacons and determining said position by a difference in said signals, wherein said difference in particular is any one or more of a time of flight, two way ranging time of flight, double sided two way ranging, symmetrical double sided two way ranging, time of arrival, time difference of arrival, signal power.
12. The professional interactive system according to any of the previous claims, wherein said radio frequency signal is an ultra wideband radio frequency signal, in particular a ultra wideband radio frequency channel according to the I EEE 802.15.4, more in particular I EEE 802.15.4-201 1 or I EEE 802.15.4-2015.
13. The professional interactive system according to any of the previous claims 1 1 , wherein said orientation determining unit of said motion controller devices comprises at least one or more of an accelerometer, a gyroscope sensor and a geomagnetic sensor, and wherein said gyroscope and geomagnetic sensors in particular are tri-axial sensors.
14. The professional interactive system according to any of the previous claims, wherein at least said wireless transmitter unit of said motion controller device is arranged to be placed on the body of said at least one user.
15. The professional interactive system according to any of the previous claims 4-14, wherein said interactive system is arranged for stereoscopy, and in particular, wherein said system comprises multiple 3D glasses for said at least one user, and wherein said display screen is arranged to display virtual environment with a depth perspective.
16. The professional interactive system according to any of the previous claims, wherein said motion controller device is arranged for wireless communication with said at least one computer via one or more of the group comprising Bluetooth, Wifi, Ethernet I EEE 802.3, Zigbee, RS422, RS485 and CAN, and wherein said orientation determining unit is arranged to communicate said determined orientation of the hand held unit over said wireless communication connection between said wireless communication unit of said motion controller device and said at least one computer.
17. The professional interactive system according to any of the previous claims, wherein said interactive system comprises at least 2 motion controller devices, an in particular at least 4, more in particular at least 8, and even more in particular at least 16 or at least 32.
18. The professional interactive system according to any of the previous claims, wherein said scene is a scene of a shooter game, and wherein said at least one object is a target object to be shot with a projectile.
19. A motion controller device configured for use as a human interface device in a professional interactive system according to any of the previous claims 1 - 18.
20. A computer device configured for use as a computer device in a professional interactive system according to any of the previous claims 1 -18.
21 . An interactive amusement ride comprising a scene and a professional interactive system according to any of the previous claims 1 -18, and wherein said professional interactive amusement ride in particular comprises a track and at least one car or carriage, and preferably a train of multiple cars or carriages, for moving users over said track, wherein said scene, and in particular said car or carriage comprises a plurality of said motion controller devices.
22. A professional interactive system for interaction of at least one user with at least one object within a scene, said system comprising:
at least one human interface device arranged for generating control signals corresponding to a user input from said at least one user;
at least one computer device arranged for receiving and processing said control signals from said at least one human interface device, and for controlling manipulation of said at least one object within said scene, characterized in that, said at least one human interface device is a motion sensor controller device, and in particular an optical motion sensor controller device, arranged for determining motion and generating said control signals for controlling said manipulation of said at least one object within said scene in correspondence to said determined motion, and wherein said manipulation of said at least one object within said scene is further controlled by a determined position of said motion controller device in respect of said at least one object.
23. The professional interactive system according to claim 22, wherein said manipulation of said at least one object within said scene is controlled if said at least one object is located in a trajectory defined as an extrapolation of a trajectory defined by said determined position and movement of said at least one body part of said user.
24. The professional interactive system according to claim 22 or 23, wherein said at least one object within said scene is a physical object.
25. The professional interactive system according to claim 22 or 23, wherein said at least one object within said scene is a virtual object within a virtual environment displayed on at least one display screen.
26. The professional interactive system according to any of the previous claims 22-25, wherein said manipulation comprises providing any one or more of the group consisting of: a visual signal, an audio signal, change of position of said at least one object, change of appearance of said at least one object.
27. The professional interactive system according to any of the previous claims 25 or 26, wherein said at least one computer device is arranged to display a projectile on said at least one display screen in a projectile trajectory defined by said determined position and movement of said at least one body part of said user.
28. The professional interactive system according to any of the previous claims 22-27, wherein said control signal comprises a trigger signal for triggering manipulation of said at least one object, in particular generated by at least one of the group comprising: a gun trigger, a fire button and a pull cord.
29. The professional interactive system according to any of the previous claims 22-28, wherein said at least one computer device is arranged to determine said position of said at least one body part by determining a position of said optical motion controller device in respect of said object, by receiving an identification value comprised in said control signals and determining a corresponding pre-defined position stored within a memory of said at least one computer device.
30. The professional interactive system according to any of the previous claims 22-29, wherein said at least one computer device is arranged to determine said position of said at least one body part by determining a position of said optical motion controller device in respect of said object and a position of said at least one body part of said user in respect of said optical motion controller device.
31 . The professional interactive system according to any of the previous claims 22-30, wherein said at least one computer device is arranged to determine said position of said at least one body part in respect of said object by determining a position of a visual and/or RF marker provided on said optical motion controller device.
32. The professional interactive system according to any of the previous claims 22-31 , wherein said at least one computer device is arranged to determine said position of said at least one body part in respect of said object by determining a position of a visual and/or RF marker provided on said user.
33. The professional interactive system according to any of the previous claims 22-32, wherein said at least one computer device is arranged to determine said position of said at least one body part in respect of said object by determining a position of a visual and/or RF marker provided on said body part of said user.
34. The professional interactive system according to any of the previous claims 22-33, wherein said interactive system is arranged for stereoscopy, and in particular, wherein said system comprises at least one 3D glass for said at least one user, and wherein said display screen is arranged to display virtual environment with a depth perspective.
35. The professional interactive system according to any of the previous claims 22-34, wherein said optical motion controller device is arranged for wireless communication with said at least one computer via one or more of the group comprising Bluetooth, Wifi, Ethernet I EEE 802.3, Zigbee, RS422, RS485 and CAN .
36. The professional interactive system according to any of the previous claims 22-35, wherein said interactive system comprises at least 2 optical motion controller devices, and in particular at least 4, more in particular at least 8, and even more in particular at least 16 or at least 32.
37. The professional interactive system according to any of the previous claims 22-36, wherein each of said motion controller devices is arranged to generate said control signals on position and movement of at least one body part of at least 2 users, and in particular at least 4 users, and more in particular at least 8, and even more in particular at professional least 16 or at least 32 users simultaneously.
38. The interactive system according to any of the previous claims 22-
37. wherein said scene is a scene of a shooter game, and wherein said at least one object is a target object to be shot with a projectile.
39. The professional interactive system according to any of the previous claims 22-38, wherein said at least one computer device is arranged to communicate with a plurality of optical motion controller devices and wherein each respective position thereof is determined by said at least one computer device.
40. The professional interactive system according to any of the previous claims 22-39, wherein a speed of movement of said at least one body part is determined by said optical motion controller device and wherein a speed of movement of said at least one object on said display screen corresponds to said determined speed of movement of said at least one body part.
41 . An optical motion controller device arranged to be used as a human interface device in a professional interactive system for interaction of multiple users with scene according to any of the previous claims 22-40.
42. A computer device arranged to be used as a computer in a professional interactive system for interaction of multiple users with scene according to any of the previous claims 22-40.
43. An interactive amusement ride comprising a scene and a professional interactive system according to any of the previous claims 22-40, and wherein said professional interactive amusement ride in particular comprises a track and at least one car or carriage for moving users over said track, wherein said scene, and in particular said car or carriage comprises a plurality of said optical motion controller devices.
PCT/NL2016/050434 2015-06-17 2016-06-17 Game controller Ceased WO2016204617A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
NL2014976 2015-06-17
NL2014974 2015-06-17
NL2014974A NL2014974B1 (en) 2015-06-17 2015-06-17 Hand held controller.
NL2014976A NL2014976B1 (en) 2015-06-17 2015-06-17 Gesture game controlling.

Publications (2)

Publication Number Publication Date
WO2016204617A2 true WO2016204617A2 (en) 2016-12-22
WO2016204617A3 WO2016204617A3 (en) 2017-02-09

Family

ID=56682229

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/NL2016/050434 Ceased WO2016204617A2 (en) 2015-06-17 2016-06-17 Game controller

Country Status (1)

Country Link
WO (1) WO2016204617A2 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111790140A (en) * 2020-07-07 2020-10-20 杭州脸脸会网络技术有限公司 Interactive method, system, computer device, and readable storage medium
CN111939561A (en) * 2020-08-31 2020-11-17 聚好看科技股份有限公司 Display device and interaction method
WO2022212665A1 (en) * 2021-04-01 2022-10-06 Universal City Studios Llc Interactive environment with portable devices
WO2023192423A1 (en) * 2022-03-30 2023-10-05 Universal City Studios Llc Systems and methods for producing responses to interactions within an interactive environment
WO2024064295A1 (en) * 2022-09-22 2024-03-28 Universal City Studios Llc Systems and methods for positional tracking of interactive devices
US11995249B2 (en) 2022-03-30 2024-05-28 Universal City Studios Llc Systems and methods for producing responses to interactions within an interactive environment
WO2024254486A3 (en) * 2023-06-09 2025-01-09 Universal City Studios Llc System and method for operational management of wireless device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4813682A (en) 1985-08-09 1989-03-21 Nintendo Co., Ltd. Video target control and sensing circuit for photosensitive gun

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11148042B2 (en) * 2013-12-23 2021-10-19 Ebay Inc. Geo location questing
US11030806B2 (en) * 2014-11-15 2021-06-08 Vr Exit Llc Combined virtual and physical environment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4813682A (en) 1985-08-09 1989-03-21 Nintendo Co., Ltd. Video target control and sensing circuit for photosensitive gun

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111790140A (en) * 2020-07-07 2020-10-20 杭州脸脸会网络技术有限公司 Interactive method, system, computer device, and readable storage medium
CN111939561A (en) * 2020-08-31 2020-11-17 聚好看科技股份有限公司 Display device and interaction method
CN111939561B (en) * 2020-08-31 2023-09-19 聚好看科技股份有限公司 Display devices and interaction methods
WO2022212665A1 (en) * 2021-04-01 2022-10-06 Universal City Studios Llc Interactive environment with portable devices
WO2023192423A1 (en) * 2022-03-30 2023-10-05 Universal City Studios Llc Systems and methods for producing responses to interactions within an interactive environment
US11995249B2 (en) 2022-03-30 2024-05-28 Universal City Studios Llc Systems and methods for producing responses to interactions within an interactive environment
US12449915B2 (en) 2022-03-30 2025-10-21 Universal City Studios Llc Systems and methods for producing responses to interactions within an interactive environment
WO2024064295A1 (en) * 2022-09-22 2024-03-28 Universal City Studios Llc Systems and methods for positional tracking of interactive devices
WO2024254486A3 (en) * 2023-06-09 2025-01-09 Universal City Studios Llc System and method for operational management of wireless device

Also Published As

Publication number Publication date
WO2016204617A3 (en) 2017-02-09

Similar Documents

Publication Publication Date Title
WO2016204617A2 (en) Game controller
US10584940B2 (en) System and method for marksmanship training
US9132342B2 (en) Dynamic environment and location based augmented reality (AR) systems
US9244525B2 (en) System and method for providing user interaction with projected three-dimensional environments
CN107469343B (en) Virtual reality interaction method, device and system
KR101366444B1 (en) Real time interoperable virtual shooting system
CN110585712A (en) Method, device, terminal and medium for throwing virtual explosives in virtual environment
CN102735100A (en) Individual light weapon shooting training method and system by using augmented reality technology
WO2012020409A1 (en) Mobile gaming platform system and method
US10222176B2 (en) Simulated gun shooting and target position sensing apparatus and method
CN108043032A (en) Shooting game system based on AR
CN106061571A (en) Interactive virtual reality systems and methods
CN207012537U (en) Simulation gun structure and simulation device
CN113827949A (en) Screen shooting range and screen shooting game method using artificial intelligence technology
EP3405269A2 (en) Improved laser game system
CN108465224A (en) Table tennis track analysis system
NL2014976B1 (en) Gesture game controlling.
CN110631411A (en) Virtual shooting training control method and system
KR101938458B1 (en) shooting method with rotating mapped images
NL2014974B1 (en) Hand held controller.
KR101706170B1 (en) Survival smart archery game system
WO2016167664A2 (en) Game controller
US20140309036A1 (en) Set of body motions and methods to control game play using a laser gun in Computer Games.
CN210466815U (en) Large-screen projection type portable ground-to-air missile training device
NL2014665B1 (en) Shooter game controller.

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16750505

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16750505

Country of ref document: EP

Kind code of ref document: A2