WO2018117985A1 - Système de projecteur pour plate-forme sphérique - Google Patents
Système de projecteur pour plate-forme sphérique Download PDFInfo
- Publication number
- WO2018117985A1 WO2018117985A1 PCT/TH2016/000102 TH2016000102W WO2018117985A1 WO 2018117985 A1 WO2018117985 A1 WO 2018117985A1 TH 2016000102 W TH2016000102 W TH 2016000102W WO 2018117985 A1 WO2018117985 A1 WO 2018117985A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- images
- projector system
- sphere
- computer
- platform according
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/3147—Multi-projection systems
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/02—Simulators for teaching or training purposes for teaching control of vehicles or other craft
- G09B9/04—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
- G09B9/05—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles the view from a vehicle being simulated
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/02—Simulators for teaching or training purposes for teaching control of vehicles or other craft
- G09B9/08—Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
- G09B9/30—Simulation of view from aircraft
- G09B9/32—Simulation of view from aircraft by projected image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
Definitions
- the invention relates to the field of computer systems engineering with relations to a projector system for spherical platform of a virtual reality (VR), augmented reality (AR), or mixed reality (AR) gameplay or environment for the purpose of gaming, exercising, training, or other leisure and business activities wherein the projector system is useful for projecting the images and related sensory cues from the environment, which is being seen or experienced by a player through controllers, interactive objects, and or technologies, to the sphere seamlessly throughout the entire 360° surface of the spherical platform.
- VR virtual reality
- AR augmented reality
- AR mixed reality
- Spherical platform is a platform that enables a simulation, duplication, or interaction with a real world experience that can be presented in various forms and which normally includes a set of complementary technologies such as a simulator headset or goggle, input controller, and/or a motion simulator so that the first-person player would feel further engaged with his/her virtual environment through visual, auditory, and tactile or physical interactions or stimulations from the computing and various components inside the platform that receive inputs from the players input controller and respond to the player in real-time.
- projection technology so that the images being seen by the player in the virtual environment, possibly through the headset, is partially projected to the spherical portion of the sphere for a third-person viewpoint from outside the sphere.
- a conventional motion simulator which utilizes a motion simulation system can simultaneously duplicate the movements of a car, motorcycle, or other vehicle so the player can feel the full-functioning, real world experience of racing and wherein the images that the first-person player sees can be partially projected on the spherical portion of the sphere.
- such conventional motion simulator cannot duplicate the full movements in all directions and so cannot be effectively used together with a VR technology in which the player shall immerse in a freely open virtual environment.
- the conventional motion simulator may also utilize one part of the sphere wherein the part is often opaque and cannot be easily viewed by the external party locating outside the sphere.
- the external party may not feel as engaged as the first-person player; unlike the spherical platform according to this invention which is designed for projection of images onto the translucent, transparent, or semi-transparent spherical surface so that the external party may feel more involved with the player and will be inclined to queue up for the platform for gaming or other activities.
- other conventional VR simulators are available for use with a VR headset or goggle and can simulate the head movements of player in many directions, but still cannot rotate in 360° like the spherical platform. Further, the VR simulator does not have a third-person display or projection portion like the spherical platform.
- the spherical platform has become a popular platform for VR, AR, and MR technology integrations and the inventors according to this invention have provided an important improvement over the conventional projector system, which can merely project images seen by first-person player on one part or portion of the sphere, by means of integrating two computing systems with embedded image correction and blending technologies to a set of projectors for seamlessly projection of images seen by player onto the sphere, either in part or in whole 360° surface of the sphere, for improved attractiveness and engagement by the external parties.
- a projector system for spherical platform uniquely presents a process for projecting and displaying image or series of images being seen by a first-person player inside the VR, AR, or MR environment to the outer surface of the sphere, in part or in whole (360°), in order for the audience outside to see the same images that the player is seeing or experience the same thing in real-time.
- the projector system mainly comprising: 1) A first computer for receiving inputs from a controller, a position tracking sensor, an image information and head-tracking information from the a headset, and other input devices, and further processing the inputs received for feeding back or interacting with the player via a motion simulator, the controller, or display panel in the headset; and 2) A second computer for receiving the image information from the first computer wherein the information shall preferably be corrected, integrated and optimized by the set of cameras and image correction and optimization technologies presented in the second computer before and during transmission of said images for projection on the sphere by a set of projectors.
- the image correction, integration and optimization process for projection of images onto the whole sphere requires the steps of receiving the images from the virtual environment by the second computer, using the cameras with fisheye distortion effect to correct each of the image distortions, integrating the images from the cameras and transmitting to the projectors, and using the geometric correction and edge blending technology for correcting or finetuning the projected images on a round surface of the sphere and adjusting the edges of the images accordingly for optimal display or projection of players environment on the whole 360° of the sphere.
- the invention is disclosed in fuller detail with examples and illustrations in the later section of the detailed description of the invention.
- FIG. 1 illustrates a flow diagram of one embodiment of the projector system for spherical platform according to this invention wherein the environment is a VR environment.
- FIG.2 illustrates a flow diagram of one embodiment of image correction and optimization steps before projection of image onto at least one part or whole surface of the sphere.
- FIG. 3 illustrates a set of examples of software set-ups for the four cameras in the second computer wherein FIG. 3A shows one example of a texture or view being seen by a front camera, FIG.3B shows one example of a texture or view being seen by a back camera, FIG.3C shows one example of a texture or view being seen by a left camera, and FIG.3D shows one example of a texture or view being seen by a right camera
- FIG.4 illustrates one set of examples of the image before applying a camera fish eye distortion effect in FIG.4A and after applying the camera fish eye distortion effect in FIG.4B.
- FIG.5 illustrates one set of examples of the image before applying a geometric correction and edge blending technology as in FIG. SA and FIG. SC, and after applying the geometric correction and edge blending technology as in FIG. SB.
- FIG.6 illustrates one example of the possible set-up arrangement of four cameras in the second computer wherein each camera exhibits at least 120° FOV.
- FIG. 7 illustrates one example of the possible set-up arrangement of four projectors facing inward for projection of images to the surface of the sphere.
- FIG. 8 illustrates one example of the sphere system with an internal motion simulator set-up wherein the motion simulator provides a seat for a player to sit on and is movable on certain axes on a rail.
- a projector system (1) for spherical platform for at least one of VR, AR, or MR gameplay or environment as shown in one embodiment can fundamentally comprising at least two main components: 1.
- the motion simulator (6) as depicted in FIG.8 further comprises a seat (40) for the first- person player (10) to sit on and is possibly driven, on pitch and yaw axes on a rail (41), by an electro-mechanical system.
- the input controller (4) used by the player ( 10) can be chosen from any conventional controller, such as a two-handed controller, a one-handed controller, a gamepad, a smartphone, a keyboard, a mouse, an input glove, a joystick, a remote control, a footpad, an armband, a foot band, an ankle band, a full bodysuit controller, an aim controller, or the combination of at least two of these controllers.
- the controller (4) is preferably comprising a set of position tracking sensors for tracking the players position, action, and movement in the VR, AR, or MR environment
- the headset (9) may be chosen from a conventional headset that shall comprise the head-tracking sensors or device (7) for tracking players head position and movement in the virtual environment
- the structure of the sphere (2) according to this invention may be chosen from any conventional spheres, but is preferably made up of transparent, semi-transparent, or translucent round-shaped materials with a set of projection films being placed or adhered on the surface of said sphere as shown in FIG. 8.
- the material used for the sphere may be chosen from glass, plastic, or the combination of the two. Further, the plastic material may be chosen from acrylic or polycarbonate plastic or ABS.
- the projectors used in this invention may comprise at least 2 projectors, or most preferably 4 projectors in accordance with FIG.7 for optimal coverage on the whole 360° of the sphere, and are preferably projecting images (30) being seen by the player (10) in VR, AR, or MR environment onto at least one part or the whole sphere (2) so that the same images (10) seen in the environment by the first-person player (10) can also be seen by the audience from outside of the sphere.
- the second computer (12) which receives information from the first computer (3) comprises at least a set of two cameras for at least for producing, processing, and projecting images as depicted in FIG.6 and wherein each camera provides at least 30° field of view (FOV).
- FOV field of view
- the second computer may further connect to and transmit image information to a set of corresponding projectors for projecting images (30) onto the sphere, accordingly.
- the second computer (12) comprises a set of four cameras (25-28) which connect and transmit image information to a set of corresponding four projectors (13-16) for projecting or displaying the images (30) onto the sphere (2).
- the second computer (12) shall comprise the following 3 technologies for seamless projection of images via at least two projectors onto the sphere (2) ⁇ .
- a multi-display technology for aiding the projection of images to cover large surface or the whole sphere (2) wherein the possible examples of said technology are AMD eyefinity technology and Graphic expansion module, which can be utilized individually or interchangeably;
- a geometric correction and edge-blending technology for correcting or finetuning the projected images on a rounded surface of the sphere and adjusting the edges of the images accordingly so that no gaps between images being projected onto the sphere (2) by different projectors are seen and mat the projected images are fused to one another.
- the second preferred camera arrangement may comprise a set of four cameras (25-28) connecting to a fifth camera (29) with a setting for orthographic projection wherein said fifth camera (29) receives and integrates the image information or textures and transmits them to a set of corresponding four projectors (13-16) for further projecting or displaying images (30) onto the sphere (2).
- the fifth camera (29) may receive the corrected or fine-tuned images or textures from the four cameras (25-28) having the camera fisheye distortion effect before transmitting the images to the projectors (13-16), accordingly.
- FIG.2 illustrates the flow diagram of one embodiment of image correction, integration, and optimization process being carried out by the second computer (12) and its set of cameras
- FIG.3 A shows example of a view from a front camera (25)
- FIG.3B shows example of a view from a back camera (26)
- FIG.3C shows example of a view from left camera (27)
- FIG.3D shows example of a view from right camera (28)
- 2) Using the cameras (25-28) with fisheye distortion effect to normalize or correct each of the image distortions as illustrated in FIG.4A to FIG.4B; 3) Having the fifth camera (29) integrating the images from the four cameras (25-28) and transmitting to the projectors (13-16); and 4) Using the geometric correction and edge blending technology for correcting or finetuning the projected images on a 3-D rounded surface of the sphere and adjusting the edges of the images accordingly for optimal display or projection of players environment to the sphere (2)
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Aviation & Aerospace Engineering (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Geometry (AREA)
- Controls And Circuits For Display Device (AREA)
- Projection Apparatus (AREA)
Abstract
L'invention concerne un système de projecteur (1) pour plate-forme sphérique comprenant fondamentalement deux composants : 1) un premier ordinateur (3) destiné à recevoir des entrées, à traiter les informations reçues des entrées, à répondre à ou à interagir avec un joueur par l'intermédiaire de divers objets ou équipements à l'intérieur de l'environnement, et à transmettre les informations à un module de projection d'image en temps réel (11) ; et 2) un deuxième ordinateur (12) dans le module de projection d'image en temps réel (11), destiné à recevoir les informations transmises en provenance du premier ordinateur (3) et à transmettre les informations à un ensemble de projecteurs (13-16) dans ledit module (11) en vue d'afficher vers la sphère (2) les images qui sont vues ou perçues par le joueur (10). Le deuxième ordinateur (12) qui reçoit des informations en provenance du premier ordinateur (3) peut en outre comprendre au moins un ensemble de deux caméras (25-28) destinées au moins à produire, à traiter et à projeter des images et chaque caméra couvrant un champ de vision (FOV) d'au moins 30°. Ces caméras (25-28) dans le deuxième ordinateur (12) peuvent en outre être connectées et transmettre des informations d'image à un ensemble de projecteurs (13-16) correspondants en vue de projeter en conséquence des images sur au moins une partie de la totalité de la sphère.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/TH2016/000102 WO2018117985A1 (fr) | 2016-12-21 | 2016-12-21 | Système de projecteur pour plate-forme sphérique |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/TH2016/000102 WO2018117985A1 (fr) | 2016-12-21 | 2016-12-21 | Système de projecteur pour plate-forme sphérique |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018117985A1 true WO2018117985A1 (fr) | 2018-06-28 |
Family
ID=62627572
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/TH2016/000102 Ceased WO2018117985A1 (fr) | 2016-12-21 | 2016-12-21 | Système de projecteur pour plate-forme sphérique |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2018117985A1 (fr) |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20090187389A1 (en) * | 2008-01-18 | 2009-07-23 | Lockheed Martin Corporation | Immersive Collaborative Environment Using Motion Capture, Head Mounted Display, and Cave |
| CN101783011A (zh) * | 2010-01-08 | 2010-07-21 | 宁波大学 | 一种鱼眼镜头的畸变校正方法 |
| US20130088577A1 (en) * | 2010-05-18 | 2013-04-11 | Teknologian Tutkimuskeskus Vtt | Mobile device, server arrangement and method for augmented reality applications |
| CN103398701A (zh) * | 2013-07-31 | 2013-11-20 | 国家测绘地理信息局卫星测绘应用中心 | 一种基于物方投影面的星载非共线tdi ccd影像拼接方法 |
| CN104168315A (zh) * | 2014-08-08 | 2014-11-26 | 三星电子(中国)研发中心 | 一种全视角平滑的沉浸式显示方法和系统 |
| CN205608380U (zh) * | 2016-04-20 | 2016-09-28 | 胡晓东 | 一种vr体验设备、球形模拟器及其视听系统 |
| US20160353089A1 (en) * | 2015-05-27 | 2016-12-01 | Google Inc. | Capture and render of panoramic virtual reality content |
-
2016
- 2016-12-21 WO PCT/TH2016/000102 patent/WO2018117985A1/fr not_active Ceased
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20090187389A1 (en) * | 2008-01-18 | 2009-07-23 | Lockheed Martin Corporation | Immersive Collaborative Environment Using Motion Capture, Head Mounted Display, and Cave |
| CN101783011A (zh) * | 2010-01-08 | 2010-07-21 | 宁波大学 | 一种鱼眼镜头的畸变校正方法 |
| US20130088577A1 (en) * | 2010-05-18 | 2013-04-11 | Teknologian Tutkimuskeskus Vtt | Mobile device, server arrangement and method for augmented reality applications |
| CN103398701A (zh) * | 2013-07-31 | 2013-11-20 | 国家测绘地理信息局卫星测绘应用中心 | 一种基于物方投影面的星载非共线tdi ccd影像拼接方法 |
| CN104168315A (zh) * | 2014-08-08 | 2014-11-26 | 三星电子(中国)研发中心 | 一种全视角平滑的沉浸式显示方法和系统 |
| US20160353089A1 (en) * | 2015-05-27 | 2016-12-01 | Google Inc. | Capture and render of panoramic virtual reality content |
| CN205608380U (zh) * | 2016-04-20 | 2016-09-28 | 胡晓东 | 一种vr体验设备、球形模拟器及其视听系统 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Greengard | Virtual reality | |
| Miles et al. | A review of virtual environments for training in ball sports | |
| Murray | Building virtual reality with unity and steamvr | |
| CN105373224B (zh) | 一种基于普适计算的混合现实游戏系统及方法 | |
| Goradia et al. | A review paper on oculus rift & project morpheus | |
| US20170150108A1 (en) | Autostereoscopic Virtual Reality Platform | |
| US10049496B2 (en) | Multiple perspective video system and method | |
| US20130225305A1 (en) | Expanded 3d space-based virtual sports simulation system | |
| WO2021106803A1 (fr) | Système de classe, terminal de visualisation, procédé de traitement d'informations et programme | |
| US20180033328A1 (en) | Immersive vehicle simulator apparatus and method | |
| US20180261120A1 (en) | Video generating device, method of controlling video generating device, display system, video generation control program, and computer-readable storage medium | |
| Benzeroual et al. | Cyber (motion) sickness in active stereoscopic 3D gaming | |
| GB2535729A (en) | Immersive vehicle simulator apparatus and method | |
| WO2018117985A1 (fr) | Système de projecteur pour plate-forme sphérique | |
| JP7547501B2 (ja) | Vr映像空間生成システム | |
| TWI697848B (zh) | 多人連線遊戲的體感遊戲系統及其方法 | |
| WO2015196877A1 (fr) | Plate-forme à réalité virtuelle autostéréoscopique | |
| Bondarenko et al. | Learning Ball Juggling in Mixed Reality with Haptic Feedback | |
| EP3136372A1 (fr) | Appareil simulateur de véhicule immersif et procédé | |
| Bondarenko et al. | Understanding human behavior in mixed reality juggling simulator with haptic interaction | |
| TWM593891U (zh) | 互動式虛擬實境遊戲系統 | |
| JP7011746B1 (ja) | コンテンツ配信システム、コンテンツ配信方法、及びコンテンツ配信プログラム | |
| Loviscach | Playing with all senses: Human–Computer interface devices for games | |
| Tack | The influence of haptic interaction for virtual reality | |
| James et al. | The effects of post-processing techniques on simulator sickness in virtual reality |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16924414 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 16924414 Country of ref document: EP Kind code of ref document: A1 |