WO2020240512A1 - Collaborative immersive cave network - Google Patents
Collaborative immersive cave network Download PDFInfo
- Publication number
- WO2020240512A1 WO2020240512A1 PCT/IB2020/055147 IB2020055147W WO2020240512A1 WO 2020240512 A1 WO2020240512 A1 WO 2020240512A1 IB 2020055147 W IB2020055147 W IB 2020055147W WO 2020240512 A1 WO2020240512 A1 WO 2020240512A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- cave
- master
- request
- caves
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
- H04N7/157—Conference systems defining a virtual conference space and using avatars or agents
Definitions
- a cave automatic virtual environment is an immersive virtual reality environment where projectors are directed to between three and six of the walls of a room-sized cube.
- a CAVE is typically a video theater situated within a larger room.
- the walls of a CAVE are typically made up of rear-projection screens or flat panel displays.
- the floor may be a downward-projection screen, a bottom projected screen or a flat panel display.
- the projection systems are typically high-resolution due to the near distance viewing which requires very small pixel sizes to retain the illusion of reality.
- the user wears 3D glasses inside the CAVE to see 3D graphics generated by the CAVE. People using the CAVE can see objects apparently floating in the air, and can walk around them, getting a proper view of what they would look like in reality.
- Embodiments of the invention attempt to solve or address one or more of the technical problems identified.
- aspects of the invention overcome shortcomings of prior approaches by including embodiments of the invention having various connection modes for linking and synchronizing interaction content across a network of remote CAVEs.
- embodiments of the invention solve issues when users from a remote CAVE who may not have the 3D content data locally in its own remote site. As such, if the user would like to visualize and interact with the 3D content from a master site, the remote site would require downloading all the 3D contents in the master site.
- This action is a very time consuming process due to the large data size.
- embodiments of the invention overcome issues relating to when multiple users interacting on the same set of 3D content and their interactions may result in miss-synchronization issue. For example, one user may rotate a 3D object clockwise, while another user at the same time may rotate the same 3D object anticlockwise.
- embodiments of the invention provide a visual hardware
- FIG. 1 is an exemplary diagram illustrating a CAVE system setup to connect to a collaborative CAVE system according to one embodiment of the invention.
- FIG. 2 is a diagram illustrating a network configuration with multiple master CAVEs streaming visual display to multiple slave CAVEs according to one embodiment of the invention.
- FIG. 3 is a flow chart illustrating a lock and wait method according to one
- FIG. 4 is a diagram illustrating a portable computing device according to one embodiment of the invention.
- FIG. 5 is a diagram illustrating a remote computing device according to one
- the present invention may be embodied as methods, systems, computer readable media, apparatuses, or devices. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. The following detailed description may, therefore, not to be taken in a limiting sense.
- remote CAVE systems without up-to-date 3D data content may automatically update the 3D data content while connected to the virtual private cloud (VPC) via aspects of the invention.
- VPC virtual private cloud
- the interaction method includes:
- embodiments of the invention may enable each local site to
- a software system of embodiments of the invention may include two sub-systems: i) Streaming and Synchronization system, and ii) co-interaction system.
- the Streaming and Synchronization system may include features such as a) remotely display the same 3D virtual reality environment in a CAVE streaming from a master CAVE without transmitting the 3D data to the slave CAVEs, and b) the system may automatically synchronize the 3D data stored in all connected master CAVEs.
- interaction system may include a real-time graphics engine which resolves simultaneous interaction conflict using a lock and wait approach.
- FIG. 1 an exemplary diagram illustrates a CAVE system setup to connect to a collaborative CAVE system according to one embodiment of the invention.
- a CAVE master as illustrated may include projectors with speakers and microphone systems in a room.
- the person who is conducting the teleconference may stand before one or more screens having an interaction tool in hand to interact with a virtual object or the like.
- FIG. 2 is a diagram illustrating a network configuration with multiple master CAVEs streaming visual display to multiple slave CAVEs according to one embodiment of the invention.
- the lines in FIG. 2 indicate connections established between all CAVEs, where synchronization between master CAVEs are conducted via these connections to the Cloud.
- All slave CAVEs are streaming from the nearest master CAVEs in the network.
- signal 1 and signal 2 indicate signal streaming from the nearest master CAVE(s) to slave CAVEs. Note that data does not exist locally in any slave CAVEs.
- all the Master CAVEs (e.g., Master 1, Master 2, and Master 3) are connected to each regardless where they are in the networks as they are connected within a VPC.
- these devices are connected to the nearest master CAVE within a Local Area Network (e.g. via Connection 1).
- slaves CAVEs may also connect to the nearest master CAVE (e.g., via Connection 2).
- aspects of the invention incorporate a streaming and synchronization system. For example, one master CAVE initializes by opening a streaming session, any slave CAVEs (i.e. without 3D content data) can join the session anytime from now on.
- a remote master CAVEs may join the streaming session for initial data synchronization to make sure data
- the master CAVE who the master CAVE who initialize the streaming session may be regarded as“local”; those master CAVEs who join in the session thereafter are considered“remote”.
- the slave CAVE who initializes the interaction will send the interaction signal to its connecting (nearest) master CAVE.
- the master CAVE who receives the interaction signal will trigger UI event handling locally, and regard the event as“Upon user interaction with the 3D data from any master CAVEs”.
- FIG. 3 is a flow chart illustrating a lock and wait method according to one
- FIG. 3 illustrates a second sub system where co- interactive system may include a real-time graphics engine that resolve simultaneous interaction conflict using a lock and wait approach, which is illustrated in the flow chart.
- the flow chart starts with a user event request.
- the user event request may be a user who wishes to request an interaction with an object.
- the wait time frame may be 5 seconds.
- the waiting time frame may be a function of a computational power or computational of the master/slave CAVEs. For example, more powerful systems may have a shorter wait time while less powerful systems may have a longer wait time.
- FIG. 4 may be a high level illustration of a portable computing device 801
- the application may be stored and accessed in a variety of ways.
- the application may be obtained in a variety of ways such as from an app store, from a web site, from a store Wi-Fi system, etc.
- a portable computing device 801 may be a mobile device 112 that operates using a portable power source 855 such as a battery.
- the portable computing device 801 may also have a display 802 which may or may not be a touch sensitive display. More specifically, the display 802 may have a capacitance sensor, for example, that may be used to provide input data to the portable computing device 801.
- an input pad 804 such as arrows, scroll wheels, keyboards, etc., may be used to provide inputs to the portable computing device 801.
- the portable computing device 801 may have a microphone 806 which may accept and store verbal data, a camera 808 to accept images and a speaker 810 to communicate sounds.
- the portable computing device 801 may be able to communicate with a
- the portable computing device 801 may be able to communicate in a variety of ways.
- the communication may be wired such as through an Ethernet cable, a USB cable or RJ6 cable.
- the communication may be wireless such as through Wi-Fi (802.11 standard),
- FIG. 4 may be a simplified illustration of the physical elements that make up a portable computing device 801 and FIG. 5 may be a simplified illustration of the physical elements that make up a server type computing device 841.
- FIG. 4 may be a sample portable computing device 801 that is physically
- the portable computing device 801 may have a processor 850 that is physically configured according to computer executable instructions. It may have a portable power supply 855 such as a battery which may be rechargeable. It may also have a sound and video module 860 which assists in displaying video and sound and may turn off when not in use to conserve power and battery life. The portable computing device 801 may also have volatile memory 865 and non-volatile memory 870. It may have GPS capabilities 880 that may be a separate circuit or may be part of the processor 850.
- an input/output bus 875 that shuttles data to and from the various user input devices such as the microphone 806, the camera 808 and other inputs, such as the input pad 804, the display 802, and the speakers 810, etc. It also may control of communicating with the networks, either through wireless or wired devices.
- the portable computing device 801 this is just one embodiment of the portable computing device 801 and the number and types of portable computing devices 801 is limited only by the imagination.
- the computing device 841 may include a digital storage such as a magnetic disk, an optical disk, flash storage, non-volatile storage, etc. Structured data may be stored in the digital storage such as in a database.
- the server 841 may have a processor 1000 that is physically configured according to computer executable instructions. It may also have a sound and video module 1005 which assists in displaying video and sound and may turn off when not in use to conserve power and battery life. The server 841 may also have volatile memory 1010 and non-volatile memory 1015.
- the database 1025 may be stored in the memory 1010 or 1015 or may be
- the database 1025 may also be part of a cloud of computing device 841 and may be stored in a distributed manner across a plurality of computing devices 841.
- the input/output bus 1020 also may control of communicating with the networks, either through wireless or wired devices.
- the application may be on the local computing device 801 and in other embodiments, the application may be remote 841. Of course, this is just one embodiment of the server 841 and the number and types of portable computing devices 841 is limited only by the imagination.
- the user devices, computers and servers described herein may be general purpose computers that may have, among other elements, a microprocessor (such as from the Intel® Corporation, AMD®, ARM®, Qualcomm®, or MediaTek®); volatile and non volatile memory; one or more mass storage devices (i.e., a hard drive); various user input devices, such as a mouse, a keyboard, or a microphone; and a video display system.
- the user devices, computers and servers described herein may be running on any one of many operating systems including, but not limited to WINDOWS®, UNIX®, LINUX®, MAC OS®, iOS®, or Android®. It is contemplated, however, that any suitable operating system may be used for the present invention.
- the servers may be a cluster of web servers, which may each be LINUX® based and supported by a load balancer that decides which of the cluster of web servers should process a request based upon the current request-load of the available server(s).
- the user devices, computers and servers described herein may communicate via networks, including the Internet, WAN, LAN, Wi-Fi, other computer networks (now known or invented in the future), and/or any combination of the foregoing. It should be understood by those of ordinary skill in the art having the present specification, drawings, and claims before them that networks may connect the various components over any combination of wired and wireless conduits, including copper, fiber optic, microwaves, and other forms of radio frequency, electrical and/or optical communication techniques.
- any network may be connected to any other network in a different manner.
- the interconnections between computers and servers in system are examples. Any device described herein may communicate with any other device via one or more networks.
- the example embodiments may include additional devices and networks beyond those shown. Further, the functionality described as being performed by one device may be distributed and performed by two or more devices. Multiple devices may also be combined into a single device, which may perform the functionality of the combined devices. [0048]
- the various participants and elements described herein may operate one or more computer apparatuses to facilitate the functions described herein. Any of the elements in the above-described Figures, including any servers, user devices, or databases, may use any suitable number of subsystems to facilitate the functions described herein.
- Any of the software components or functions described in this application may be implemented as software code or computer readable instructions that may be executed by at least one processor using any suitable computer language such as, for example, Java, C++, or Perl using, for example, conventional or object-oriented techniques.
- the software code may be stored as a series of instructions or commands on a non-transitory computer readable medium, such as a random access memory (RAM), a read only memory (ROM), a magnetic medium such as a hard-drive or a floppy disk, or an optical medium such as a CD-ROM.
- a non-transitory computer readable medium such as a random access memory (RAM), a read only memory (ROM), a magnetic medium such as a hard-drive or a floppy disk, or an optical medium such as a CD-ROM.
- RAM random access memory
- ROM read only memory
- magnetic medium such as a hard-drive or a floppy disk
- optical medium such as a CD-ROM.
- microprocessor (as the case may be) programmed to perform the particularly recited function using functionality found in any general purpose computer without special programming and/or by implementing one or more algorithms to achieve the recited functionality.
- algorithm may be expressed within this disclosure as a mathematical formula, a flow chart, a narrative, and/or in any other manner that provides sufficient structure for those of ordinary skill in the art to implement the recited process and its equivalents.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Information Transfer Between Computers (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
An immersive and interactive CAVE system for teleconferencing or multiplayer interactions enable users from a remote CAVE who may not have the 3D content data locally in its own remote site. The system each local site to configure the hardware system that includes the CAVE setup and a router for connecting to the virtual private cloud (VPC) service allocated. The system further includes two sub-systems: i) Streaming and Synchronization system, and ii) co-interaction system.
Description
COLLABORATIVE IMMERSIVE CAVE NETWORK
CROSS-REFERENCE TO RELATED APPLICATION
[001] This is a US provisional patent application.
FIELD OF THE INVENTION
[002] A general description of the field of the invention.
BACKGROUND
[003] A cave automatic virtual environment (CAVE) is an immersive virtual reality environment where projectors are directed to between three and six of the walls of a room-sized cube. A CAVE is typically a video theater situated within a larger room. The walls of a CAVE are typically made up of rear-projection screens or flat panel displays. The floor may be a downward-projection screen, a bottom projected screen or a flat panel display. The projection systems are typically high-resolution due to the near distance viewing which requires very small pixel sizes to retain the illusion of reality. The user wears 3D glasses inside the CAVE to see 3D graphics generated by the CAVE. People using the CAVE can see objects apparently floating in the air, and can walk around them, getting a proper view of what they would look like in reality.
[004] The frame of early CAVEs had to be built from non-magnetic materials such as wood to minimize interference with the electromagnetic sensors; the change to infrared tracking has removed that limitation. A CAVE user's movements are tracked by the sensors typically attached to the 3D glasses and the video continually adjusts to retain the viewers perspective. Computers control both this aspect of the CAVE and the audio aspect. There are typically multiple speakers placed at multiple angles in the CAVE, providing 3D sound to complement the 3D video.
[005] When it comes to teleconferencing, however, existing teleconferencing software only supports desktop sharing. This creates a technological barrier to immersive display, such as CAVE, to be shared across devices during a teleconferencing sessions. At the same time, in order to create a CAVE-like environment in a teleconferencing session, a large amount of 3D data is required. The storage and management of such large-scale 3D data are expensive, especially given the multiple sessions and locations for such a teleconferencing session. A given teleconferencing host and participant would need to synchronize mega-large size of 3D data for visualization.
[006] Moreover, multiple remote user interaction over large-scale 3D content is only available via head-mounted VR devices. This limitation makes interactive and immersive CAVE in a teleconferencing session difficult and may require multiple head- mounted VR devices for all participants. This further adds prohibitive costs to such a session.
[007] Embodiments of the invention attempt to solve or address one or more of the technical problems identified.
SUMMARY
[008] Aspects of the invention overcome shortcomings of prior approaches by including embodiments of the invention having various connection modes for linking and synchronizing interaction content across a network of remote CAVEs.
[009] In another aspect, embodiments of the invention solve issues when users from a remote CAVE who may not have the 3D content data locally in its own remote site. As such, if the user would like to visualize and interact with the 3D content from a master
site, the remote site would require downloading all the 3D contents in the master site.
This action is a very time consuming process due to the large data size.
[0010] In a further aspect, embodiments of the invention overcome issues relating to when multiple users interacting on the same set of 3D content and their interactions may result in miss-synchronization issue. For example, one user may rotate a 3D object clockwise, while another user at the same time may rotate the same 3D object anticlockwise.
[0011] In one aspect, embodiments of the invention provide a visual hardware and
software system for experiencing immersive virtual reality environment while being able to collaboratively interact with remote users in a teleconferencing session.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] Persons of ordinary skill in the art may appreciate that elements in the figures are illustrated for simplicity and clarity so not all connections and options have been shown to avoid obscuring the inventive aspects. For example, common but well-understood elements that are useful or necessary in a commercially feasible embodiment may often not be depicted in order to facilitate a less obstructed view of these various embodiments of the present disclosure. It will be further appreciated that certain actions and/or steps may be described or depicted in a particular order of occurrence while those skilled in the art will understand that such specificity with respect to sequence is not actually required. It will also be understood that the terms and expressions used herein may be defined with respect to their corresponding respective areas of inquiry and study except where specific meanings have otherwise been set forth herein.
[0013] FIG. 1 is an exemplary diagram illustrating a CAVE system setup to connect to a collaborative CAVE system according to one embodiment of the invention.
[0014] FIG. 2 is a diagram illustrating a network configuration with multiple master CAVEs streaming visual display to multiple slave CAVEs according to one embodiment of the invention.
[0015] FIG. 3 is a flow chart illustrating a lock and wait method according to one
embodiment of the invention.
[0016] FIG. 4 is a diagram illustrating a portable computing device according to one embodiment of the invention.
[0017] FIG. 5 is a diagram illustrating a remote computing device according to one
embodiment of the invention.
[0018]
DETAILED DESCRIPTION
[0019] The present invention may now be described more fully with reference to the accompanying drawings, which form a part hereof, and which show, by way of illustration, specific exemplary embodiments by which the invention may be practiced. These illustrations and exemplary embodiments may be presented with the understanding that the present disclosure is an exemplification of the principles of one or more inventions and may not be intended to limit any one of the inventions to the embodiments illustrated. The invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the
scope of the invention to those skilled in the art. Among other things, the present invention may be embodied as methods, systems, computer readable media, apparatuses, or devices. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. The following detailed description may, therefore, not to be taken in a limiting sense.
[0020] Without traveling to where the 3D data content and CAVE are physically on-site, remote CAVE systems without up-to-date 3D data content may automatically update the 3D data content while connected to the virtual private cloud (VPC) via aspects of the invention. This is advantageous to various applications such as in architectural settings, immersive gaming environment, interactive 3D video conferencing, etc.
[0021] ALL remote CAVE systems with/without 3D data content can interact,
collaborate, and manipulate the same 3D data content in their local immersive environment. The interaction method includes:
[0022] voice conferencing;
[0023] touchable interaction; and
[0024] multiplayer video game interaction.
[0025] In one aspect, embodiments of the invention may enable each local site to
configure the hardware system that includes the CAVE setup and a router for connecting to the virtual private cloud (VPC) service allocated. A software system of embodiments of the invention may include two sub-systems: i) Streaming and Synchronization system, and ii) co-interaction system.
[0026] For example, the Streaming and Synchronization system may include features such as a) remotely display the same 3D virtual reality environment in a CAVE streaming from a master CAVE without transmitting the 3D data to the slave CAVEs, and b) the system may automatically synchronize the 3D data stored in all connected master CAVEs.
[0027] With the co-interaction system of embodiments of the invention, the co
interaction system may include a real-time graphics engine which resolves simultaneous interaction conflict using a lock and wait approach.
[0028] Referring to FIG. 1 , an exemplary diagram illustrates a CAVE system setup to connect to a collaborative CAVE system according to one embodiment of the invention. For example, a CAVE master as illustrated may include projectors with speakers and microphone systems in a room. The person who is conducting the teleconference may stand before one or more screens having an interaction tool in hand to interact with a virtual object or the like.
[0029] With this set up, in order to obtain the benefits of aspects of the invention,
specific network configurations may be needed. For example, FIG. 2 is a diagram illustrating a network configuration with multiple master CAVEs streaming visual display to multiple slave CAVEs according to one embodiment of the invention. For example, as illustrated, the lines in FIG. 2 indicate connections established between all CAVEs, where synchronization between master CAVEs are conducted via these connections to the Cloud. All slave CAVEs are streaming from the nearest master CAVEs in the network. For example, signal 1 and signal 2 indicate signal streaming
from the nearest master CAVE(s) to slave CAVEs. Note that data does not exist locally in any slave CAVEs.
[0030] In another embodiment, all the Master CAVEs (e.g., Master 1, Master 2, and Master 3) are connected to each regardless where they are in the networks as they are connected within a VPC. However, for slave CAVEs (e.g., in a teleconference setting, client CAVEs), these devices are connected to the nearest master CAVE within a Local Area Network (e.g. via Connection 1). In another embodiment, slaves CAVEs may also connect to the nearest master CAVE (e.g., via Connection 2).
[0031] Once CAVEs have established with the network configuration as illustrated in FIG. 2, aspects of the invention incorporate a streaming and synchronization system. For example, one master CAVE initializes by opening a streaming session, any slave CAVEs (i.e. without 3D content data) can join the session anytime from now on.
[0032] In another embodiment, a remote master CAVEs (i.e. with 3D content data) may join the streaming session for initial data synchronization to make sure data
completeness. Note that once the session is started, no other master CAVEs may join as master CAVEs.
[0033] Moreover, as the streaming session is started, and rendering from master CAVEs will be streaming to nearest slave CAVEs by network distance. Upon receiving user interaction with the 3D data from any master CAVEs, a local master CAVE will synchronize visually and digitally (i.e. 3D data) to all other master CAVEs connect inside the VPC.
[0034] In one embodiment, the master CAVE who the master CAVE who initialize the streaming session may be regarded as“local”; those master CAVEs who join in the session thereafter are considered“remote”.
[0035] All the master CAVEs will then update the visual via streaming to their
corresponding connected slave CAVEs upon user interaction with the 3D data from any slave CAVEs, the slave CAVE who initializes the interaction will send the interaction signal to its connecting (nearest) master CAVE. The master CAVE who receives the interaction signal will trigger UI event handling locally, and regard the event as“Upon user interaction with the 3D data from any master CAVEs”.
[0036] FIG. 3 is a flow chart illustrating a lock and wait method according to one
embodiment of the invention. In another embodiment, FIG. 3 illustrates a second sub system where co- interactive system may include a real-time graphics engine that resolve simultaneous interaction conflict using a lock and wait approach, which is illustrated in the flow chart. In one embodiment, the flow chart starts with a user event request. In one embodiment, the user event request may be a user who wishes to request an interaction with an object.
[0037] In example, the wait time frame may be 5 seconds. In another embodiment, the waiting time frame may be a function of a computational power or computational of the master/slave CAVEs. For example, more powerful systems may have a shorter wait time while less powerful systems may have a longer wait time.
[0038] FIG. 4 may be a high level illustration of a portable computing device 801
communicating with a remote computing device 841 but the application may be stored and accessed in a variety of ways. In addition, the application may be obtained in a
variety of ways such as from an app store, from a web site, from a store Wi-Fi system, etc. There may be various versions of the application to take advantage of the benefits of different computing devices, different languages, and different API platforms.
[0039] In one embodiment, a portable computing device 801 may be a mobile device 112 that operates using a portable power source 855 such as a battery. The portable computing device 801 may also have a display 802 which may or may not be a touch sensitive display. More specifically, the display 802 may have a capacitance sensor, for example, that may be used to provide input data to the portable computing device 801. In other embodiments, an input pad 804 such as arrows, scroll wheels, keyboards, etc., may be used to provide inputs to the portable computing device 801. In addition, the portable computing device 801 may have a microphone 806 which may accept and store verbal data, a camera 808 to accept images and a speaker 810 to communicate sounds.
[0040] The portable computing device 801 may be able to communicate with a
computing device 841 or a plurality of computing devices 841 that make up a cloud of computing devices 811. The portable computing device 801 may be able to communicate in a variety of ways. In some embodiments, the communication may be wired such as through an Ethernet cable, a USB cable or RJ6 cable. In other embodiments, the communication may be wireless such as through Wi-Fi (802.11 standard),
BLUETOOTH, cellular communication or near field communication devices. The communication may be direct to the computing device 841 or may be through a communication network 102 such as cellular service, through the Internet, through a private network, through BLUETOOTH, etc. FIG. 4 may be a simplified illustration of the physical elements that make up a portable computing device 801 and FIG. 5 may be a
simplified illustration of the physical elements that make up a server type computing device 841.
[0041] FIG. 4 may be a sample portable computing device 801 that is physically
configured according to be part of the system. The portable computing device 801 may have a processor 850 that is physically configured according to computer executable instructions. It may have a portable power supply 855 such as a battery which may be rechargeable. It may also have a sound and video module 860 which assists in displaying video and sound and may turn off when not in use to conserve power and battery life. The portable computing device 801 may also have volatile memory 865 and non-volatile memory 870. It may have GPS capabilities 880 that may be a separate circuit or may be part of the processor 850. There also may be an input/output bus 875 that shuttles data to and from the various user input devices such as the microphone 806, the camera 808 and other inputs, such as the input pad 804, the display 802, and the speakers 810, etc. It also may control of communicating with the networks, either through wireless or wired devices. Of course, this is just one embodiment of the portable computing device 801 and the number and types of portable computing devices 801 is limited only by the imagination.
[0042] As a result of the system, better information may be provided to a user at a point of sale. The information may be user specific and may be required to be over a threshold of relevance. As a result, users may make better informed decisions. The system is more than just speeding a process but uses a computing system to achieve a better outcome.
[0043] The physical elements that make up the remote computing device 841 may be further illustrated in FIG. 5. At a high level, the computing device 841 may include a
digital storage such as a magnetic disk, an optical disk, flash storage, non-volatile storage, etc. Structured data may be stored in the digital storage such as in a database.
The server 841 may have a processor 1000 that is physically configured according to computer executable instructions. It may also have a sound and video module 1005 which assists in displaying video and sound and may turn off when not in use to conserve power and battery life. The server 841 may also have volatile memory 1010 and non-volatile memory 1015.
[0044] The database 1025 may be stored in the memory 1010 or 1015 or may be
separate. The database 1025 may also be part of a cloud of computing device 841 and may be stored in a distributed manner across a plurality of computing devices 841. There also may be an input/output bus 1020 that shuttles data to and from the various user input devices such as the microphone 806, the camera 808, the inputs such as the input pad 804, the display 802, and the speakers 810, etc. The input/output bus 1020 also may control of communicating with the networks, either through wireless or wired devices. In some embodiments, the application may be on the local computing device 801 and in other embodiments, the application may be remote 841. Of course, this is just one embodiment of the server 841 and the number and types of portable computing devices 841 is limited only by the imagination.
[0045] The user devices, computers and servers described herein may be general purpose computers that may have, among other elements, a microprocessor (such as from the Intel® Corporation, AMD®, ARM®, Qualcomm®, or MediaTek®); volatile and non volatile memory; one or more mass storage devices (i.e., a hard drive); various user input devices, such as a mouse, a keyboard, or a microphone; and a video display system. The
user devices, computers and servers described herein may be running on any one of many operating systems including, but not limited to WINDOWS®, UNIX®, LINUX®, MAC OS®, iOS®, or Android®. It is contemplated, however, that any suitable operating system may be used for the present invention. The servers may be a cluster of web servers, which may each be LINUX® based and supported by a load balancer that decides which of the cluster of web servers should process a request based upon the current request-load of the available server(s).
[0046] The user devices, computers and servers described herein may communicate via networks, including the Internet, WAN, LAN, Wi-Fi, other computer networks (now known or invented in the future), and/or any combination of the foregoing. It should be understood by those of ordinary skill in the art having the present specification, drawings, and claims before them that networks may connect the various components over any combination of wired and wireless conduits, including copper, fiber optic, microwaves, and other forms of radio frequency, electrical and/or optical communication techniques.
It should also be understood that any network may be connected to any other network in a different manner. The interconnections between computers and servers in system are examples. Any device described herein may communicate with any other device via one or more networks.
[0047] The example embodiments may include additional devices and networks beyond those shown. Further, the functionality described as being performed by one device may be distributed and performed by two or more devices. Multiple devices may also be combined into a single device, which may perform the functionality of the combined devices.
[0048] The various participants and elements described herein may operate one or more computer apparatuses to facilitate the functions described herein. Any of the elements in the above-described Figures, including any servers, user devices, or databases, may use any suitable number of subsystems to facilitate the functions described herein.
[0049] Any of the software components or functions described in this application, may be implemented as software code or computer readable instructions that may be executed by at least one processor using any suitable computer language such as, for example, Java, C++, or Perl using, for example, conventional or object-oriented techniques.
[0050] The software code may be stored as a series of instructions or commands on a non-transitory computer readable medium, such as a random access memory (RAM), a read only memory (ROM), a magnetic medium such as a hard-drive or a floppy disk, or an optical medium such as a CD-ROM. Any such computer readable medium may reside on or within a single computational apparatus and may be present on or within different computational apparatuses within a system or network.
[0051]
[0052] It may be understood that the present invention as described above may be
implemented in the form of control logic using computer software in a modular or integrated manner. Based on the disclosure and teachings provided herein, a person of ordinary skill in the art may know and appreciate other ways and/or methods to implement the present invention using hardware, software, or a combination of hardware and software.
[0053] The above description is illustrative and is not restrictive. Many variations of the invention will become apparent to those skilled in the art upon review of the disclosure.
The scope of the invention should, therefore, be determined not with reference to the above description, but instead should be determined with reference to the pending claims along with their full scope or equivalents.
[0054] One or more features from any embodiment may be combined with one or more features of any other embodiment without departing from the scope of the invention. A recitation of "a", "an" or "the" is intended to mean "one or more" unless specifically indicated to the contrary. Recitation of "and/or" is intended to represent the most inclusive sense of the term unless specifically indicated to the contrary.
[0055] One or more of the elements of the present system may be claimed as means for accomplishing a particular function. Where such means-plus-function elements are used to describe certain elements of a claimed system it will be understood by those of ordinary skill in the art having the present specification, figures and claims before them, that the corresponding structure is a general purpose computer, processor, or
microprocessor (as the case may be) programmed to perform the particularly recited function using functionality found in any general purpose computer without special programming and/or by implementing one or more algorithms to achieve the recited functionality. As would be understood by those of ordinary skill in the art that algorithm may be expressed within this disclosure as a mathematical formula, a flow chart, a narrative, and/or in any other manner that provides sufficient structure for those of ordinary skill in the art to implement the recited process and its equivalents.
[0056] While the present disclosure may be embodied in many different forms, the drawings and discussion are presented with the understanding that the present disclosure
is an exemplification of the principles of one or more inventions and is not intended to limit any one of the inventions to the embodiments illustrated.
[0057] The present disclosure provides a solution to the long-felt need described above.
In particular, the systems and methods described herein may be configured for improving CAVE applications in a teleconferencing setting. Further advantages and modifications of the above described system and method will readily occur to those skilled in the art. The disclosure, in its broader aspects, is therefore not limited to the specific details, representative system and methods, and illustrative examples shown and described above. Various modifications and variations can be made to the above specification without departing from the scope or spirit of the present disclosure, and it is intended that the present disclosure covers all such modifications and variations provided they come within the scope of the following claims and their equivalents.
Claims
1. A system for resolving conflicts in a cave automatic virtual environment (CAVE) interactions among multiple users comprising:
receiving a user event request from a local CAVE, said user event request comprising a request from a user to interact with an object available in an interactive multiplayer CAVE session connected via a computer network;
determining whether the object is currently in a locked state;
in response to determining that the object is not in the locked state, determining whether the request was sent from a master CAVE, said master CAVE comprises a CAVE system that initiated the session;
in response to determining that the request was sent from the master CAVE, trigger graphical user interface (GUI) event handler locally at the local CAVE;
in response to determining that the request was sent from a system other than the master CAVE, triggering another GUI event handler from a nearest master CAVE in the computer network;
issuing a command to put all interactive items in the locked state;
synchronize visual elements in all CAVE systems; and
release all interactive items from the locked state.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202080003847.9A CN113841416A (en) | 2019-05-31 | 2020-05-30 | Interactive immersive cave network |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201962855660P | 2019-05-31 | 2019-05-31 | |
| US62/855,660 | 2019-05-31 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020240512A1 true WO2020240512A1 (en) | 2020-12-03 |
Family
ID=73552724
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2020/055147 Ceased WO2020240512A1 (en) | 2019-05-31 | 2020-05-30 | Collaborative immersive cave network |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN113841416A (en) |
| WO (1) | WO2020240512A1 (en) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20030227487A1 (en) * | 2002-06-01 | 2003-12-11 | Hugh Harlan M. | Method and apparatus for creating and accessing associative data structures under a shared model of categories, rules, triggers and data relationship permissions |
| US20090187389A1 (en) * | 2008-01-18 | 2009-07-23 | Lockheed Martin Corporation | Immersive Collaborative Environment Using Motion Capture, Head Mounted Display, and Cave |
| CN101690150A (en) * | 2007-04-14 | 2010-03-31 | 缪斯科姆有限公司 | virtual reality-based teleconferencing |
| WO2018102649A1 (en) * | 2016-12-02 | 2018-06-07 | Google Llc | Collaborative manipulation of objects in virtual reality |
| US20190114061A1 (en) * | 2016-03-23 | 2019-04-18 | Bent Image Lab, Llc | Augmented reality for the internet of things |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8838526B2 (en) * | 2011-06-24 | 2014-09-16 | Salesforce.Com, Inc. | Systems and methods for supporting transactional message handling |
| KR102352933B1 (en) * | 2016-09-09 | 2022-01-20 | 삼성전자주식회사 | Method and apparatus for processing 3 dimensional image |
| WO2018131803A1 (en) * | 2017-01-10 | 2018-07-19 | 삼성전자 주식회사 | Method and apparatus for transmitting stereoscopic video content |
| US20180314322A1 (en) * | 2017-04-28 | 2018-11-01 | Motive Force Technology Limited | System and method for immersive cave application |
-
2020
- 2020-05-30 WO PCT/IB2020/055147 patent/WO2020240512A1/en not_active Ceased
- 2020-05-30 CN CN202080003847.9A patent/CN113841416A/en active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20030227487A1 (en) * | 2002-06-01 | 2003-12-11 | Hugh Harlan M. | Method and apparatus for creating and accessing associative data structures under a shared model of categories, rules, triggers and data relationship permissions |
| CN101690150A (en) * | 2007-04-14 | 2010-03-31 | 缪斯科姆有限公司 | virtual reality-based teleconferencing |
| US20090187389A1 (en) * | 2008-01-18 | 2009-07-23 | Lockheed Martin Corporation | Immersive Collaborative Environment Using Motion Capture, Head Mounted Display, and Cave |
| US20190114061A1 (en) * | 2016-03-23 | 2019-04-18 | Bent Image Lab, Llc | Augmented reality for the internet of things |
| WO2018102649A1 (en) * | 2016-12-02 | 2018-06-07 | Google Llc | Collaborative manipulation of objects in virtual reality |
Non-Patent Citations (1)
| Title |
|---|
| MANJREKAR SIDDHESH; SANDILYA SHUBHRIKA; BHOSALE DEESHA; KANCHI SRAVANTHI; PITKAR ADWAIT; GONDHALEKAR MAYUR: "CAVE: An Emerging Immersive Technology - A Review", 2014 UKSIM-AMSS 16TH INTERNATIONAL CONFERENCE ON COMPUTER MODELLING AND SIMULATION, 26 March 2014 (2014-03-26), pages 131 - 136, XP032738424 * |
Also Published As
| Publication number | Publication date |
|---|---|
| CN113841416A (en) | 2021-12-24 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12033241B2 (en) | Scene interaction method and apparatus, electronic device, and computer storage medium | |
| JP2023082119A (en) | Virtual scene information interaction method, device, electronic device, storage medium and computer program | |
| WO2017117278A1 (en) | Displaying content from multiple devices | |
| US12086301B2 (en) | System for multi-user collaboration within a virtual reality environment | |
| JP6743273B2 (en) | Collaborative Immersive Live Action 360 Degree Video and Virtual Reality | |
| Pape et al. | Virtual heritage at iGrid 2000 | |
| CN109076203B (en) | System for projecting immersive audiovisual content | |
| US20230206571A1 (en) | System and method for syncing local and remote augmented reality experiences across devices | |
| CN111602118A (en) | Audio, video and control system for implementing virtual machine | |
| CN110471355A (en) | A kind of venue multiple-terminal control system | |
| CN114189743B (en) | Data transmission method, device, electronic equipment and storage medium | |
| CN114816315A (en) | Volume control for audio and video conferencing applications | |
| CN109696873B (en) | KTV box and optical video control method thereof | |
| WO2020240512A1 (en) | Collaborative immersive cave network | |
| JP4951912B2 (en) | Method, system, and program for optimizing presentation visual fidelity | |
| US20230186552A1 (en) | System and method for virtualized environment | |
| HK40062110A (en) | Interactive immersive cave network | |
| WO2012053001A2 (en) | Virtual office environment | |
| Cohen et al. | Directional selectivity in panoramic and pantophonic interfaces: Flashdark, Narrowcasting for Stereoscopic Photospherical Cinemagraphy, Akabeko Ensemble | |
| JP2002251637A (en) | Dynamic cell management method in 3D shared virtual space communication service and 3D shared virtual space communication system | |
| US20230334751A1 (en) | System and method for virtual events platform | |
| US20230401076A1 (en) | Dynamic input interaction | |
| JP2024073794A (en) | VIRTUAL SPACE GENERATION DEVICE, VIRTUAL SPACE GENERATION PROGRAM, AND VIRTUAL SPACE GENERATION METHOD | |
| West et al. | Designing a VR Arena: Integrating Virtual Environments and Physical Spaces for Social Sensorial Data-Driven Experiences | |
| CN117221641A (en) | Virtual interaction method, device, equipment and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20813970 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 20813970 Country of ref document: EP Kind code of ref document: A1 |