WO2025189176A1 - Système et procédé pour des agencements d'informations visuelles dans un format configurable par l'utilisateur - Google Patents
Système et procédé pour des agencements d'informations visuelles dans un format configurable par l'utilisateurInfo
- Publication number
- WO2025189176A1 WO2025189176A1 PCT/US2025/019076 US2025019076W WO2025189176A1 WO 2025189176 A1 WO2025189176 A1 WO 2025189176A1 US 2025019076 W US2025019076 W US 2025019076W WO 2025189176 A1 WO2025189176 A1 WO 2025189176A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- window
- video
- updated
- degree video
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
Definitions
- Apparatuses and methods consistent with example embodiments relate to virtual reality
- VR virtual reality
- Command center operators are required to maintain situational awareness of a large array of media sources simultaneously.
- Traditionally command centers have utilized large arrays of displays to allow the simultaneous viewing of information across different media sources including but not limited to dashboards, video feeds, camera feeds, sensor visualizations, and data visualizations. Effective performance of the tasks required of command center operators has depended to a large extent upon the operator’s physical presence within the command center to allow viewing of a large array of data sources simultaneously.
- a presentation system comprises: a wireless receiver configured to wirelessly receive, from a remote unit, 360 degree video data and global positioning data; a display unit; and a user-end unit operatively connected to the wireless receiver and to the display unit, the user-end unit comprising a non-transitory storage medium storing instructions and a processor configured to execute the instructions and thereby: render a 360 degree video from a point of view based on the 360 degree video data; create a user interface (HI) comprising a plurality of windows based on the 360 degree video data and the global positioning data, the plurality of windows comprising: a video window displaying the 360 degree video from the point of view, and at least one overlay window comprising a menu window displaying an interactive settings menu comprising a user- controllable indication of the point of view, and the global positioning data; control the display unit to display the UI; receive user input comprising a change of the indication of the point of view; render an updated 360 degree video from an updated point of
- each of the at least one overlay window is an overlay displayed over the video window.
- the at least one overlay window may further comprises a map window displaying a map and an indication on the map of a location of the remote unit; and control of the display unit to display the UI may comprise controlling the display unit to display, in the map window, an automatically updated map and an automatically updated indication based on the global positioning data.
- the processor may be further configured to: receive user input comprising an instruction to change one of a size and a position of one of the at least one overlay windows to one of a new size and a new position; and control the display unit to display an updated UI displaying the one of the at least one overlay windows in the one of the new size and the new position.
- the processor may be further configured to receive user input comprising an instruction to change display of the map in the map window; and control the display unit to display an updated UI displaying a changed display of the map in the map window.
- a method of data presentation comprising: wirelessly receiving 360 degree video data and global positioning data from a remote unit; rendering a 360 degree video from a point of view based on the 360 degree video data; creating a user interface (UI) comprising a plurality of windows based on the 360 degree video data and the global positioning data, the plurality of windows comprising; a video window displaying the 360 degree video from the point of view, and at least one overlay window comprising a menu window displaying an interactive settings menu comprising a user-controllable indication of the point of view, and the global positioning data; controlling a display unit to display the UI; receiving user input comprising a change of the indication of the point of view; rendering an updated 360 degree video from an updated point of view based on the 360 degree video data and the user input; controlling the display unit to display an updated video window comprising the updated 360 degree view.
- UI user interface
- the creating the UI may comprise creating the
- each of the at least one overlay window is an overlay displayed over the video window.
- the at least one overlay window may further comprise a map window displaying a map and an indication on the map of a location of the remote unit; and the controlling the display unit to display the UI may comprise controlling the display unit to display, in the map window, an automatically updated map and an automatically updated indication based on the global positioning data.
- the method may further comprise receiving user input comprising an instruction to change one of a size and a position of one of the at least one overlay windows to one of a new size and a new position; and controlling the display unit to display an updated UI displaying the one of the at least one overlay windows in the one of the new size and the new position.
- the metliod may further comprise receiving user input compr ising an instruction to change display of the map in the map window; and controlling the display unit to display an updated UI displaying a changed display of the map in the map window.
- a presentation and monitoring system comprising: a wireless transceiver configured to wirelessly communicate with a remote unit comprising a camera; a display unit; and a user-end unit operatively connected to the wireless receiver and to the dispby unit, the user-end unit comprising a non-transitory storage medium storing instructions and a processor configured to execute the instructions and thereby: receive, via the wireless transceiver, a data stream of 360 degree video data from the camera; render a 360 degree video based on the 360 degree video data; create a user interface (UI) comprising a plurality of windtows based on the 360 degree video data and the global positioning data, the plurality of windows comprising: a video window displaying the 360 degree video from the point of view, and at least one overlay window comprising a status menu displaying an internet protocol (IP) address of the camera and a status of one of the camera and a wireless connection between the wireless transceiver and the camera; control the display to
- the processor may be configured to create the UI such that each of the at least one overlay window is an overlay displayed over the video window.
- the status may comprise one of a ping status and a latency of a wireless transmission.
- the processor may be further configured to, in response to a change of the status: perform a response operation comprising at least one of: initiating a reconnection with the camera; transmitting an instruction for reconfiguration to the camera; throttling, bufforing, pausing, or disabling rendering of the 360 degree video; switching to a mixed reality mode; increasing a bitrate; changing a transmission format.
- a presentation and monitoring method comprising: wirelessly communicating with a remote unit comprising a camera; wirelessly receiving a data stream of 360 degree video data from the camera; rendering a 360 degree video based on the 360 degree video data; creating a user interface (UI) comprising a plurality of windows based on the 360 degree video data and the global positioning data, the plurality of windows comprising: a video window displaying the 360 degree video from the point of view, and at least one overlay window comprising a status menu displaying an internet protocol (IP) address of the camera and a status of one of the camera and a wireless connection between the wireless transceiver and the camera; controlling a display unit to display the UI; transmitting to the camera, via the wireless transceiver, a Status request; determining whether a response to the status request is received; determining an updated status based on one of a received response to the status request and a determination of no response received; creating an updated Ul based on the updated status; and controlling
- IP internet protocol
- the creating the UI may comprise creating the Ul such that each of the at least one overlay window is an overlay displayed over the video window.
- the status may comprise one of a ping status and a latency of a w ireless transmission-
- the method may further comprise: in response to a change of the status, performing a response operation comprising at least one of: initiating a reconnection with the camera; transmitting an instruction for reconfiguration to the camera; throttling, buffering, pausing, or disabling rendering Of the 360 degree video; switching to a mixed reality mode; increasing a bitrate; changing a transmission format.
- FIG. 2 is a block diagram of the required communication flows for the control of audiovisual hardware utilizing software capable of rendering a video wall within a head-mounted display;
- FIGS. 3A, 3B, and 3C, and 3 A 1 , 3 A2, 3B1 , 3B2, 3C1, and 3C2 are diagrammatic block and flow diagram illustrations of various components according to exemplary uses of the disclosed systems and methods;
- FIG. 4 is a diagrammatic illustration of a head-mounted display, host computer, control devices and sensors typical of Virtual Reality Systems that are capable of being utilized with exemplary implementations of exemplary embodiments of disclosed systems and methods;
- FIGS. 5A, 5B, 5C, and-5D are diagrammatic illustration of elements of examples of user interface according to exemplary implementations of exemplary embodiments of disclosed systems and methods;
- FIG. 6A is a diagrammatic illustration of a VR or MR display, tracking and input system capable of being utilized with, or deploying, exemplary implementations of exemplary embodiments of disclosed systems and methods;
- FIGS. 6B and 6c are illustrative examples of various components capable of being utilized tn exemplary implementations of exemplary embodiments of disclosed systems and methods;
- FIGs. 7 and 8 show examples of screen captures of a display unit of a UI unit according to one or more example embodiments
- FIG. 9A is a schematic illustration of a system according to an example embodiment
- FIG. 10 illustrates a scene representing a system according to one or more example embodiments
- FIGs. 11, 12, 13, and 14 show examples of screen captures of a display unit of a UI unit according to one or more example embodiments
- FIG. 15 is a flow diagram of a presentation method according to one or more example embodiments.
- FIG. 16 is a flow diagram of a presentation and monitoring method according to one or more example embodiments.
- FIG. 17 shows an example of a screen captures of a display unit of a UI unit according to one or more example embodiments.
- example embodiments may have different forms and may not be construed as being limited to the descriptions set forth herein.
- the term “and/or” includes any and all combinations of one or more of the associated listed items. Expressions such as “at least one of,” when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
- the terms such as “unit ” “-er (-or),” and “module” described in the specification refer to an element for performing at least one function or operation, and may be implemented in hardware, software, or the combination of hardware and software.
- the embodiments described herein relate to utilization of a head mounted display and virtual reality engine to decode and render video streams, transmit control messages to enable control of remote headend hardware and store/recall preset layouts either in synchronization with a physical video wall or as an extension of a physical video wall.
- Such implementations are not practical for secure command center operations.
- display devices used for the viewing of secure content must not have software packages installed that enable them to access, transmit or store sensitive data. Further display devices must be physically separated from networks with connected devices that host sensitive information.
- Current VR implementations that allow screen sharing and video playout, toil to provide a secure method to enable integration with traditional command center infrastructure based on technical requirements for access to files on the host or access to networks containing sensitive information.
- the current industry accepted secure command center video display implementation relies on hardware architectures consistent with the exemplary diagram shown in FIG 1 inclusive only of devices 101, 102, 103, 104, 105, 106, 107, 108, 109 and 115. Such architectures by design, mitigate the possibility of data spillage.
- Any VR/AR solution implemented in a secure environment must act as a display device that maintains logical and physical separation from networks or hosts containing sensitive information and must not store the video images it displays.
- the embodiments described herein approach these problems from a different perspective. Instead of utilizing the host computer and HMD as repository for content or a host client that accesses data from a server, the system utilizes toe system exclusively for content playout of real time streams of video in the same manner that a stateless display device displays content, but neither stores nor accesses the source of the content.
- exemplary embodiments of the disclosed system and methods allow users of the system to receive and view video streams from computers and other video sources while being physically disconnected from the network that the source computers and devices generating that video are connected to, providing an enhanced level of segmentation and security.
- Exemplary embodiments described herein provide technologies and techniques for using a 3D engine and VRZAR capable HMD to reproduce and arrange video images and process control messages to and from the hardware devices typically controlled by an AV Control
- System processor such as Video Teleconferencing Hardware, Pan-Tilt-Zoom robotic camera systems.
- Video Switching infrastructure Video Processing hardware, Lighting systems, Building management systems, displays and any other device with an API, relay control, GPIO and Logic
- systems and methods are provided for the implementation of various configurations and use cases which may be implemented utilizing the disclosed methods and system.
- Examples of such implementations include the provisioning of multiple VR/AR HMD systems that share the same unicast streams providing mirroring of content across all media streaming textures within the 3D engine, multiple VR/AR HMD systems that each receive their own unique unicast video streams allowing individual unique content to be displayed in each HMD and variations of the system where the VR/AR HMD system is either collocated in the command center where the video switching and streaming encoders installed or where the VR/AR HMD is remotely located and connected via a secure encrypted VPN, GRE tunnel or other TCP/IP protocol
- Exemplary embodiments described herein further can include systems and methods for generating a virtual reality environment, wherein the virtual reality environment contains one or more three-dimensional virtual controls; providing the virtual reality environment for display, through a virtual reality device; obtaining input through manipulation of the one or more virtual controls, wherein the manipulation is made using the virtual reality device and the manipulation follows at least one movement pattern associated with the one or more virtual controls; determining, based on the input, content of the one or more virtual controls, and the movement pattern associated with the manipulation, changes to the virtual environment wherein the changes are reflective of the manipulation of the one or more virtual controls; and providing the changes to the virtual reality device for display.
- FIG. 1 is a block diagram of an exemplary system according to exemplary embodiments including devices/components/modules/ functions 101, 102, 103, 104, 105, 106,
- 107, 108, 109, 110 and 115 representative of those included in a command center AV architecture to which the disclosed system can be attached to enable extension of the display wall
- the architecture can be scaled to support an unlimited number of input devices (101, 102, 103, 104, 105, 106) and an unlimited number of output devices (115) depending on the requirements of the system, where modules 1.01 are COTS personal computers with graphics cards, installed (typically HDMl or USB outputs would be available), module 102 is a COTS USB KVM transmitter that connects to a matrix switching headend. Module 101 would typically be connected to module 102 using an HDMl or Display
- Network Switch or Video Matrix Switcher 107 using multimode or single mode fiber or CATx cabling dependent on the application.
- Connection between 110 and 112 can be any video format that shares compatibility between 110 and 112.
- Comma) implementations will include HDMI, 12G SDI, Display Port and DVI for connection between 110 and 112.
- the output of 110 and the input of 112 be configured Such that the maximum resolution per ip encoder be delivered from the video wall processor 110 to the IP Encoder 1 12. This can enable the maximum amount of video information to be transmitted to the VRZAR engine per stream.
- IP Video Encoders 112 can be connected to a COTS network switch device 114 using standard Ethernet protocol
- Device 114 can be either a single switch or a LAN composed of multiple switches, routers, servers and hosts as required.
- device 113 hardware encryption device or VPN appliance may be inserted.
- a decryption device or VPN device 113 would be inserted at the point of ingress back to a physically and logically secured LAN environment Connection from 113 at the ingress point would then be typically connected to 114 network switch for distribution of data inside the LAN.
- the principal of this portion of the system is that hardware encryption devices 113 can be implemented as a part of the system where the signals encoded by 112 require encryption and transmission in a secure manner.
- arrangement and positioning of individual source video content from within the pixel space of the video encoding device 112 can be managed by the video processing external hardware device 110.
- individual inputs sources 101, 103, 105 and 106 can be arranged within a single video stream based on the windowing configuration applied in device 110.
- the result of the arrangement of input sources by device 110 can be observed in FIG. 5B where individual media streaming textures can contain one image or many images.
- a preset window launch buttonv403 can also be provided such that, for example, pressing control button 403 causes the preset control menu 505 to show and hide within the environment.
- a magnification window launch button v404 can also be provided such that pressing the button 404 will pop up magnification window 501.
- a source selection page popup button 405, can also be provided such that pressing button 405 will pop up source selection control menu 505.
- a rotary selector 406 can be controlled for example by placing the users thumb over the rotary selection button on a
- FIGS. 5A-5D are screen captures showing exemplary implementations of embodiments of the disclosed system user interface as viewed in a VR headset as device 375 depicted in a non- limiting example of FIG. 6B.
- a magnification window can be conceptualized as a display in the virtual environment
- the magnification window 501 is a virtual object in the 3d environment with a media streaming texture applied to the object.
- the media streaming texture is connected logically in software such as to ffmpeg software plugin as set forth in the example of Appendix C.
- the plugin has a stream url field that can be user defined.
- magnification window When a stream URL is present at the defined address the magnification window will then display video on the media streaming texture.
- a positioning control 502 for the magnification window can be provided.
- the magnification window can be positioned in the 3d environment by selecting the positioning control bar 502 and dragging the window in the 3d space. Controls have been enabled for X,Y,Z positioning of the magnification window in the coordinate plane of the 3d environment.
- a close window button 503 which allows the magnification window to be closed, hiding it from view in the 3d environment.
- This menu contains sources configured via a web application hosted on COTS control processor 108.
- the sources shown on this UI element are representative of physical video inputs to the system as shown in 101, 103,
- the naming and configuration of sources is executed via a web browser.
- the user is able to select a source as shown in 505 and route it to a destination as shown in 507.
- a command is sent from COTS PC 116 to processor 108 utilizing API such as in the example of Appendix A, a physical video source 102 is switched at the video matrix switcher 107 to an input of the COTS video wall processor 110, video outputs of the processor 110, video wall processor are physically connected to the IP video encoders 112 at the stream URLs defined by the user during system setup.
- IP video streams are decoded and displayed in in the 3d environment on the media streaming texture.
- Source selection buttons are representative of physical video sources connected to the system. By selecting a 505 source and then selecting a virtual display 507, API commands are sent to the hardware devices as shown in
- FIG. 2 such that physical video routes are executed resulting in the video being displayed in the
- a preset button 506 Presets are storable and recallable configuration methods by which a user can store all information related to source/destination routing, video wall processor settings and x,y,z positioning of windows in the 3d environment A preset is stored by clicking and holding for 3 seconds on a typical 506 type button.
- a virtual display 507 with media Streaming texture applied there can be provided.
- the virtual display is an object in the 3d environment
- the media streaming texture is a software component that enables playout of video as a texture applied to an object
- FIG. 5D illustrates an exemplary image 508 of the application of a video wall processor for windowing of display sources within a single stream according to exemplary implementations of various embodiments of the present disclosure.
- a windowing control button 509 that allows control of the 110, video wall processor, associated with the display 507, virtual display with media streaming texture applied.
- a command can be transmitted as depicted in FIG. 2 which results in a modification of the tiling of video sources at the output of the 110 video wall processor. The result is a change in the arrangement of sources shown on a virtual dlspIay507.
- FIG. 5B illustrates an exemplary image 510 of the specialty controls available for sources defined as compatible with robotic pan tilt zoom control parameters for cameras according to exemplary implementations of various embodiments of the present disclosure.
- a camera source When a camera source is routed to a virtual display507 , controls ate displayed to enable the user the ability to send ptz control messages to a type 105 device capable of robotic or cropping based ptz control.
- FIG. 5B further illustrates an exemplary image 511 of the specialty control open/close parameter to enable ptz control buttons to be displayed or hidden over the virtual display 507 media streaming texture of a compatible routed source according to exemplary implementations of various embodiments of the present disclosure.
- FIG, 6A is a diagrammatic illustration of an example of virtual reality (VR) of mixed reality (MR) display, tracking and input system including a PC 116 driving AR/VR HMD, and
- VR virtual reality
- MR mixed reality
- 3C1 , and 3C2 of Pub. No. US 2018/0082477 Al illustrate components of a VR system of the type that can be used complimentarily with, or improved by, exemplary embodiments described in this disclosure.
- APPENDIX A provides an exemplary API Reference document demonstrative of the required control command set and possible syntax protocols between the VR/AR Command center application and the connected hardware control server.
- APPENDIX B provides an exemplary source code for control system communication with VR/AR Command Center application as well as external hardware that can be controlled via the VR/AR Command Center user from within the AR/VR Command Center application
- Related art VR systems typically utilize software on a host computer to store, process, and play out video content in a 3D engine and HMD.
- An example of this architecture is the display of a locally stored MP4 video on a video streaming texture within the 3D environment.
- the 3D engine accesses a file on the host computer and renders it in the
- 3D engine to be viewed in the HMD.
- Another method for the display of video within a 3D engine from a remotely hosted content source requires the transfer o f data from a server to the host where it is locally buffered and played out in the 3D engine.
- a host computer and, in some implementations a server, must be loaded with software that is allowed access to the display drivers and file structures of a computer which might contain sensitive information.
- Such implementations are not practical for secure command center operations.
- display devices used for the viewing of secure content must not have software packages installed that enable them to access, transmit or store sensitive data. Further display devices must be physically separated from networks with connected devices that host sensitive information.
- HMDs that display a user’s surroundings via optical or video see through camera devices.
- one or more example embodiments may enable blending of a physical display with remote information from 360 degree cameras and/or Internet of Things (loT) sensor.
- LoT Internet of Things
- One or more example embodiments may enable one or more of various features and/or functions on a remove vehicle (i.e. unmanned craft), including, but not limited to: high fidelity, low latency 360 degree video; wireless video reliability and monitoring tools; Global Positioning
- GPS Global System
- HUD heads-up display
- loT data such as battery and/or fuel level or other internal conditions of the unmanned craft, motor speed, etc.
- One or more example embodiments described herein may bring one or more of these informational elements into a three dimensional (3D) VR environment.
- One or more example embodiments may include a HMD enabling a user to have a small logistic and spatial footprint while the 3D nature of the data and display convey to the user an understanding of spatial relationships and situational awareness from the perspective of an unmanned craft.
- One or more example embodiments described herein may provide a 360 degree video enabling a user to have awareness of the immediate vantage point of an unmanned craft.
- one or more example embodiments may enable a user to observe aft of the craft and to thereby verity , for example, deployment of a sensor or ordinance that is ejected from the craft, and to then, immediately view in a forward direction to thereby ascertain threats and/or obstacles that may be in the way of the craft.
- these functions may be achieved without moving mechanical parts or the latency associated with joystick or other mechanical maneuvering of cameras.
- a default perspective may provide fore, aft, starboard, and port perspectives or some blend thereof.
- a display option may additionally include a bearing or heading compass to thereby enable a user to maintain cardinal direction awareness.
- a system may enable vehicle locations in the format of GPS coordinates to be referenced to third party mapping systems. As shown in FIG. 8, a current location of a vehicle can be shown while a 360 degree video is being displayed, and the map location of the vehicle may be displayed in an overlay that is capable of being manipulated by a user.
- the overlay may been delivered as a Bing Map (Microsoft) or ArcGlS Map (ESRI), but may alternately be con.ured for any map or space including user supplied point cloud data, for example.
- a system may provide a display of vehicles on a three-axis map, including coordinates and elevation, thereby enabling a user to see vehicles in relation to one another and the landscape.
- One or more example implementations may facilitate a user’s understanding of relationships between and among vehicles and topography in a manner that is both quick and intuitive.
- One or more example embodiments described herein provide methods and systems for facilitating large scale and arrangements of visual information in a user-configurable format within a head mounted display (HMD).
- HMD head mounted display
- An HMD may be referred to herein by the descriptive, non-limiting term “Headwall” for purposes clarity and conciseness.
- One or more example embodiments described herein may relate to utilization of an
- HMD and VR engine to decode and render video streams, transmit control messages to enable control of remote headend hardware and store/recall preset layouts either in synchronization with a physical video wall or as an extension of a physical video wall.
- FIG. 9A is a schematic illustration of a system according to an example embodiment
- the system 200 includes a remote unit 200 in wireless communication with a user-end unit 300 connected to a user interface (UI) unit 450.
- FIG. A illustrates a single remote unit 200, a single user-end unit 300, and a single UI unit 540.
- the architecture of the system 500 may be scaled to support multiple remote unite 200, multiple user-end units 300, and multiple UI units 450, and may include any one or more of devices 101, 102, 103, 104, 105, 106, 107, 108, 109, and 115, as discussed with respect to FIG.
- FIG. 9B is a schematic illustration of another system according to an example embodiment.
- Remote unit - The remote unit 200 may be, for example, but is not limited to, a remote vehicle or other craft and may include one or more of a camera 210, a GPS unit 220, one or more other sensors 230, and a wireless transmitter 240.
- the camera 210, GPS unit 220, and one or more other sensors 230 are operatively connected to the wireless transmitter 240 configured to transmit data to the user-end unit 300.
- the camera 210 of the remote unit 200 may be any commercially available camera configured to capture images and therefrom create a 360 degree field of view.
- the camera 210 may have 4k resolution in an equirectangular frame, for example, and may obtain video images from about 15 to about 60 frames per second.
- these features of the camera 210 are merely examples and the camera may have any Configuration as would be understood by one of skill in the art.
- An encoder included, for example, within the camera 210 or as an external unit hardware device may be configured to encode video from the camera XX into a standards-based format such as, but not limited to RTSP, RTP, RTMP, and may output a 4k stream using, for example, UDP, TCP, or HLP depending on system requirements. It should be noted that certain 360 degree cameras may not support RTSP. RTMP may be used, for example, between about 10-25 MPBS and may provide the least latency for certain cameras, for example, the Instapro camera. The 4k stream may be wirelessly output from the wireless transmitter 240.
- a standards-based format such as, but not limited to RTSP, RTP, RTMP, and may output a 4k stream using, for example, UDP, TCP, or HLP depending on system requirements. It should be noted that certain 360 degree cameras may not support RTSP. RTMP may be used, for example, between about 10-25 MPBS and may provide the least latency for certain cameras, for example, the Insta
- GPS - The GPS unit 220 of the remote unit 200 may be a GPS beacon.
- Wireless receiver - A wireless receiver 550 is operatively connected to a user-end unit
- the wireless receiver 550 is configured to receive the 4k stream transmitted from the remote unit 200 and to transmit the stream to the user-end unit 300, for example to a decoder therewithal.
- the wireless receiver 550 may receive data directly, via a GPS unit 220 of the remote unit 200 or may receive data via an intermediary server using, for example, textual requests and responses over a socket.
- a request for GPS data can first be written to a designated socket on a service daemon on a GPS device.
- Time-Position-Velocity reports (TPV) may then be read from the GPS device/server at every measurement epoch
- Responses can be in the format of, for example, but not limited to GPS built on top of JavaScript Object Notation
- the wireless receiver 550 may likewise receive data directly from the one or more sensors 230 via the wireless transmitter 240 or may receive sensor data via an intermediary server using, for example, textual requests and responses over a socket.
- the user-end unit 300 may be referred to as a Topaz unit and may include the wireless receiver 550 configured to receive data from the remote unit 200.
- the wireless receiver 550 may be external to and operatively connected to the Topaz unit 300.
- the Topaz unit 300 may include a decoder and a rendering engine and may be operatively connected to the UI unit 450.
- the Topaz unit 300 may be configured to operate, for example Windows 10 Pro or Windows 11 ; may utilize LAV filters, for example ffmpeg-based DirectShow Splitters and Decoders; and may utilize Vaijo
- the Topaz unit 300 may be embodied by a controller.
- the controller can be implemented, at least in part, in digital electronic circuitry, analog electronic circuitry, or in computer hardware, firmware, software, or a combination thereof.
- These components can be implemented, for example, as a computer program product such as a computer program, program code or computer instructions tangibly embodied in an information carrier, or in a machine-readable storage device, for execution by, or to control the operation of, data processing apparatus such as a programmable processor, a computer, or multiple computers.
- the computer program product may comprise a computer-readable media storing program instructions, executable by a processor, to implement various operations.
- the media may include, alone or in combination with the program instructions, data files, data structures, and the like.
- the media and program instructions may be specially designed and constructed for the purposes of example embodiments described herein, or they may be of a kind well-known and available to those of skill in the art.
- Examples of computer-readable media include, but are not limited to: magnetic media such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM disks and DVD; magneto-optical media such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read- only memory (ROM), random access memory (RAM), flash memory, and the like.
- the media may also include a transmission medium such as one or more of optical lines, electrically- conductive lines, and wave guides.
- Examples of program instructions include, but are not limited to: machine code, such as produced by a compiler, and files containing higher level code that may be executed by the controller using an interpreter.
- Hardware devices described herein may be configured to act as one or more software modules in order to perform the operations of one or more example embodiments described herein.
- a computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
- a computer program can be deployed to be executed on one computer or other device or on multiple device at one site or distributed across multiple sites and interconnected by a communication network.
- functional programs, codes, and code segments for accomplishing features described herein can be easily developed by programmers skilled in the art.
- Operations associated with the example embodiments can be performed by one or more programmable processors executing a computer program, code or instructions to perform functions (e.g., by operating on input data and/or generating an output). Operations can also be performed by, and apparatuses described herein can be implemented as, special purpose logic circuitry, e.g., a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC), for example.
- FPGA field programmable gate array
- ASIC
- DSP digital signal processor
- a general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
- a processor may also be implemented as a combination of computing devices, for example, a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
- a general purpose processor may be a microprocessor, but in the alternative, the processor may be airy conventional processor, controller, microcontroller, or state machine.
- a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
- processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
- a processor will receive instructions and data from a read-only memory or a random access memory or both.
- the essential elements of a computer are a processor for executing instructions and one or more memory devices for storing instructions and data.
- a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic. magneto-optical disks, or optical disks.
- Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example, semiconductor memory devices, e.g., electrically programmable read-only memory
- Computer-readable non-transitory media includes all types of computer readable media. including magnetic storage media, optical storage media, flash media and solid state storage media. It should be understood that software can be installed in and sold with a central processing unit (CPU) device.
- CPU central processing unit
- software can be obtained and loaded into the CPU device, including obtaining the software through physical medium or distribution system, including, for example, from a server owned by the software creator or from a server not owned but used by the software creator,
- the software can be stored on a server for distribution over the
- NFC Near-Field Communication
- the Topaz unit XX Topaz may read text and/or json data from one or more sockets on sensor events, for example every measurement epoch, or may request data from a REST/SOAP server.
- the text and/or JSON responses may be parsed to obtain measurement and or other status data and may be used to update data of a state of one or more virtual objects in the user’s environment as provided by the UI.
- GPS and or sensor requests may occur automatically by default, may be manually initiated, or may be configured on-the-fly.
- Topaz 300 may include a Decoder and a Rendering engine.
- Monitoring system the Topaz unit 300 may include a monitoring system be configured to check the system health of the remote unit 200 inducting the camera 210, GPS unit 220, and other sensor elements 230. Topaz 300 may also be configured to determine a health of the wireless data stream from the remote unit 200, for example, to obtain a ping status and/or a latency of the wireless transmission. The health of the remote unit 200 and of the wireless transmission may each be obtained automatically according to a fixed or adaptive schedule or may be manually initiated.
- Topaz 300 can output an alert to a user via .
- the alert may be in the form of one or more status indicators on the UI.
- Topaz 300 may control the UI unit 450 to enable control features for a user to manually initiate reconnection, or reconfigure camera and sensor services on the fly.
- one of various strategies may be selected, either automatically by the Topaz unit 300 or manually by a user via the UI unit 450.
- the strategies may include, for example, operations to optimize available bandwidth usage in consideration of the rendering overhead of a virtual environment provided by the UI unit 450 and a current priority.
- the operations may include, but are not limited to, for example: throttling, buffering. pausing, disabling video rendering, and switching to mixed reality mode in low bandwidth mode; increasing bitrate and'or changing a transmission format to thereby adapt to a high signal strength, or at a critical stage of a mission; and pausing sensor updates and visuals at a last known value while initiating reconnection attempts.
- FIG. 10 illustrates a scene representing a system according to one or more example embodiments.
- the UI unit 450 may include a display unit and a user control unit
- the display unit and user control unit may be integrated, as in a headset device, or may be physically separate units operatively connected to each other and to the Topaz unit 300.
- the display unit may include any one or more of a headset device; a display of a hand- held unit such as a mobile phone or tablet; and a fixed display screen (see FIG. 10).
- the user control unit may include any one or more of a hand-tracking controller; mouse; gloves; sensors configured to determine hand and finger positions and gestures such as pinching, gripping, poking, and palm indications; joystick; keyboard; and any other user control unit as would be understood by one of skill in the art. Any one or more of the elements of the user control unit may be wired or wirelessly connected to the display unit and/or the Topaz unit
- the UI unit 450 may support any of various modes of control and content viewing.
- Content and menus may be provided using video pass through cameras to superimpose content and menu features over a camera feed.
- the Topaz unit may control the UI unit to enable a user to: toggle between VR and Augmented reality (AR) modes using video pass through; to pop up a magnification window or any one or more of various mentis; use hand motions to perform virtual near or distant “grabs” to pick up, move, and/or scale slates.
- Topaz 300 may receive streaming video information and GPS data and may use the streaming video information and GPS data in combination to display video and/or a virtual environment following a GPS target or control the display unit [0124]
- FIGs. 7, 8, 11 -14, and 17 show examples of screen captures of a display unit of a UI unit 450 according to one or more example embodiments.
- a Topaz unit 300 may control the UI unit 450 to enable monitoring of the health of streamed data via text and/or color indicators displayed to a user as “reachable” in conjunction with an indicator, for example, a green diamond, indicating the stream health of both video and sensor feeds.
- One or more display options may include, but are not limited to, an ability to pan, tilt, and yaw a spherical image to provide an optimal viewing perspective from the point of view of an unmanned craft In other words, a default perspective may provide fore, aft, starboard, and port perspectives or some blend thereof.
- a display option may additionally include a bearing or heading compass to thereby enable a user to maintain cardinal direction awareness.
- a current location of a vehicle can be shown while a 360 degree video is being displayed, and the map location of the vehicle may be displayed in an overlay that is capable of being manipulated by a user.
- the overlay may been delivered as a Bing Map (Microsoft) or ArcGIS Map (ESRI), but may alternately be configured for any map or space including user supplied point cloud data, for example.
- the monitoring system of the Topaz unit 300 may control the UI unit 450 to display information of a status of any one or more elements of the remote unit 200, For example, as shown in FIG. 11, a Status tab (shown in the upper left) may be displayed showing an internet protocol (IP) address for a camera and GPSD daemon that Topaz 300 is trying to reach. The status may be shown as “Reachable” with a green or other color diamond or other indicator shape or as “Unreachable” with a rod or other color diamond or other indicator shape. A Settings tab may also be displayed for the user.
- IP internet protocol
- the Settings tab may include text input boxes, for example for Camera IP and GPS IP, and a user changing the GPS IP address in the text input box may automatically switch a map pin to the location of a Silvus radio or other GPS-enable unit with that IP.
- Topaz 300 may be configured such that a user does not need to press any Enter key when updating the IP address.
- Topaz 300 may be configured such that when a user changes the camera IP address however, the user may be required to manually click “Refresh Stream” on the
- the Topaz unit 300 may automatically perform up to a predetermined number of attempts, for example five attempts, to refresh and search for the stream. In such a case, the display may show, for example “Refreshing Stream (Attempt x/n).”
- Topaz 300 may be configured to control the
- UI unit such that a user can move XYZ sliders, use +/- buttons, or enter a desired integer value
- Topaz 300 may be configured to control the
- U l unit 450 such that a user can set a height of the compass as displayed to be at a comfortable level (e.g. at eye-level or just above) depending on the user’s preference by using the “Height” slider (See FIG. 11).
- the user may also reset the bearing 000 to a current orientation of the user’s
- Topaz 300 may be configured to control the
- UI unit 450 to provide an extended reality (XR) interface in which a displayed map forms a center of interaction for the user.
- the map may indicate a single map pin following the coordinates of a tracked asset with an associated 360 video stream.
- This asset can be configured by updating the GPS and Camera IP addresses.
- a button with a camera icon on the map pin can be pressed by the user, for example, via a hand interaction to toggle and display the 360 video feed from the asset around the camera.
- the UI unit 450 thus enables the user to experience an on-location real-time point-of-view of the camera.
- Topaz 300 may be configured to control the
- UI unit 450 to provide a palm menu and map interaction options for operational convenience.
- FIGS. 10, 11 , and 12 shown examples of screen captures of a display unit of a UI unit
- Topaz 300 may be configured to control the UI unit 450 to provide a palm unit
- the user may hold out either hand with palm feeing the user’s face to bring up a palm menu in XR.
- This functionality may provide the user wife options such as. but not limited to: zooming the map in/out by one level per button press; tecentering the XR space so the map is in front of the user; toggling the visibility of the map; and exiting one or more applications.
- 300 may enable the user to move or scale the displayed map using a transparent black bounding ring around it For example, pinching the ring with a single hand and moving it may allow the user to move the map; pinching fee ring with both hands and then pulling fee hands apart or towards each other may allow the user to scale the map to be larger or smaller.
- Topaz 300 may enable the user to press the button with the camera icon on the map pin to thereby allow fee user to toggle between a mixed-reality view and a 360 video stream.
- FIG. 15 is a flow diagram of a presentation method according to one or more example embodiments.
- a presentation method may include operations of: receiving wireless data 13-1; rendering a 360 degree video 13-2; creating a displaying a UI 13-3; automatically updating the UI 13-4 and/or receiving a user input 13-5 and updating the UI based on the user input 13-6; and displaying the updated UI 13-3.
- the wireless data may be received, via wireless receiverAransmitters, from a
- the creation and display of the UI may include a user- mid unit/Topaz unit UI and controlling a unit to display the UI according to any one or more of the example embodiments described herein.
- the user input may be received via a user interacting with a menu or other element of the displayed UI via a user input unit according to one or more example embodiments herein. For example, a user may enter new information or may manipulate a user input to create an instruction to change a window included in the UI.
- FIG 16 is a flow diagram of a presentation and monitoring method according to one or more example embodiments.
- a presentation and monitoring method may include operations of: receiving wireless data 14-1 ; rendering a 360 degree video 14-2; creating a displaying a UI 14-3; mid obtaining status information of, for example, one of a camera of a remote unit and a wireless connection with a camera or other element of the remove unit 14-4.
- the status information may be obtained by transmitting a status update request, such as a ping, to the camera or other element, receiving a response to the update request, determining that there is no response to the update request, or determining a latency of a transmission.
- the method may further include updating the UI based on the determined updated status 14-6.
- the method may return to displaying the UI 14-3 or may, if there is a change in the status 14-7 (YES), perform a response operation according to any one of the example embodiments discussed herein.
- the rendering and display of a 360 video in conjunction with the display of a U1 with one or more overlying windows, menus, information, and the tike enables a user to have unparalleled situational awareness, to make swift decisions, and to seamlessly integrate the user into existing command structures.
- controlling the display of a UI based on user input contributes to the immersive and holistic perspective of the operational environment provided to the user.
- Related art video systems installed on unmanned vehicles have been plagued by a critical limitation—tunnel vision. This narrow field of view constrains a user’s situational awareness and can be particularly challenging in the ever-changing and often hostile conditions faced by military personnel.
- a system may contribute to eradication of the limitations of tunnel vision, and contribute to enabling military commanders and operators to monitor potential threats from all angles, track multiple tergets simultaneously, and respond proactively to evolving situations.
- a system may not only enhance security but also contribute to accelerating the decision-making processes — a vital component in the rapid and dynamic context of military operations.
- 360-degree video With 360-degree video according to one or more example systems described herein, commanders and operators can detect threats from any angle, monitor multiple points of interest simultaneously, and achieve a level of situational awareness that was once unattainable. [0137] 360-degree video according to one or more example systems described herein contributes to providing a user widi an immersive and comprehensive view of their operational surroundings, this panoramic perspective allows them to monitor multiple unmanned vehicles simultaneously, detect threats from any direction, and make rapid and informed decisions.
- 360-degree video contributes to eliminating blind spots and ensuring that no detail escapes notice. Users can survey vast areas and monitor critical points of interest simultaneously.
- 360-degree video contributes to a user’s ability to respond swiftly to emerging threats or opportunities. With interactive tools and real-time data overlays, they can quickly assess situations, make informed decisions, and redirect assets as needed.
- provision of an updating status including but not limited to a status of a remote camera and a status of a wireless connection with a remote camera, a remote GPS unit, and/or another remote unit; as well as the ability to perform operations in response to a change in the status, either automatically or manually, contributes to provision of a technological to video delivery system that is optimized for unreliable network conditions and tactical environments.
- a status of a remote camera and a status of a wireless connection with a remote camera, a remote GPS unit, and/or another remote unit contributes to provision of a technological to video delivery system that is optimized for unreliable network conditions and tactical environments.
- bandwidth can be limited
- interference is commonplace
- network reliability is often compromised
- Monitoring and response operations provide an ability to fluctuating conditions, contributing to ensuring that military commanders and operators receive continuous, low-latency video feeds even when facing unreliable networks. It further contributes to an ability to prioritize video feeds efficiently within the military's tactical framework. This interoperability enhances the resilience and effectiveness of command and control operations, even in the most challenging and unpredictable environments.
- Monitoring and response operations provide an ability to automatically adjusts performance to maintain a consistent and high-quality video stream, ensuring that commanders and operators receive uninterrupted. real-time visual information; to enhance the resilience and effectiveness of command and control operations, even in bandwidth-constrained or interference-prone scenarios.
- integration of GPS data contributes to more precise tracking and coordination of unmanned vehicles. Whether guiding drones through complex airspace or maneuvering autonomous ground vehicles in challenging terrain, Topaz ensures that remote fleets operate with pinpoint accuracy.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Human Computer Interaction (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
L'invention concerne des systèmes et des procédés de présentation et de surveillance pour une unité d'extrémité utilisateur en communication sans fil avec une ou plusieurs unités distantes. L'unité d'extrémité utilisateur reçoit sans fil des données, comprenant des données vidéo à 360 degrés et des données de positionnement global, en provenance d'une unité distante. L'unité d'extrémité utilisateur rend une vidéo à 360 degrés et une interface utilisateur (Ul), comprenant de multiples fenêtres, est affichée de telle sorte qu'une ou plusieurs fenêtres, dont les tailles et les positions peuvent être modifiées, peuvent être affichées comme superposées sur une fenêtre affichant la vidéo à 360 degrés, l'unité d'extrémité utilisateur peut en outre surveiller l'état d'un ou de plusieurs éléments de l'unité distante, comprenant la caméra à 360 degrés, et de la communication sans fil entre l'unité d'extrémité utilisateur et l'unité distante, et des informations de l'état, et des options interactives associées à celles-ci peuvent être affichées sur une ou plusieurs des fenêtres de l'UI.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202463563068P | 2024-03-08 | 2024-03-08 | |
| US63/563,068 | 2024-03-08 |
Publications (3)
| Publication Number | Publication Date |
|---|---|
| WO2025189176A1 true WO2025189176A1 (fr) | 2025-09-12 |
| WO2025189176A8 WO2025189176A8 (fr) | 2025-10-02 |
| WO2025189176A9 WO2025189176A9 (fr) | 2025-10-30 |
Family
ID=96991659
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2025/019076 Pending WO2025189176A1 (fr) | 2024-03-08 | 2025-03-08 | Système et procédé pour des agencements d'informations visuelles dans un format configurable par l'utilisateur |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025189176A1 (fr) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160088259A1 (en) * | 2011-01-17 | 2016-03-24 | Eric C. Anderson | System and method for interactive internet video conferencing |
| WO2018027067A1 (fr) * | 2016-08-05 | 2018-02-08 | Pcms Holdings, Inc. | Procédés et systèmes pour vidéo panoramique avec diffusion en continu collaborative en direct |
| US20180150994A1 (en) * | 2016-11-30 | 2018-05-31 | Adcor Magnet Systems, Llc | System, method, and non-transitory computer-readable storage media for generating 3-dimensional video images |
-
2025
- 2025-03-08 WO PCT/US2025/019076 patent/WO2025189176A1/fr active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160088259A1 (en) * | 2011-01-17 | 2016-03-24 | Eric C. Anderson | System and method for interactive internet video conferencing |
| WO2018027067A1 (fr) * | 2016-08-05 | 2018-02-08 | Pcms Holdings, Inc. | Procédés et systèmes pour vidéo panoramique avec diffusion en continu collaborative en direct |
| US20180150994A1 (en) * | 2016-11-30 | 2018-05-31 | Adcor Magnet Systems, Llc | System, method, and non-transitory computer-readable storage media for generating 3-dimensional video images |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2025189176A8 (fr) | 2025-10-02 |
| WO2025189176A9 (fr) | 2025-10-30 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10380800B2 (en) | System and method for linking and interacting between augmented reality and virtual reality environments | |
| US20230111408A1 (en) | Techniques for capturing and rendering videos with simulated reality systems and for connecting services with service providers | |
| EP3596931B1 (fr) | Procédé et appareil de conditionnement et de diffusion en continu de contenu multimédia de réalité virtuelle | |
| US20140320529A1 (en) | View steering in a combined virtual augmented reality system | |
| US10274737B2 (en) | Selecting portions of vehicle-captured video to use for display | |
| US9420156B2 (en) | Imaging control system, control apparatus, control method, and storage medium | |
| US20020149617A1 (en) | Remote collaboration technology design and methodology | |
| AU2002305105B2 (en) | Remote collaboration technology design and methodology | |
| US8854415B2 (en) | Motion responsive video capture during a video conference | |
| US11900530B1 (en) | Multi-user data presentation in AR/VR | |
| US11924393B2 (en) | Shared viewing of video among multiple users | |
| US11647354B2 (en) | Method and apparatus for providing audio content in immersive reality | |
| EP3438935A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations, programme | |
| US20180278995A1 (en) | Information processing apparatus, information processing method, and program | |
| KR102200115B1 (ko) | 다시점 360도 vr 컨텐츠 제공 시스템 | |
| AU2019271924A1 (en) | System and method for adjusting an image for a vehicle mounted camera | |
| WO2025189176A1 (fr) | Système et procédé pour des agencements d'informations visuelles dans un format configurable par l'utilisateur | |
| US20250001298A1 (en) | System and method for arrangements of visual information in user-configurable format | |
| WO2022030209A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations, et système de traitement d'informations | |
| WO2024143026A1 (fr) | Dispositif de manipulation d'objet virtuel, programme associé et système d'affichage d'objet virtuel | |
| JP2023067635A (ja) | 情報処理装置、情報処理方法及び情報処理プログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25768905 Country of ref document: EP Kind code of ref document: A1 |