WO2025003379A1 - Usage of a mobile device or integrated device supporting and/or enabling motion synchronized experiences on a moving platform - Google Patents
Usage of a mobile device or integrated device supporting and/or enabling motion synchronized experiences on a moving platform Download PDFInfo
- Publication number
- WO2025003379A1 WO2025003379A1 PCT/EP2024/068213 EP2024068213W WO2025003379A1 WO 2025003379 A1 WO2025003379 A1 WO 2025003379A1 EP 2024068213 W EP2024068213 W EP 2024068213W WO 2025003379 A1 WO2025003379 A1 WO 2025003379A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- experience
- platform
- data
- portable device
- integrated device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
- B60K35/233—Head-up displays [HUD] controlling the size or position in display areas of virtual images depending on the condition of the vehicle or the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
- B60K35/235—Head-up displays [HUD] with means for detecting the driver's gaze direction or eye points
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/29—Instruments characterised by the way in which information is handled, e.g. showing information on plural displays or prioritising information according to driving conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/65—Instruments specially adapted for specific vehicle types or users, e.g. for left- or right-hand drive
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
- B60K35/81—Arrangements for controlling instruments for controlling displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/167—Vehicle dynamics information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/31—Virtual images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/33—Illumination features
- B60K2360/347—Optical elements for superposition of display information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/583—Data transfer between instruments
Definitions
- An XR headset as an example of an experience device can be designed as a head mounted device that can display or present a VR content (VR - virtual reality) and/or AR content (AR - augmented reality) and/or MR content (MR - mixed reality) to the user wearing the XR headset, in particular by providing a pose-aware and motion-synchronized output to the eyes of the user.
- VR VR - virtual reality
- AR AR - augmented reality
- MR - mixed reality MR content
- XR is a subset of all spatial media output devices, in the following referred to as “experience devices”, that replace or augment real-world signals with pose-aware and motion-synchronized virtual signals, in the following referred to as “spatial content”.
- Spatial audio headphones or a 2D screen that functions as a portable camera or window into a virtual 3D space are other examples for experience devices and are equally taken into account.
- the experience device can comprise at least one sensor that provides a sensor signal that is correlated with a pose (position and/or spatial orientation/rotation) and/or motion and/or acceleration of the experience device in space I the surroundings.
- the at least one sensor can for example comprise at least one camera and/or an IMU (inertial measurement unit).
- a spatial media output device in a moving platform, in particular in a vehicle (like a passenger vehicle or a truck or a passenger bus or a motorbike) or a plane or a ship/boat, comes with the technical problem that the at least one sensor of the experience device will sense both the movements of the user inside the platform as well as the movements of the platform in the environment in the same way.
- the two types of position/movement need to be distinguished when presenting spatial content to the user via a pose- and motion-tracked experience device. Otherwise, when the platform moves in a curve with the user sitting still inside the platform, the spatial content will be changed in the same way as when the user turns the device inside the platform with world reference tracking. In the case of vehicle reference tracking, the motion relative to the world is unknown.
- the two types are preferably separated as an information on the pose of the vehicle in relation to its environment I surroundings and an information on the pose of the experience device in relation to the platform.
- the disclosure relates to a method for operating at least one experience device in a movable and/or moving platform using at least one portable device and/or at least one integrated device.
- a movement and/or pose of the portable device and/or integrated device relative to the platform is limited or fixed. This feature can enhance the accuracy, reliability, and overall performance of the system in which the portable and/or integrated device operates and/or support the usage of further devices.
- the portable device and/or integrated device can achieve higher positional accuracy. This stabilization can reduce errors caused by unintended movements or vibrations, ensuring that the devices maintain a consistent and precise location within the platform. The stabilization of the devices can lead to improved quality of sensor data.
- Sensors such as accelerometers, gyroscopes, and cameras can operate more effectively when the device is held in a stable position, reducing noise and enhancing the fidelity of measurements and observations. Fixing or limiting the movement of the device can facilitate more consistent calibration.
- a stable device position can simplify the calibration process, ensuring that once the device is calibrated relative to the platform, it remains accurate over time, thus reducing the need for frequent recalibrations.
- the integration of data from multiple sources e.g., GNSS, IMlls, cameras, lasers
- the consistency in device positioning can enable more accurate data fusion, leading to better overall system performance.
- the experience device refers to a digital or electronic device designed to present, interact with, or process digital content in either two-dimensional (2D) or three-dimensional (3D) formats.
- This encompasses devices that facilitate immersive and interactive experiences by integrating various sensors, processing units, and communication interfaces.
- Experience devices can be used for applications such as virtual reality (VR), augmented reality (AR), mixed reality (MR), gaming, media consumption, and spatial audio experiences.
- VR virtual reality
- AR augmented reality
- MR mixed reality
- gaming media consumption
- spatial audio experiences spatial audio experiences.
- the experience device can comprise a display unit capable of rendering 2D or 3D content.
- This display unit can provide visual output that may include stereoscopic visuals for enhanced depth perception in 3D experiences.
- the experience device can include sensors such as inertial measurement units (IMlls), global navigation satellite system (GNSS) receivers, cameras, optical sensors, magnetometers, barometers, and/or accelerometers. These sensors can be used to track the device’s pose, movement, and acceleration relative to its environment.
- the experience device can be equipped with a processing unit to handle computational tasks. This processing unit can perform data pre-processing, map integration, and rendering of virtual scenes.
- the experience device can support wired or wireless communication interfaces such as Bluetooth, Wi-Fi, Ultra-Wideband (UWB), and/or Ethernet.
- the experience device can be powered by internal batteries, vehicle power systems, or wireless charging.
- the experience device can provide a user-friendly interface for setup, calibration, and realtime data interaction. This interface can display instructions and feedback to the user.
- the experience device can adapt to changes in the environment, such as vehicle speed and external conditions, to dynamically update the user experience.
- Examples of experience devices can include VR headsets such as but not limited to the Oculus Rift, HTC Vive, and PlayStation VR, which can provide immersive 3D virtual environments with stereoscopic displays and spatial audio.
- Examples of experience devices can include AR headsets such as Microsoft HoloLens and Magic Leap One, which can overlay digital content onto the real world, allowing interaction with both physical and virtual objects.
- the experience device can deliver synchronized 2D or 3D content to passengers, enhancing travel experiences.
- the experience device can provide but not limited to immersive 2D or 3D gaming experiences with motion tracking and spatial audio.
- the experience device can offer realistic simulations and training environments using VR or AR headsets.
- the experience device can enhance the experience of watching movies, shows, and other media content with spatial audio and interactive features.
- Examples of experience devices can include smartphones, tablets, screens and/or the platform infotainment system.
- the movable and/or moving platform refers to any vehicle, apparatus, or structure that can transport or be transported from one location to another.
- This platform can provide a base for various devices and systems, including experience devices, and can offer an environment where positional and motion data can be crucial for synchronized experiences.
- Such platforms can range from personal transportation vehicles to larger commercial or industrial systems.
- the movable and/or moving platform can be capable of transporting itself or being transported from one location to another. This mobility can be powered by various means, including but not limited to engines, motors, manual force, or external transport systems.
- the movable and/or moving platform can integrate with experience devices to provide synchronized data on position, orientation, movement, status of the platform (e.g., standing still, charging, or others) and environmental conditions. This integration can enhance the functionality of experience devices, enabling immersive and interactive experiences.
- the movable and/or moving platform can be equipped with sensors such as GNSS receivers, IMlls, cameras, optical sensors, lasers, wheel sensors, steering sensors, radar sensors, accelerometers, gyroscopes, barometers, and/or magnetometers.
- the movable and/or moving platform can provide power to integrated systems and devices, which can include internal batteries, connection to an external power source, or energy generation systems like solar panels or generators.
- the platform can support communication interfaces for data exchange with integrated devices and external systems. This can include wired connections, wireless technologies like Bluetooth, Wi-Fi, and cellular networks.
- the platform can adapt to various environmental conditions, providing data and feedback to integrated devices to ensure synchronized and optimal performance.
- Examples can, but not limited to include cars, motorcycles, bicycles, and scooters. These platforms can integrate with experience devices to provide synchronized data on movement and environmental conditions. All the mentioned examples can be equipped with the various existing drive concepts. Further examples can include buses, trains, trams, and subways. These platforms can support experience devices used by passengers for entertainment, navigation, and information. Further examples can include airplanes, helicopters, drones, and gliders. These platforms can provide dynamic environments for experience devices, integrating data on position, orientation, speed, acceleration and/or environmental conditions. Further examples can include boats, ships, submarines, and yachts. These platforms can integrate with experience devices for navigation, entertainment, and environmental monitoring. Further examples can include trucks, forklifts, cranes, and construction vehicles.
- These platforms can provide critical data to experience devices used for operational efficiency and safety. Further examples can include rockets, space shuttles, and space stations. These platforms can offer unique environments for experience devices, incorporating data on space conditions, trajectory, and onboard systems. Further examples can include recreational vehicles (RVs), caravans, and mobile homes. These platforms can integrate with experience devices to enhance travel and living experiences with synchronized data on movement and location.
- RVs recreational vehicles
- the portable device refers to any compact, and mobile electronic device that can be carried and operated by a user. Such devices are designed for mobility and can perform various computational, communication, and entertainment functions independently or in conjunction with other systems. Portable devices can integrate a range of sensors, communication interfaces, and power sources to support their diverse functionalities.
- a portable device can be designed to be compact and lightweight, facilitating ease of transport and use by an individual.
- the portable device can comprise a display unit for visual output, which can range from simple monochrome screens to high-resolution color displays.
- the display can support touch input and/or physical buttons for user interaction.
- the portable device can include sensors such as inertial measurement units (IMlls), global navigation satellite system (GNSS) receivers, cameras, lasers, optical sensors, magnetometers, barometers, and/or accelerometers. These sensors can enable the device to track position, orientation, movement, speed, acceleration and/or environmental conditions.
- the portable device can be equipped with a processing unit to perform computational tasks. This processing unit can handle data processing, application execution, and communication management.
- the portable device can support various wired and wireless communication interfaces such as Bluetooth, Wi-Fi, cellular networks, NFC, and USB. These interfaces can facilitate data exchange with other devices and systems.
- the portable device can be powered by an internal rechargeable battery, which can be charged through wired connections (e.g., USB) or wireless charging technologies.
- the portable device can include internal storage for data and applications. It can also support expandable storage options such as microSD cards.
- the portable device can provide an interactive user interface, which can include a touchscreen, physical buttons, or voice commands. This interface can facilitate user interaction with applications and device functions.
- the portable device can be designed for high mobility, allowing users to carry and use it across different locations and environments.
- the portable device can also include an interface to receive sim cards for storing data and/or establishing a communication to a mobile network.
- Examples can include but are not limited to devices like the Apple iPhone, Samsung Galaxy, and Google Pixel, which can perform a wide range of functions including communication, navigation, and multimedia playback. Further, examples can include devices like the Apple iPad, Samsung Galaxy Tab, and Microsoft Surface, which can offer larger screens and enhanced capabilities for productivity and entertainment. Further, examples can include devices like the Apple Watch, Samsung Galaxy Watch, and Fitbit, which can provide fitness tracking, notifications, and communication features on a compact wearable platform.
- Portable devices can enable voice calls, video calls, messaging, and email communication while on the move.
- Portable devices can use GNSS and mapping applications to provide real-time navigation and location services.
- Portable devices can offer multimedia playback, including music, videos, and games, providing entertainment in various environments.
- Portable devices can support office applications, note-taking, and document editing, allowing users to work from anywhere, e.g., in the vehicle.
- Portable devices can track physical activities, monitor health metrics, and provide fitness guidance.
- the integrated device refers to an electronic system or component that is built into a larger apparatus or platform, for instance a vehicle, designed to perform specific functions as part of that overall system.
- Such devices can be embedded within the structure of the apparatus, seamlessly combining their capabilities with the platform’s operation.
- Integrated devices can interact with other components and systems within the apparatus to enhance functionality, efficiency, and user experience.
- An integrated device can be embedded within a larger apparatus or platform, ensuring seamless operation and interaction with the host system.
- the integrated device can perform specific functions, such as sensing, processing, communication, or control, which are essential to the operation of the host system.
- the integrated device can include various sensors such as inertial measurement units (IMlls), global navigation satellite system (GNSS) receivers, wheel ticks, steering information, cameras, lasers, optical sensors, magnetometers, barometers, and/or accelerometers. These sensors can provide critical data for the host system's operation.
- the integrated device can comprise a processing unit that handles computational tasks, data processing, and system control. This unit can work in conjunction with the host system’s processor or operate independently.
- the integrated device can, but is not limited to, support wired or wireless communication interfaces such as Bluetooth, Wi-Fi, Ethernet, CAN bus, FlexRay, and proprietary protocols. These interfaces can enable data exchange with other components of the host system or external devices.
- the integrated device can draw power from the host system’s power supply, ensuring consistent operation without the need for separate power sources.
- the integrated device can interface with the host system’s user interface, allowing users to interact with its functionalities through the main system controls and displays. Examples can include a respective processor device for a motor vehicle.
- the processor circuit may comprise at least one microprocessor and/or at least one microcontroller and/or at least one FPGA (Field Programmable Gate Array) and/or at least one DSP (Digital Signal Processor) and/or at least one ASIC (Application Specific Integrated Circuit).
- a CPU Central Processing Unit
- a GPU Graphics Processing Unit
- NPU Neurological Processing Unit
- the processor device may comprise program code comprising instructions which, when executed by the processor circuit, perform the method steps according to the method of the disclosure.
- the program code may be stored in a data storage of the processor circuit.
- the processor circuit can, for example, be based on at least one circuit board and/or on at least one SoC (System on Chip).
- the processor device may include integrated satellite navigation units for providing navigation, position, orientation and/or velocity data.
- the method comprises several method steps.
- a first step performing at least one measurement to determine a pose and/or motion and/or acceleration of the platform takes place.
- This initial measurement can establish a precise baseline for the platform's current state, enabling accurate tracking and subsequent calculations.
- By determining the platform's pose, motion, and acceleration it can effectively be differentiated between platform -induced movements and those of any devices or users on the platform. This can enhance the accuracy of sensor data, improve the reliability of subsequent measurements, and optimize system performance.
- this step can reduce computational complexity in later stages by providing clear initial conditions, thus enabling more efficient data processing and analysis.
- leveraging the measurement to enable a motion- synchronized experience on at least one experience device takes place. It allows to create highly immersive and realistic experiences by aligning the content on the experience device with the actual motion of the platform. By synchronizing the experience device with the platform's movement, motion sickness can be reduced and user comfort enhanced, particularly in but not limited to applications such as virtual reality (VR) or augmented reality (AR). Additionally, this synchronization can improve the accuracy of spatial audio and visual elements, making interactions more intuitive and engaging. Overall, leveraging these measurements can optimize the performance of the experience device, delivering a seamless and coherent user experience.
- VR virtual reality
- AR augmented reality
- a further step operating the at least one experience device based on the measurement takes place.
- This step can ensure that the experience device operates in harmony with the platform's movements, thereby enhancing the accuracy and realism of the user experience.
- By basing the experience device's operation on precise measurements it can dynamically adjust the content and interactions to reflect real-time changes in the platform's pose, motion, and acceleration. This can lead to improved user engagement and reduced motion sickness in immersive applications like virtual reality (VR) and augmented reality (AR), but not limited to these devices. Additionally, it can optimize the experience device's performance by minimizing latency and ensuring smooth, coherent experiences, thereby increasing user satisfaction and system efficiency.
- VR virtual reality
- AR augmented reality
- the method as described above provides a number of technical advantages.
- the method tracks the platform’s dynamics, which can be crucial for distinguishing platform-induced movements from user actions. Leveraging these precise measurements to enable a motion-synchronized experience on the experience device can significantly enhance user immersion and comfort, while operating the experience device based on these measurements can ensure seamless, real-time interactions, thereby optimizing the overall performance and user satisfaction of the system.
- a method for operating a portable device and/or an integrated device in a movable and/or moving platform is provided.
- a movement and/or pose of the portable device and/or integrated device relative to the platform is motion-limited or fixed.
- the portable device and/or integrated device supports at least one experience device being located in the platform to determine a pose and/or a movement and/or an acceleration of the respective experience device relative to the platform.
- the method as described above provides a number of technical advantages.
- the system can achieve higher accuracy and stability in tracking the platform's dynamics.
- This stabilization can enhance the precision of determining the pose, movement, and acceleration of the experience device relative to the platform, ensuring that the experience device receives reliable and consistent data. Consequently, the experience device can operate more effectively, delivering a synchronized and immersive user experience. Additionally, this method can reduce computational complexity and improve system efficiency by providing clear and stable reference points for data processing and analysis.
- the movable and/or moving platform comprises at least a bike, rollercoaster, industrial vehicle, car, bus, train, truck, plane, helicopter, and/or ship, and/or the like moving platform.
- the listing is not limited to the listed examples. This versatility can enable the method to be applied across a wide range of transportation modes, enhancing its applicability and usefulness in various contexts. It can deliver consistent and reliable performance in diverse environments, ensuring accurate and synchronized experiences regardless of the specific platform used.
- the portable device comprises:
- This versatility can allow the method to leverage the diverse capabilities of various portable devices, enhancing its adaptability and user accessibility. Users can benefit from seamless integration and consistent performance across different portable devices, ensuring a wide range of applications and improved user experience.
- the listing of portable devices is not limited to the listed examples.
- the integrated device comprises at least one integrated sensor within or attached to the platform, selected from the group consisting of an inertial measurement unit (IMU), global navigation satellite system (GNSS) receiver and antenna, camera, optical sensor (laser sensor), radar, wheel sensor, steering sensor, magnetometer, gyroscope, wheel sensor, steering sensor, and accelerometer, and wherein the integrated device is configured to gather data from these sensors to determine the pose and/or movement and/or acceleration of the platform.
- IMU inertial measurement unit
- GNSS global navigation satellite system
- the experience device comprises a unit for presenting audio and/or visual content, including spatial and/or VR content and/or AR content and/or MR content, or a display unit for 2D content, or a vehicle infotainment system capable of rendering content based on data received from the portable device and/or integrated device, in particular in a motion-synchronized manner. It can enable the experience device to deliver highly immersive and interactive experiences by leveraging accurate and real-time data from the portable and/or integrated device. Consequently, users can benefit from synchronized audio and/or visual content that enhances realism and engagement, improves spatial awareness in VR/AR/MR applications, and provides a seamless transition between different content types.
- the portable device and/or the integrated device performs loading and pre-processing of map data, generating a virtual scene and/or experience and/or audio and/or rendered images and/or virtual elements from the map data, and streaming the pre-processed content to the at least one experience device. It can enable the system to deliver highly detailed and immersive virtual experiences by offloading intensive processing tasks to the portable or integrated device. Consequently, the experience device can operate more efficiently, as it receives pre-processed content ready for immediate display, reducing latency and enhancing performance. This can lead to improved user experiences, with smoother and more responsive interactions, and can expand the range of applications by providing rich, contextually relevant virtual environments seamlessly integrated with real-world data.
- the portable device and/or the integrated device provides calibration instructions on a display unit of the portable device and/or on a display unit of the platform and/or on the experience device, enabling accurate alignment of the portable device and/or experience device within the platform. It can ensure precise alignment and synchronization of devices by guiding the user through a clear and intuitive calibration process. Consequently, the accuracy and reliability of data collected and processed by the system can be significantly improved, leading to enhanced performance of motion-synchronized experiences. Additionally, this can reduce user error during setup, ensuring consistent and optimal operation across different environments and applications. By facilitating accurate alignment, a seamless and immersive user experience, enhancing overall satisfaction and functionality can be provided.
- the portable device and/or the integrated device enable multiple experience devices within the platform synchronized spatial audio and/or location and/or motion aware audio and/or visual experiences based on the relative poses and motion of the experience devices. This can ensure that all experience devices within the platform operate in harmony, creating a cohesive and immersive multi-user environment.
- the system can enhance the realism and interactivity of the shared experience, making it more engaging for users.
- this synchronization can reduce latency and inconsistencies between devices, improving the overall quality and fluidity of the user experience. Consequently, users can benefit from a more intuitive and immersive interaction with the digital content, whether for entertainment, training, or operational purposes.
- the portable device and/or the integrated device integrates data from platform sensors (e.g., steering angle, wheel data, acceleration) to enhance the accuracy and/or redundancy and/or robustness of the user experience when using the at least one experience device.
- platform sensors e.g., steering angle, wheel data, acceleration
- Integrating sensor data from the platform can lead to more precise and reliable tracking of the platform's dynamics, which in turn can improve the synchronization and performance of the experience device.
- This enhanced accuracy can reduce errors and inconsistencies in the user experience, providing a smoother and more immersive interaction with the content.
- the redundancy of sensor data can increase the system's fault tolerance and robustness, ensuring consistent operation even if some sensors fail or provide erroneous data. Consequently, users can enjoy a more stable, reliable, and engaging experience, which can be particularly beneficial in safety-critical applications and complex interactive environments.
- computational tasks for audio and/or visual content are dynamically distributed between the portable device and/or an integrated device and the at least one experience device based on computational load and task complexity.
- the system can optimize resource utilization, ensuring that each device operates within its capacity for maximum efficiency. This can lead to enhanced performance and responsiveness of the experience device, as tasks are allocated based on real-time demands and device capabilities.
- this approach can reduce latency and prevent bottlenecks, resulting in smoother audio and visual experiences for the user. Additionally, it can improve the overall robustness and reliability of the system by balancing the load and mitigating the risk of overloading any single device. Consequently, users can benefit from a seamless, high-quality interactive experience, with consistent performance even under varying computational demands.
- the portable device and/or the integrated device receives map data from external sources (including internet services), processes the data, and transmits it to the at least one experience device to create a virtual environment that matches the real-world road network.
- the system can generate a highly accurate and realistic virtual environment, enhancing the user’s immersive experience. This capability can improve the relevance and context of the virtual content, making it more engaging and useful for navigation, training, or entertainment purposes.
- the synchronization of virtual environments with real-world road networks can enhance the safety and effectiveness of applications such as driver assistance systems or educational simulations. Consequently, users can benefit from a more reliable and contextually accurate interactive experience, which can lead to increased satisfaction and utility in various applications.
- the portable device and/or the integrated device automatically activates and calibrates the experience device upon detecting its fixed position within the platform using sensor heuristics and/or NFC elements.
- This automatic activation and calibration can significantly enhance the ease of use and user experience by eliminating the need for manual setup, thereby saving time and reducing the potential for user error.
- the system can ensure precise and reliable calibration, which can improve the accuracy and performance of the experience device.
- this capability can lead to a more seamless and intuitive interaction with the system, as the experience device is always correctly aligned and ready for use. Consequently, users can enjoy a more efficient and effective setup process, leading to enhanced satisfaction and overall system reliability.
- the portable device and/or the integrated device act as a hub, gathering data from multiple sources (e.g., vehicle sensors, internet services) and distributing it to at least one experience device.
- the portable or integrated device can centralize data collection and management, ensuring that the experience device receives comprehensive and consistent information from diverse sources. This capability can enhance the accuracy and richness of the content provided to the experience device, improving the overall user experience.
- the centralized hub can streamline data processing and reduce latency, enabling more responsive and real-time interactions. Consequently, users can benefit from a seamless integration of various data inputs, leading to more immersive and informative experiences across different applications.
- the portable device and/or the integrated device uses optical sensors, e.g., cameras and/or lasers and/or IR sensors, and/or UWB and/or BLE and/or other electromagnetic based sensors to estimate the pose of the experience device relative to the platform and/or other experience device within the platform.
- optical sensors e.g., cameras and/or lasers and/or IR sensors, and/or UWB and/or BLE and/or other electromagnetic based sensors to estimate the pose of the experience device relative to the platform and/or other experience device within the platform.
- the portable device and/or the integrated device incorporates weather and/or environmental data from internet services and/or from detected objects outside of the platform into the at least one experience device, providing a realistic representation and/or artistic re-interpretation of the current conditions; and wherein the portable device and/or an integrated device integrates data from external data sources, such as internet services or the like, to enhance the experience.
- the system can deliver highly realistic and contextually relevant experiences that reflect current conditions, thereby increasing user immersion and engagement. This capability can also allow for creative and artistic re-interpretations, offering users unique and dynamic visual and sensory experiences.
- integrating data from various internal and/or external sources, like sensors and/or data services, can enrich the content and functionality of the experience device, ensuring it remains up-to-date and relevant. Consequently, users can benefit from a more immersive, informative, and adaptable experience, enhancing overall satisfaction and utility in a wide range of applications.
- the portable device and/or the integrated device provides motion-synchronized experiences not limited to spatial content, including 2D games and music, by providing relevant motion and location data to the respective experience device(s).
- the system can enhance the interactivity and immersion of various content types, such as 2D games and music.
- This synchronization can create a more engaging user experience by aligning in-game actions or musical rhythms with the user's physical movements and the platform's dynamics.
- this capability can expand the applicability of the system beyond spatial content, making it versatile and valuable for a wider range of entertainment and educational applications. Consequently, users can benefit from enriched, dynamic experiences that respond to their movements, leading to greater enjoyment and a deeper connection with the content.
- the portable device and/or the integrated device uses additional GNSS antennas mounted on the platform to improve localization accuracy.
- additional GNSS antennas By utilizing additional GNSS antennas, higher precision in determining the platform's position and movement can be achieved.
- This enhanced localization accuracy can significantly improve the reliability and performance of navigation and tracking functions, ensuring that the experience device operates with greater fidelity and responsiveness.
- the improved accuracy can benefit various applications, including augmented reality (AR), virtual reality (VR), and other locationbased services, by providing more precise and stable positioning data. Consequently, users can experience increased immersion, safety, and effectiveness in a wide range of interactive and real-time applications.
- AR augmented reality
- VR virtual reality
- other locationbased services by providing more precise and stable positioning data. Consequently, users can experience increased immersion, safety, and effectiveness in a wide range of interactive and real-time applications.
- the portable device and/or the integrated device coordinates multiple experience devices within the platform, ensuring that all experience devices receive consistent data for synchronized experiences.
- the system can ensure a harmonized and unified user experience across all devices.
- This synchronization can enhance the overall immersive quality of the experience by maintaining consistent visual, audio, and interactive elements, which is crucial for applications such as multiplayer gaming, collaborative VR environments, and synchronized media consumption.
- providing consistent data to all experience devices can reduce latency and discrepancies, thereby improving the responsiveness and realism of the shared experience. Consequently, users can benefit from a seamless and cohesive interaction, increasing satisfaction and engagement in multi-user and multi-device scenarios.
- the portable device and/or the integrated device uses redundant sensors to validate the accuracy of the data and ensure a correct calibration of the experience device.
- Utilizing redundant sensors can significantly enhance the reliability and accuracy of the system by cross-verifying data from multiple sources. This redundancy can detect and correct errors, ensuring that the experience device maintains precise calibration and operates optimally. Additionally, the use of redundant sensors can improve fault tolerance, allowing the system to continue functioning correctly even if one sensor fails or provides inaccurate data. Consequently, users can benefit from a more robust and dependable experience, with increased accuracy and reliability in various applications.
- the pose of the portable device and/or the integrated device can be determined and/or adjusted based on user input on the device's display, and/or platform display, and/or the experience device display.
- Allowing user input to determine or adjust the device's pose can significantly enhance the accuracy and flexibility of the system by enabling precise manual calibration.
- This user-driven adjustment capability can accommodate various usage scenarios and individual preferences, ensuring that the device operates optimally in different environments. Additionally, this feature can improve the user experience by providing intuitive and direct control over the device's configuration, leading to greater satisfaction and usability. Consequently, users can benefit from a more accurate, adaptable, and user-friendly system, enhancing the overall effectiveness and enjoyment of the experience.
- the portable device and/or the integrated device ensures correct positioning of virtual avatars in a multi-user experience based on the relative pose and/or motion of the at least one experience device and the body poses tracked by those devices. Ensuring accurate positioning of virtual avatars can significantly enhance the realism and immersion of multi-user experiences, making interactions more natural and intuitive. This capability can prevent discrepancies in avatar placement, which can disrupt the user experience and reduce the effectiveness of collaborative or interactive applications. Additionally, by accurately tracking and synchronizing body poses, the system can facilitate more engaging and cohesive virtual environments, where users can interact seamlessly. Consequently, users can enjoy a more immersive, interactive, and cohesive experience, leading to higher satisfaction and greater utility in applications such as virtual meetings, gaming, and collaborative workspaces.
- the portable device and/or the integrated device provides an interactive user interface to guide the user through the setup and calibration process, ensuring correct installation and configuration.
- This interactive user interface can significantly enhance the ease of use and accessibility of the system by providing clear, step-by-step instructions, which can reduce the potential for user error during setup and calibration. By facilitating correct installation and configuration, it can ensure optimal performance and reliability, improving the accuracy and effectiveness of the experience device. Additionally, this guided process can save time and effort for users, leading to a more efficient and satisfactory user experience. Consequently, users can benefit from a more intuitive and streamlined setup process, resulting in enhanced functionality and overall satisfaction with the system.
- the portable device and/or the integrated device is mounted and/or placed and/or held in various poses within the platform, including fixed mounts, handheld positions, and/or semi-rigid attachments and/or placements.
- This flexibility in mounting and positioning can significantly enhance the adaptability and usability of the portable device and/or integrated device across different environments and use cases.
- By allowing the device to be securely fixed, handheld, or sem i-rigidly attached it can ensure consistent and reliable performance regardless of the installation method.
- this versatility can cater to diverse user preferences and operational requirements, improving the overall user experience. Consequently, users can benefit from a more flexible and adaptable system that maintains optimal functionality and reliability in various scenarios, leading to increased satisfaction and broader applicability.
- the portable device and/or the integrated device uses data on the relative positions of the experience device to provide a spatial audio experience, ensuring that sound directionality matches the visual scene.
- the system can deliver highly immersive and realistic spatial audio, enhancing the overall user experience.
- This alignment of sound directionality with the visual scene can create a more cohesive and engaging environment, significantly improving the sense of presence and immersion for the user.
- this capability can enhance the effectiveness of applications such as virtual reality (VR), augmented reality (AR), and gaming, where accurate spatial audio is crucial for a realistic experience. Consequently, users can benefit from a more intuitive and immersive auditory experience, which complements the visual content and enhances the overall quality and satisfaction with the system.
- the portable device and/or the integrated device automatically detects its mount pose within the platform and adjusts the calibration and data processing accordingly.
- Automatic detection of the mount pose can significantly enhance the system’s ease of use and accuracy by eliminating the need for manual adjustments. This capability can ensure that the device is always correctly calibrated, leading to more precise and reliable data processing.
- dynamically adjusting calibration and data processing based on the detected mount pose the system can maintain optimal performance regardless of the device’s position within the platform. Consequently, users can benefit from a more intuitive and reliable system, which can improve the overall user experience and satisfaction by providing consistent and accurate functionality in various mounting scenarios.
- the portable device and/or the integrated device pre-processes data, such as map data or virtual scenes, and streams the processed data to the experience device for rendering.
- data such as map data or virtual scenes
- the system can offload computationally intensive operations from the experience device, thereby enhancing its performance and responsiveness. This can lead to smoother and more efficient rendering of complex scenes, improving the overall user experience.
- preprocessing data before streaming it to the experience device can reduce latency and ensure that the visual and interactive elements are delivered in real-time. Consequently, users can benefit from a more seamless and immersive experience, with high-quality visuals and interactions that enhance the effectiveness and enjoyment of the system.
- This disclosure describes a solution that uses a portable mobile device that may be operated in the platform and that may therefore allow performing separate measurements, i.e. (cyclic or periodic) measurements concerning the pose and/or motion and/or acceleration of the platform in the environment (e.g., a geoposition) and (cyclic or periodic) measurements concerning the pose and/or motion and/or acceleration of the headset in the platform. These measurements can be fused to compute smoother pose and/or motion and/or acceleration.
- a portable mobile device can be a smartphone or a tablet PC (PC personal computer) or a smart watch or a laptop or a multi-functional device with similar capabilities.
- a device of that type may come with suitable hardware, in particular at least one sensor and/or at least one receiver that may be used for implementing the solution.
- the device may in particular obtain the two separate measurements concerning a) the platform relative to the outdoor environment of the platform (e.g., a world reference frame on Earth) and b) the headset relative to the platform.
- the following aspects provide advantages.
- the “moving platform” is referred to as “vehicle” without the intent of limiting the scope of this disclosure
- the “portable and/or integrated device” is referred to as “device” without the intent of limiting the scope of this disclosure:
- the device can be fixed in the vehicle using a rigid mount so that it doesn’t change its position relative to the vehicle
- the mount can have an anti-slipping surface e or a mechanical grip or a magnetic grip - Limited motion can be achieved without a mount, sensor filters can mitigate the error between the device and the vehicle
- the mount can have an additional GNSS antenna (Global Navigation Satellite System) that can be connected to the device
- GNSS antenna Global Navigation Satellite System
- the antenna can be wired in the car interior or at arbitrary places such as under the hood, on the roof, under the chassis, in the trunk.
- the antenna can be wirelessly connected to the device, for example, the device could connect to a roof antenna of the platform.
- the antenna or an antenna receiver for the antenna may be linked to the device by a wired or wireless link for providing a received signal, in particular geo-position data, to the device.
- the additional antenna may have a magnetic base element for placing the antenna on a magnetic surface, e.g., a car chassis
- At least one supportive sensor may be provided in vehicle that may be connected to the device, if the corresponding internal sensor of the device does not provide sufficient precision
- the calibration can be triggered by using a certain RFID tag, active charging state or similar.
- the calibration can automatically be started depending on sensor heuristics.
- the calibration and/or the connectivity of one or more Experience Devices can be provided on either or both the integrated display of the Device and/or the Experience Device.
- the calibration and mounting position can be validated using other (redundant) sensors such as optical sensors.
- the device can have at least one of GNSS technology, inertial, Wi-Fi, magnetometer, map data, cellular, radio, UWB, optical (including for example camera/s, laser, radar or the like) and/or barometer and (in case of several such data sources) fuse their data and/or transmit raw data (all or one or some) to one or more experience devices, via a wireless and/or wired connection -
- One or multiple experience devices can use these data for multiple (alternative or concurrent) purposes, including
- one or several reference elements that can be active or passive, for example a beacon, a set of LEDs, a QR code or similar can (additionally but not necessarily) be placed on the headset to be identified by the device for the purpose of computing relative pose or motion or similar
- Multiple experience devices can estimate the relative position with respect to the device (and/or the device with respect to the experience device/s) and by that compute relative spatial information
- optical data including cameras, laser and/or similar
- electromagnetic data captured by the experience device and/or by the device
- any or all of this data can be used for multi-user-experiences to place the characters in a common digital environment, e.g., cockpit or scene at the corresponding offset, and/or for single-user but social experiences
- Spatial audio in the experience device can depend on the relative pose of the headset with regard to the device in the vehicle,
- the mobile device also can receive map data and forward it to the experience device. E.g., by receiving it from the internet via the cellular network and forwarding it via Wi-Fi tethering to the respective experience device
- the experience device/s can send map requests to the device in order to create a virtual environment that matches the road network
- workload can be outsourced from the Experience Devices processing unit to the Device and be shared with multiple Experience Devices. This can be:
- An interface can be provided in the device that can connect to the vehicle for receiving data from the vehicle, e.g., for using at least one sensor of the vehicle for obtaining data on vehicle position and/or orientation and/or motion and/or acceleration
- the computational tasks for providing the experience device pose and/or spatial content may be distributed or shared among the device and the experience device. This can be done dynamically, e.g., as a function of computational load and/or complexity of the task.
- device detects experience device/s
- experience device/s detect device position of device in vehicle may be known or unknown, position relative to device may suffice
- device and/or experience device/s may use magnetic and/or optical and/or UWB markers and/or electromagnetic sensors for tracking using e.g., triangulation I trilateration
- the device may detect that the device is placed inside vehicle in a mount and/or inside a predefined area using activity of (wireless I wired) charging system and/or NFC element in mount and/or acceleration profile (time signal) matches vehicle acceleration profile: support mode and/or calibration is automatically activated
- the user can call a calibration with regard to device: relative location and/or orientation and/or position (pose) is observed
- - pose of device in vehicle may be known (by user input or RFID-tag I NFC element in vehicle, but not limited to these approaches)
- the device may receive GNSS signals, camera, optical sensor, lasers, magnetometer, accelerometer, and/or cellular data, and/or similar and may fuse them or not and transmits it to experience device/s in real time, to enable geo-localization for the experience device/s
- One or multiple experience devices may track the pose relative to the device (e.g., by radio signal triangulation) and by that gain
- the disclosure also includes a portable device.
- the portable device is configured to perform the disclosed method.
- the portable device according to the disclosure is preferably designed as a smartphone, tablet, PDA, smartwatch, and the like.
- the disclosure also includes an integrated device.
- the portable device is configured to perform the disclosed method.
- the integrated device according to the disclosure is preferably designed as a control unit of a platform, preferably a vehicle.
- the portable device is configured to act as a hub and integrate data from the platform, internet services, and/or data from its own sensors, in particular for supporting and enabling motion- synchronized experiences.
- the disclosure also comprises a computer-readable storage medium comprising program code which, when executed by a computer, portable device, or a computer network, causes it to execute an embodiment of the method according to the disclosure.
- the storage medium may be provided at least in part as a non-volatile data storage (e.g., as a flash memory and/or as an SSD - solid state drive) and/or at least in part as a volatile data storage (e.g. as a RAM - random access memory).
- the storage medium can be arranged in the computer, portable device, or computer network. However, the storage medium can also be operated as a so-called Appstore server and/or cloud server on the Internet, for example.
- the computer or computer network can provide a processor circuit with, for example, at least one microprocessor.
- the program code can be provided as binary code and/or as assembler code and/or as source code of a programming language (e.g., C) and/or as a program script (e.g. Python).
- the disclosure also comprises a computer program having program code or program means, wherein, if the computer program is executed on a computer or a computer-based processing unit, the computer program is stored on a computer readable medium, wherein the program code or the program means causes the computer or the computer-based processing unit to execute a method according to any of the preceding method claims.
- the disclosure also includes combinations of the features of the embodiments described.
- the disclosure also includes implementations each comprising a combination of the features of several of the described embodiments, provided that the embodiments have not been described as mutually exclusive.
- Fig. 1 schematically illustrates a system comprising a platform and a portable device
- Fig. 2 schematically illustrates a flow chart of the method for operating at least one experience device in a movable and/or moving platform using at least one portable device and/or at least one integrated device.
- Fig. 1 depicts a system for operating at least one experience device within a moving platform.
- the system is presented as the movable and/or moving platform 1 .
- the movable and/or moving platform 1 is shown with various components integrated into its structure.
- a portable device 2 and an integrated device 3 are located within/on the movable and/or moving platform 1 .
- the portable device 2 could be a smartphone, tablet PC, smart watch, or multi-functional device.
- the integrated device 3 comprises sensors such as an inertial measurement unit, GNSS receiver and antenna, wheel data, steering data, speed data, camera, optical sensor (e.g., lasers and/or radar) magnetometer, and/or accelerometer. These sensors gather data to determine the pose, movement, and/or acceleration of the movable and/or moving platform 1 .
- sensors such as an inertial measurement unit, GNSS receiver and antenna, wheel data, steering data, speed data, camera, optical sensor (e.g., lasers and/or radar) magnetometer, and/or acceler
- Platform sensors 5 are distributed within the movable and/or moving platform 1 to enhance the accuracy, redundancy, and robustness of user experience when using the experience device 4.
- the experience device 4 is configured to present spatial content such as audio, visual, VR, AR, or MR content based on data received from the portable device 2 and/or integrated device 3.
- the experience device 4 can also present 2D content and is capable of rendering motion-synchronized content.
- the system performs measurements to determine the pose, motion, and acceleration of the movable and/or moving platform 1 . These measurements enable a motion-synchronized experience on the experience device 4.
- the experience device 4 operates based on the measurements from the portable device 2 and/or integrated device 3.
- the world frame 7 represents the reference coordinate system for external data integration, ensuring the alignment of the VR or other digital world with the real-world environment. There is a need to align the virtual reality (VR) world (or other digital world) with the real world using three coordinate systems: the world coordinate system 7, the vehicle coordinate system 6, and the coordinate system of the experience device 4. These systems work together to provide a synchronized and immersive experience.
- the world coordinate system 7 serves as the reference frame representing the external world, crucial for aligning digital scenes with the real-world road network.
- the vehicle coordinate system 6 is essential for understanding the vehicle's movement and orientation within the world frame 7. It utilizes data from sensors such as, but not limited to the inertial measurement unit (IMU), GNSS receiver and antenna, wheel data, steering data, speed data, camera, optical sensor, (e.g., lasers and/or radar), magnetometer, and/or accelerometer to determine the vehicle's pose and/or movement and/or acceleration accurately.
- sensors such as, but not limited to the inertial measurement unit (IMU), GNSS receiver and antenna, wheel data, steering data, speed data, camera, optical sensor, (e.g., lasers and/or radar), magnetometer, and/or accelerometer to determine the vehicle's pose and/or movement and/or acceleration accurately.
- the experience device coordinate system ensures that the experience device’s orientation and position are synchronized with the vehicle's movements and the user's viewing direction. This synchronization is vital for maintaining a cohesive digital experience. It is important to synchronize the vehicle's rotation with the experience device rotation, especially when the experience device lacks reliable inside-out tracking. This involves using external devices to measure and transmit data about the vehicle's movement to the experience device, ensuring accurate alignment. Therefore, the portable device 2, such as, but not limited to a smartphone, attached to the platform 1 is used. Alternatively, or additionally, an integrated device 3 integrated into the moving platform 1 can be used.
- IMU inertial measurement unit
- GNSS receiver and antenna gather data from inertial measurement unit (IMU), GNSS receiver and antenna, wheel data, steering data, speed data, camera, optical sensor, (e.g., lasers and/or radar), magnetometer, and/or accelerometer, process it and/or transmits it raw for processing on the experience device 4.
- This setup facilitates vehicle localization and motion synchronization without relying on fixed vehicle sensors, enhancing flexibility and usability.
- the portable device 2 and/or integrated device 3 can integrate weather conditions and other data from internet services into the digital experience. By processing this information within the world frame, it enhances the realism and contextual relevance of the digital scenes, providing users with a more immersive and interactive experience.
- the portable device 2 and/or integrated device 3 perform loading and preprocessing of map data, generating virtual scenes, experiences, audio, rendered images, and virtual elements from the map data, and streaming the pre-processed and/or not pre-processed content to the experience device 4.
- the portable device 2 and/or integrated device 3 enable multiple experience devices 4-x within the movable and/or moving platform 1 to provide synchronized spatial audio, location-aware, and motion-aware audio and visual experiences based on the relative positions of the experience devices 4-x.
- the portable device 2 and/or integrated device 3 integrate data from platform sensors 5 such as but not limited to wheel ticks and/or vehicle speed and/or steering angle and/or GNSS data and/or optical data and/or IMU data to enhance the user experience accuracy and robustness.
- Computational tasks for audio and visual content are dynamically distributed between the portable device 2 and/or integrated device 3 and the experience device 4) based on computational load and task complexity.
- the portable device 2 and/or integrated device 3 automatically can activate and calibrate the experience device 4 upon detecting its temporary fixed position within the movable and/or moving platform 1 using sensor heuristics and/or NFC elements.
- the portable device 2 and/or integrated device 3 act as a hub, gathering data from multiple sources and distributing it to the experience device 4.
- the portable device 2 and/or integrated device 3 incorporate weather and environmental data from internet services and detected objects outside of the platform into the experience device 4, providing a realistic representation or artistic re-interpretation of the current conditions.
- the portable device 2 and/or integrated device 3 integrate data from external data sources, such as but not limited to internet services, to enhance the experience.
- the pose of the portable device 2 and/or integrated device 3 can be determined and adjusted based on user input on the portable and/or integrated device's display, platform display, or the experience device display.
- the portable device 2 and/or integrated device 3 ensure correct positioning of virtual avatars in a multi-user experience based on the relative pose and motion of the experience device 4 and the body poses tracked by those devices.
- the portable device 2 and/or integrated device 3 is mounted, placed, or held in various poses within the movable and/or moving platform 1 , including fixed mounts, handheld positions, and/or semi-rigid attachments.
- the portable device 2 and/or integrated device 3 can automatically detect its pose within the movable and/or moving platform 1 and adjust the calibration and data processing accordingly.
- the portable device 2, when used in the movable and/or moving platform 1 is also configured to process the method 200 for operating the portable device 2 and/or the integrated device 3 in the movable and/or moving platform 1 , wherein a movement and/or pose of the portable device 2 and/or integrated device 3 relative to the platform 1 is limited or fixed, and wherein the portable device 2 and/or integrated device 3 supports at least one experience device 4 being located in the platform 1 to determine a pose and/or a movement and/or an acceleration of the respective experience device 4 relative to the platform 1.
- Fig. 2 illustrates the method 100 for operating at least one experience device 4 in a movable and/or moving platform 1 using at least one portable device 2 and/or at least one integrated device 3, wherein a movement and/or pose of the portable device 2 and/or integrated device 3 relative to the movable and/or moving platform 1 is limited or fixed.
- the method 100 can be processed by the portable device 2 and/or integrated device 3 and/or the experience device 4, when used in the movable and/or moving platform 1 .
- the method 100 comprises several method steps.
- a first step 110 at least one measurement is performed to determine a pose and/or motion and or acceleration of the platform 1 .
- the measurement is leveraged to enable a motion-synchronized experience on the at least one experience device 4.
- the at least one experience device 4 is operated based on the measurement.
Landscapes
- Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The disclosure relates to a method (100) for operating at least one experience device (4) in a movable and/or moving platform (1) using at least one portable device (2) and/or at least one integrated device (3), wherein a movement and/or pose of the portable device (2) and/or integrated device (3) relative to the platform (1) is limited or fixed, the method comprises the steps of: - Performing at least one measurement (110) to determine a pose and/or motion and/or acceleration of the platform (1); - Leveraging the measurement (120) to enable a motion-synchronized experience on the at least one experience device (4); and - operating (130) the at least one experience device (4) based on the measurement.
Description
Usage of a mobile device or integrated device supporting and/or enabling motion synchronized experiences on a moving platform
Description
An XR headset as an example of an experience device can be designed as a head mounted device that can display or present a VR content (VR - virtual reality) and/or AR content (AR - augmented reality) and/or MR content (MR - mixed reality) to the user wearing the XR headset, in particular by providing a pose-aware and motion-synchronized output to the eyes of the user. As such, XR is a subset of all spatial media output devices, in the following referred to as “experience devices”, that replace or augment real-world signals with pose-aware and motion-synchronized virtual signals, in the following referred to as “spatial content”. Spatial audio headphones or a 2D screen that functions as a portable camera or window into a virtual 3D space are other examples for experience devices and are equally taken into account.
For synchronizing movements of the user’s head with the shown spatial content, the experience device can comprise at least one sensor that provides a sensor signal that is correlated with a pose (position and/or spatial orientation/rotation) and/or motion and/or acceleration of the experience device in space I the surroundings. The at least one sensor can for example comprise at least one camera and/or an IMU (inertial measurement unit).
Operating such a spatial media output device in a moving platform, in particular in a vehicle (like a passenger vehicle or a truck or a passenger bus or a motorbike) or a plane or a ship/boat, comes with the technical problem that the at least one sensor of the experience device will sense both the
movements of the user inside the platform as well as the movements of the platform in the environment in the same way.
These two types of position/movement need to be distinguished when presenting spatial content to the user via a pose- and motion-tracked experience device. Otherwise, when the platform moves in a curve with the user sitting still inside the platform, the spatial content will be changed in the same way as when the user turns the device inside the platform with world reference tracking. In the case of vehicle reference tracking, the motion relative to the world is unknown. The two types are preferably separated as an information on the pose of the vehicle in relation to its environment I surroundings and an information on the pose of the experience device in relation to the platform.
Thus, there is a need in the state of the art to provide a solution for a method for operating at least one experience device in a movable and/or moving platform using at least one portable device and/or integrated device that can overcome, at least partially, the disadvantages of the above-described approaches.
There, it is an objective of the disclosure to provide the method for operating at least one experience device in a movable and/or moving platform using at least one portable device and/or integrated device and a portable device, each of which is suitable for enriching the known state of the art.
According to a first aspect, the disclosure relates to a method for operating at least one experience device in a movable and/or moving platform using at least one portable device and/or at least one integrated device.
A movement and/or pose of the portable device and/or integrated device relative to the platform is limited or fixed. This feature can enhance the accuracy, reliability, and overall performance of the system in which the portable and/or integrated device operates and/or support the usage of further devices. By limiting or fixing the movement and/or pose of the portable device and/or integrated device relative to the platform, the portable
device and/or integrated device can achieve higher positional accuracy. This stabilization can reduce errors caused by unintended movements or vibrations, ensuring that the devices maintain a consistent and precise location within the platform. The stabilization of the devices can lead to improved quality of sensor data. Sensors such as accelerometers, gyroscopes, and cameras can operate more effectively when the device is held in a stable position, reducing noise and enhancing the fidelity of measurements and observations. Fixing or limiting the movement of the device can facilitate more consistent calibration. A stable device position can simplify the calibration process, ensuring that once the device is calibrated relative to the platform, it remains accurate over time, thus reducing the need for frequent recalibrations. Further, when the movement and/or pose of the device is controlled, the integration of data from multiple sources (e.g., GNSS, IMlls, cameras, lasers) can be more reliable. The consistency in device positioning can enable more accurate data fusion, leading to better overall system performance.
In the meaning of the disclosure, the experience device refers to a digital or electronic device designed to present, interact with, or process digital content in either two-dimensional (2D) or three-dimensional (3D) formats. This encompasses devices that facilitate immersive and interactive experiences by integrating various sensors, processing units, and communication interfaces. Experience devices can be used for applications such as virtual reality (VR), augmented reality (AR), mixed reality (MR), gaming, media consumption, and spatial audio experiences.
The experience device can comprise a display unit capable of rendering 2D or 3D content. This display unit can provide visual output that may include stereoscopic visuals for enhanced depth perception in 3D experiences. The experience device can include sensors such as inertial measurement units (IMlls), global navigation satellite system (GNSS) receivers, cameras, optical sensors, magnetometers, barometers, and/or accelerometers. These sensors can be used to track the device’s pose, movement, and acceleration relative to its environment. The experience device can be equipped with a
processing unit to handle computational tasks. This processing unit can perform data pre-processing, map integration, and rendering of virtual scenes. The experience device can support wired or wireless communication interfaces such as Bluetooth, Wi-Fi, Ultra-Wideband (UWB), and/or Ethernet. These interfaces can facilitate data exchange with integrated devices or other experience devices. The experience device can be powered by internal batteries, vehicle power systems, or wireless charging. The experience device can provide a user-friendly interface for setup, calibration, and realtime data interaction. This interface can display instructions and feedback to the user. The experience device can adapt to changes in the environment, such as vehicle speed and external conditions, to dynamically update the user experience.
Examples of experience devices can include VR headsets such as but not limited to the Oculus Rift, HTC Vive, and PlayStation VR, which can provide immersive 3D virtual environments with stereoscopic displays and spatial audio. Examples of experience devices can include AR headsets such as Microsoft HoloLens and Magic Leap One, which can overlay digital content onto the real world, allowing interaction with both physical and virtual objects. The experience device can deliver synchronized 2D or 3D content to passengers, enhancing travel experiences. The experience device can provide but not limited to immersive 2D or 3D gaming experiences with motion tracking and spatial audio. The experience device can offer realistic simulations and training environments using VR or AR headsets. The experience device can enhance the experience of watching movies, shows, and other media content with spatial audio and interactive features.
Examples of experience devices can include smartphones, tablets, screens and/or the platform infotainment system.
In the meaning of the disclosure, the movable and/or moving platform refers to any vehicle, apparatus, or structure that can transport or be transported from one location to another. This platform can provide a base for various devices and systems, including experience devices, and can offer an environment where positional and motion data can be crucial for
synchronized experiences. Such platforms can range from personal transportation vehicles to larger commercial or industrial systems.
The movable and/or moving platform can be capable of transporting itself or being transported from one location to another. This mobility can be powered by various means, including but not limited to engines, motors, manual force, or external transport systems. The movable and/or moving platform can integrate with experience devices to provide synchronized data on position, orientation, movement, status of the platform (e.g., standing still, charging, or others) and environmental conditions. This integration can enhance the functionality of experience devices, enabling immersive and interactive experiences. The movable and/or moving platform can be equipped with sensors such as GNSS receivers, IMlls, cameras, optical sensors, lasers, wheel sensors, steering sensors, radar sensors, accelerometers, gyroscopes, barometers, and/or magnetometers. These sensors can track for example the platform’s position, orientation, speed, acceleration and/or environmental conditions. The movable and/or moving platform can provide power to integrated systems and devices, which can include internal batteries, connection to an external power source, or energy generation systems like solar panels or generators. The platform can support communication interfaces for data exchange with integrated devices and external systems. This can include wired connections, wireless technologies like Bluetooth, Wi-Fi, and cellular networks. The platform can adapt to various environmental conditions, providing data and feedback to integrated devices to ensure synchronized and optimal performance.
Examples can, but not limited to include cars, motorcycles, bicycles, and scooters. These platforms can integrate with experience devices to provide synchronized data on movement and environmental conditions. All the mentioned examples can be equipped with the various existing drive concepts. Further examples can include buses, trains, trams, and subways. These platforms can support experience devices used by passengers for entertainment, navigation, and information. Further examples can include airplanes, helicopters, drones, and gliders. These platforms can provide
dynamic environments for experience devices, integrating data on position, orientation, speed, acceleration and/or environmental conditions. Further examples can include boats, ships, submarines, and yachts. These platforms can integrate with experience devices for navigation, entertainment, and environmental monitoring. Further examples can include trucks, forklifts, cranes, and construction vehicles. These platforms can provide critical data to experience devices used for operational efficiency and safety. Further examples can include rockets, space shuttles, and space stations. These platforms can offer unique environments for experience devices, incorporating data on space conditions, trajectory, and onboard systems. Further examples can include recreational vehicles (RVs), caravans, and mobile homes. These platforms can integrate with experience devices to enhance travel and living experiences with synchronized data on movement and location.
In the meaning of the disclosure, the portable device refers to any compact, and mobile electronic device that can be carried and operated by a user. Such devices are designed for mobility and can perform various computational, communication, and entertainment functions independently or in conjunction with other systems. Portable devices can integrate a range of sensors, communication interfaces, and power sources to support their diverse functionalities.
A portable device can be designed to be compact and lightweight, facilitating ease of transport and use by an individual. The portable device can comprise a display unit for visual output, which can range from simple monochrome screens to high-resolution color displays. The display can support touch input and/or physical buttons for user interaction. The portable device can include sensors such as inertial measurement units (IMlls), global navigation satellite system (GNSS) receivers, cameras, lasers, optical sensors, magnetometers, barometers, and/or accelerometers. These sensors can enable the device to track position, orientation, movement, speed, acceleration and/or environmental conditions. The portable device can be equipped with a processing unit to perform computational tasks. This
processing unit can handle data processing, application execution, and communication management. The portable device can support various wired and wireless communication interfaces such as Bluetooth, Wi-Fi, cellular networks, NFC, and USB. These interfaces can facilitate data exchange with other devices and systems. The portable device can be powered by an internal rechargeable battery, which can be charged through wired connections (e.g., USB) or wireless charging technologies. The portable device can include internal storage for data and applications. It can also support expandable storage options such as microSD cards. The portable device can provide an interactive user interface, which can include a touchscreen, physical buttons, or voice commands. This interface can facilitate user interaction with applications and device functions. The portable device can be designed for high mobility, allowing users to carry and use it across different locations and environments. The portable device can also include an interface to receive sim cards for storing data and/or establishing a communication to a mobile network.
Examples can include but are not limited to devices like the Apple iPhone, Samsung Galaxy, and Google Pixel, which can perform a wide range of functions including communication, navigation, and multimedia playback. Further, examples can include devices like the Apple iPad, Samsung Galaxy Tab, and Microsoft Surface, which can offer larger screens and enhanced capabilities for productivity and entertainment. Further, examples can include devices like the Apple Watch, Samsung Galaxy Watch, and Fitbit, which can provide fitness tracking, notifications, and communication features on a compact wearable platform.
Portable devices can enable voice calls, video calls, messaging, and email communication while on the move. Portable devices can use GNSS and mapping applications to provide real-time navigation and location services. Portable devices can offer multimedia playback, including music, videos, and games, providing entertainment in various environments. Portable devices can support office applications, note-taking, and document editing, allowing
users to work from anywhere, e.g., in the vehicle. Portable devices can track physical activities, monitor health metrics, and provide fitness guidance.
In the meaning of the disclosure, the integrated device refers to an electronic system or component that is built into a larger apparatus or platform, for instance a vehicle, designed to perform specific functions as part of that overall system. Such devices can be embedded within the structure of the apparatus, seamlessly combining their capabilities with the platform’s operation. Integrated devices can interact with other components and systems within the apparatus to enhance functionality, efficiency, and user experience.
An integrated device can be embedded within a larger apparatus or platform, ensuring seamless operation and interaction with the host system. The integrated device can perform specific functions, such as sensing, processing, communication, or control, which are essential to the operation of the host system. The integrated device can include various sensors such as inertial measurement units (IMlls), global navigation satellite system (GNSS) receivers, wheel ticks, steering information, cameras, lasers, optical sensors, magnetometers, barometers, and/or accelerometers. These sensors can provide critical data for the host system's operation. The integrated device can comprise a processing unit that handles computational tasks, data processing, and system control. This unit can work in conjunction with the host system’s processor or operate independently. The integrated device can, but is not limited to, support wired or wireless communication interfaces such as Bluetooth, Wi-Fi, Ethernet, CAN bus, FlexRay, and proprietary protocols. These interfaces can enable data exchange with other components of the host system or external devices. The integrated device can draw power from the host system’s power supply, ensuring consistent operation without the need for separate power sources. The integrated device can interface with the host system’s user interface, allowing users to interact with its functionalities through the main system controls and displays.
Examples can include a respective processor device for a motor vehicle. The processor circuit may comprise at least one microprocessor and/or at least one microcontroller and/or at least one FPGA (Field Programmable Gate Array) and/or at least one DSP (Digital Signal Processor) and/or at least one ASIC (Application Specific Integrated Circuit). In particular, a CPU (Central Processing Unit), a GPU (Graphical Processing Unit) or an NPU (Neural Processing Unit) can be used as the respective microprocessor. Furthermore, the processor device may comprise program code comprising instructions which, when executed by the processor circuit, perform the method steps according to the method of the disclosure. The program code may be stored in a data storage of the processor circuit. The processor circuit can, for example, be based on at least one circuit board and/or on at least one SoC (System on Chip). The processor device may include integrated satellite navigation units for providing navigation, position, orientation and/or velocity data.
The method comprises several method steps.
In a first step, performing at least one measurement to determine a pose and/or motion and/or acceleration of the platform takes place. This initial measurement can establish a precise baseline for the platform's current state, enabling accurate tracking and subsequent calculations. By determining the platform's pose, motion, and acceleration, it can effectively be differentiated between platform -induced movements and those of any devices or users on the platform. This can enhance the accuracy of sensor data, improve the reliability of subsequent measurements, and optimize system performance. Furthermore, this step can reduce computational complexity in later stages by providing clear initial conditions, thus enabling more efficient data processing and analysis.
In a further step, leveraging the measurement to enable a motion- synchronized experience on at least one experience device takes place. It allows to create highly immersive and realistic experiences by aligning the content on the experience device with the actual motion of the platform. By
synchronizing the experience device with the platform's movement, motion sickness can be reduced and user comfort enhanced, particularly in but not limited to applications such as virtual reality (VR) or augmented reality (AR). Additionally, this synchronization can improve the accuracy of spatial audio and visual elements, making interactions more intuitive and engaging. Overall, leveraging these measurements can optimize the performance of the experience device, delivering a seamless and coherent user experience.
In a further step, operating the at least one experience device based on the measurement takes place. This step can ensure that the experience device operates in harmony with the platform's movements, thereby enhancing the accuracy and realism of the user experience. By basing the experience device's operation on precise measurements, it can dynamically adjust the content and interactions to reflect real-time changes in the platform's pose, motion, and acceleration. This can lead to improved user engagement and reduced motion sickness in immersive applications like virtual reality (VR) and augmented reality (AR), but not limited to these devices. Additionally, it can optimize the experience device's performance by minimizing latency and ensuring smooth, coherent experiences, thereby increasing user satisfaction and system efficiency.
The method as described above provides a number of technical advantages. The method tracks the platform’s dynamics, which can be crucial for distinguishing platform-induced movements from user actions. Leveraging these precise measurements to enable a motion-synchronized experience on the experience device can significantly enhance user immersion and comfort, while operating the experience device based on these measurements can ensure seamless, real-time interactions, thereby optimizing the overall performance and user satisfaction of the system.
As a further solution, a method for operating a portable device and/or an integrated device in a movable and/or moving platform is provided. A movement and/or pose of the portable device and/or integrated device relative to the platform is motion-limited or fixed. The portable device and/or
integrated device supports at least one experience device being located in the platform to determine a pose and/or a movement and/or an acceleration of the respective experience device relative to the platform.
The method as described above provides a number of technical advantages. By limiting or fixing the movement and/or pose of the portable device and/or integrated device relative to the platform, the system can achieve higher accuracy and stability in tracking the platform's dynamics. This stabilization can enhance the precision of determining the pose, movement, and acceleration of the experience device relative to the platform, ensuring that the experience device receives reliable and consistent data. Consequently, the experience device can operate more effectively, delivering a synchronized and immersive user experience. Additionally, this method can reduce computational complexity and improve system efficiency by providing clear and stable reference points for data processing and analysis.
According to an embodiment, the movable and/or moving platform comprises at least a bike, rollercoaster, industrial vehicle, car, bus, train, truck, plane, helicopter, and/or ship, and/or the like moving platform. The listing is not limited to the listed examples. This versatility can enable the method to be applied across a wide range of transportation modes, enhancing its applicability and usefulness in various contexts. It can deliver consistent and reliable performance in diverse environments, ensuring accurate and synchronized experiences regardless of the specific platform used.
According to an embodiment, the portable device comprises:
- a smartphone,
- a tablet PC,
- a smart watch, and/or
- a multi-functional device.
This versatility can allow the method to leverage the diverse capabilities of various portable devices, enhancing its adaptability and user accessibility. Users can benefit from seamless integration and consistent performance across different portable devices, ensuring a wide range of applications and
improved user experience. The listing of portable devices is not limited to the listed examples.
According to an embodiment, the integrated device comprises at least one integrated sensor within or attached to the platform, selected from the group consisting of an inertial measurement unit (IMU), global navigation satellite system (GNSS) receiver and antenna, camera, optical sensor (laser sensor), radar, wheel sensor, steering sensor, magnetometer, gyroscope, wheel sensor, steering sensor, and accelerometer, and wherein the integrated device is configured to gather data from these sensors to determine the pose and/or movement and/or acceleration of the platform. This configuration can ensure highly accurate and reliable data collection from multiple sensor sources, enabling precise determination of the platform’s dynamics. Consequently, the system can leverage this data to enhance the synchronization and performance of experience devices, leading to improved user experiences, enhanced safety, and optimized operational efficiency across various applications.
According to an embodiment, the experience device comprises a unit for presenting audio and/or visual content, including spatial and/or VR content and/or AR content and/or MR content, or a display unit for 2D content, or a vehicle infotainment system capable of rendering content based on data received from the portable device and/or integrated device, in particular in a motion-synchronized manner. It can enable the experience device to deliver highly immersive and interactive experiences by leveraging accurate and real-time data from the portable and/or integrated device. Consequently, users can benefit from synchronized audio and/or visual content that enhances realism and engagement, improves spatial awareness in VR/AR/MR applications, and provides a seamless transition between different content types. This can lead to an overall enhanced user experience, increased satisfaction, and broader applicability across various entertainment, training, and operational scenarios.
According to an embodiment, the portable device and/or the integrated device performs loading and pre-processing of map data, generating a virtual scene and/or experience and/or audio and/or rendered images and/or virtual elements from the map data, and streaming the pre-processed content to the at least one experience device. It can enable the system to deliver highly detailed and immersive virtual experiences by offloading intensive processing tasks to the portable or integrated device. Consequently, the experience device can operate more efficiently, as it receives pre-processed content ready for immediate display, reducing latency and enhancing performance. This can lead to improved user experiences, with smoother and more responsive interactions, and can expand the range of applications by providing rich, contextually relevant virtual environments seamlessly integrated with real-world data.
According to an embodiment, the portable device and/or the integrated device provides calibration instructions on a display unit of the portable device and/or on a display unit of the platform and/or on the experience device, enabling accurate alignment of the portable device and/or experience device within the platform. It can ensure precise alignment and synchronization of devices by guiding the user through a clear and intuitive calibration process. Consequently, the accuracy and reliability of data collected and processed by the system can be significantly improved, leading to enhanced performance of motion-synchronized experiences. Additionally, this can reduce user error during setup, ensuring consistent and optimal operation across different environments and applications. By facilitating accurate alignment, a seamless and immersive user experience, enhancing overall satisfaction and functionality can be provided.
According to an embodiment, the portable device and/or the integrated device enable multiple experience devices within the platform synchronized spatial audio and/or location and/or motion aware audio and/or visual experiences based on the relative poses and motion of the experience devices. This can ensure that all experience devices within the platform operate in harmony, creating a cohesive and immersive multi-user
environment. By synchronizing spatial and motion-aware audio and visual elements, the system can enhance the realism and interactivity of the shared experience, making it more engaging for users. Furthermore, this synchronization can reduce latency and inconsistencies between devices, improving the overall quality and fluidity of the user experience. Consequently, users can benefit from a more intuitive and immersive interaction with the digital content, whether for entertainment, training, or operational purposes.
According to an embodiment, the portable device and/or the integrated device integrates data from platform sensors (e.g., steering angle, wheel data, acceleration) to enhance the accuracy and/or redundancy and/or robustness of the user experience when using the at least one experience device. Integrating sensor data from the platform can lead to more precise and reliable tracking of the platform's dynamics, which in turn can improve the synchronization and performance of the experience device. This enhanced accuracy can reduce errors and inconsistencies in the user experience, providing a smoother and more immersive interaction with the content. Additionally, the redundancy of sensor data can increase the system's fault tolerance and robustness, ensuring consistent operation even if some sensors fail or provide erroneous data. Consequently, users can enjoy a more stable, reliable, and engaging experience, which can be particularly beneficial in safety-critical applications and complex interactive environments.
According to an embodiment, computational tasks for audio and/or visual content are dynamically distributed between the portable device and/or an integrated device and the at least one experience device based on computational load and task complexity. By distributing computational tasks dynamically, the system can optimize resource utilization, ensuring that each device operates within its capacity for maximum efficiency. This can lead to enhanced performance and responsiveness of the experience device, as tasks are allocated based on real-time demands and device capabilities. Furthermore, this approach can reduce latency and prevent bottlenecks,
resulting in smoother audio and visual experiences for the user. Additionally, it can improve the overall robustness and reliability of the system by balancing the load and mitigating the risk of overloading any single device. Consequently, users can benefit from a seamless, high-quality interactive experience, with consistent performance even under varying computational demands.
According to an embodiment, the portable device and/or the integrated device receives map data from external sources (including internet services), processes the data, and transmits it to the at least one experience device to create a virtual environment that matches the real-world road network. By receiving and processing real-world map data, the system can generate a highly accurate and realistic virtual environment, enhancing the user’s immersive experience. This capability can improve the relevance and context of the virtual content, making it more engaging and useful for navigation, training, or entertainment purposes. Furthermore, the synchronization of virtual environments with real-world road networks can enhance the safety and effectiveness of applications such as driver assistance systems or educational simulations. Consequently, users can benefit from a more reliable and contextually accurate interactive experience, which can lead to increased satisfaction and utility in various applications.
According to an embodiment, the portable device and/or the integrated device automatically activates and calibrates the experience device upon detecting its fixed position within the platform using sensor heuristics and/or NFC elements. This automatic activation and calibration can significantly enhance the ease of use and user experience by eliminating the need for manual setup, thereby saving time and reducing the potential for user error. By leveraging sensor heuristics or NFC elements, the system can ensure precise and reliable calibration, which can improve the accuracy and performance of the experience device. Furthermore, this capability can lead to a more seamless and intuitive interaction with the system, as the experience device is always correctly aligned and ready for use.
Consequently, users can enjoy a more efficient and effective setup process, leading to enhanced satisfaction and overall system reliability.
According to an embodiment, the portable device and/or the integrated device act as a hub, gathering data from multiple sources (e.g., vehicle sensors, internet services) and distributing it to at least one experience device. Acting as a hub, the portable or integrated device can centralize data collection and management, ensuring that the experience device receives comprehensive and consistent information from diverse sources. This capability can enhance the accuracy and richness of the content provided to the experience device, improving the overall user experience. Furthermore, the centralized hub can streamline data processing and reduce latency, enabling more responsive and real-time interactions. Consequently, users can benefit from a seamless integration of various data inputs, leading to more immersive and informative experiences across different applications.
According to an embodiment, the portable device and/or the integrated device uses optical sensors, e.g., cameras and/or lasers and/or IR sensors, and/or UWB and/or BLE and/or other electromagnetic based sensors to estimate the pose of the experience device relative to the platform and/or other experience device within the platform. Utilizing these sensors for pose estimation can significantly enhance the precision and accuracy of tracking the experience device's position and orientation. This improved accuracy can lead to more seamless and immersive user experiences, as the system can provide more reliable and synchronized visual and spatial feedback. Additionally, the ability to estimate pose relative to other experience devices can facilitate coordinated multi-user interactions and shared experiences within the platform. Consequently, users can benefit from more engaging and realistic interactions, improving the overall effectiveness and satisfaction with the system.
According to the portable device and/or the integrated device incorporates weather and/or environmental data from internet services and/or from detected objects outside of the platform into the at least one experience
device, providing a realistic representation and/or artistic re-interpretation of the current conditions; and wherein the portable device and/or an integrated device integrates data from external data sources, such as internet services or the like, to enhance the experience. By incorporating real-time weather and environmental data, the system can deliver highly realistic and contextually relevant experiences that reflect current conditions, thereby increasing user immersion and engagement. This capability can also allow for creative and artistic re-interpretations, offering users unique and dynamic visual and sensory experiences. Furthermore, integrating data from various internal and/or external sources, like sensors and/or data services, can enrich the content and functionality of the experience device, ensuring it remains up-to-date and relevant. Consequently, users can benefit from a more immersive, informative, and adaptable experience, enhancing overall satisfaction and utility in a wide range of applications.
According to the portable device and/or the integrated device provides motion-synchronized experiences not limited to spatial content, including 2D games and music, by providing relevant motion and location data to the respective experience device(s). By leveraging motion and location data, the system can enhance the interactivity and immersion of various content types, such as 2D games and music. This synchronization can create a more engaging user experience by aligning in-game actions or musical rhythms with the user's physical movements and the platform's dynamics.
Additionally, this capability can expand the applicability of the system beyond spatial content, making it versatile and valuable for a wider range of entertainment and educational applications. Consequently, users can benefit from enriched, dynamic experiences that respond to their movements, leading to greater enjoyment and a deeper connection with the content.
According to an embodiment, the portable device and/or the integrated device uses additional GNSS antennas mounted on the platform to improve localization accuracy. By utilizing additional GNSS antennas, higher precision in determining the platform's position and movement can be achieved. This enhanced localization accuracy can significantly improve the
reliability and performance of navigation and tracking functions, ensuring that the experience device operates with greater fidelity and responsiveness. Furthermore, the improved accuracy can benefit various applications, including augmented reality (AR), virtual reality (VR), and other locationbased services, by providing more precise and stable positioning data. Consequently, users can experience increased immersion, safety, and effectiveness in a wide range of interactive and real-time applications.
According to an embodiment, the portable device and/or the integrated device coordinates multiple experience devices within the platform, ensuring that all experience devices receive consistent data for synchronized experiences. By coordinating multiple experience devices, the system can ensure a harmonized and unified user experience across all devices. This synchronization can enhance the overall immersive quality of the experience by maintaining consistent visual, audio, and interactive elements, which is crucial for applications such as multiplayer gaming, collaborative VR environments, and synchronized media consumption. Additionally, providing consistent data to all experience devices can reduce latency and discrepancies, thereby improving the responsiveness and realism of the shared experience. Consequently, users can benefit from a seamless and cohesive interaction, increasing satisfaction and engagement in multi-user and multi-device scenarios.
According to an embodiment, the portable device and/or the integrated device uses redundant sensors to validate the accuracy of the data and ensure a correct calibration of the experience device. Utilizing redundant sensors can significantly enhance the reliability and accuracy of the system by cross-verifying data from multiple sources. This redundancy can detect and correct errors, ensuring that the experience device maintains precise calibration and operates optimally. Additionally, the use of redundant sensors can improve fault tolerance, allowing the system to continue functioning correctly even if one sensor fails or provides inaccurate data. Consequently, users can benefit from a more robust and dependable experience, with increased accuracy and reliability in various applications.
According to an embodiment, the pose of the portable device and/or the integrated device can be determined and/or adjusted based on user input on the device's display, and/or platform display, and/or the experience device display. Allowing user input to determine or adjust the device's pose can significantly enhance the accuracy and flexibility of the system by enabling precise manual calibration. This user-driven adjustment capability can accommodate various usage scenarios and individual preferences, ensuring that the device operates optimally in different environments. Additionally, this feature can improve the user experience by providing intuitive and direct control over the device's configuration, leading to greater satisfaction and usability. Consequently, users can benefit from a more accurate, adaptable, and user-friendly system, enhancing the overall effectiveness and enjoyment of the experience.
According to an embodiment, the portable device and/or the integrated device ensures correct positioning of virtual avatars in a multi-user experience based on the relative pose and/or motion of the at least one experience device and the body poses tracked by those devices. Ensuring accurate positioning of virtual avatars can significantly enhance the realism and immersion of multi-user experiences, making interactions more natural and intuitive. This capability can prevent discrepancies in avatar placement, which can disrupt the user experience and reduce the effectiveness of collaborative or interactive applications. Additionally, by accurately tracking and synchronizing body poses, the system can facilitate more engaging and cohesive virtual environments, where users can interact seamlessly. Consequently, users can enjoy a more immersive, interactive, and cohesive experience, leading to higher satisfaction and greater utility in applications such as virtual meetings, gaming, and collaborative workspaces.
According to an embodiment, the portable device and/or the integrated device provides an interactive user interface to guide the user through the setup and calibration process, ensuring correct installation and configuration. This interactive user interface can significantly enhance the ease of use and
accessibility of the system by providing clear, step-by-step instructions, which can reduce the potential for user error during setup and calibration. By facilitating correct installation and configuration, it can ensure optimal performance and reliability, improving the accuracy and effectiveness of the experience device. Additionally, this guided process can save time and effort for users, leading to a more efficient and satisfactory user experience. Consequently, users can benefit from a more intuitive and streamlined setup process, resulting in enhanced functionality and overall satisfaction with the system.
According to an embodiment, the portable device and/or the integrated device is mounted and/or placed and/or held in various poses within the platform, including fixed mounts, handheld positions, and/or semi-rigid attachments and/or placements. This flexibility in mounting and positioning can significantly enhance the adaptability and usability of the portable device and/or integrated device across different environments and use cases. By allowing the device to be securely fixed, handheld, or sem i-rigidly attached, it can ensure consistent and reliable performance regardless of the installation method. Additionally, this versatility can cater to diverse user preferences and operational requirements, improving the overall user experience. Consequently, users can benefit from a more flexible and adaptable system that maintains optimal functionality and reliability in various scenarios, leading to increased satisfaction and broader applicability.
According to an embodiment, the portable device and/or the integrated device uses data on the relative positions of the experience device to provide a spatial audio experience, ensuring that sound directionality matches the visual scene. By leveraging positional data, the system can deliver highly immersive and realistic spatial audio, enhancing the overall user experience. This alignment of sound directionality with the visual scene can create a more cohesive and engaging environment, significantly improving the sense of presence and immersion for the user. Additionally, this capability can enhance the effectiveness of applications such as virtual reality (VR), augmented reality (AR), and gaming, where accurate spatial
audio is crucial for a realistic experience. Consequently, users can benefit from a more intuitive and immersive auditory experience, which complements the visual content and enhances the overall quality and satisfaction with the system.
According to an embodiment, the portable device and/or the integrated device automatically detects its mount pose within the platform and adjusts the calibration and data processing accordingly. Automatic detection of the mount pose can significantly enhance the system’s ease of use and accuracy by eliminating the need for manual adjustments. This capability can ensure that the device is always correctly calibrated, leading to more precise and reliable data processing. Additionally, by dynamically adjusting calibration and data processing based on the detected mount pose, the system can maintain optimal performance regardless of the device’s position within the platform. Consequently, users can benefit from a more intuitive and reliable system, which can improve the overall user experience and satisfaction by providing consistent and accurate functionality in various mounting scenarios.
According to an embodiment, the portable device and/or the integrated device pre-processes data, such as map data or virtual scenes, and streams the processed data to the experience device for rendering. By performing pre-processing tasks, the system can offload computationally intensive operations from the experience device, thereby enhancing its performance and responsiveness. This can lead to smoother and more efficient rendering of complex scenes, improving the overall user experience. Additionally, preprocessing data before streaming it to the experience device can reduce latency and ensure that the visual and interactive elements are delivered in real-time. Consequently, users can benefit from a more seamless and immersive experience, with high-quality visuals and interactions that enhance the effectiveness and enjoyment of the system.
The above described may be summarized in other words and in relation to a possible more specific embodiment of the disclosure as described below, the following description being construed as not limiting the disclosure.
This disclosure describes a solution that uses a portable mobile device that may be operated in the platform and that may therefore allow performing separate measurements, i.e. (cyclic or periodic) measurements concerning the pose and/or motion and/or acceleration of the platform in the environment (e.g., a geoposition) and (cyclic or periodic) measurements concerning the pose and/or motion and/or acceleration of the headset in the platform. These measurements can be fused to compute smoother pose and/or motion and/or acceleration. In particular, such a portable mobile device can be a smartphone or a tablet PC (PC personal computer) or a smart watch or a laptop or a multi-functional device with similar capabilities. Due to its multifunctional nature, a device of that type may come with suitable hardware, in particular at least one sensor and/or at least one receiver that may be used for implementing the solution. By operating the device in the platform in a fixed or limited relative position to the platform, the device may in particular obtain the two separate measurements concerning a) the platform relative to the outdoor environment of the platform (e.g., a world reference frame on Earth) and b) the headset relative to the platform.
For operating or supporting or enabling an experience device in a moving platform, the following aspects provide advantages. In the sequel, the “moving platform” is referred to as “vehicle” without the intent of limiting the scope of this disclosure, and the “portable and/or integrated device” is referred to as “device” without the intent of limiting the scope of this disclosure:
- The device can be fixed in the vehicle using a rigid mount so that it doesn’t change its position relative to the vehicle
- the mount can have an anti-slipping surface e or a mechanical grip or a magnetic grip
- Limited motion can be achieved without a mount, sensor filters can mitigate the error between the device and the vehicle
- the mount can have an additional GNSS antenna (Global Navigation Satellite System) that can be connected to the device
- the antenna can be wired in the car interior or at arbitrary places such as under the hood, on the roof, under the chassis, in the trunk.
- the antenna can be wirelessly connected to the device, for example, the device could connect to a roof antenna of the platform. The antenna or an antenna receiver for the antenna may be linked to the device by a wired or wireless link for providing a received signal, in particular geo-position data, to the device.
- the additional antenna may have a magnetic base element for placing the antenna on a magnetic surface, e.g., a car chassis
- in general, at least one supportive sensor (GNSS (including all satellite system constellations), IMU, motion) may be provided in vehicle that may be connected to the device, if the corresponding internal sensor of the device does not provide sufficient precision
- When the device is mounted and/or fixed and/or motion-limited in the platform, the user can calibrate the system
- Alternatively, the calibration can be triggered by using a certain RFID tag, active charging state or similar.
- Additionally, or alternatively, the calibration can automatically be started depending on sensor heuristics.
- Instructions regarding the mounting position, the calibration and/or the connectivity of one or more Experience Devices can be provided on either or both the integrated display of the Device and/or the Experience Device. The calibration and mounting position can be validated using other (redundant) sensors such as optical sensors.
- The device can have at least one of GNSS technology, inertial, Wi-Fi, magnetometer, map data, cellular, radio, UWB, optical (including for example camera/s, laser, radar or the like) and/or barometer and (in case of several such data sources) fuse their data and/or transmit raw data (all or one or some) to one or more experience devices, via a wireless and/or wired connection
- One or multiple experience devices can use these data for multiple (alternative or concurrent) purposes, including
- receiving motion and/or location information of the vehicle
- processing motion and/or location information of the vehicle
- track the motion, position and/or orientation of the experience device/s relative to the device and/or vehicle and/or the other experience device/s in the vehicle, and by that achieve
- experience device/s 6dof tracking (dof - degrees of freedom, i.e., 3 translational, 3 rotational)
- and/or experience device/s 3d of tracking (3 translational degrees of freedom or 3 rotational degrees of freedom)
- and/or any subset of these dimensions (for example heading only)
- and/or headset drift mitigation for position and/or orientation
- one or several reference elements (devices) that can be active or passive, for example a beacon, a set of LEDs, a QR code or similar can (additionally but not necessarily) be placed on the headset to be identified by the device for the purpose of computing relative pose or motion or similar
- Multiple experience devices can estimate the relative position with respect to the device (and/or the device with respect to the experience device/s) and by that compute relative spatial information
- the position and/or orientation of the device in the vehicle
- can be determined on the device’s display by user input
- and/or can be determined on the experience device/s by user input
- and/or can be determined via analysis of optical data (including cameras, laser and/or similar) and/or electromagnetic data captured by the experience device and/or by the device
- and/or can be pre-defined by the mount position
- any or all of this data can be used for multi-user-experiences to place the characters in a common digital environment, e.g., cockpit or scene at the corresponding offset, and/or for single-user but social experiences
- Spatial audio in the experience device can depend on the relative pose of the headset with regard to the device in the vehicle,
- and/or on the vehicle location and/or pose and/or motion
- The mobile device also can receive map data and forward it to the experience device. E.g., by receiving it from the internet via the cellular network and forwarding it via Wi-Fi tethering to the respective experience device
- the experience device/s can send map requests to the device in order to create a virtual environment that matches the road network
- and/or can receive map data from the device
- Content specific pre-calculations can be done on the Device. In other words, workload can be outsourced from the Experience Devices processing unit to the Device and be shared with multiple Experience Devices. This can be:
(1 ) Preprocessing the map data (Preprocessed I pre-filtered map data streaming)
(2) Generation of a virtual scene or experience generated from map data (geometry/point cloud/texture/description/story streaming)
(3) Renderings of that virtual scene or experience (video/audio streaming)
- An interface can be provided in the device that can connect to the vehicle for receiving data from the vehicle, e.g., for using at least one sensor of the vehicle for obtaining data on vehicle position and/or orientation and/or motion and/or acceleration
- The computational tasks for providing the experience device pose and/or spatial content may be distributed or shared among the device and the experience device. This can be done dynamically, e.g., as a function of computational load and/or complexity of the task.
- Possible solution for enabling the in-vehicle motion-synchronized mode (device interacts with experience device): when detected that in fixed or limited pose, actively activated by user, calibration for device is called
- Use of device as reference point for defining the pose of the experience device relative to the vehicle or to the device itself: device detects experience device/s, experience device/s detect device (position of device in vehicle may be known or unknown, position relative to device may suffice)
- device and/or experience device/s may use magnetic and/or optical and/or UWB markers and/or electromagnetic sensors for tracking using e.g., triangulation I trilateration,
- the device may detect that the device is placed inside vehicle in a mount and/or inside a predefined area using activity of (wireless I wired) charging system and/or NFC element in mount and/or acceleration profile (time signal) matches vehicle acceleration profile: support mode and/or calibration is automatically activated
- Once the device is fixed and/or limited, the user can call a calibration with regard to device: relative location and/or orientation and/or position (pose) is observed
- pose of device in vehicle may be known (by user input or RFID-tag I NFC element in vehicle, but not limited to these approaches)
- The device may receive GNSS signals, camera, optical sensor, lasers, magnetometer, accelerometer, and/or cellular data, and/or similar and may fuse them or not and transmits it to experience device/s in real time, to enable geo-localization for the experience device/s
- One or multiple experience devices may track the pose relative to the device (e.g., by radio signal triangulation) and by that gain
- either 6dof tracking support
- or 3dof tracking support
- or a subset of these degrees of freedom (e.g., heading only)
- or drift mitigation for 3dof (degrees of freedom)
Up to now, the disclosure has been described with respect to the claimed method. Features, advantages or alternative embodiments herein can be assigned to the other claimed objects (e.g., portable and/or integrated device, system, processor circuit) and vice versa. In other words, the subject matter which is claimed or described with respect to the claimed method can be improved with features described or claimed in the context of the portable and/or integrated device, system, or processor circuit and vice versa. In this case, the functional features of the method are embodied by structural units of the portable and/or integrated device, system, and vice versa, respectively. Generally, in computer science a software implementation and
a corresponding hardware implementation are equivalent. Thus, for example, a method step for “storing” data may be performed with a storage unit and respective instructions to write data into the storage. For the sake of avoiding redundancy, although the device may also be used in the alternative embodiments described with reference to the method, these embodiments are not explicitly described again for the device.
The disclosure also includes a portable device. The portable device is configured to perform the disclosed method. The portable device according to the disclosure is preferably designed as a smartphone, tablet, PDA, smartwatch, and the like.
The disclosure also includes an integrated device. The portable device is configured to perform the disclosed method. The integrated device according to the disclosure is preferably designed as a control unit of a platform, preferably a vehicle.
According to an embodiment, the portable device is configured to act as a hub and integrate data from the platform, internet services, and/or data from its own sensors, in particular for supporting and enabling motion- synchronized experiences.
As a further solution, the disclosure also comprises a computer-readable storage medium comprising program code which, when executed by a computer, portable device, or a computer network, causes it to execute an embodiment of the method according to the disclosure. The storage medium may be provided at least in part as a non-volatile data storage (e.g., as a flash memory and/or as an SSD - solid state drive) and/or at least in part as a volatile data storage (e.g. as a RAM - random access memory). The storage medium can be arranged in the computer, portable device, or computer network. However, the storage medium can also be operated as a so-called Appstore server and/or cloud server on the Internet, for example. The computer or computer network can provide a processor circuit with, for example, at least one microprocessor. The program code can be provided as
binary code and/or as assembler code and/or as source code of a programming language (e.g., C) and/or as a program script (e.g. Python).
As a further solution, the disclosure also comprises a computer program having program code or program means, wherein, if the computer program is executed on a computer or a computer-based processing unit, the computer program is stored on a computer readable medium, wherein the program code or the program means causes the computer or the computer-based processing unit to execute a method according to any of the preceding method claims.
The disclosure also includes combinations of the features of the embodiments described. Thus, the disclosure also includes implementations each comprising a combination of the features of several of the described embodiments, provided that the embodiments have not been described as mutually exclusive.
The embodiments as explained are preferred embodiments of the disclosure. In the embodiments, the described components of the embodiments each represent individual features of the disclosure which are to be considered independently of each other and which also further form the application independently of each other. Therefore, the disclosure is also intended to include combinations of the features of the embodiments other than those shown. Furthermore, the described embodiments can also be supplemented by further features of the disclosure already described.
Brief Description of the Drawings
In the following, the disclosure will further be described with reference to exemplary embodiments illustrated in the figures, in which:
Fig. 1 schematically illustrates a system comprising a platform and a portable device, and
Fig. 2 schematically illustrates a flow chart of the method for operating at least one experience device in a movable and/or moving platform using at least one portable device and/or at least one integrated device.
In the figures, identical reference signs denote elements with the same function.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, in order to provide a thorough understanding of the current disclosure. It will be apparent to one skilled in the art that the current disclosure may be practiced in other embodiments that depart from these specific details. For example, the skilled artisan will appreciate that the current disclosure may be practiced with any application for different functionalities or for different computing entities.
Fig. 1 depicts a system for operating at least one experience device within a moving platform. The system is presented as the movable and/or moving platform 1 . The movable and/or moving platform 1 is shown with various components integrated into its structure. A portable device 2 and an integrated device 3 are located within/on the movable and/or moving platform 1 . The portable device 2 could be a smartphone, tablet PC, smart watch, or multi-functional device. The integrated device 3 comprises sensors such as an inertial measurement unit, GNSS receiver and antenna, wheel data, steering data, speed data, camera, optical sensor (e.g., lasers and/or radar) magnetometer, and/or accelerometer. These sensors gather data to determine the pose, movement, and/or acceleration of the movable and/or moving platform 1 .
Platform sensors 5 are distributed within the movable and/or moving platform 1 to enhance the accuracy, redundancy, and robustness of user experience when using the experience device 4. The experience device 4 is configured to present spatial content such as audio, visual, VR, AR, or MR content based on data received from the portable device 2 and/or integrated device
3. The experience device 4 can also present 2D content and is capable of rendering motion-synchronized content.
The system performs measurements to determine the pose, motion, and acceleration of the movable and/or moving platform 1 . These measurements enable a motion-synchronized experience on the experience device 4. The experience device 4 operates based on the measurements from the portable device 2 and/or integrated device 3. The world frame 7 represents the reference coordinate system for external data integration, ensuring the alignment of the VR or other digital world with the real-world environment. There is a need to align the virtual reality (VR) world (or other digital world) with the real world using three coordinate systems: the world coordinate system 7, the vehicle coordinate system 6, and the coordinate system of the experience device 4. These systems work together to provide a synchronized and immersive experience. The world coordinate system 7 serves as the reference frame representing the external world, crucial for aligning digital scenes with the real-world road network. It ensures accurate and immersive experiences by providing a stable reference for all movements and positions. Operating on top of the world coordinate system 7, the vehicle coordinate system 6 is essential for understanding the vehicle's movement and orientation within the world frame 7. It utilizes data from sensors such as, but not limited to the inertial measurement unit (IMU), GNSS receiver and antenna, wheel data, steering data, speed data, camera, optical sensor, (e.g., lasers and/or radar), magnetometer, and/or accelerometer to determine the vehicle's pose and/or movement and/or acceleration accurately.
Controlled by both the vehicle's movement and the user's head movements, the experience device coordinate system ensures that the experience device’s orientation and position are synchronized with the vehicle's movements and the user's viewing direction. This synchronization is vital for maintaining a cohesive digital experience. It is important to synchronize the vehicle's rotation with the experience device rotation, especially when the experience device lacks reliable inside-out tracking. This involves using external devices to measure and transmit data about the vehicle's movement to the experience device, ensuring accurate alignment. Therefore, the
portable device 2, such as, but not limited to a smartphone, attached to the platform 1 is used. Alternatively, or additionally, an integrated device 3 integrated into the moving platform 1 can be used. These devices gather data from inertial measurement unit (IMU), GNSS receiver and antenna, wheel data, steering data, speed data, camera, optical sensor, (e.g., lasers and/or radar), magnetometer, and/or accelerometer, process it and/or transmits it raw for processing on the experience device 4. This setup facilitates vehicle localization and motion synchronization without relying on fixed vehicle sensors, enhancing flexibility and usability. Additionally, the portable device 2 and/or integrated device 3 can integrate weather conditions and other data from internet services into the digital experience. By processing this information within the world frame, it enhances the realism and contextual relevance of the digital scenes, providing users with a more immersive and interactive experience.
The portable device 2 and/or integrated device 3 perform loading and preprocessing of map data, generating virtual scenes, experiences, audio, rendered images, and virtual elements from the map data, and streaming the pre-processed and/or not pre-processed content to the experience device 4. The portable device 2 and/or integrated device 3 enable multiple experience devices 4-x within the movable and/or moving platform 1 to provide synchronized spatial audio, location-aware, and motion-aware audio and visual experiences based on the relative positions of the experience devices 4-x.
The portable device 2 and/or integrated device 3 integrate data from platform sensors 5 such as but not limited to wheel ticks and/or vehicle speed and/or steering angle and/or GNSS data and/or optical data and/or IMU data to enhance the user experience accuracy and robustness. Computational tasks for audio and visual content are dynamically distributed between the portable device 2 and/or integrated device 3 and the experience device 4) based on computational load and task complexity. The portable device 2 and/or integrated device 3 automatically can activate and calibrate the experience
device 4 upon detecting its temporary fixed position within the movable and/or moving platform 1 using sensor heuristics and/or NFC elements.
The portable device 2 and/or integrated device 3 act as a hub, gathering data from multiple sources and distributing it to the experience device 4. The portable device 2 and/or integrated device 3 incorporate weather and environmental data from internet services and detected objects outside of the platform into the experience device 4, providing a realistic representation or artistic re-interpretation of the current conditions. The portable device 2 and/or integrated device 3 integrate data from external data sources, such as but not limited to internet services, to enhance the experience.
The pose of the portable device 2 and/or integrated device 3 can be determined and adjusted based on user input on the portable and/or integrated device's display, platform display, or the experience device display. The portable device 2 and/or integrated device 3 ensure correct positioning of virtual avatars in a multi-user experience based on the relative pose and motion of the experience device 4 and the body poses tracked by those devices.
The portable device 2 and/or integrated device 3 is mounted, placed, or held in various poses within the movable and/or moving platform 1 , including fixed mounts, handheld positions, and/or semi-rigid attachments. The portable device 2 and/or integrated device 3 can automatically detect its pose within the movable and/or moving platform 1 and adjust the calibration and data processing accordingly.
The portable device 2, when used in the movable and/or moving platform 1 is also configured to process the method 200 for operating the portable device 2 and/or the integrated device 3 in the movable and/or moving platform 1 , wherein a movement and/or pose of the portable device 2 and/or integrated device 3 relative to the platform 1 is limited or fixed, and wherein the portable device 2 and/or integrated device 3 supports at least one experience device 4 being located in the platform 1 to determine a pose and/or a movement
and/or an acceleration of the respective experience device 4 relative to the platform 1.
Fig. 2 illustrates the method 100 for operating at least one experience device 4 in a movable and/or moving platform 1 using at least one portable device 2 and/or at least one integrated device 3, wherein a movement and/or pose of the portable device 2 and/or integrated device 3 relative to the movable and/or moving platform 1 is limited or fixed. The method 100 can be processed by the portable device 2 and/or integrated device 3 and/or the experience device 4, when used in the movable and/or moving platform 1 . The method 100 comprises several method steps. In a first step 110, at least one measurement is performed to determine a pose and/or motion and or acceleration of the platform 1 . In a further step 120, the measurement is leveraged to enable a motion-synchronized experience on the at least one experience device 4. In a further step 130 the at least one experience device 4 is operated based on the measurement.
Reference Numerals
1 moving platform
2 portable device
3 integrated device
4 experience device
4-x multiple experience devices
5 platform sensors
6 vehicle frame
7 world frame
100 Method
110-130 Method steps
200 Method
Claims
1 . A method (100) for operating at least one experience device (4) in a movable and/or moving platform (1 ) using at least one portable device (2) and/or at least one integrated device (3), wherein a movement and/or pose of the portable device (2) and/or integrated device (3) relative to the platform (1 ) is limited or fixed, the method comprises the steps of:
- Performing at least one measurement (110) to determine a pose and/or motion and/or acceleration of the platform (1 );
- Leveraging the measurement (120) to enable a motion-synchronized experience on the at least one experience device (4); and
- operating (130) the at least one experience device (4) based on the measurement.
2. A method (200) for operating a portable device (2) and/or an integrated device (3) in a movable and/or moving platform (1), wherein a movement and/or pose of the portable device (2) and/or integrated device (3) relative to the platform (1 ) is motion-limited or fixed, and wherein the portable device (2) and/or integrated device (3) supports at least one experience device (4) being located in the platform (1 ) to determine a pose and/or a movement and/or an acceleration of the respective experience device (4) relative to the platform (1 ).
3. The method (100, 200) according to any of the preceding claims, wherein the movable and/or moving platform (1) comprises at least a bike, rollercoaster, industrial vehicle, car, bus, train, truck, plane, helicopter, and/or ship, and/or the like moving platform.
4. The method (100, 200) according to any of the preceding claims, wherein the portable device (2) comprises:
- a smartphone,
- a tablet PC,
- a smart watch, and/or
- a multi-functional device.
5. The method (100, 200) according to any of the preceding claims, wherein the integrated device (3) comprises at least one integrated sensor within or attached to the platform (1 ), selected from the group consisting of an inertial measurement unit (IMU), global navigation satellite system (GNSS) receiver, camera, optical sensor, magnetometer, and accelerometer, wheel sensor, steering sensor, and wherein the integrated device (3) is configured to gather data from these sensors to determine the pose and/or movement and/or acceleration of the platform (1 ).
6. The method (100, 200) according to any of the preceding claims, wherein the experience device (4) comprises a unit for presenting audio and/or visual content, including spatial and/or VR content and/or AR content and/or MR content, or a display unit for 2D content, or a vehicle infotainment system capable of rendering content based on data received from the portable device (2) and/or integrated device (3), in particular in a motion-synchronized manner.
7. The method (100, 200) according to any of the preceding claims, wherein the portable device (2) and/or the integrated device (3) performs loading and pre-processing of map data, generating a virtual scene and/or experience and/or audio and/or rendered images and/or virtual elements from the map data, and streaming the pre-processed content to the at least one experience device (4).
8. The method (100, 200) according to any of the preceding claims, wherein the portable device (2) and/or the integrated device (3) enable multiple experience devices (4-x) within the platform (1 ) synchronized spatial audio and/or location and/or motion aware audio and/or visual experiences based on the relative poses and motion of the experience devices (4-x).
9. The method (100, 200) according to any of the preceding claims, wherein the portable device (2) and/or the integrated device (3) integrates data from platform sensors (5) to enhance the accuracy and/or redundancy and/or robustness of the user experience when using the at least one experience device (4).
10. The method (100, 200) according to any of the preceding claims 6 to 10, wherein computational tasks for audio and/or visual content are dynamically distributed between the portable device (2) and/or an integrated device (3) and the at least one experience device (4) based on computational load and task complexity.
11 . The method (100, 200) according to any of the preceding claims, wherein the portable device (2) and/or the integrated device (3) automatically activates and calibrates the experience device (4) upon detecting its fixed position within the platform (1 ) using sensor heuristics and/or NFC elements.
12. The method (100, 200) according to any of the preceding claims, wherein the portable device (2) and/or the integrated device (3) act as a hub, gathering data from multiple sources and distributing it to the at least one experience device (4).
13. The method (100, 200) according to any of the preceding claims, wherein the portable device (2) and/or the integrated device (3) incorporates weather and/or environmental data from internet services and/or from internal and/or peripheral sensors, and/or objects detected by those sensors into the at least one experience device (4), providing a realistic representation and/or artistic re-interpretation of the current conditions; and wherein the portable device (2) and/or an integrated device (3) integrates data from external data sources, such as internet services or the like, to enhance the experience.
14. The method (100, 200) according to any of the preceding claims, wherein pose data describing the pose of the portable device (2) and/or the integrated device (3) or the pose of the portable device (2) and/or the integrated device (3) can be determined and/or adjusted based on user input on the device's display, and/or platform display, and/or the experience device display.
15. The method (100, 200) according to any of the preceding claims, wherein the portable device (2) and/or the integrated device (3) ensures correct positioning of virtual avatars in a multi-user experience based on the relative pose and/or motion of the at least one experience device (4) and the body poses tracked by those devices.
16. The method (100, 200) according to any of the preceding claims, wherein portable device (2) and/or the integrated device (3) is mounted and/or placed and/or hold in various poses within the platform (1 ), including fixed mounts, handheld positions, and/or semi-rigid attachments and/or placements.
17. The method (100, 200) according to any of the preceding claims, wherein portable device (2) and/or the integrated device (3) automatically detects its mount pose within the platform (1 ) and adjusts the calibration and data processing accordingly.
18. A portable device (2) that is configured to perform a method according to any of the preceding method claims.
19. A computer program having program code or program means, wherein, if the computer program is executed on a computer or a computer- based processing unit, the computer program is stored on a computer readable medium, wherein the program code or the program means causes the computer or the computer-based processing unit to execute a method according to any of the preceding method claims.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102023116974 | 2023-06-27 | ||
| DE102023116974.2 | 2023-06-27 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025003379A1 true WO2025003379A1 (en) | 2025-01-02 |
Family
ID=91759389
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2024/068213 Pending WO2025003379A1 (en) | 2023-06-27 | 2024-06-27 | Usage of a mobile device or integrated device supporting and/or enabling motion synchronized experiences on a moving platform |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025003379A1 (en) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180081426A1 (en) * | 2016-09-21 | 2018-03-22 | Apple Inc. | Relative intertial measurement system |
| WO2023072779A1 (en) * | 2021-10-29 | 2023-05-04 | Bayerische Motoren Werke Aktiengesellschaft | Method and device for ascertaining an installation pose of a vehicle-mounted inertial sensor system in a motor vehicle |
| US20230161169A1 (en) * | 2021-11-22 | 2023-05-25 | Toyota Jidosha Kabushiki Kaisha | Image display system |
-
2024
- 2024-06-27 WO PCT/EP2024/068213 patent/WO2025003379A1/en active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180081426A1 (en) * | 2016-09-21 | 2018-03-22 | Apple Inc. | Relative intertial measurement system |
| WO2023072779A1 (en) * | 2021-10-29 | 2023-05-04 | Bayerische Motoren Werke Aktiengesellschaft | Method and device for ascertaining an installation pose of a vehicle-mounted inertial sensor system in a motor vehicle |
| US20230161169A1 (en) * | 2021-11-22 | 2023-05-25 | Toyota Jidosha Kabushiki Kaisha | Image display system |
Non-Patent Citations (1)
| Title |
|---|
| MCGILL MARK MARK MCGILL@GLASGOW AC UK ET AL: "PassengXR: A Low Cost Platform for Any-Car, Multi-User, Motion-Based Passenger XR Experiences", THE ADJUNCT PUBLICATION OF THE 35TH ANNUAL ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY, ACMPUB27, NEW YORK, NY, USA, 29 October 2022 (2022-10-29), pages 1 - 15, XP058912728, ISBN: 978-1-4503-9427-7, DOI: 10.1145/3526113.3545657 * |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP4045965B1 (en) | Split rendering between a head-mounted display (hmd) and a host computer | |
| US11334148B2 (en) | Relative inertial measurement system | |
| US11185773B2 (en) | Virtual vehicle control method in virtual scene, computer device, and storage medium | |
| US10551834B2 (en) | Method and electronic device for controlling unmanned aerial vehicle | |
| US10345925B2 (en) | Methods and systems for determining positional data for three-dimensional interactions inside virtual reality environments | |
| US20220068118A1 (en) | Mobile computing device for use in controlling wirelessly controlled vehicles | |
| US9459692B1 (en) | Virtual reality headset with relative motion head tracker | |
| CN110494792B (en) | Visual tracking of peripheral devices | |
| US10720125B2 (en) | Method and system of wireless data transmission for virtual or augmented reality head mounted displays | |
| WO2021088498A1 (en) | Virtual object display method and electronic device | |
| US20240370095A1 (en) | Hand and totem input fusion for wearable systems | |
| US12449260B2 (en) | Relative inertial measurement system with visual correction | |
| WO2023231875A1 (en) | Device interaction method, and device and vehicle | |
| CN114332423A (en) | Virtual reality handle tracking method, terminal and computer-readable storage medium | |
| CN110146106A (en) | Inertial navigation set scaling method, device, electronic equipment and storage medium | |
| US20250005874A1 (en) | Method and apparatus for displaying augmented reality object | |
| WO2024215392A1 (en) | Pose optimization for object tracking | |
| EP4207087A1 (en) | Method and system for creating and storing map targets | |
| WO2025003379A1 (en) | Usage of a mobile device or integrated device supporting and/or enabling motion synchronized experiences on a moving platform | |
| US11216165B2 (en) | Content processing method and electronic device for supporting same | |
| CN112947474A (en) | Method and device for adjusting transverse control parameters of automatic driving vehicle | |
| CN110073314B (en) | Magnetic tracker dual mode | |
| JP6297663B1 (en) | Electronic device, correction control method, and correction control program | |
| US20250126361A1 (en) | Eye tracking validation using robot eye system | |
| KR102902961B1 (en) | Method and apparatus for displaying an ar object |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24737766 Country of ref document: EP Kind code of ref document: A1 |
|
| DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) |