[go: up one dir, main page]

WO2024136903A1 - Methods and systems for defining virtual boundaries - Google Patents

Methods and systems for defining virtual boundaries Download PDF

Info

Publication number
WO2024136903A1
WO2024136903A1 PCT/US2022/082374 US2022082374W WO2024136903A1 WO 2024136903 A1 WO2024136903 A1 WO 2024136903A1 US 2022082374 W US2022082374 W US 2022082374W WO 2024136903 A1 WO2024136903 A1 WO 2024136903A1
Authority
WO
WIPO (PCT)
Prior art keywords
boundary
distance
initial
location
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2022/082374
Other languages
French (fr)
Inventor
Qingan Yan
Qi XIONG
Yifan Yang
Yu Gao
Pan JI
Yi Xu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Innopeak Technology Inc
Original Assignee
Innopeak Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Innopeak Technology Inc filed Critical Innopeak Technology Inc
Priority to PCT/US2022/082374 priority Critical patent/WO2024136903A1/en
Priority to CN202280101568.5A priority patent/CN120266080A/en
Publication of WO2024136903A1 publication Critical patent/WO2024136903A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted

Definitions

  • the present invention is directed to extended reality systems and methods.
  • XR extended reality
  • AR augmented reality
  • VR virtual reality
  • Important design considerations and challenges for XR devices include performance, cost, and power consumption.
  • existing XR devices have been inadequate in setting virtual boundaries for reasons further explained below.
  • the present invention is directed to extended reality systems and methods.
  • the location of an XR device is used as a center of a boundary.
  • a distance between the XR device and a limb location is used as a radius of the boundary.
  • a user may use a limb and/or a controller to modify the boundary.
  • a system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions.
  • One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions.
  • One general aspect of the present invention includes a method for operating an extended reality device. The method includes initiating a boundary-setting process. The method also includes determining the initial location of the extended reality (XR) device. The method also includes capturing at least a first image using a first camera. The method also includes identifying a first limb from the first image. The method also includes determining a first distance between the first limb and the initial location.
  • XR extended reality
  • the method also includes comparing the first distance to a minimum distance.
  • the method also includes generating a radius value by selecting a greater value of the first distance and the minimum distance.
  • the method also includes obtaining a default height value.
  • the method also includes setting a ceiling value to the default height value.
  • the method also includes defining an initial boundary, the initial boundary may include a radius around the initial location and a ceiling based on the ceiling value.
  • Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
  • Implementations may include one or more of the following features.
  • the method may include determining the initial location using at least an accelerometer.
  • the method may include generating a notification to start a boundary-defining process for a user.
  • the method may include determining shapes and key points associated with the first limb.
  • the method may include: generating a second image using a second camera, identifying the first limb from the second image, and determining the first distance based on a parallax between the first image and the second image.
  • the method may include: identifying a second limb, determining a second distance between the second limb and the initial location, comparing the first distance and the second distance, and selecting the greater value of the first distance and the second distance.
  • the method may include: detecting an absence of the initial boundary and initiating the boundary-setting process upon detecting the absence.
  • the method may include modifying the ceiling value based on a second distance of the first limb.
  • the method may include modifying the ceiling value using a controller.
  • the method may include modifying the initial boundary based on the movements of the first limb.
  • One general aspect includes the method where the notification may include a visual indication.
  • Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
  • Implementations may include one or more of the following features.
  • the method where the notification may include an audio indication.
  • Implementations of the described techniques may include hardware, a method or process, or computer software on a computer- accessible medium.
  • One general aspect includes a method for operating an extended reality device.
  • the method also includes initiating a boundary setting process.
  • the method also includes determining an initial location of the extended reality (XR) device.
  • the method also includes connecting the XR device to a first controller.
  • the method also includes determining a first location of the first controller.
  • the method also includes determining a first distance between the first controller and the initial location.
  • the method also includes comparing the first distance to a minimum distance.
  • the method also includes generating a radius value by selecting the greater value of the first distance and the minimum distance.
  • the method also includes obtaining a default height value.
  • the method also includes setting a ceiling value to the default height value.
  • the method also includes defining an initial boundary, the initial boundary may include a radius around the initial location and a ceiling based on the ceiling value.
  • Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
  • Implementations may include one or more of the following features.
  • the method may include providing a user interface for modifying the initial boundary using the first controller.
  • the method may further include capturing an image of an environment surrounding the XR device, generating an output image using the initial boundary and the image, and displaying the output image.
  • Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium.
  • the XR apparatus includes a housing having a front side and a rear side.
  • the XR apparatus also includes a first camera configured on the front side, the first camera is configured to capture a plurality of two- dimensional (2D) images at a predefined frame rate, the plurality of 2d images including a first image.
  • the XR apparatus also includes a sensor module configured for determining the first location of the XR apparatus.
  • the XR apparatus also includes a wireless communication interface connected to a controller device.
  • the XR apparatus also includes a display configured on the rear side of the housing, the display is configured to display an output image.
  • the XR apparatus also includes a memory coupled to the first camera and is configured to store the plurality of 2D images.
  • the XR apparatus also includes a processor coupled to the memory.
  • the XR apparatus also includes where the processor is configured to determine the first distance between a first limb and the first location.
  • the processor is also configured to compare the first distance to a minimum distance.
  • the processor is also configured to generate a radius value by selecting the greater value of the first distance and the minimum distance.
  • the processor is also configured to define an initial boundary, the initial boundary being substantially cylindrical and may include a radius around the initial location and a ceiling.
  • the processor is also configured to generate the output image using the first image and the initial boundary.
  • Implementations may include one or more of the following features.
  • the first distance may be based on a second location of the controller device being held in the first limb.
  • the display is configured to show a user interface for modifying the initial boundary.
  • the XR apparatus may include a second camera, the processor being configured to determine the first distance using a parallax between the first camera and the second camera.
  • embodiments of the present invention provide many advantages over conventional techniques. Among other things, through a natural interaction provided by the present invention in terms of stretching (arm or leg) gestures relative to a real- world environment XR equipment users can be allowed to define their virtual safety boundary dynamically, making the user experience more intuitive. Additionally, the whole interaction is achieved in 3D space instead of a 2D plane, which offers more freedom to users to define their safety boundary modifiable in shape according to the real-world environment.
  • Embodiments of the present invention can be implemented in conjunction with existing systems and processes.
  • 3D natural safety boundary definition according to the present invention can be used in a wide variety of XR systems, including Head-Mount Display (HMD) devices that are equipped with range-sensing components.
  • HMD Head-Mount Display
  • various techniques according to the present invention can be adopted into existing XR systems via Software or firmware update. There are other benefits as well.
  • Figure 1 A is a simplified diagram illustrating an external view of XR device 115 according to the embodiments of the present invention.
  • Figure IB is a simplified block diagram illustrating components of XR device 115 according to the embodiments of the present invention.
  • Figure 2 is a simplified diagram illustrating a cylindrical boundary defined by a user according to the embodiments of the present invention.
  • Figure 3 is a simplified diagram illustrating an output image including a boundary overlaying an image of the environment according to embodiments of the present invention.
  • Figure 4 is a simplified diagram illustrating a method 400 for setting an XR boundary using one or more limbs according to the embodiments of the present invention.
  • Figure 5 is a simplified diagram illustrating a method 500 for setting an XR boundary using one or more limbs according to the embodiments of the present invention.
  • the present invention is directed to extended reality systems and methods.
  • the location of an XR device is used as a center of a boundary.
  • a distance between the XR device and a limb location is used as a radius of the boundary.
  • a user may use a hand/feet and/or a controller to modify the boundary.
  • Augmented reality provides an interactive experience of a real-world environment where the objects that reside in the real world are enhanced by computergenerated perceptual information, sometimes across multiple sensory modalities, including visual, auditory, haptic, somatosensory, and olfactory.
  • Virtual reality (VR) provides a simulated experience in a virtual and realistically modeled environment.
  • extended reality (XR) includes ah real-and-virtual combined environments and humanmachine interactions generated by computer technology and wearable devices.
  • a guardian system is adopted to allow users to define a room-scale play space or simply utilize a pre-defined stationary circular area as a safety zone. When the user inside the zone moves closer to the safety boundary, a warning signal will be displaced.
  • the room-scale boundary is designated through user interaction. For example, a user can draw a continuous virtual curve on the real- world ground image and the system will lift the curve to three dimensions (3D) to form a closed boundary.
  • a stationary mode can automatically place a pre-defined circle around the user and overlayed on the image of the real ground. The system then lifts the circle to 3D to form a virtual cylindrical wall.
  • Such kind of drawing is only designated in 2D space on the image of the ground plane. Neither 3D interaction nor body movements are fully explored to assist the safety boundary generation more naturally.
  • an XR device it is common to timely track 6 degrees of freedom (6DOF), with 3 degrees in translational movement and 3 degrees in rotational movements respectively, positions and orientations of both a headset and one or more hand-held controllers or sensors.
  • the trackable signals from one or more hand-held controllers or sensors can be utilized in a user interface for enlarging/reducing an initial safety boundary.
  • a user interface to generate the safety boundary while using AR/VR head-mounted device can be applied to simply and dynamically provide the most freedom yet still prevent collisions with objects in the real -world environment.
  • Figure 1 A is a simplified diagram illustrating an external view of an XR device 115 remotely coupled with a controller device 125 according to the embodiments of the present invention.
  • This diagram is merely an example, which should not unduly limit the scope of the claims.
  • One of ordinary skill in the art would recognize many variations, alternatives, and modifications.
  • XR extended reality
  • VR virtual reality
  • AR augmented reality
  • XR device 115 as shown can be configured as VR, AR, or others.
  • XR device 115 may include small housing for AR applications or relatively larger housing for VR applications.
  • Cameras 180A and 180B are configured on the front side of XR device 115.
  • cameras 180A and 180B are respectively mounted on the left and right sides of the XR device 115.
  • additional cameras may be configured below cameras 180A and 180B to provide an additional field of view and range estimation accuracy.
  • Display 185 is configured on the backside of XR device 115.
  • display 185 may be a semitransparent display that overlays information on an optical lens in AR applications.
  • display 185 may include a non-transparent display.
  • the XR device 115 is an XR headset or an HMD device, wearable by a user.
  • a user interface 195 may be built on the display 185 to allow the user dynamically set or directly view a safety boundary for operating the XR device 115 in a real-world environment.
  • a controller device 125 is included to wirelessly coupled with the XR device 115.
  • the XR device 115 is a head-mount display (HMD) device wearable for a user and the controller device 125 is a hand-held controller used by the user in XR applications.
  • the controller device 125 may be sensed its range or position by the XR device 115.
  • the controller device 125 may be held in a hand (or foot) of the user, thus providing a trackable 6DOF position and orientation of the hand (or foot) in motion.
  • the controller device 125 contains or is coupled with one or more sensors, e.g., motion/gravity sensor, lidar sensor, parallax sensor, audio sensor, etc.
  • sensors e.g., motion/gravity sensor, lidar sensor, parallax sensor, audio sensor, etc.
  • One or more sensors can provide corresponding position or range information via a kind of connect! on/synchronizati on device to or in the controller device 125 which communicates the information wirelessly to the XR device 115.
  • Figure IB is a simplified block diagram illustrating components of XR device 115 and controller device 125 according to the embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims.
  • One of ordinary skill in the art would recognize many variations, alternatives, and modifications.
  • an XR device 115 e.g., AR headset 115n as shown, or the like worn or used by a user 120n
  • a controller device 125 (e.g., hand-held controller) held in one of the hands of the user 120n is communicatively coupled with the XR device 115.
  • the controller device 125 might include, in general without limitation, at least one of the sensors, mode selector, I/O keys, display, and/or the like.
  • the processor 150 might communicatively be coupled (e.g., via a bus, via wired connectors, or via electrical pathways (e.g., traces and/or pads, etc.) of printed circuit boards ("PCBs") or integrated circuits ("ICs"), and/or the like) to each of one or more of the Lidar 130, the accelerometer 135, the Connection/Synchronization device 140, the parallax device 145, the data store 155, the audio output like speaker(s) or earpiece(s) 160, the eyetracking sensor(s) 165, the light source(s) 170, the audio sensor(s) or microphone(s) 175, the front camera(s) 180, display 185 including a user interface 195, and/or the communication interface 190, and/or the like.
  • PCBs printed circuit boards
  • ICs integrated circuits
  • the sensors including the Lidar 130, the accelerometer 135, the Connection/Synchronization device 140, the parallax device 145, the audio output like speaker(s) or earpiece(s) 160, the eye-tracking sensor(s) 165, the light source(s) 170, the audio sensor(s) or microphone(s) 175, are disposed of in a combined sensor module.
  • data store 155 may include dynamic random-access memory (DRAM) and/or non-volatile memory.
  • images captured by cameras 180 may be temporarily stored in the DRAM for processing, and executable instructions (e.g., hand shape calibration and hand gesture identification algorithms) may be stored in the non-volatile memory.
  • data store 155 may be implemented as a part of the processor 150 in a system-on-chip (SoC) arrangement.
  • SoC system-on-chip
  • the eye-tracking sensor(s) 165 - which might include, without limitation, at least one of one or more cameras, one or more motion sensors, or one or more tracking sensors, and/or the like - track where the user's eyes are looking, which in conjunction with computation processing by the processor 150 to compare with images or videos taken in front of the XR device 115.
  • the processor 150 can do computation processing to yield range estimate information of the target in a certain coordinate system. For example, the distance of the limb relative to the XR device.
  • the light source 170 which might include a laser source, white light source, infrared light source, and the like.
  • the light source 170 may work together with the Lidar 130 to perform a dynamic range estimation for the user within a real-world environment during XR applications.
  • the audio sensor(s) 175 might include, but is not limited to, microphones, sound sensors, noise sensors, and/or the like, and might be used to receive or capture voice signals, sound signals, and/or noise signals, or the like.
  • the front cameras 180 include their respective lenses and sensors used to capture images or video of an area in front of the XR device 115.
  • front cameras 180 include cameras 180A and 180B as shown in Figure IB, and they are configured respectively on the left and right sides of the headset housing.
  • the sensors of the front camera 180 may be low-resolution monochrome sensors, which are not only energyefficient (without color filter and color processing thereof), but also relatively inexpensive, both in terms of device size and cost.
  • the images captured by the cameras 180A and 180B can be used to identify the hand (or foot) of the user, either in stretching or retracting gestures.
  • the identification of the hand (or foot) includes collecting and tracking information about the corresponding 6DOF position and orientation of the user’s hand (or foot) in motion during the XR applications. In some implementations, this information can be utilized via a graphical user interface 195 being directly viewable on the display 185 by the user to naturally define or modify the safety boundary of the XR applications.
  • the motion/gravity sensor or accelerometer 135 disposed of in the headset device or the controller or directly attached to the fingers of the hand (or foot) of the user 120n might provide data for directly determining the location of the corresponding headset device, controller, or the hand (or foot).
  • the location is indicated by a 6DOF position and orientation of the object with respect to the XR device 115n.
  • the processor 150 is configured to perform hand detection and hand prediction processes.
  • processor 150 includes a central processing unit (CPU), graphic processing unit (GPU), and neural processing unit (NPU).
  • CPU central processing unit
  • GPU graphic processing unit
  • NPU neural processing unit
  • hand detection processes may be performed by NPU
  • hand prediction may be performed by CPU and/or NPU.
  • the processor 150 is able to process all data (numerical or graphical) collected by sensors in the XR device 115 as well as the controller device 125 and produce results of object identification, location tracking, range estimation, distance measurement, orientation determination, and other information for generating user-defined safety boundary and more.
  • each front camera 180 overlaps with the field of view of the eye of the user 120, the captured images or video.
  • the display screen(s) and/or projector(s) 185 may be used to display or project the generated image overlays (and/or to display an output image or video that combines the generated image overlays superimposed over images or video of the actual area).
  • the communication interface 190 provides wired or wireless communication with other devices and/or networks.
  • communication interface 190 may be connected to a computer for tether operations, where the computer provides the processing power needed for graphic-intensive applications.
  • controller device 125 is wirelessly communicated with the communication interface 190 of the XR device 115. All sensor data may be synchronized and transmitted through a connect! on/synchronizati on device for the controller device. At least for some instances, sensor data information about the 6DOF position and orientation, control information, and mode selector setting information can be loaded or retrieved by the user who holds the controller device 125 in hand or remotely controls through voice. Yet, these sensor data signals, control signals, and setting signals, still may be synchronized and transmitted dynamically to the XR headset following the motion of the user in the XR application.
  • the 6DOF position and orientation of the hand (or foot) in stretching or retracting gestures can be measured or determined dynamically to define an initial safety boundary and further re-editing the safety boundary based on the user’s XR activity in a real -world environment.
  • XR device with or without a controller device may include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods. Further details of methods for defining and modifying user safety boundaries for XR applications and related techniques are discussed with reference to the following figures.
  • FIG. 2 is a simplified diagram illustrating a cylindrical boundary defined by a user according to the embodiments of the present invention.
  • This diagram is merely an example, which should not unduly limit the scope of the claims.
  • One of ordinary skill in the art would recognize many variations, alternatives, and modifications.
  • the present invention is designed to provide a natural interaction that allows a user of an XR application to define or set the safety boundary, according to their real-world environment, while wearing an XR headset device.
  • the user can dynamically generate a virtual safety boundary by stretching their hands and (or) feet.
  • the system tracks the position of the hands and feet, for example, by tracking hand-held controllers using LED lights, tracking bare hands using computer vision, placing sensors on fingers, etc.
  • the term “computer vision” is broadly defined to include various types of cameras and capturing devices that allow XR apparatus to capture images that are later processed. This permits the user to build their safety boundary from a 3D space instead of on a 2D ground plane.
  • the present invention offers the user different types of safety boundary generation interfaces.
  • the XR device user wears a Head-mount display (HMD) device, e.g., the XR device 115 of Figure IB, whose three-dimensional (3D) position can be easily determined in a coordinate system defined by the user standing on a ground plane when the XR application starts.
  • the user may initiate a boundary-setting process by using one hand to generate a cylinder as the safety boundary.
  • a central axis of the cylinder can be defined by the 3D position of the HMD device and the direction of gravity.
  • the HMD device measures the distance from the hand to the central axis based on the hand’s position data. The data might be collected by one of the sensors attached to the hand or on a hand-held controller.
  • the data then might be synchronized by a connect! on/synchronizati on device in the controller and sent to the processor in the HMD device to process for measuring the distance.
  • the measured distance can be used as a radius to form a regular cylindrical safety boundary as shown in Figure 2. Therefore, when the user stretches their arm outward or inward to change the distance, the radius of the cylinder, i.e., the size of the safety boundary may expand or shrink accordingly.
  • the minimum size of the cylinder can be enforced (e.g., the radius is set to at least 0.5m). If both hands are used, two distances can be collected simultaneously. The larger one of the two distances is adopted to be the radius of the cylinder (e.g., as shown in Figure 2).
  • the HMD device may be coupled with one or more hand-held controllers (e.g., controller device 125 in Figures 1 A and IB).
  • the controller coupled with the processor in the HMD device is configured to measure the 6DOF position and orientation of itself (or correspondingly the hand holding the controller) with respect to the same coordinate system as the HMD device.
  • the HMD device processes one of the alternative sensor signals, such as head and hand positions tracked by computer vision algorithms, by motion/gravitation or accelerometer sensing program, by Lidar range estimation, or other detecting and tracking methods, to generate a dynamic user-defined safety boundary.
  • the cylindrical safety boundary (or other shapes) will follow the movement of the HMD device (i.e., the central axis always passes through the location of the HMD and is parallel to the gravity direction).
  • the safety boundary will be fixed in shape and size. This provides the user with a natural interaction to generate a user-defined safety boundary, here, a simple cylindrical safety boundary.
  • the cylindrical boundary (with the radius being fixed) can extend to infinite in the vertical direction.
  • a base at each end of the cylinder can be also designated so that the safety boundary is bounded by two end surfaces.
  • the two bases are perpendicular to the gravity direction, based on which the height of the cylinder can be measured by the processor in the HMD device, which can be initially set to a pre-defined value.
  • the user may want to edit the height of the cylindrical safety boundary in addition to the radius of the initial cylinder boundary. Once the user is satisfied with the radius, a height adjustment mode can be initiated (e.g., by clicking a button on the controller).
  • the user can use the controller to adjust the two ends of the cylinder. They can touch their feet using the controller to indicate the bottom end of the cylinder. They can also use HMD cameras to look at the feet or ground and use computer vision and parallax devices to estimate the positions of the feet or ground. In some other instances, a pre-configured body height from the user can also be a feasible alternative. To compute the bottom end, the system can subtract the adjusted body height from the HMD device location. In yet other instances, the user can also use sensors attached to the raised hand of the stretch-up arm to indicate the top base of the cylinder. As hand or feet sensor signals change, the safety boundary will change accordingly.
  • the HMD device coupled with the controller device can be configured to generate user-defined safety boundaries in other fixed shapes, such as a capsule shape, a sphere, an ellipsoid, or a combination of several geometric primitives, without limiting to cylindrical radius and height.
  • the present invention allows the user to change the parameters of these geometric primitives using tracked locations of hands, controllers, feet, and HMD devices.
  • FIG. 3 is a simplified diagram illustrating an output image including a boundary overlaying an image of the environment according to embodiments of the present invention.
  • This diagram is merely an example, which should not unduly limit the scope of the claims.
  • a mesh boundary overlays an image 303.
  • the mesh boundary includes a cylindrical portion 301 and a modified portion 302.
  • cylindrical portion 301 is formed based on a radius value
  • the modified portion 302 may be formed by manual modification from a user.
  • the present invention also allows the user to further modify it and create an irregularly shaped safety boundary.
  • the HMD device or XR system and methods afford the most freedom as compared with the previous two operation modes.
  • the system first creates an initial safety boundary as in the previous two modes.
  • the user can use a user interface on the display of the HMD device to further modify the safety boundary.
  • the HMD device can switch the display from virtual scene to video pass-through mode.
  • the safety boundary is graphically illustrated as a mesh-like interface overlaying the environment background image.
  • the user can place their hand (or hand-held controller) outside of the boundary so that the safety boundary will be updated based on the updated location of the hand.
  • the process resembles that the boundary has been “pushed” out by the hand movement.
  • This update can happen at each frame or every several frames or only when a button on the hand-held controller is pressed to ensure that the update is intended by the user.
  • a new continuous boundary surface will be generated, leaving irrelevant parts unchanged.
  • the XR device and operation methods are configured to project the 3D location onto the horizontal cross-section plane to compute a 2D position on the horizontal cross-section plane.
  • a spline curve can be leveraged to connect this 2D position with its nearest boundary into a new smooth boundary.
  • the new boundary on the horizontal cross-section plane is then lifted back to 3D to form a new safety boundary.
  • the user may use the intersection of a ground plane (where the user's feet are standing) and a virtual laser originating from the handheld controller as the indication of a boundary update.
  • 3D mesh deformation can also be used to modify initial boundaries that are represented as a 3D mesh.
  • FIG. 4 is a simplified diagram illustrating a method 400 for setting an XR safety boundary using one or more hands according to the embodiments of the present invention.
  • This diagram is merely an example, which should not unduly limit the scope of the claims.
  • One of ordinary skill in the art would recognize many variations, alternatives, and modifications.
  • one or more steps may be added, removed, repeated, replaced, rearranged, modified, and/or overlapped, and they should not unduly limit the claims.
  • the present invention provides one of the methods for operating an extended reality (XR) device, i.e., method 400 for setting an XR safety boundary.
  • XR extended reality
  • a boundary-setting process is initiated.
  • the XR device e.g., XR device 115 shown in Figures 1A and IB
  • the boundary-setting process then is initiated by the user.
  • the XR device is configured to detect an absence of an XR safety boundary which automatically triggers the initiation of the boundary setting process by a notification to the user.
  • the notification may be through an audio indication through an audio output or a visual indication through a user interface display.
  • the user can use the user interface on the display of the XR device for setting the safety boundary.
  • the initial location of the XR device is determined.
  • the XR device is a head-mount display (HMD) device as illustrated in Figure 1 A and Figure IB.
  • a 3D position of the HMD device at its initial location can be determined by the HMD device.
  • the processor of the HMD device can process sensor data from the cameras and accelerometer to calculate the coordinates corresponding to the initial location of the HMD device in a coordinate system based on the user standing on a ground surface.
  • the coordinates define the initial 3D position of the HMD device which can be stored in its data store or retrieved for conducting the boundary-setting process.
  • the initial 3D position of the HMD device can be used to calculate a virtual central axis that passes through the position in parallel to the gravity direction.
  • the information of the virtual axis can also be stored in the data store of the HMD device.
  • At step 406 at least a first image is captured using a first camera disposed of in the HMD device. For example, capturing the first image is triggered as the boundary-setting process is initiated.
  • Two cameras 180A and 180B are respectively mounted on the left and right sides of the HMD device 115.
  • the first image captured by camera 180 A on the left side of the HMD device is aimed at the user’s left hand as the left arm is stretched out.
  • a second image of the left hand is also captured by the second camera for example, by camera 180B on the right side of the HMD device.
  • a second image of the right hand is captured by the second camera aiming at the user’s right hand.
  • a first limb is identified from the first image.
  • the first limb is the left hand.
  • the identification of the hand from the captured image is a pre- established hand-gesture identification function of the HMD device.
  • hand identification is conducted by determining shapes and key points associated with the first limb based on multiple 3D key points detection, tracking, depth calculation, and other computations performed in the processor of the HMD device. At least a 3D position representing the first limb in the same coordinate system can be determined and stored in the data store of the HMD device.
  • a second limb i.e., the right hand
  • a second image captured for determining shapes and key points associated with the second limb so that a 3D position representing the right hand in the same coordinate system is also determined and stored in the data store.
  • a first distance between the first limb and the initial location of the HMD device is determined.
  • the first distance is referred to be a distance from the 3D position representing the first limb to the virtual central axis in a gravity direction that passes through the 3D position of the HMD device.
  • the first distance determination is performed by the processor of the HMD device.
  • the first distance may be determined by parallax data between the first image and the second image respectively captured by a first camera and a second camera both aiming at the first limb.
  • the first distance is compared with a minimum distance.
  • the minimum distance is a preset value (e.g., 0.5m) stored in the data store of the HMD device.
  • a radius value is generated by selecting from a greater value of the first distance and the minimum distance.
  • a second distance is also retrieved from the data store.
  • the radius value is selected from a greater value among the first distance, the minimum distance, and the second distance.
  • a default height value is obtained.
  • the default height value is pre-stored in the data store of the HMD device.
  • the default height value is equal to or larger than an average human height.
  • the default height value is an average human height plus an average arm length of a human.
  • a ceiling value is set to the default height value. In the previous example, the ceiling value can be used by the user to set a top-end boundary of the cylindrical-shaped boundary with the ground plane (where the user stands) being set to the bottom-end boundary.
  • the user uses the user interface on the display of the HMD device to complete the setting of an initial cylinder boundary based on the radius around the initial location of the HMD device and a ceiling based on the ceiling value.
  • the initial boundary is a fixed, cylindrical-shaped boundary for starting the operation of the XR device.
  • the user may adjust the ceiling value to reset the height of the safety boundary.
  • the user may adjust the radius value of the safety boundary based on a second distance of the first limb relative to the virtual axis that passes the HMD device.
  • the user may adjust setting geometric primitives differently but dependent on the radius value, the first distance, the second distance, and the ceiling value obtained for the initial safety boundary to generate an alternate shaped safety boundary including capsule shape, a sphere, an ellipsoid, or a combination of several geometric primitives.
  • the user may use a controller to obtain the alternative geometric primitives for generating a fixed shape safety boundary.
  • the user may use the HMD device to look at one foot of the user to adjust the geometric primitives.
  • the user may generate a user interface as a 3D mesh overlapping on an environmental image as an output image displayed on the display of the HMD device.
  • the user may use a hand or a controller in hand to interactively adjust the 3D mesh, e.g., using 3D mesh deformation, in a manner natural to the user during XR application.
  • Figure 5 is a simplified diagram illustrating method 500 for setting an XR boundary using one or more limbs according to the embodiments of the present invention.
  • This diagram is merely an example, which should not unduly limit the scope of the claims.
  • One of ordinary skill in the art would recognize many variations, alternatives, and modifications.
  • one or more steps may be added, removed, repeated, replaced, rearranged, modified, and/or overlapped, and they should not unduly limit the claims.
  • the present invention provides one of the methods for operating an extended reality (XR) device, i.e., the method 500 for setting an XR safety boundary.
  • XR extended reality
  • the XR device e.g., XR device 115 shown in Figures 1 A and IB
  • the XR device is a headset device worn by a user and turned on to empower cameras, light sources, and various sensors to be ready for capturing images or collecting data, to set a display to show a user interface overlapping a background image and to enable a processor to be ready for processing the captured image and collected sensor data, the boundary setting process then is initiated by the user.
  • a notification is generated for the user to start a boundary-defining process.
  • the notification is an audio indication provided through an audio output of the headset device.
  • the notification is a visual indication provided via a graphic user interface on the display of the headset device.
  • the user can use the user interface on the display for inputting parameters and editing the settings for the XR safety boundary.
  • the XR device is configured to detect an absence of an XR safety boundary during its operation, which automatically triggers the initiation of the boundarysetting process by a notification to the user.
  • the notification may be through an audio indication through an audio output or a visual indication through the user interface displayed as an output image on top of an environmental image around the XR device.
  • the initial location of the XR device is determined.
  • the XR device i.e., the headset device
  • the XR device is illustrated in Figure 1 A and Figure IB.
  • a 3D position of the headset device at its initial location can be determined by the headset device itself in a coordinate system based on the user standing on a ground surface.
  • the initial 3D position of the headset device can be stored in its data store or retrieved for conducting the boundary-setting process.
  • the initial 3D position of the headset device can be used to calculate a virtual central axis that passes through the position in parallel to the gravity direction.
  • the information of the virtual axis can also be stored in the data store of the headset device.
  • the XR device is connected to a first controller.
  • the first controller is a controller device 125 shown in Figure IB and contains a connection device 124 to establish wireless communication with a communication interface 190 of the XR headset device 115.
  • the first location of the first controller is determined.
  • the headset is worn on the head of the user and the first controller is held by one limb (e.g., left hand) of the user and connected wirelessly to the headset.
  • the first location of the first controller is a location reached by the user’s left hand firstly stretched out while holding the first controller.
  • At least a 3D position corresponding to the first location of the first controller in the same coordinate system based on the user can be determined via image parallax or calculated from sensor data collected from one of the accelerometers, Lidar, parallax, and the like, and transferred to the processor in the headset device.
  • the 3D position corresponding to the first location of the first controller is stored in the data store of the headset device.
  • a second controller held by the right hand of the user, is also connected wirelessly to the headset.
  • a 3D position corresponding to a second location of the second controller in the same coordinate system can be calculated by the processor of the headset device based on similar sensor data from the second controller.
  • the 3D position corresponding to the second location of the second controller is also stored in the data store of the headset device.
  • a first distance between the first location and the initial location of the headset device is determined.
  • the first distance is referred to be the distance from the 3D position of the first controller at the first location to the virtual central axis in the gravity direction that passes through the 3D position of the headset device at its initial location.
  • the first distance determination is performed by the processor of the headset device.
  • the first distance is compared with a minimum distance.
  • the minimum distance can be a preset value (e.g., 0.5m) stored in the data store of the headset device.
  • a radius value is generated by selecting from a greater value of the first distance and the minimum distance.
  • a second distance is also retrieved from the data store.
  • the radius value is selected from a greater value among the first distance, the minimum distance, and the second distance.
  • these steps 510, 512, and 514 can be performed easily by the processor of the XR headset device.
  • the radius value can be outputted via the user interface displayed on the display of the headset device. Then, the user can set an initial cylindricalshaped boundary (without ceiling) having the radius value via the user interface.
  • a default height value is obtained.
  • the default height value is pre-stored in the data store of the headset device.
  • the default height value is equal to or larger than an average human height, referring to the height from the headset worn by the user to a standing ground of the user.
  • the default height value is an average human height plus an average arm length of a human, assuming that the user raises her arm upward in a maximum stretching mode.
  • a ceiling value is set to the default height value.
  • the ceiling value can be used by the user to set a top-end plane for the initial cylindrical-shaped boundary with a ground plane (where the user stands) being set to the bottom-end plane of the initial cylindrical-shaped boundary.
  • the user uses the user interface displayed on the display of the headset device to complete the setting of an XR safety boundary based on the radius around the initial location of the headset device and a ceiling based on the ceiling value.
  • the initial boundary is a fixed, cylindrical-shaped boundary for starting the operation of the XR device.
  • the user may adjust the ceiling value to reset the height of the safety boundary.
  • the user may adjust the radius value of the safety boundary based on a second distance of the first controller relative to the virtual axis that passes the HMD device, the second distance is larger at some location/orientation than the radius value normally used.
  • the user may adjust setting one or more geometric primitives differently but dependent on the radius value, the first distance, the second distance, and the ceiling value obtained for the initial safety boundary to generate an alternate shaped safety boundary including capsule shape, a sphere, an ellipsoid, or a combination of several geometric primitives.
  • the user may use a controller to obtain the alternative geometric primitives for generating a fixed shape safety boundary.
  • the user may use the HMD device to look at one foot of the user to adjust one of the geometric primitives.
  • the user may generate a user interface as a 3D mesh overlapping on an environmental image as an output image displayed on the display of the HMD device.
  • the user may use a hand or a controller in hand to interactively adjust the 3D mesh, e.g., using 3D mesh deformation, in a manner natural to the user during XR application.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention is directed to extended reality systems and methods. According to a specific embodiment, the location of an XR device is used as a center of a boundary. A distance between the XR device and a limb location is used as a radius of the boundary. A user may use a limb and/or a controller to modify the boundary. There are other embodiments as well.

Description

METHODS AND SYSTEMS FOR DEFINING VIRTUAL BOUNDARIES
BACKGROUND OF THE INVENTION
[0001] The present invention is directed to extended reality systems and methods.
[0002] Over the last decade, extended reality (XR) devices — including both augmented reality (AR) devices and virtual reality (VR) devices — have become increasingly popular. Important design considerations and challenges for XR devices include performance, cost, and power consumption. Among other features, existing XR devices have been inadequate in setting virtual boundaries for reasons further explained below.
[0003] It is desired to have new and improved XR systems and methods thereof.
BRIEF SUMMARY OF THE INVENTION
[0004] The present invention is directed to extended reality systems and methods. According to a specific embodiment, the location of an XR device is used as a center of a boundary. A distance between the XR device and a limb location is used as a radius of the boundary. A user may use a limb and/or a controller to modify the boundary. There are other embodiments as well.
[0005] A system of one or more computers can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination of them installed on the system that in operation causes or cause the system to perform the actions. One or more computer programs can be configured to perform particular operations or actions by virtue of including instructions that, when executed by data processing apparatus, cause the apparatus to perform the actions. One general aspect of the present invention includes a method for operating an extended reality device. The method includes initiating a boundary-setting process. The method also includes determining the initial location of the extended reality (XR) device. The method also includes capturing at least a first image using a first camera. The method also includes identifying a first limb from the first image. The method also includes determining a first distance between the first limb and the initial location. The method also includes comparing the first distance to a minimum distance. The method also includes generating a radius value by selecting a greater value of the first distance and the minimum distance. The method also includes obtaining a default height value. The method also includes setting a ceiling value to the default height value. The method also includes defining an initial boundary, the initial boundary may include a radius around the initial location and a ceiling based on the ceiling value. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
[0006] Implementations may include one or more of the following features. The method may include determining the initial location using at least an accelerometer. The method may include generating a notification to start a boundary-defining process for a user. The method may include determining shapes and key points associated with the first limb. The method may include: generating a second image using a second camera, identifying the first limb from the second image, and determining the first distance based on a parallax between the first image and the second image. The method may include: identifying a second limb, determining a second distance between the second limb and the initial location, comparing the first distance and the second distance, and selecting the greater value of the first distance and the second distance. The method may include: detecting an absence of the initial boundary and initiating the boundary-setting process upon detecting the absence. The method may include modifying the ceiling value based on a second distance of the first limb. The method may include modifying the ceiling value using a controller. The method may include modifying the initial boundary based on the movements of the first limb. The method may include initiating and/or modifying the initial boundary using a controller. Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium.
[0007] One general aspect includes the method where the notification may include a visual indication. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
[0008] Implementations may include one or more of the following features. The method where the notification may include an audio indication. Implementations of the described techniques may include hardware, a method or process, or computer software on a computer- accessible medium.
[0009] One general aspect includes a method for operating an extended reality device. The method also includes initiating a boundary setting process. The method also includes determining an initial location of the extended reality (XR) device. The method also includes connecting the XR device to a first controller. The method also includes determining a first location of the first controller. The method also includes determining a first distance between the first controller and the initial location. The method also includes comparing the first distance to a minimum distance. The method also includes generating a radius value by selecting the greater value of the first distance and the minimum distance. The method also includes obtaining a default height value. The method also includes setting a ceiling value to the default height value. The method also includes defining an initial boundary, the initial boundary may include a radius around the initial location and a ceiling based on the ceiling value. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods.
[0010] Implementations may include one or more of the following features. The method may include providing a user interface for modifying the initial boundary using the first controller. The method may further include capturing an image of an environment surrounding the XR device, generating an output image using the initial boundary and the image, and displaying the output image. Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium.
[0011] One general aspect includes an extended reality (XR) apparatus. The XR apparatus includes a housing having a front side and a rear side. The XR apparatus also includes a first camera configured on the front side, the first camera is configured to capture a plurality of two- dimensional (2D) images at a predefined frame rate, the plurality of 2d images including a first image. The XR apparatus also includes a sensor module configured for determining the first location of the XR apparatus. The XR apparatus also includes a wireless communication interface connected to a controller device. The XR apparatus also includes a display configured on the rear side of the housing, the display is configured to display an output image. The XR apparatus also includes a memory coupled to the first camera and is configured to store the plurality of 2D images. The XR apparatus also includes a processor coupled to the memory. The XR apparatus also includes where the processor is configured to determine the first distance between a first limb and the first location. The processor is also configured to compare the first distance to a minimum distance. The processor is also configured to generate a radius value by selecting the greater value of the first distance and the minimum distance. The processor is also configured to define an initial boundary, the initial boundary being substantially cylindrical and may include a radius around the initial location and a ceiling. The processor is also configured to generate the output image using the first image and the initial boundary. Other embodiments of this aspect include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods. [0012] Implementations may include one or more of the following features. The first distance may be based on a second location of the controller device being held in the first limb. The display is configured to show a user interface for modifying the initial boundary. The XR apparatus may include a second camera, the processor being configured to determine the first distance using a parallax between the first camera and the second camera. The initial boundary may be characterized by a central axis passing through a location of the XR apparatus, the central axis is parallel to a gravity direction. Implementations of the described techniques may include hardware, a method or process, or computer software on a computer-accessible medium.
[0013] It is to be appreciated that embodiments of the present invention provide many advantages over conventional techniques. Among other things, through a natural interaction provided by the present invention in terms of stretching (arm or leg) gestures relative to a real- world environment XR equipment users can be allowed to define their virtual safety boundary dynamically, making the user experience more intuitive. Additionally, the whole interaction is achieved in 3D space instead of a 2D plane, which offers more freedom to users to define their safety boundary modifiable in shape according to the real-world environment.
[0014] Embodiments of the present invention can be implemented in conjunction with existing systems and processes. For example, 3D natural safety boundary definition according to the present invention can be used in a wide variety of XR systems, including Head-Mount Display (HMD) devices that are equipped with range-sensing components. Additionally, various techniques according to the present invention can be adopted into existing XR systems via Software or firmware update. There are other benefits as well.
[0015] The present invention achieves these benefits and others in the context of known technology. However, a further understanding of the nature and advantages of the present invention may be realized by reference to the latter portions of the specification and attached drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0016] Figure 1 A is a simplified diagram illustrating an external view of XR device 115 according to the embodiments of the present invention.
[0017] Figure IB is a simplified block diagram illustrating components of XR device 115 according to the embodiments of the present invention.
[0018] Figure 2 is a simplified diagram illustrating a cylindrical boundary defined by a user according to the embodiments of the present invention. [0019] Figure 3 is a simplified diagram illustrating an output image including a boundary overlaying an image of the environment according to embodiments of the present invention. [0020] Figure 4 is a simplified diagram illustrating a method 400 for setting an XR boundary using one or more limbs according to the embodiments of the present invention.
[0021] Figure 5 is a simplified diagram illustrating a method 500 for setting an XR boundary using one or more limbs according to the embodiments of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
[0022] The present invention is directed to extended reality systems and methods. According to a specific embodiment, the location of an XR device is used as a center of a boundary. A distance between the XR device and a limb location is used as a radius of the boundary. A user may use a hand/feet and/or a controller to modify the boundary. There are other embodiments as well.
[0023] Augmented reality (AR) provides an interactive experience of a real-world environment where the objects that reside in the real world are enhanced by computergenerated perceptual information, sometimes across multiple sensory modalities, including visual, auditory, haptic, somatosensory, and olfactory. Virtual reality (VR) provides a simulated experience in a virtual and realistically modeled environment. More generally, extended reality (XR) includes ah real-and-virtual combined environments and humanmachine interactions generated by computer technology and wearable devices. With the advent of XR applications, user-defined safety boundary features, i.e., a closed boundary within which is deemed as a safe space in a real-world environment for user activities in a simulated scene, are becoming more and more important application concerns.
[0024] Among other features, many popular AR and VR applications have all been deployed with a similar safety boundary setup to designate an obstacle-free play area and keep users away from undesired collisions with real -world objects. For example, a guardian system is adopted to allow users to define a room-scale play space or simply utilize a pre-defined stationary circular area as a safety zone. When the user inside the zone moves closer to the safety boundary, a warning signal will be displaced. The room-scale boundary is designated through user interaction. For example, a user can draw a continuous virtual curve on the real- world ground image and the system will lift the curve to three dimensions (3D) to form a closed boundary. In another example, a stationary mode can automatically place a pre-defined circle around the user and overlayed on the image of the real ground. The system then lifts the circle to 3D to form a virtual cylindrical wall. However, such kind of drawing is only designated in 2D space on the image of the ground plane. Neither 3D interaction nor body movements are fully explored to assist the safety boundary generation more naturally.
[0025] In an XR device, it is common to timely track 6 degrees of freedom (6DOF), with 3 degrees in translational movement and 3 degrees in rotational movements respectively, positions and orientations of both a headset and one or more hand-held controllers or sensors. The trackable signals from one or more hand-held controllers or sensors can be utilized in a user interface for enlarging/reducing an initial safety boundary. After applying the present invention, a user interface to generate the safety boundary while using AR/VR head-mounted device can be applied to simply and dynamically provide the most freedom yet still prevent collisions with objects in the real -world environment.
[0026] The following description is presented to enable one of ordinary skill in the art to make and use the invention and to incorporate it in the context of particular applications. Various modifications, as well as a variety of uses in different applications, will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to a wide range of embodiments. Thus, the present invention is not intended to be limited to the embodiments presented but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
[0027] In the following detailed description, numerous specific details are set forth in order to provide a more thorough understanding of the present invention. However, it will be apparent to one skilled in the art that the present invention may be practiced without necessarily being limited to these specific details. In other instances, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring the present invention.
[0028] The reader’s attention is directed to all papers and documents which are filed concurrently with this specification, and which are open to public inspection with this specification, and the contents of all such papers and documents are incorporated herein by reference. All the features disclosed in this specification, (including any accompanying claims, abstract, and drawings) may be replaced by alternative features serving the same, equivalent, or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features. [0029] Furthermore, any element in a claim that does not explicitly state “means for” performing a specified function, or “step for” performing a specific function, is not to be interpreted as a “means” or “step” clause as specified in 35 U.S.C. Section 112, Paragraph 6. In particular, the use of “step of’ or “act of’ in the Claims herein is not intended to invoke the provisions of 35 U.S.C. 112, Paragraph 6. [0030] Please note, if used, the labels left, right, front, back, top, bottom, forward, reverse, clockwise, and counter-clockwise have been used for convenience purposes only and are not intended to imply any particular fixed direction. Instead, they are used to reflect relative locations and/or directions between various portions of an object.
[0031] Figure 1 A is a simplified diagram illustrating an external view of an XR device 115 remotely coupled with a controller device 125 according to the embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims. One of ordinary skill in the art would recognize many variations, alternatives, and modifications.
[0032] It is to be understood that the term “extended reality” (XR) is broadly defined, which includes virtual reality (VR), augmented reality (AR), and/or other similar technologies. For example, XR device 115 as shown can be configured as VR, AR, or others. Depending on the specific implementation, XR device 115 may include small housing for AR applications or relatively larger housing for VR applications. Cameras 180A and 180B are configured on the front side of XR device 115. For example, cameras 180A and 180B are respectively mounted on the left and right sides of the XR device 115. In various applications, additional cameras may be configured below cameras 180A and 180B to provide an additional field of view and range estimation accuracy. Display 185 is configured on the backside of XR device 115. For example, display 185 may be a semitransparent display that overlays information on an optical lens in AR applications. In VR implementations, display 185 may include a non-transparent display. Optionally, the XR device 115 is an XR headset or an HMD device, wearable by a user. In some implementations, a user interface 195 may be built on the display 185 to allow the user dynamically set or directly view a safety boundary for operating the XR device 115 in a real-world environment.
[0033] In some embodiments, a controller device 125 is included to wirelessly coupled with the XR device 115. For example, the XR device 115 is a head-mount display (HMD) device wearable for a user and the controller device 125 is a hand-held controller used by the user in XR applications. In various applications, the controller device 125 may be sensed its range or position by the XR device 115. In some implementations, the controller device 125 may be held in a hand (or foot) of the user, thus providing a trackable 6DOF position and orientation of the hand (or foot) in motion. Optionally, the controller device 125 contains or is coupled with one or more sensors, e.g., motion/gravity sensor, lidar sensor, parallax sensor, audio sensor, etc. One or more sensors can provide corresponding position or range information via a kind of connect! on/synchronizati on device to or in the controller device 125 which communicates the information wirelessly to the XR device 115. [0034] Figure IB is a simplified block diagram illustrating components of XR device 115 and controller device 125 according to the embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims. One of ordinary skill in the art would recognize many variations, alternatives, and modifications.
[0035] In some embodiments, an XR device 115 (e.g., AR headset 115n as shown, or the like worn or used by a user 120n ) might include, without limitation, at least one of Lidar 130, Motion/Gravity sensor (or accelerometer) 135, Connection/Synchronization device 140, parallax device 145, processor 150, data store 155, speaker(s) or earpiece(s) 160, eye-tracking sensor(s) 165, light source(s) 170, audio sensor(s) or microphone(s) 175, front or front-facing cameras 180, display 185, and/or communication interfacel90, and/or the like. In various embodiments, a controller device 125 (e.g., hand-held controller) held in one of the hands of the user 120n is communicatively coupled with the XR device 115. The controller device 125 might include, in general without limitation, at least one of the sensors, mode selector, I/O keys, display, and/or the like.
[0036] In some instances, the processor 150 might communicatively be coupled (e.g., via a bus, via wired connectors, or via electrical pathways (e.g., traces and/or pads, etc.) of printed circuit boards ("PCBs") or integrated circuits ("ICs"), and/or the like) to each of one or more of the Lidar 130, the accelerometer 135, the Connection/Synchronization device 140, the parallax device 145, the data store 155, the audio output like speaker(s) or earpiece(s) 160, the eyetracking sensor(s) 165, the light source(s) 170, the audio sensor(s) or microphone(s) 175, the front camera(s) 180, display 185 including a user interface 195, and/or the communication interface 190, and/or the like. Optionally, the sensors, including the Lidar 130, the accelerometer 135, the Connection/Synchronization device 140, the parallax device 145, the audio output like speaker(s) or earpiece(s) 160, the eye-tracking sensor(s) 165, the light source(s) 170, the audio sensor(s) or microphone(s) 175, are disposed of in a combined sensor module. In various embodiments, data store 155 may include dynamic random-access memory (DRAM) and/or non-volatile memory. For example, images captured by cameras 180 may be temporarily stored in the DRAM for processing, and executable instructions (e.g., hand shape calibration and hand gesture identification algorithms) may be stored in the non-volatile memory. In various embodiments, data store 155 may be implemented as a part of the processor 150 in a system-on-chip (SoC) arrangement.
[0037] The eye-tracking sensor(s) 165 - which might include, without limitation, at least one of one or more cameras, one or more motion sensors, or one or more tracking sensors, and/or the like - track where the user's eyes are looking, which in conjunction with computation processing by the processor 150 to compare with images or videos taken in front of the XR device 115. When multiple image data are captured of a certain target, e.g., a limb, by different cameras 180 and combined with the operation of the Parallax device 145, the processor 150 can do computation processing to yield range estimate information of the target in a certain coordinate system. For example, the distance of the limb relative to the XR device. The light source 170, which might include a laser source, white light source, infrared light source, and the like. Optionally, the light source 170 may work together with the Lidar 130 to perform a dynamic range estimation for the user within a real-world environment during XR applications. The audio sensor(s) 175 might include, but is not limited to, microphones, sound sensors, noise sensors, and/or the like, and might be used to receive or capture voice signals, sound signals, and/or noise signals, or the like.
[0038] The front cameras 180 include their respective lenses and sensors used to capture images or video of an area in front of the XR device 115. For example, front cameras 180 include cameras 180A and 180B as shown in Figure IB, and they are configured respectively on the left and right sides of the headset housing. In various implementations, the sensors of the front camera 180 may be low-resolution monochrome sensors, which are not only energyefficient (without color filter and color processing thereof), but also relatively inexpensive, both in terms of device size and cost. In some implementations, the images captured by the cameras 180A and 180B can be used to identify the hand (or foot) of the user, either in stretching or retracting gestures. The identification of the hand (or foot) includes collecting and tracking information about the corresponding 6DOF position and orientation of the user’s hand (or foot) in motion during the XR applications. In some implementations, this information can be utilized via a graphical user interface 195 being directly viewable on the display 185 by the user to naturally define or modify the safety boundary of the XR applications. Optionally, the motion/gravity sensor or accelerometer 135 disposed of in the headset device or the controller or directly attached to the fingers of the hand (or foot) of the user 120n might provide data for directly determining the location of the corresponding headset device, controller, or the hand (or foot). Optionally, the location is indicated by a 6DOF position and orientation of the object with respect to the XR device 115n.
[0039] The processor 150, in various embodiments, is configured to perform hand detection and hand prediction processes. In various embodiments, processor 150 includes a central processing unit (CPU), graphic processing unit (GPU), and neural processing unit (NPU). For example, hand detection processes may be performed by NPU, and hand prediction may be performed by CPU and/or NPU. The processor 150 is able to process all data (numerical or graphical) collected by sensors in the XR device 115 as well as the controller device 125 and produce results of object identification, location tracking, range estimation, distance measurement, orientation determination, and other information for generating user-defined safety boundary and more.
[0040] In XR applications, the field of view of each front camera 180 overlaps with the field of view of the eye of the user 120, the captured images or video. The display screen(s) and/or projector(s) 185 may be used to display or project the generated image overlays (and/or to display an output image or video that combines the generated image overlays superimposed over images or video of the actual area). The communication interface 190 provides wired or wireless communication with other devices and/or networks. For example, communication interface 190 may be connected to a computer for tether operations, where the computer provides the processing power needed for graphic-intensive applications.
[0041] In various embodiments, controller device 125 is wirelessly communicated with the communication interface 190 of the XR device 115. All sensor data may be synchronized and transmitted through a connect! on/synchronizati on device for the controller device. At least for some instances, sensor data information about the 6DOF position and orientation, control information, and mode selector setting information can be loaded or retrieved by the user who holds the controller device 125 in hand or remotely controls through voice. Yet, these sensor data signals, control signals, and setting signals, still may be synchronized and transmitted dynamically to the XR headset following the motion of the user in the XR application. The 6DOF position and orientation of the hand (or foot) in stretching or retracting gestures can be measured or determined dynamically to define an initial safety boundary and further re-editing the safety boundary based on the user’s XR activity in a real -world environment.
[0042] Other embodiments of this XR device with or without a controller device may include corresponding computer systems, apparatus, and computer programs recorded on one or more computer storage devices, each configured to perform the actions of the methods. Further details of methods for defining and modifying user safety boundaries for XR applications and related techniques are discussed with reference to the following figures.
[0043] Figure 2 is a simplified diagram illustrating a cylindrical boundary defined by a user according to the embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims. One of ordinary skill in the art would recognize many variations, alternatives, and modifications. The present invention is designed to provide a natural interaction that allows a user of an XR application to define or set the safety boundary, according to their real-world environment, while wearing an XR headset device. As will be explained in detail below, the user can dynamically generate a virtual safety boundary by stretching their hands and (or) feet. The system tracks the position of the hands and feet, for example, by tracking hand-held controllers using LED lights, tracking bare hands using computer vision, placing sensors on fingers, etc. For example, the term “computer vision” is broadly defined to include various types of cameras and capturing devices that allow XR apparatus to capture images that are later processed. This permits the user to build their safety boundary from a 3D space instead of on a 2D ground plane. The present invention offers the user different types of safety boundary generation interfaces.
[0044] In some embodiments, as shown in Figure 2, the XR device user wears a Head-mount display (HMD) device, e.g., the XR device 115 of Figure IB, whose three-dimensional (3D) position can be easily determined in a coordinate system defined by the user standing on a ground plane when the XR application starts. The user may initiate a boundary-setting process by using one hand to generate a cylinder as the safety boundary. A central axis of the cylinder can be defined by the 3D position of the HMD device and the direction of gravity. The HMD device measures the distance from the hand to the central axis based on the hand’s position data. The data might be collected by one of the sensors attached to the hand or on a hand-held controller. The data then might be synchronized by a connect! on/synchronizati on device in the controller and sent to the processor in the HMD device to process for measuring the distance. Then, the measured distance can be used as a radius to form a regular cylindrical safety boundary as shown in Figure 2. Therefore, when the user stretches their arm outward or inward to change the distance, the radius of the cylinder, i.e., the size of the safety boundary may expand or shrink accordingly. The minimum size of the cylinder can be enforced (e.g., the radius is set to at least 0.5m). If both hands are used, two distances can be collected simultaneously. The larger one of the two distances is adopted to be the radius of the cylinder (e.g., as shown in Figure 2).
[0045] There are multiple ways to measure the distance between the hand and the central axis. For example, the HMD device may be coupled with one or more hand-held controllers (e.g., controller device 125 in Figures 1 A and IB). The controller coupled with the processor in the HMD device is configured to measure the 6DOF position and orientation of itself (or correspondingly the hand holding the controller) with respect to the same coordinate system as the HMD device. In other examples, the HMD device processes one of the alternative sensor signals, such as head and hand positions tracked by computer vision algorithms, by motion/gravitation or accelerometer sensing program, by Lidar range estimation, or other detecting and tracking methods, to generate a dynamic user-defined safety boundary. Note that during this process, the cylindrical safety boundary (or other shapes) will follow the movement of the HMD device (i.e., the central axis always passes through the location of the HMD and is parallel to the gravity direction). Optionally, once the safety boundary is confirmed by the user, it will be fixed in shape and size. This provides the user with a natural interaction to generate a user-defined safety boundary, here, a simple cylindrical safety boundary.
[0046] In the previous operation mode, the cylindrical boundary (with the radius being fixed) can extend to infinite in the vertical direction. In another embodiment, a base at each end of the cylinder can be also designated so that the safety boundary is bounded by two end surfaces. The two bases are perpendicular to the gravity direction, based on which the height of the cylinder can be measured by the processor in the HMD device, which can be initially set to a pre-defined value. In some embodiments, the user may want to edit the height of the cylindrical safety boundary in addition to the radius of the initial cylinder boundary. Once the user is satisfied with the radius, a height adjustment mode can be initiated (e.g., by clicking a button on the controller). Then, in some instances, the user can use the controller to adjust the two ends of the cylinder. They can touch their feet using the controller to indicate the bottom end of the cylinder. They can also use HMD cameras to look at the feet or ground and use computer vision and parallax devices to estimate the positions of the feet or ground. In some other instances, a pre-configured body height from the user can also be a feasible alternative. To compute the bottom end, the system can subtract the adjusted body height from the HMD device location. In yet other instances, the user can also use sensors attached to the raised hand of the stretch-up arm to indicate the top base of the cylinder. As hand or feet sensor signals change, the safety boundary will change accordingly. Again, once the safety boundary is confirmed by the user, it will be fixed, here a cylinder shape with a fixed radius and a fixed height. In other embodiments, the HMD device coupled with the controller device can be configured to generate user-defined safety boundaries in other fixed shapes, such as a capsule shape, a sphere, an ellipsoid, or a combination of several geometric primitives, without limiting to cylindrical radius and height. The present invention allows the user to change the parameters of these geometric primitives using tracked locations of hands, controllers, feet, and HMD devices.
[0047] Figure 3 is a simplified diagram illustrating an output image including a boundary overlaying an image of the environment according to embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims. One of ordinary skill in the art would recognize many variations, alternatives, and modifications. As an example, a mesh boundary overlays an image 303. The mesh boundary includes a cylindrical portion 301 and a modified portion 302. For example, cylindrical portion 301 is formed based on a radius value, and the modified portion 302 may be formed by manual modification from a user. In various embodiments, starting from a cylindrical safety boundary or other fixed-shape safety boundaries, the present invention also allows the user to further modify it and create an irregularly shaped safety boundary. The HMD device or XR system and methods afford the most freedom as compared with the previous two operation modes. In this mode, the system first creates an initial safety boundary as in the previous two modes. Next, the user can use a user interface on the display of the HMD device to further modify the safety boundary. As an example, the HMD device can switch the display from virtual scene to video pass-through mode. In this mode, as shown in Figure 3, the safety boundary is graphically illustrated as a mesh-like interface overlaying the environment background image.
[0048] With the user interface displaying the mech-like boundary, the user can place their hand (or hand-held controller) outside of the boundary so that the safety boundary will be updated based on the updated location of the hand. The process resembles that the boundary has been “pushed” out by the hand movement. This update can happen at each frame or every several frames or only when a button on the hand-held controller is pressed to ensure that the update is intended by the user. A new continuous boundary surface will be generated, leaving irrelevant parts unchanged.
[0049] There are multiple ways to implement this feature. In some embodiments, for safety boundaries that are extended from a 2D horizontal contour to 3D along the vertical direction (e.g., a cylinder), if a 3D location outside of the boundary is detected, the XR device and operation methods are configured to project the 3D location onto the horizontal cross-section plane to compute a 2D position on the horizontal cross-section plane. Next, a spline curve can be leveraged to connect this 2D position with its nearest boundary into a new smooth boundary. The new boundary on the horizontal cross-section plane is then lifted back to 3D to form a new safety boundary. In other embodiments, the user may use the intersection of a ground plane (where the user's feet are standing) and a virtual laser originating from the handheld controller as the indication of a boundary update. Alternatively, 3D mesh deformation can also be used to modify initial boundaries that are represented as a 3D mesh.
[0050] Figure 4 is a simplified diagram illustrating a method 400 for setting an XR safety boundary using one or more hands according to the embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims. One of ordinary skill in the art would recognize many variations, alternatives, and modifications. As an example, one or more steps may be added, removed, repeated, replaced, rearranged, modified, and/or overlapped, and they should not unduly limit the claims.
[0051] In some embodiments, the present invention provides one of the methods for operating an extended reality (XR) device, i.e., method 400 for setting an XR safety boundary. At step 402 of method 400, a boundary-setting process is initiated. According to the embodiments, as the XR device (e.g., XR device 115 shown in Figures 1A and IB) is turned on by a user to empower cameras, light sources, and various sensors to be ready for capturing images or collecting data, to set a display to show a user interface overlapping a background image and to enable a processor to be ready for processing the captured image and collected sensor data. The boundary-setting process then is initiated by the user. Optionally, the XR device is configured to detect an absence of an XR safety boundary which automatically triggers the initiation of the boundary setting process by a notification to the user. The notification may be through an audio indication through an audio output or a visual indication through a user interface display. For example, the user can use the user interface on the display of the XR device for setting the safety boundary.
[0052] At step 404, the initial location of the XR device is determined. In some embodiments, the XR device is a head-mount display (HMD) device as illustrated in Figure 1 A and Figure IB. A 3D position of the HMD device at its initial location can be determined by the HMD device. The processor of the HMD device can process sensor data from the cameras and accelerometer to calculate the coordinates corresponding to the initial location of the HMD device in a coordinate system based on the user standing on a ground surface. The coordinates define the initial 3D position of the HMD device which can be stored in its data store or retrieved for conducting the boundary-setting process. In a specific example, the initial 3D position of the HMD device can be used to calculate a virtual central axis that passes through the position in parallel to the gravity direction. The information of the virtual axis can also be stored in the data store of the HMD device.
[0053] At step 406, at least a first image is captured using a first camera disposed of in the HMD device. For example, capturing the first image is triggered as the boundary-setting process is initiated. Two cameras 180A and 180B are respectively mounted on the left and right sides of the HMD device 115. Optionally, the first image captured by camera 180 A on the left side of the HMD device is aimed at the user’s left hand as the left arm is stretched out. In some implementations, a second image of the left hand is also captured by the second camera for example, by camera 180B on the right side of the HMD device. Optionally, a second image of the right hand is captured by the second camera aiming at the user’s right hand.
[0054] At step 408, a first limb is identified from the first image. In the previous example, the first limb is the left hand. The identification of the hand from the captured image is a pre- established hand-gesture identification function of the HMD device. In some implementations, hand identification is conducted by determining shapes and key points associated with the first limb based on multiple 3D key points detection, tracking, depth calculation, and other computations performed in the processor of the HMD device. At least a 3D position representing the first limb in the same coordinate system can be determined and stored in the data store of the HMD device. Optionally, a second limb, i.e., the right hand, is also identified from the second image captured for determining shapes and key points associated with the second limb so that a 3D position representing the right hand in the same coordinate system is also determined and stored in the data store.
[0055] At step 410, when the first limb is identified, a first distance between the first limb and the initial location of the HMD device is determined. In an example, the first distance is referred to be a distance from the 3D position representing the first limb to the virtual central axis in a gravity direction that passes through the 3D position of the HMD device. The first distance determination is performed by the processor of the HMD device. Optionally, the first distance may be determined by parallax data between the first image and the second image respectively captured by a first camera and a second camera both aiming at the first limb. [0056] At step 412, the first distance is compared with a minimum distance. The minimum distance is a preset value (e.g., 0.5m) stored in the data store of the HMD device. At step 414, a radius value is generated by selecting from a greater value of the first distance and the minimum distance. Optionally, a second distance is also retrieved from the data store. In this case, the radius value is selected from a greater value among the first distance, the minimum distance, and the second distance. Again, all the above steps 410, 412, and 414 can be performed easily by the processor of the HMD device. The radius value can be outputted via the user interface on the display of the HMD device. For example, the user can set a cylindrical-shaped boundary via the user interface using the radius value.
[0057] At step 416, a default height value is obtained. Optionally, the default height value is pre-stored in the data store of the HMD device. Optionally, the default height value is equal to or larger than an average human height. In some implementations, the default height value is an average human height plus an average arm length of a human. At step 418, a ceiling value is set to the default height value. In the previous example, the ceiling value can be used by the user to set a top-end boundary of the cylindrical-shaped boundary with the ground plane (where the user stands) being set to the bottom-end boundary. In various embodiments, the user uses the user interface on the display of the HMD device to complete the setting of an initial cylinder boundary based on the radius around the initial location of the HMD device and a ceiling based on the ceiling value. Once the initial boundary is set, the initial boundary is a fixed, cylindrical-shaped boundary for starting the operation of the XR device. In some implementations, the user may adjust the ceiling value to reset the height of the safety boundary. In some implementations, the user may adjust the radius value of the safety boundary based on a second distance of the first limb relative to the virtual axis that passes the HMD device. In some implementations, the user may adjust setting geometric primitives differently but dependent on the radius value, the first distance, the second distance, and the ceiling value obtained for the initial safety boundary to generate an alternate shaped safety boundary including capsule shape, a sphere, an ellipsoid, or a combination of several geometric primitives. In some implementations, the user may use a controller to obtain the alternative geometric primitives for generating a fixed shape safety boundary. In some implementations, the user may use the HMD device to look at one foot of the user to adjust the geometric primitives. In some implementations, the user may generate a user interface as a 3D mesh overlapping on an environmental image as an output image displayed on the display of the HMD device. In some implementations, the user may use a hand or a controller in hand to interactively adjust the 3D mesh, e.g., using 3D mesh deformation, in a manner natural to the user during XR application.
[0058] Figure 5 is a simplified diagram illustrating method 500 for setting an XR boundary using one or more limbs according to the embodiments of the present invention. This diagram is merely an example, which should not unduly limit the scope of the claims. One of ordinary skill in the art would recognize many variations, alternatives, and modifications. As an example, one or more steps may be added, removed, repeated, replaced, rearranged, modified, and/or overlapped, and they should not unduly limit the claims.
[0059] In some embodiments, the present invention provides one of the methods for operating an extended reality (XR) device, i.e., the method 500 for setting an XR safety boundary. At step 502 of method 500, a boundary-setting process is initiated. According to the embodiments, the XR device (e.g., XR device 115 shown in Figures 1 A and IB) is a headset device worn by a user and turned on to empower cameras, light sources, and various sensors to be ready for capturing images or collecting data, to set a display to show a user interface overlapping a background image and to enable a processor to be ready for processing the captured image and collected sensor data, the boundary setting process then is initiated by the user. Optionally, a notification is generated for the user to start a boundary-defining process. Optionally, the notification is an audio indication provided through an audio output of the headset device. Optionally, the notification is a visual indication provided via a graphic user interface on the display of the headset device. For example, the user can use the user interface on the display for inputting parameters and editing the settings for the XR safety boundary. In some implementations, the XR device is configured to detect an absence of an XR safety boundary during its operation, which automatically triggers the initiation of the boundarysetting process by a notification to the user. The notification may be through an audio indication through an audio output or a visual indication through the user interface displayed as an output image on top of an environmental image around the XR device.
[0060] At step 504, the initial location of the XR device is determined. In some embodiments, the XR device, i.e., the headset device, is illustrated in Figure 1 A and Figure IB. A 3D position of the headset device at its initial location can be determined by the headset device itself in a coordinate system based on the user standing on a ground surface. The initial 3D position of the headset device can be stored in its data store or retrieved for conducting the boundary-setting process. In a specific example, the initial 3D position of the headset device can be used to calculate a virtual central axis that passes through the position in parallel to the gravity direction. The information of the virtual axis can also be stored in the data store of the headset device.
[0061] At step 506, the XR device is connected to a first controller. In various embodiments, the first controller is a controller device 125 shown in Figure IB and contains a connection device 124 to establish wireless communication with a communication interface 190 of the XR headset device 115.
[0062] At step 508, the first location of the first controller is determined. In various embodiments, the headset is worn on the head of the user and the first controller is held by one limb (e.g., left hand) of the user and connected wirelessly to the headset. The first location of the first controller is a location reached by the user’s left hand firstly stretched out while holding the first controller. At least a 3D position corresponding to the first location of the first controller in the same coordinate system based on the user (e.g., standing on the ground surface) can be determined via image parallax or calculated from sensor data collected from one of the accelerometers, Lidar, parallax, and the like, and transferred to the processor in the headset device. The 3D position corresponding to the first location of the first controller is stored in the data store of the headset device. In some implementations, a second controller, held by the right hand of the user, is also connected wirelessly to the headset. Similarly, a 3D position corresponding to a second location of the second controller in the same coordinate system can be calculated by the processor of the headset device based on similar sensor data from the second controller. The 3D position corresponding to the second location of the second controller is also stored in the data store of the headset device.
[0063] At step 510, a first distance between the first location and the initial location of the headset device is determined. In an example, the first distance is referred to be the distance from the 3D position of the first controller at the first location to the virtual central axis in the gravity direction that passes through the 3D position of the headset device at its initial location. The first distance determination is performed by the processor of the headset device. In step 512, the first distance is compared with a minimum distance. The minimum distance can be a preset value (e.g., 0.5m) stored in the data store of the headset device. In step 514, a radius value is generated by selecting from a greater value of the first distance and the minimum distance. Optionally, a second distance is also retrieved from the data store. In this case, the radius value is selected from a greater value among the first distance, the minimum distance, and the second distance. Again, these steps 510, 512, and 514 can be performed easily by the processor of the XR headset device. The radius value can be outputted via the user interface displayed on the display of the headset device. Then, the user can set an initial cylindricalshaped boundary (without ceiling) having the radius value via the user interface.
[0064] At step 516, a default height value is obtained. Optionally, the default height value is pre-stored in the data store of the headset device. Optionally, the default height value is equal to or larger than an average human height, referring to the height from the headset worn by the user to a standing ground of the user. Optionally, the default height value is an average human height plus an average arm length of a human, assuming that the user raises her arm upward in a maximum stretching mode.
[0065] At step 518, a ceiling value is set to the default height value. In the previous example, the ceiling value can be used by the user to set a top-end plane for the initial cylindrical-shaped boundary with a ground plane (where the user stands) being set to the bottom-end plane of the initial cylindrical-shaped boundary. In various embodiments, the user uses the user interface displayed on the display of the headset device to complete the setting of an XR safety boundary based on the radius around the initial location of the headset device and a ceiling based on the ceiling value. Once the initial boundary is set, the initial boundary is a fixed, cylindrical-shaped boundary for starting the operation of the XR device. Optionally, the user may adjust the ceiling value to reset the height of the safety boundary. Optionally, the user may adjust the radius value of the safety boundary based on a second distance of the first controller relative to the virtual axis that passes the HMD device, the second distance is larger at some location/orientation than the radius value normally used. Optionally, the user may adjust setting one or more geometric primitives differently but dependent on the radius value, the first distance, the second distance, and the ceiling value obtained for the initial safety boundary to generate an alternate shaped safety boundary including capsule shape, a sphere, an ellipsoid, or a combination of several geometric primitives. Optionally, the user may use a controller to obtain the alternative geometric primitives for generating a fixed shape safety boundary. Optionally, the user may use the HMD device to look at one foot of the user to adjust one of the geometric primitives. Optionally, the user may generate a user interface as a 3D mesh overlapping on an environmental image as an output image displayed on the display of the HMD device. Optionally, the user may use a hand or a controller in hand to interactively adjust the 3D mesh, e.g., using 3D mesh deformation, in a manner natural to the user during XR application.
[0066] While the above is a full description of the specific embodiments, various modifications, alternative constructions and equivalents may be used. Therefore, the above description and illustrations should not be taken as limiting the scope of the present invention which is defined by the appended claims.

Claims

WHAT IS CLAIMED IS:
1. A method for operating an extended reality device, comprising: initiating a boundary-setting process, determining an initial location of the extended reality (XR) device; capturing at least a first image using a first camera; identifying a first limb from the first image; determining a first distance between the first limb and the initial location; comparing the first distance to a minimum distance; generating a radius value by selecting a greater value of the first distance and the minimum distance; and defining an initial boundary, the initial boundary comprising a radius around the initial location.
2. The method of claim 1 further comprising determining the initial location using at least an accelerometer.
3. The method of claim 1 further comprising generating a notification to start a boundary-defining process for a user.
4. The method of claim 3 wherein the notification comprises at least one of a visual indication or an audio indication.
5. The method of claim 3 further comprising detecting whether a 3D location outside of the boundary; projecting the 3D location onto a horizontal cross-section plane to compute a 2D position on the horizontal cross-section plane; generating a spline curve can be leveraged to connect this 2D position to form a 2D boundary on the horizontal cross-section plane; and lifting the 2D boundary from the horizontal cross-section plane to 3D to form a new boundary.
6. The method of claim 1 further comprising determining shapes and key points associated with the first limb.
7. The method of claim 1 further comprising: obtaining a default height value; and setting a ceiling value to the default height value; wherein the initial boundary further comprises a ceiling based on the ceiling value.
8. The method of claim 1 further comprising: identifying a second limb; determining a second distance between the second limb and the initial location; comparing the first distance and the second distance; selecting the greater value of the first distance and the second distance.
9. The method of claim 1 further comprising: detecting an absence of the initial boundary; initiating the boundary-setting process upon detecting the absence.
10. The method of claim 1 further comprising modifying the ceiling value based on a second distance of the first limb.
11. The method of claim 1 further comprising modifying the ceiling value using a controller.
12. The method of claim 1 further comprising modifying the initial boundary based on movements of the first limb.
13. The method of claim 1 further comprising initiating and/or modifying the initial boundary using a controller.
14. A method for operating an extended reality device, comprising: initiating a boundary-setting process; determining an initial location of the extended reality (XR) device; connecting the XR device to a first controller; determining a first location of the first controller; determining a first distance between the first controller and the initial location; comparing the first distance to a minimum distance; generating a radius value by selecting the greater value of the first distance and the minimum distance; obtaining a default height value; setting a ceiling value to the default height value; and defining an initial boundary, the initial boundary comprising a radius around the initial location and a ceiling based on the ceiling value.
15. The method of claim 14 further comprising providing a user interface for modifying the initial boundary using the first controller.
16. The method of claim 14 further comprising: capturing an image of an environment surrounding the XR device; generating an output image using the initial boundary and the image; and displaying the output image.
17. An extended reality (XR) apparatus comprising: a housing having a front side and a rear side; a first camera configured on the front side, the first camera being configured to capture a plurality of two-dimensional (2D) images at a predefined frame rate, the plurality of 2D images including a first image; a sensor module configured for determining a first location of the XR apparatus; a wireless communication interface connected to a controller device; a display configured on the rear side of the housing, the display being configured to display an output image; a memory coupled to first camera and being configured to store the plurality of 2D images; and a processor coupled to the memory; wherein the processor is configured to: determine a first distance between a first limb and the first location; compare the first distance to a minimum distance; generate a radius value by selecting the greater value of the first distance and the minimum distance; define an initial boundary, the initial boundary being substantially cylindrical and comprising a radius around the initial location and a ceiling; and generate the output image using the first image and the initial boundary.
18. The apparatus of claim 17 wherein the first distance is based on a second location of the controller device being held in the first limb.
19. The apparatus of claim 17 wherein the display is configured to show a user interface for modifying the initial boundary.
20. The apparatus of claim 17 wherein the initial boundary is characterized by a central axis passing through a location the XR apparatus, the central axis is parallel to a gravity direction.
PCT/US2022/082374 2022-12-23 2022-12-23 Methods and systems for defining virtual boundaries Ceased WO2024136903A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/US2022/082374 WO2024136903A1 (en) 2022-12-23 2022-12-23 Methods and systems for defining virtual boundaries
CN202280101568.5A CN120266080A (en) 2022-12-23 2022-12-23 Method and system for defining virtual boundaries

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2022/082374 WO2024136903A1 (en) 2022-12-23 2022-12-23 Methods and systems for defining virtual boundaries

Publications (1)

Publication Number Publication Date
WO2024136903A1 true WO2024136903A1 (en) 2024-06-27

Family

ID=91589819

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/082374 Ceased WO2024136903A1 (en) 2022-12-23 2022-12-23 Methods and systems for defining virtual boundaries

Country Status (2)

Country Link
CN (1) CN120266080A (en)
WO (1) WO2024136903A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160203642A1 (en) * 2015-01-14 2016-07-14 Oculus Vr, Llc Passive locators for a virtual reality headset
US20190043259A1 (en) * 2017-08-02 2019-02-07 Google Inc. Depth sensor aided estimation of virtual reality environment boundaries
US20190087019A1 (en) * 2017-09-18 2019-03-21 Google Inc. Tracking of location and orientation of a virtual controller in a virtual reality system
US20190385372A1 (en) * 2018-06-15 2019-12-19 Microsoft Technology Licensing, Llc Positioning a virtual reality passthrough region at a known distance
US20220207837A1 (en) * 2020-12-31 2022-06-30 Facebook Technologies, Llc Systems and Methods for Providing Spatial Awareness in Virtual Reality

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160203642A1 (en) * 2015-01-14 2016-07-14 Oculus Vr, Llc Passive locators for a virtual reality headset
US20190043259A1 (en) * 2017-08-02 2019-02-07 Google Inc. Depth sensor aided estimation of virtual reality environment boundaries
US20190087019A1 (en) * 2017-09-18 2019-03-21 Google Inc. Tracking of location and orientation of a virtual controller in a virtual reality system
US20190385372A1 (en) * 2018-06-15 2019-12-19 Microsoft Technology Licensing, Llc Positioning a virtual reality passthrough region at a known distance
US20220207837A1 (en) * 2020-12-31 2022-06-30 Facebook Technologies, Llc Systems and Methods for Providing Spatial Awareness in Virtual Reality

Also Published As

Publication number Publication date
CN120266080A (en) 2025-07-04

Similar Documents

Publication Publication Date Title
US12148111B2 (en) Tangibility visualization of virtual objects within a computer-generated reality environment
KR102793104B1 (en) Devices, methods and graphical user interfaces for interacting with three-dimensional environments
US12008151B2 (en) Tracking and drift correction
US10198866B2 (en) Head-mountable apparatus and systems
US9423880B2 (en) Head-mountable apparatus and systems
CN107111370B (en) Virtual representation of real-world objects
KR102883900B1 (en) Devices, methods, and graphical user interfaces for generating and displaying user expressions
JP2023507867A (en) Artificial reality system with variable focus display for artificial reality content
CN107209561A (en) Methods, systems and devices for navigating in a virtual reality environment
KR20150130495A (en) Detection of a gesture performed with at least two control objects
KR20230022239A (en) Augmented reality experience enhancements
CN110968190B (en) IMU for touch detection
JP7625102B2 (en) Information processing device, user guide presentation method, and head-mounted display
JP2023542628A (en) Systems, methods, and graphical user interfaces for updating a device's display of a user's body
US20240107256A1 (en) Augmented reality spatial audio experience
JPWO2017122270A1 (en) Image display device
JP2021177580A (en) Information processing equipment, information processing methods, and programs
WO2024136903A1 (en) Methods and systems for defining virtual boundaries
CN110968248A (en) Generating 3D models of fingertips for visual touch detection
CN117999529A (en) Multimodal tracking of input devices
JP7792030B2 (en) Information processing device, user guide presentation method, and head-mounted display
KR20160002620U (en) Holography touch method and Projector touch method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22969415

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202280101568.5

Country of ref document: CN

WWP Wipo information: published in national office

Ref document number: 202280101568.5

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE