[go: up one dir, main page]

WO2018142418A1 - An apparatus, method, and system for augmented and mixed reality viewing - Google Patents

An apparatus, method, and system for augmented and mixed reality viewing Download PDF

Info

Publication number
WO2018142418A1
WO2018142418A1 PCT/IN2017/050304 IN2017050304W WO2018142418A1 WO 2018142418 A1 WO2018142418 A1 WO 2018142418A1 IN 2017050304 W IN2017050304 W IN 2017050304W WO 2018142418 A1 WO2018142418 A1 WO 2018142418A1
Authority
WO
WIPO (PCT)
Prior art keywords
smartphone
camera
display
head
cameras
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IN2017/050304
Other languages
French (fr)
Inventor
Kshitij Marwah
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of WO2018142418A1 publication Critical patent/WO2018142418A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the Smartphones specifically those with a touch screen, have allowed for a revolution that has enabled the consumption of the content in a much more natural form than earlier. But all the media and interaction is trapped behind a glass screen, and this is not how the world around is being seen.
  • the augmented and mixed reality viewing device presents itself with a head mounted gear and rendering of the virtual world, completely unencumbered where the user feels as if he or she is interacting with the natural objects.
  • the device can simulate a virtual world that completely resembles the reality around us, allowing for a real mixed reality experience.
  • This invention proposes an apparatus, method, and system for augmented and mixed reality viewing using a spectacles-type add-on in a mobile phone.
  • a display that is a head up holographic display that consists of (i) a light source of laser type, (ii) to show the holograms of the two-dimensional images there is a Spatial Light Modulator (SLM), (iii) the optics to illuminate the SLM, and to create an image in the observer's eye, and (iv) the imaging optics are required for an SLM image plane in the eye box.
  • SLM Spatial Light Modulator
  • the imaging optics are required for an SLM image plane in the eye box.
  • the image should be placed less than ten meters away from the observer's eye.
  • the fan-out optics consists of a microlens or a beam splitter.
  • US6124954A titled "Projection screen based on reconfigurable holographic optics for implementation in head-mounted displays” describes a system of a head-mounted display that consists of a frame that is designed in such a way that it has to be placed on the viewer's head, a means attached to the frame to generate an image to show a scenes' right and left images in a predetermined direction, a means attached to a frame for light manipulation for scattering the right and left images in front of the right and left eyes of the observer, and a viewing means coupled with the light manipulation means for displaying the images in front of the eyes of the observer.
  • the holographic optical elements are made to manipulate the monochromatic light, in a manner that images get diffused to the viewing means.
  • US8547615B2 titled "Head-mounted display device for generating reconstructions of three-dimensional representations” describes a head-mounted display device which reconstructs the three-dimensional displays. It comprises a frame which looks like goggles or a helmet with a front section and two side sections. The front section is in front of the eyes and consists of a light source, an optical system, and a light modulator. The light modulator is placed at an observer's window in the plane of the observer or front of the observer's eye if the light modulator has the code of the hologram so that the hologram can be transformed into the size of the observer's window.
  • the light modulators are also connected to the encoding device, where the holograms or the Wavefronts are calculated from three-dimensional representations of the surfaces. When the light modulator is lightened up, there are complex Wavefronts of the three-dimensional representations in the observer's window.
  • US20070002412A1 titled "Head-up display apparatus” describes an apparatus that is a heads-up apparatus that consists of a light source for showing light, a unit for switching holograms connected to the switching holograms devices and a control unit to provide voltage to the devices connected to the switching holograms. In the horizontal and vertical directions, all the different diffractions are shifted which were in different directions. There is a light source control unit that controls the light shown from the emitter and a timing control unit that control the timing of switching the light and the voltage provided to the switching holograms devices.
  • US20060250671A1 titled "Device for holographic reconstruction of three- dimensional scenes” describes a device that reconstructs the three-dimensional scenes into the holographic videos.
  • the device focuses the coherent light to the eyes of the observer through a Spatial Light Modulator (SLM).
  • SLM Spatial Light Modulator
  • the device has some illumination units to lighten up the surface of the SLM, and each unit consists of a focusing element that lightens up a different surface, with the help of the lighting means. All the regions that are lightened up reconstruct the scene in three dimensions using the video hologram.
  • the illumination unit shows up rays which coincide with the observer's window.
  • the present invention is a head-mounted display that uses a Smartphone as an add-on along with the specialized optics and spatial tracking, to view the augmented and mixed reality holograms along with a new interaction paradigm with natural gesture interactions.
  • the specialized optics includes, but is not limited to, a couple of lenses, a couple of beam splitters, curved mirrors and waveguides.
  • the augmented and mixed reality viewing device consists of a mounting point where any Smartphone can slide in.
  • the device using a high field of view lenses, beam splitters, curved mirrors or waveguides projects two or more distinct views through this optical arrangement, so that the user viewing through the headset sees a hologram projected right in front.
  • Detection of gestures allows for a natural user interaction paradigm for enabling the applications in the fields of gaming, media and entertainment, workspace interaction and much more. All these are possible with the in-built spatial world tracking with single or multiple RGB cameras or depth sensors using battery and GPU optimized Visual-Inertial SLAM method combined with hand tracking and gesture recognition.
  • a version of the augmented and mixed reality viewing device includes an enclosure design with the ability to slide in any phone on the top, two lenses placed at a focal length distance from the phone's display, two beam splitters placed at a specified distance from the lenses, and a viewing setup for a user to see the holograms.
  • An additional optical configuration includes using combiner optics such as curved mirrors or waveguides for the user to view the holograms.
  • the augmented and mixed reality viewing device includes an in-built Visual-Inertial SLAM based tracking using single or multiple RGB or RGB -Depth cameras.
  • the application layer also has hand and gesture tracking built in along with a holographic rendering engine that allows disparate views to be displayed at an inter-pupillary distance for a comfortable holographic viewing.
  • This invention is a head-mounted-display for a user to enable mixed-reality viewing comprising, an enclosure that can fit a Smartphone of any size with an inbuilt housing for any Smartphone to enable the Smartphone camera to see and track the real world, a pair of focusing lens placed at a focal length distance from the Smartphone Screen, and combiner optics adjusted for a correct holographic view based on an Inter Pupillary Distance (IPD) and a controller.
  • IPD Inter Pupillary Distance
  • the combiner optics includes, one or more beam splitters placed at a certain required distance from the focusing lenses, one or more curved mirrors built using a combination of lenses and beam splitters allow a reduced optical footprint with a high field of view and one or more waveguides that combine holographic light modulators with total internal reflections.
  • the controller is configured to do the following, to track the real world around the Smartphone, to track the head-mounted display 3D position in the real-world, to detect the plane or curved surfaces and render a visual representation of the user's interaction, including holograms, and to detect one or more gestures and re-render the visual representation, including holograms, in the real world.
  • the controller is configured to track the real world around the Smartphone by using a camera to detect the world in 3D using Visual-Intertial SLAM methods and markers, facilitating the recognition of the surfaces, positions, and orientations to put the hologram accordingly, on receiving a 3D map.
  • the camera is either an in-built Smartphone camera or one or more externally attached cameras.
  • the controller is configured to detect the plane or curved surfaces and render holograms as required, wherein once the hologram has been rendered on a particular surface, the method of SLAM tracking makes sure that the hologram sticks to its position rendering different viewpoints of the hologram based on the point of view of the user.
  • the controller is configured to detect the gestures and re -render the holograms as required by the application wherein the controller uses one or more cameras to track hands and gestures for interactions with the holograms.
  • the controller is also configured to detect the gestures that are tracked using one or more cameras with hand tracking.
  • the head-mounted-display is used for natural interaction with the digital manifestation of physical objects with applications in e-commerce, gaming, media, entertainment and interact with holographic content by generating an appropriate visual representation of the user's interaction.
  • the gestures include tap, pinch, and zoom.
  • a method for a user to enable mixed-reality viewing with Visual-Inertial SLAM tracking comprising the steps of, initializing a visual system of the Smartphone that includes either mono or dual cameras or any other external cameras as attached, initializing an inertial System of the Smartphone, including an Inertial Measurement Unit (IMU) that contains an accelerometer, a gyroscope, and a magnetometer, pre-processing and normalization of cameras and IMU data, detecting features in either single or multiple camera streams, identifying keyframes in camera frames and storing for further processing, estimating one or more 3D world maps and camera poses using non-linear optimization on the keyframe and IMU data, enhancing the 3D map and camera pose estimation using Visual-Inertial alignment, Loop Closure Model along with the GPU-optimized implementation for real-time computations, and rendering stereo Augmented Reality content based on camera pose, 3D Map Estimation and Inter-Pupillary Distance on the Smartphone display is done.
  • the camera is either an in-built Smartphone camera or one or more externally
  • Figure 1 shows an isometric view of an augmented and mixed reality viewing device.
  • Figure 2 shows the top view of the augmented and mixed reality viewing device.
  • Figure 3 shows a front view of the augmented and mixed reality viewing device.
  • Figure 4 shows a side view of the augmented and mixed reality viewing device.
  • Figure 5 shows a front perspective view of another configuration of the augmented and mixed reality viewing device.
  • Figure 6a and 6b shows the side view of another configuration of the augmented and mixed reality viewing device.
  • Figure 7 shows the front view of another configuration of the augmented and mixed reality viewing device.
  • Figure 8 shows the top side of another configuration of the augmented and mixed reality viewing device.
  • Figure 9 shows the isometric view of another configuration of the augmented and mixed reality viewing device.
  • Figure 10 shows the back view of another configuration of the augmented and mixed reality viewing device.
  • Figure 11 shows the optics see-through version of the augmented and mixed reality viewing device.
  • Figure 12 shows the scene viewed by the augmented and mixed reality viewing device.
  • Figure 13 shows the working of the optics ray in the augmented and mixed reality viewing device.
  • Figure 14 shows the entire process of the augmented and mixed reality viewing device.
  • Figure 15 shows detailed Visual-Inertial SLAM tracking method.
  • Figure 1 shows an isometric view of a version of an augmented and mixed reality viewing device.
  • the version of the augmented and mixed reality viewing device consists of a top part as shown in Figure 2 in the top view, that acts as a slider which is a Phone holder 2 for any Smartphone 1 to fit in. It also has an angled mirror on a mirror holder 3 that allows a Smartphone camera 8 to spatially track the world using the Visual-Inertial SLAM method along with hand, gesture and interaction detections.
  • the inside of the device consists of focusing lenses 4, and beam splitters 5 along with an eyepiece 6 to see the holograms. There is a nose cut 7 in the device for a user to handle with comfort.
  • Figure 3 shows a front view of the of the augmented and mixed reality viewing device which consists of a mirror holder 9 for holding an angled mirror.
  • FIG 4 shows a side view of the augmented and mixed reality viewing device.
  • This version consists of a Phone holder 10 for any Smartphone to fit in on the top. It also has an angled mirror on a mirror holder 11 that allows a Smartphone camera to spatially track the world using Visual-Inertial SLAM method along with hand, gestures and interaction detection.
  • a pair of focusing lens 12 is placed at a focal length distance from the Smartphone Screen.
  • a beam splitter 13 is an optical device that splits a beam of light in two, is inside the device.
  • FIG 5 shows the front perspective view of another version of the augmented and mixed reality viewing device.
  • This version consists of a Smartphone 16 that can fit in the Phone holder 17.
  • the device has angled mirror on the mirror holder 18 along with the head grip 19.
  • the inside of the device consists of focusing lenses and beam splitters 20 along with an eyepiece to see the holograms.
  • Figure 6a and 6b shows the side views of another configuration of the augmented and mixed reality viewing device.
  • This version consists of a strap (23, 23a) that can be of any flexible material, used to hold on to user's face.
  • the front part is shown in Figure 7 at the front view, that acts as a holder for the Phone to slide in.
  • the phone camera or multiple cameras connected to the Smartphone to view the scene directly and spatially track the world in 3D using the Visual- Inertial SLAM method. In addition to spatially tracking the world, the cameras can also detect hand and natural gestures.
  • the inner side of the augmented and mixed reality viewing device consists of curved mirrors and beam splitters with an eyepiece to be able to view the holograms and a nose cut 24.
  • the nose cut 26 in the augmented and mixed reality viewing device shown in top side view (in Figure 8) helps a user to handle the device with comfort.
  • the device has an angled mirror on the mirror holder 27 and the eyepiece to be able to view the holograms.
  • This version also shows a strap 25 that can be of any flexible material.
  • Figure 9 shows the isometric view of another configuration of the augmented and mixed reality viewing device indicates the strap 28 which is used to hold on to user's face
  • Figure 10 shows the back view of another configuration of the augmented and mixed reality viewing device in which the Smartphone 30 is placed in the Phone holder 29, with a back-camera and an Inertial Measurement Unit (IMU) used for Spatial world tracking.
  • the device consists of eyepiece to see the holograms.
  • the enclosure with combiner optics 32 assists in displaying the holographic Augmented Reality content to the viewer.
  • Figure 11 shows the optics see-through version of the augmented and mixed reality viewing device.
  • This version consists of a Smartphone 34 with display faced on the combiner optics, and the back-camera and IMU for Spatial world tracking, which fits in front of the device.
  • the first-surface mirror 33 is placed at a specified angle against the Phone display and thin Fresnel lenses 35 for focusing the hologram towards the viewer.
  • the beam splitter 36 allows both holographic light rays and real- world light rays to go through.
  • FIG. 12 A scene as viewed in the augmented and mixed reality viewing device is shown in Figure 12.
  • the real- world scene 37 as seen by the naked eye is viewed as an hologram 38 via the Smartphone display as a Virtual 3D Object overlayed the real world.
  • the viewer views the Stereo Projected Augmented Reality hologram 41 via the phone display through the combiner optics.
  • the Smartphone can fit in the Phone holder 39 in front of the device.
  • the inside of the device consists of an eyepiece to see the holograms, and a strap 40 to hold on to the user's face.
  • Figure 13 shows the working of the optics ray in the augmented and mixed reality viewing device.
  • the Smartphone 42 display acts as the source of the holographic light rays.
  • the holographic content is augmented 43 in the real-world for the viewer to view it as a virtual image.
  • the holographic content is then displayed 45 in the real-world for the viewer.
  • Enclosure A plastic or cardboard enclosure that can fit a Smartphone 1 of any size.
  • the enclosure has an in-built angled mirror attached to the mirror holder 3 for the Smartphone camera 8 to see and track the real world, shown in Figures 1 and 4.
  • the Smartphone 16 is slid in the front with in-built lenses, mirrors, curved mirrors or waveguides for holographic viewing as shown in Figures 5 and 7.
  • a pair of focusing lens 4 (shown in Figure 1) is placed at a focal length distance from the Smartphone Screen.
  • Beam Splitters A pair of beam splitters 5 is placed at a certain required distance from the focusing lenses 4, as in Figure 1.
  • a version of the augmented and mixed reality viewing device also contains combiner optics 32 such as curved mirrors and waveguides as shown in Figure 10.
  • Curved Mirrors A combination of lenses (4, 12) and beam splitters (5, 13, 20) allow a reduced optical footprint with a high field of view as shown in Figure 11.
  • Waveguides An optical element that combines holographic light modulators with total internal reflections.
  • Smartphone with an application Any Smartphone 1 with an application that has in-built methods for the following purpose: a. Track the real world around the Smartphone using Visual-Inertial SLAM method.
  • a Method of Detection The device uses the Smartphone camera 8 or externally attached cameras on the phone to detect the world in 3D using Visual-Inertial SLAM method, markers or otherwise.
  • the method of detection facilitates the recognition of the surfaces, positions, and orientations to put the hologram accordingly, on receiving a 3D map.
  • a Method of Spatial Tracking Once the hologram has been rendered on a particular surface, the method of Visual-Inertial SLAM tracking makes sure that the hologram sticks to its position. The method renders different viewpoints of the hologram based on the point of view of the user.
  • Figure 15 shows the detailed Visual-Inertial SLAM tracking method, which is as follows:
  • STEP A The method starts 109 by initialization of the Visual system 110 of the Smartphone that includes, mono or dual cameras or any other external cameras as attached.
  • STEP B Initialization of Inertial system 111 of the Smartphone, including Inertial Measurement Unit that contains an accelerometer, a gyroscope, and a magnetometer.
  • STEP C The process of pre-processing and normalization 112 of all cameras and IMU data.
  • STEP D The pre-processing and normalization is followed by detection of features 113 in a single or multiple cameras streams.
  • STEP E The keyframes within camera frames are identified 114 and are stored for further processing.
  • STEP F Estimation of the 3D world map and camera pose, using non-linear optimization on the keyframe and IMU data 115.
  • STEP G The 3D map and camera pose estimation are enhanced by employing Visual-Inertial Alignment, Loop Closure Model along with the GPU-optimized implementation for real-time computations 116.
  • STEP H The rendering of stereo Augmented Reality content 117 based on camera pose, 3D Map Estimation, and Inter-Pupillary Distance on the Smartphone display, and the method ends 118.
  • a Method of Hand and Gesture Tracking Using single or multiple cameras as attached to the Smartphone, the augmented and mixed reality viewing device can track hands and gestures for interactions with the holograms.
  • a Method of Rendering takes an in-built 3D model of any game, scene, movie or animation and renders different views based on the detection and method of spatial tracking.
  • Figure 14 shows the entire process of the augmented and mixed reality viewing device. The process of the device for a holographic experience is as follows:
  • STEP I The process starts 100 with the mobile application of the device will be switched on 101, and the Smartphone is slid into phone holder 2 plate either at the top or the front of the device (depending on version) 102.
  • STEP II The focusing lenses 4 and the beam splitters 5 or other combiner optics or waveguides are to be adjusted for a correct holographic view based on Inter- Pupillary Distance (IPD) 103.
  • IPD Inter- Pupillary Distance
  • STEP III With a single Smartphone RGB camera 8 or multiple cameras attached to the Smartphone using Visual-Inertial SLAM tracking method as shown in Figure 10, the 3D world is tracked for projection of the holograms 104.
  • STEP IV Playing games, watching movies and interacting with holographic content with Natural Gestures - that are tracked using single or multiple cameras with hand tracking method can be initiated 104.
  • STEP V Tap using natural gestures on any hologram to start the holographic experience 105.
  • STEP VI For further interactions, natural gestures such as tap, pinch, zoom and more can be used. These gestures are detected, tracked and interpreted by the phone camera along with the tracking to enable an immersive experience 106.
  • STEP VII The Smartphone screen renders the subsequent holographic frame based on gestures and controls and through the optical system as implemented, displays the hologram in the real world 107.
  • STEP VIII For a natural user interaction to interact with subsequent applications 108, the user either has to adjust the device to focus the lens as required or tap on any hologram using natural gestures to start the holographic experience. The above steps are repeated for any holographic experience.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention is an augmented and mixed reality viewing apparatus, method and system, which is a head-mounted display that uses a Smartphone as an add-on along with specialized optics to display augmented and mixed reality holograms along with a new interaction paradigm using gestures. The specialized optics includes but not limited to curved mirrors or waveguides, a couple of lenses 4 and couple of beam splitters 5. The device consists of a mounting point where any Smartphone 1 can slide in. The proprietary optics projects two or more distinct views through this optical arrangement for a user to view a hologram projected with the help of single or multiple RGB and RGB-D cameras using Visual-Inertial SLAM method. Detecting gestures allowing for a natural user interaction paradigm for enabling applications in the fields of gaming, media, and entertainment, workspace interaction is possible with the in-built hand and gesture tracking method.

Description

AN APPARATUS, METHOD, AND SYSTEM FOR
AUGMENTED AND MIXED REALITY VIEWING
BACKGROUND OF THE INVENTION
The Smartphones, specifically those with a touch screen, have allowed for a revolution that has enabled the consumption of the content in a much more natural form than earlier. But all the media and interaction is trapped behind a glass screen, and this is not how the world around is being seen.
Hence, there is a need for a natural user experience in which the user can touch, feel and interact with the digital world as is being done with the physical world. The augmented and mixed reality viewing device presents itself with a head mounted gear and rendering of the virtual world, completely unencumbered where the user feels as if he or she is interacting with the natural objects. By combining the specialized optics and tracking methods, the device can simulate a virtual world that completely resembles the reality around us, allowing for a real mixed reality experience.
FIELD OF THE INVENTION
This invention proposes an apparatus, method, and system for augmented and mixed reality viewing using a spectacles-type add-on in a mobile phone.
DISCUSSION OF PRIOR ART US20110157667A1 titled "Holographic image display systems" describes a display that is a head up holographic display that consists of (i) a light source of laser type, (ii) to show the holograms of the two-dimensional images there is a Spatial Light Modulator (SLM), (iii) the optics to illuminate the SLM, and to create an image in the observer's eye, and (iv) the imaging optics are required for an SLM image plane in the eye box. For the hologram to be displayed, the image should be placed less than ten meters away from the observer's eye. For the two- dimensional images to be at a distinct distance from the eyes of the observer, the images are kept at different focal plane depths, and these images have different colors respectively as they are at different distances from the eyes. To increase the size of the eye box, some image planes are kept in the SLM. The fan-out optics consists of a microlens or a beam splitter. US6124954A titled "Projection screen based on reconfigurable holographic optics for implementation in head-mounted displays" describes a system of a head-mounted display that consists of a frame that is designed in such a way that it has to be placed on the viewer's head, a means attached to the frame to generate an image to show a scenes' right and left images in a predetermined direction, a means attached to a frame for light manipulation for scattering the right and left images in front of the right and left eyes of the observer, and a viewing means coupled with the light manipulation means for displaying the images in front of the eyes of the observer. The holographic optical elements are made to manipulate the monochromatic light, in a manner that images get diffused to the viewing means.
US8547615B2 titled "Head-mounted display device for generating reconstructions of three-dimensional representations" describes a head-mounted display device which reconstructs the three-dimensional displays. It comprises a frame which looks like goggles or a helmet with a front section and two side sections. The front section is in front of the eyes and consists of a light source, an optical system, and a light modulator. The light modulator is placed at an observer's window in the plane of the observer or front of the observer's eye if the light modulator has the code of the hologram so that the hologram can be transformed into the size of the observer's window. The light modulators are also connected to the encoding device, where the holograms or the Wavefronts are calculated from three-dimensional representations of the surfaces. When the light modulator is lightened up, there are complex Wavefronts of the three-dimensional representations in the observer's window.
US20070002412A1 titled "Head-up display apparatus" describes an apparatus that is a heads-up apparatus that consists of a light source for showing light, a unit for switching holograms connected to the switching holograms devices and a control unit to provide voltage to the devices connected to the switching holograms. In the horizontal and vertical directions, all the different diffractions are shifted which were in different directions. There is a light source control unit that controls the light shown from the emitter and a timing control unit that control the timing of switching the light and the voltage provided to the switching holograms devices.
US20060250671A1 titled "Device for holographic reconstruction of three- dimensional scenes" describes a device that reconstructs the three-dimensional scenes into the holographic videos. The device focuses the coherent light to the eyes of the observer through a Spatial Light Modulator (SLM). The device has some illumination units to lighten up the surface of the SLM, and each unit consists of a focusing element that lightens up a different surface, with the help of the lighting means. All the regions that are lightened up reconstruct the scene in three dimensions using the video hologram. The illumination unit shows up rays which coincide with the observer's window.
SUMMARY OF THE INVENTION
The present invention is a head-mounted display that uses a Smartphone as an add-on along with the specialized optics and spatial tracking, to view the augmented and mixed reality holograms along with a new interaction paradigm with natural gesture interactions. The specialized optics includes, but is not limited to, a couple of lenses, a couple of beam splitters, curved mirrors and waveguides. The augmented and mixed reality viewing device consists of a mounting point where any Smartphone can slide in. The device using a high field of view lenses, beam splitters, curved mirrors or waveguides projects two or more distinct views through this optical arrangement, so that the user viewing through the headset sees a hologram projected right in front. Detection of gestures allows for a natural user interaction paradigm for enabling the applications in the fields of gaming, media and entertainment, workspace interaction and much more. All these are possible with the in-built spatial world tracking with single or multiple RGB cameras or depth sensors using battery and GPU optimized Visual-Inertial SLAM method combined with hand tracking and gesture recognition.
In exemplary implementation of this invention, a version of the augmented and mixed reality viewing device includes an enclosure design with the ability to slide in any phone on the top, two lenses placed at a focal length distance from the phone's display, two beam splitters placed at a specified distance from the lenses, and a viewing setup for a user to see the holograms. An additional optical configuration includes using combiner optics such as curved mirrors or waveguides for the user to view the holograms. The augmented and mixed reality viewing device includes an in-built Visual-Inertial SLAM based tracking using single or multiple RGB or RGB -Depth cameras. The application layer also has hand and gesture tracking built in along with a holographic rendering engine that allows disparate views to be displayed at an inter-pupillary distance for a comfortable holographic viewing. This invention is a head-mounted-display for a user to enable mixed-reality viewing comprising, an enclosure that can fit a Smartphone of any size with an inbuilt housing for any Smartphone to enable the Smartphone camera to see and track the real world, a pair of focusing lens placed at a focal length distance from the Smartphone Screen, and combiner optics adjusted for a correct holographic view based on an Inter Pupillary Distance (IPD) and a controller. The combiner optics includes, one or more beam splitters placed at a certain required distance from the focusing lenses, one or more curved mirrors built using a combination of lenses and beam splitters allow a reduced optical footprint with a high field of view and one or more waveguides that combine holographic light modulators with total internal reflections. The controller is configured to do the following, to track the real world around the Smartphone, to track the head-mounted display 3D position in the real-world, to detect the plane or curved surfaces and render a visual representation of the user's interaction, including holograms, and to detect one or more gestures and re-render the visual representation, including holograms, in the real world. The controller is configured to track the real world around the Smartphone by using a camera to detect the world in 3D using Visual-Intertial SLAM methods and markers, facilitating the recognition of the surfaces, positions, and orientations to put the hologram accordingly, on receiving a 3D map. The camera is either an in-built Smartphone camera or one or more externally attached cameras. The controller is configured to detect the plane or curved surfaces and render holograms as required, wherein once the hologram has been rendered on a particular surface, the method of SLAM tracking makes sure that the hologram sticks to its position rendering different viewpoints of the hologram based on the point of view of the user.
Further, the controller is configured to detect the gestures and re -render the holograms as required by the application wherein the controller uses one or more cameras to track hands and gestures for interactions with the holograms. The controller is also configured to detect the gestures that are tracked using one or more cameras with hand tracking. The head-mounted-display is used for natural interaction with the digital manifestation of physical objects with applications in e-commerce, gaming, media, entertainment and interact with holographic content by generating an appropriate visual representation of the user's interaction. The gestures include tap, pinch, and zoom. A method for a user to enable mixed-reality viewing with Visual-Inertial SLAM tracking comprising the steps of, initializing a visual system of the Smartphone that includes either mono or dual cameras or any other external cameras as attached, initializing an inertial System of the Smartphone, including an Inertial Measurement Unit (IMU) that contains an accelerometer, a gyroscope, and a magnetometer, pre-processing and normalization of cameras and IMU data, detecting features in either single or multiple camera streams, identifying keyframes in camera frames and storing for further processing, estimating one or more 3D world maps and camera poses using non-linear optimization on the keyframe and IMU data, enhancing the 3D map and camera pose estimation using Visual-Inertial alignment, Loop Closure Model along with the GPU-optimized implementation for real-time computations, and rendering stereo Augmented Reality content based on camera pose, 3D Map Estimation and Inter-Pupillary Distance on the Smartphone display is done. The camera is either an in-built Smartphone camera or one or more externally attached cameras. The inputs to the method include signals from one or more combiner optics.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 shows an isometric view of an augmented and mixed reality viewing device.
Figure 2 shows the top view of the augmented and mixed reality viewing device. Figure 3 shows a front view of the augmented and mixed reality viewing device.
Figure 4 shows a side view of the augmented and mixed reality viewing device.
Figure 5 shows a front perspective view of another configuration of the augmented and mixed reality viewing device.
Figure 6a and 6b shows the side view of another configuration of the augmented and mixed reality viewing device.
Figure 7 shows the front view of another configuration of the augmented and mixed reality viewing device.
Figure 8 shows the top side of another configuration of the augmented and mixed reality viewing device.
Figure 9 shows the isometric view of another configuration of the augmented and mixed reality viewing device.
Figure 10 shows the back view of another configuration of the augmented and mixed reality viewing device.
Figure 11 shows the optics see-through version of the augmented and mixed reality viewing device.
Figure 12 shows the scene viewed by the augmented and mixed reality viewing device. Figure 13 shows the working of the optics ray in the augmented and mixed reality viewing device.
Figure 14 shows the entire process of the augmented and mixed reality viewing device.
Figure 15 shows detailed Visual-Inertial SLAM tracking method.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Figure 1 shows an isometric view of a version of an augmented and mixed reality viewing device. The version of the augmented and mixed reality viewing device consists of a top part as shown in Figure 2 in the top view, that acts as a slider which is a Phone holder 2 for any Smartphone 1 to fit in. It also has an angled mirror on a mirror holder 3 that allows a Smartphone camera 8 to spatially track the world using the Visual-Inertial SLAM method along with hand, gesture and interaction detections. The inside of the device consists of focusing lenses 4, and beam splitters 5 along with an eyepiece 6 to see the holograms. There is a nose cut 7 in the device for a user to handle with comfort. Figure 3 shows a front view of the of the augmented and mixed reality viewing device which consists of a mirror holder 9 for holding an angled mirror.
Figure 4 shows a side view of the augmented and mixed reality viewing device. This version consists of a Phone holder 10 for any Smartphone to fit in on the top. It also has an angled mirror on a mirror holder 11 that allows a Smartphone camera to spatially track the world using Visual-Inertial SLAM method along with hand, gestures and interaction detection. A pair of focusing lens 12 is placed at a focal length distance from the Smartphone Screen. A beam splitter 13 is an optical device that splits a beam of light in two, is inside the device. There is an eyepiece 14 provided for the user to see the holograms and a nose cut 15 for a user to handle the device with comfort.
Figure 5 shows the front perspective view of another version of the augmented and mixed reality viewing device. This version consists of a Smartphone 16 that can fit in the Phone holder 17. The device has angled mirror on the mirror holder 18 along with the head grip 19. The inside of the device consists of focusing lenses and beam splitters 20 along with an eyepiece to see the holograms. There is a nose cut 21 in the device for a user to handle the device with comfort and a strap 22 that can be of any flexible material, used to hold on to user's face. Figure 6a and 6b shows the side views of another configuration of the augmented and mixed reality viewing device. This version consists of a strap (23, 23a) that can be of any flexible material, used to hold on to user's face. The front part is shown in Figure 7 at the front view, that acts as a holder for the Phone to slide in. The phone camera or multiple cameras connected to the Smartphone to view the scene directly and spatially track the world in 3D using the Visual- Inertial SLAM method. In addition to spatially tracking the world, the cameras can also detect hand and natural gestures. The inner side of the augmented and mixed reality viewing device consists of curved mirrors and beam splitters with an eyepiece to be able to view the holograms and a nose cut 24. The nose cut 26 in the augmented and mixed reality viewing device shown in top side view (in Figure 8) helps a user to handle the device with comfort. The device has an angled mirror on the mirror holder 27 and the eyepiece to be able to view the holograms. This version also shows a strap 25 that can be of any flexible material. Figure 9 shows the isometric view of another configuration of the augmented and mixed reality viewing device indicates the strap 28 which is used to hold on to user's face.
Figure 10 shows the back view of another configuration of the augmented and mixed reality viewing device in which the Smartphone 30 is placed in the Phone holder 29, with a back-camera and an Inertial Measurement Unit (IMU) used for Spatial world tracking. The device consists of eyepiece to see the holograms. There is a nose cut 31 in the device for the user to handle the device with comfort. The enclosure with combiner optics 32 assists in displaying the holographic Augmented Reality content to the viewer.
Figure 11 shows the optics see-through version of the augmented and mixed reality viewing device. This version consists of a Smartphone 34 with display faced on the combiner optics, and the back-camera and IMU for Spatial world tracking, which fits in front of the device. The first-surface mirror 33 is placed at a specified angle against the Phone display and thin Fresnel lenses 35 for focusing the hologram towards the viewer. The beam splitter 36 allows both holographic light rays and real- world light rays to go through.
A scene as viewed in the augmented and mixed reality viewing device is shown in Figure 12. The real- world scene 37 as seen by the naked eye is viewed as an hologram 38 via the Smartphone display as a Virtual 3D Object overlayed the real world. The viewer views the Stereo Projected Augmented Reality hologram 41 via the phone display through the combiner optics. The Smartphone can fit in the Phone holder 39 in front of the device. The inside of the device consists of an eyepiece to see the holograms, and a strap 40 to hold on to the user's face.
Figure 13 shows the working of the optics ray in the augmented and mixed reality viewing device. The Smartphone 42 display acts as the source of the holographic light rays. The holographic content is augmented 43 in the real-world for the viewer to view it as a virtual image. The tracked rays describing the real-world and the hologram ray traces 44 for the consumer's eyes. The holographic content is then displayed 45 in the real-world for the viewer.
The hardware components of the augmented and mixed reality viewing device are individually described below:
Enclosure: A plastic or cardboard enclosure that can fit a Smartphone 1 of any size. The enclosure has an in-built angled mirror attached to the mirror holder 3 for the Smartphone camera 8 to see and track the real world, shown in Figures 1 and 4. In an additional version of the augmented and mixed reality viewing device, the Smartphone 16 is slid in the front with in-built lenses, mirrors, curved mirrors or waveguides for holographic viewing as shown in Figures 5 and 7.
Lenses: A pair of focusing lens 4 (shown in Figure 1) is placed at a focal length distance from the Smartphone Screen. Beam Splitters: A pair of beam splitters 5 is placed at a certain required distance from the focusing lenses 4, as in Figure 1.
Other Optics: A version of the augmented and mixed reality viewing device also contains combiner optics 32 such as curved mirrors and waveguides as shown in Figure 10.
Curved Mirrors: A combination of lenses (4, 12) and beam splitters (5, 13, 20) allow a reduced optical footprint with a high field of view as shown in Figure 11.
Waveguides: An optical element that combines holographic light modulators with total internal reflections. Smartphone with an application: Any Smartphone 1 with an application that has in-built methods for the following purpose: a. Track the real world around the Smartphone using Visual-Inertial SLAM method.
b. Estimate either the Smartphone or head-mounted display 3D position in the real- world.
c. Detect the plane or curved surfaces and render holograms as required. d. Detect the gestures and re-render the holograms as required by the application.
Individual software components of the augmented and mixed reality viewing device are:
A Method of Detection: The device uses the Smartphone camera 8 or externally attached cameras on the phone to detect the world in 3D using Visual-Inertial SLAM method, markers or otherwise. The method of detection facilitates the recognition of the surfaces, positions, and orientations to put the hologram accordingly, on receiving a 3D map.
A Method of Spatial Tracking: Once the hologram has been rendered on a particular surface, the method of Visual-Inertial SLAM tracking makes sure that the hologram sticks to its position. The method renders different viewpoints of the hologram based on the point of view of the user. Figure 15 shows the detailed Visual-Inertial SLAM tracking method, which is as follows:
STEP A: The method starts 109 by initialization of the Visual system 110 of the Smartphone that includes, mono or dual cameras or any other external cameras as attached.
STEP B: Initialization of Inertial system 111 of the Smartphone, including Inertial Measurement Unit that contains an accelerometer, a gyroscope, and a magnetometer.
STEP C: The process of pre-processing and normalization 112 of all cameras and IMU data.
STEP D: The pre-processing and normalization is followed by detection of features 113 in a single or multiple cameras streams.
STEP E: The keyframes within camera frames are identified 114 and are stored for further processing. STEP F: Estimation of the 3D world map and camera pose, using non-linear optimization on the keyframe and IMU data 115.
STEP G: The 3D map and camera pose estimation are enhanced by employing Visual-Inertial Alignment, Loop Closure Model along with the GPU-optimized implementation for real-time computations 116. STEP H: The rendering of stereo Augmented Reality content 117 based on camera pose, 3D Map Estimation, and Inter-Pupillary Distance on the Smartphone display, and the method ends 118.
A Method of Hand and Gesture Tracking: Using single or multiple cameras as attached to the Smartphone, the augmented and mixed reality viewing device can track hands and gestures for interactions with the holograms.
A Method of Rendering: The method of rendering takes an in-built 3D model of any game, scene, movie or animation and renders different views based on the detection and method of spatial tracking. Figure 14 shows the entire process of the augmented and mixed reality viewing device. The process of the device for a holographic experience is as follows:
STEP I: The process starts 100 with the mobile application of the device will be switched on 101, and the Smartphone is slid into phone holder 2 plate either at the top or the front of the device (depending on version) 102.
STEP II: The focusing lenses 4 and the beam splitters 5 or other combiner optics or waveguides are to be adjusted for a correct holographic view based on Inter- Pupillary Distance (IPD) 103.
STEP III: With a single Smartphone RGB camera 8 or multiple cameras attached to the Smartphone using Visual-Inertial SLAM tracking method as shown in Figure 10, the 3D world is tracked for projection of the holograms 104.
STEP IV: Playing games, watching movies and interacting with holographic content with Natural Gestures - that are tracked using single or multiple cameras with hand tracking method can be initiated 104. STEP V: Tap using natural gestures on any hologram to start the holographic experience 105.
STEP VI: For further interactions, natural gestures such as tap, pinch, zoom and more can be used. These gestures are detected, tracked and interpreted by the phone camera along with the tracking to enable an immersive experience 106. STEP VII: The Smartphone screen renders the subsequent holographic frame based on gestures and controls and through the optical system as implemented, displays the hologram in the real world 107.
STEP VIII: For a natural user interaction to interact with subsequent applications 108, the user either has to adjust the device to focus the lens as required or tap on any hologram using natural gestures to start the holographic experience. The above steps are repeated for any holographic experience.

Claims

A head-mounted-display for a user to enable mixed-reality viewing comprising:
a. An enclosure that can fit a Smartphone (1, 16, 30, 34, 42) of any size with an in-built housing for any Smartphone to enable the Smartphone camera 8 to see and track the real world;
b. A pair of focusing lens (4, 12) placed at a focal length distance from the Smartphone Screen;
c. Combiner optics 32 adjusted for a correct holographic view based on an Inter Pupillary Distance (IPD) including:
i. One or more beam splitters (5, 13, 20) are placed at a certain required distance from the focusing lenses (4, 12); ii. One or more curved mirrors built using a combination of lenses (4, 12) and beam splitters (5, 13, 20) allow a reduced optical footprint with a high field of view; and iii. One or more waveguides that combine holographic light modulators with total internal reflections; and d. A controller, wherein the controller is configured to:
i. Track the real world around the Smartphone (1, 16, 30, 34, 42) ;
ii. Track the head-mounted display 3D position in the real- world;
iii. Detect the plane or curved surfaces and render a visual representation of the user's interaction, including holograms; and
iv. Detect one or more gestures and re-render the visual representation, including holograms, in the real world.
2. The head-mounted-display of Claim 1 wherein the controller is configured to track the real world around the Smartphone (1, 16, 30, 34, 42) by using a camera to detect the world in 3D using Visual-Intertial SLAM methods and markers, facilitating the recognition of the surfaces, positions, and orientations to put the hologram accordingly, on receiving a 3D map.
3. The head-mounted display of Claim 1 wherein the camera is an in-built Smartphone camera.
4. The head-mounted display of Claim 1 wherein the camera is one or more externally attached cameras.
5. The head-mounted-display of Claim 1 wherein the controller is configured to detect the plane or curved surfaces and render holograms as required, wherein once the hologram has been rendered on a particular surface, the method of Visual-Inertial SLAM tracking makes sure that the hologram sticks to its position rendering different viewpoints of the hologram based on the point of view of the user.
6. The head-mounted-display of Claim 1 wherein the controller is configured to detect the gestures and re-render the holograms as required by the application wherein the controller uses one or more cameras to track hands and gestures for interactions with the holograms.
7. The head-mounted-display of Claim 1 wherein the controller is configured to detect the gestures that are tracked using one or more cameras with hand tracking.
8. The head-mounted-display of Claim 1 that is used for natural interaction with digital manifestation of physical objects with applications in e- commerce, gaming, media, entertainment and interact with holographic content by generating an appropriate visual representation of the user's interaction.
9. The head-mounted-display of Claim 1 wherein gestures include tap, pinch, and zoom.
10. A method for a user to enable mixed-reality viewing with Visual-Inertial SLAM tracking comprising the steps of:
a. Initializing a visual system 110 of the Smartphone that includes either mono or dual cameras or any other external cameras as attached;
b. Initializing an inertial System 111 of the Smartphone, including an Inertial Measurement Unit that contains an accelerometer, a gyroscope, and a magnetometer;
c. Pre-processing and normalization 112 of cameras and IMU data; d. Detecting features 113 in either single or multiple camera streams; e. Identifying keyframes 114 in camera frames and storing for further processing;
f. Estimating one or more 3D world maps and camera poses 115 using non-linear optimization on the keyframe and IMU data; g. Enhancing the 3D map and camera pose estimation using Visual- Inertial alignment, Loop Closure Model along with the GPU- optimized implementation for real-time computations 116; and h. Rendering Stereo Augmented Reality content 117 based on camera pose, 3D Map Estimation and Inter-Pupillary Distance on the Smartphone display is done 117.
11. The method of Claim 10 wherein the camera is an in-built Smartphone camera 8.
12. The method of Claim 10 wherein the camera is one or more externally attached cameras.
13. The method of Claim 10 wherein inputs to the method include signals from one or more combiner optics 32.
PCT/IN2017/050304 2017-02-02 2017-07-26 An apparatus, method, and system for augmented and mixed reality viewing Ceased WO2018142418A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201741003807 2017-02-02
IN201741003807 2017-02-02

Publications (1)

Publication Number Publication Date
WO2018142418A1 true WO2018142418A1 (en) 2018-08-09

Family

ID=63040397

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2017/050304 Ceased WO2018142418A1 (en) 2017-02-02 2017-07-26 An apparatus, method, and system for augmented and mixed reality viewing

Country Status (1)

Country Link
WO (1) WO2018142418A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109769112A (en) * 2019-01-07 2019-05-17 上海临奇智能科技有限公司 The assembling setting method of virtual screen all-in-one machine with a variety of effect screens
CN111103687A (en) * 2019-11-18 2020-05-05 邵阳学院 Museum AR text creating device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2990852A1 (en) * 2014-09-01 2016-03-02 Samsung Electronics Co., Ltd. Head-mounted display hosting a smartphone for providing virtual reality environment
US20160349509A1 (en) * 2015-05-26 2016-12-01 Microsoft Technology Licensing, Llc Mixed-reality headset

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2990852A1 (en) * 2014-09-01 2016-03-02 Samsung Electronics Co., Ltd. Head-mounted display hosting a smartphone for providing virtual reality environment
US20160349509A1 (en) * 2015-05-26 2016-12-01 Microsoft Technology Licensing, Llc Mixed-reality headset

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109769112A (en) * 2019-01-07 2019-05-17 上海临奇智能科技有限公司 The assembling setting method of virtual screen all-in-one machine with a variety of effect screens
CN111103687A (en) * 2019-11-18 2020-05-05 邵阳学院 Museum AR text creating device

Similar Documents

Publication Publication Date Title
US11928784B2 (en) Systems and methods for presenting perspective views of augmented reality virtual object
US11330241B2 (en) Focusing for virtual and augmented reality systems
US10228564B2 (en) Increasing returned light in a compact augmented reality/virtual reality display
KR100809479B1 (en) Face Wearable Display Devices for Mixed Reality Environments
US10127725B2 (en) Augmented-reality imaging
US11574389B2 (en) Reprojection and wobulation at head-mounted display device
US20250271928A1 (en) Determining angular acceleration
KR101868405B1 (en) Augmented reality/virual reality convertible display device
US12283013B2 (en) Non-uniform stereo rendering
KR20150088355A (en) Apparatus and method for stereo light-field input/ouput supporting eye-ball movement
CN107728319B (en) Visual display system and method and head-mounted display device
EP3308539A1 (en) Display for stereoscopic augmented reality
US20230103091A1 (en) Combined birefringent material and reflective waveguide for multiple focal planes in a mixed-reality head-mounted display device
CN113272710A (en) Extending field of view by color separation
US10725301B2 (en) Method and apparatus for transporting optical images
WO2018142418A1 (en) An apparatus, method, and system for augmented and mixed reality viewing
CN109963145B (en) Visual display system and method, and head mounted display device
CN207625711U (en) Vision display system and head-wearing display device
CN109963141B (en) Visual display system and method and head-mounted display device
CN116762032A (en) Reverse see-through glasses for augmented reality and virtual reality devices
US20190162968A1 (en) Method and system for communication between a wearable display device and a portable device
Hansen et al. Rendering the Light Field

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17894772

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17894772

Country of ref document: EP

Kind code of ref document: A1