[go: up one dir, main page]

WO2025212393A1 - Polarized augmented reality displays - Google Patents

Polarized augmented reality displays

Info

Publication number
WO2025212393A1
WO2025212393A1 PCT/US2025/021905 US2025021905W WO2025212393A1 WO 2025212393 A1 WO2025212393 A1 WO 2025212393A1 US 2025021905 W US2025021905 W US 2025021905W WO 2025212393 A1 WO2025212393 A1 WO 2025212393A1
Authority
WO
WIPO (PCT)
Prior art keywords
waveguide
display system
reflective
image light
reflective element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2025/021905
Other languages
French (fr)
Inventor
Andrew John Ouderkirk
Tingling Rao
Siddharth BUDDHIRAJU
Zhaoyu NIE
Liliana Ruiz Diaz
Prathmesh DESHMUKH
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Platforms Technologies LLC
Original Assignee
Meta Platforms Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meta Platforms Technologies LLC filed Critical Meta Platforms Technologies LLC
Publication of WO2025212393A1 publication Critical patent/WO2025212393A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • G02B2027/0174Head mounted characterised by optical features holographic

Definitions

  • This application relates to a system for polarized augmented reality displays.
  • a display system comprising: a waveguide body extending from an input end to an output end and configured to guide light by total internal reflection from the input end to the output end; an in-coupling element located proximate to the input end and configured to direct image light into the waveguide body; an out-coupling element located proximate to the output end and configured to direct image light out of the waveguide body; and a reflective element located on a world side of the waveguide body, wherein the reflective element is configured to direct image light out- coupled from the waveguide body toward a user's eyes.
  • the waveguide body comprises an optically isotropic material.
  • the waveguide body comprises an optically anisotropic material.
  • the waveguide body comprises an organic solid crystal.
  • the out-coupling element comprises an optically isotropic material.
  • the out-coupling element comprises a surface relief grating.
  • the out-coupling element comprises a structured diffraction grating selected from the group consisting of binary, slanted, and blazed.
  • the reflective element overlies the output end of the waveguide body.
  • the reflective element is spaced away from the out-coupling element.
  • the reflective element contacts the out-coupling element.
  • the reflective element comprises an optical element selected from the group consisting of a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer.
  • the reflective element comprises a notch dichroic mirror and quarter waveplate overlying the notch dichroic mirror.
  • the display system further comprises a notch reflective polarizer located on a user side of the waveguide body.
  • a display system comprising: a waveguide configured to propagate image light therethrough; an out-coupling element disposed over an output region of the waveguide, wherein the out-coupling element is configured to out-couple the image light from the waveguide; and a reflective element located between the waveguide and a world side of the display system, wherein the reflective element is configured to direct the out-coupled image light toward an eye of a user.
  • the waveguide comprises an optically anisotropic material.
  • the waveguide comprises an organic solid crystal.
  • the out-coupling element comprises an optically anisotropic material.
  • the reflective element comprises an optical element selected from the group consisting of a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer.
  • a display system comprising: a waveguide configured to propagate image light therethrough; and a reflective element located between the waveguide and a world side of the display system, wherein the reflective element comprises an optical element selected from the group consisting of a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer, and the reflective element is configured to direct image light out-coupled from the waveguide toward an eye of a user.
  • FIG. 1 is an illustration of an exemplary waveguide with grating structures to couple light into and out of the waveguide according to some embodiments.
  • FIG. 2 is an illustration of an exemplary waveguide with grating structures to couple light into a birefringent waveguide substrate and grating structures to outcouple polarized light from the birefringent substrate according to various embodiments.
  • FIG. 3 is an illustration of an exemplary waveguide with grating structures to couple light into an isotropic or anisotropic waveguide substrate and birefringent grating structures to outcouple polarized light from the waveguide substrate according to various embodiments.
  • FIG. 4 depicts momentum space renderings and the outcoupling of polarized light from an example waveguide according to certain embodiments.
  • FIG. 7 is an illustration of exemplary augmented-reality glasses that may be used in connection with embodiments of this disclosure.
  • a waveguide display system for VR and AR applications may include a micro-display module and waveguide optics for directing a display image to a user.
  • the micro-display module may include a light source, such as a light emitting diode (LED).
  • the waveguide optics may include input-coupling and output-coupling elements such as surface relief gratings that are configured to couple light into and out of the waveguide.
  • Example grating structures may have a one-dimensional or two-dimensional periodicity, and may include a binary, slanted, or blazed architecture.
  • Organic solid crystal (OSC) materials with high refractive index and birefringence can be used for various optical components, including surface relief gratings, meta-surfaces, waveguides, beam splitting, photonic elements such as photonic integrated circuits, and polarization selective elements.
  • an augmented reality display may include an OSC-based waveguide.
  • Organic solid crystal (OSC) materials may be incorporated into monolithic bodies, such as optical elements (e.g., lenses, waveguides, and the like) and other structures. For instance, particles of an OSC material may be manufactured and consolidated/densified to form an optical element.
  • An optical element formed from an organic solid crystal material may be configured to provide one or more advantageous characteristics, including one or more of a controllable refractive index and birefringence, optical clarity, and optical transparency.
  • organic optically anisotropic materials include anthracene, polycene, triazole, thiophene, as well as derivatives thereof.
  • Example inorganic optically anisotropic materials include SiC>2, TiCh, GazCh, LiNbC , SiC, and ZnS.
  • an optically anisotropic material may be characterized by a refractive index difference between at least one pair of principal axes of at least approximately 0.1, e.g., 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, or 0.9, including ranges between any of the foregoing values.
  • a grating may overlie a waveguide substrate through which an electromagnetic wave may propagate.
  • the waveguide substrate includes or is formed from an organic solid crystal material.
  • the substrate may include a single phase OSC material.
  • the substrate may include a single organic solid crystal layer or an OSC multilayer.
  • the characteristic refractive indices (m, n2, ns) may be aligned or askew with respect to the principal dimensions of the substrate.
  • the waveguide substrate may include an OSC material with either a fixed optical axis or a spatially varying optical axis.
  • FIG. 1 includes a description of an example waveguide with grating structures to in-couple and out-couple image light from a waveguide substrate.
  • FIGS. 2 and 3 includes a description of planar waveguides having light polarizing configurations.
  • electromagnetic waves are confined to propagate along the plane of the waveguide, whereas in non-planar waveguides, the waves may follow a curved path, guided by the shape of the waveguide.
  • the discussion associated with FIG. 4 includes momentum space renderings depicting the diffraction of image light from birefringent materials or structures.
  • the discussion associated with FIGS. 5 and 6 includes a description of planar waveguides having a reflective optical element for recycling leaked polarized image light.
  • the discussion associated with FIGS. 7 and 8 relates to exemplary virtual reality and augmented reality devices that may include one or more waveguide configurations as disclosed herein.
  • FIG. 1 A schematic view of a display system 100 including a planar waveguide is shown in FIG. 1.
  • An input grating 110 is configured to couple image light into a waveguide substrate
  • an output grating 120 is configured to couple the internally reflected image light out of the waveguide and toward both a user's eye 130 and the world side 140 of the display system.
  • a field of view reaching an eyebox of the display may be 50° x 50°, for example.
  • the reflective element 500 may include a structure selected from a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer.
  • the reflective element 500 may include a notch dichroic mirror and quarter waveplate (QWP) located on the world side of the waveguide.
  • the display may additionally include a notch reflective polarizer 510 disposed on the user side.
  • the dichroic mirror may be curved or flat and may have matching notch characteristics with the notch reflective polarizer 510.
  • polarized light emitted toward the world side of the display may be redirected by the QWP and dichroic mirror back to the user.
  • Light emitted toward the user side of the display may be initially reflected by the reflective polarizer but subsequently redirected by the QWP and dichroic mirror back to the user.
  • the generation of polarized image light allows for efficient redirection of leaked world side light while maintaining commercially-relevant see-through transmission.
  • the waveguides in FIGS. 2, 3, 5, and 6 are depicted as having a planar surface (i.e., abutting the grating structures), non-planar waveguide surfaces are also contemplated where, for example, a curvature of the reflective element may match a curvature of the waveguide.
  • Example 1 A display system includes a waveguide body extending from an input end to an output end and configured to guide light by total internal reflection from the input end to the output end, an in-coupling element located proximate to the input end and configured to direct image light into the waveguide body, an out-coupling element located proximate to the output end and configured to direct image light out of the waveguide body, and a reflective element located on a world side of the waveguide body, where the reflective element is configured to direct image light out-coupled from the waveguide body toward a user's eyes.
  • Example 2 The display system of Example 1, where the waveguide body includes an optically isotropic material.
  • Example 3 The display system of Example 1, where the waveguide body includes an optically anisotropic material.
  • Example 4 The display system of any of Examples 1-3, where the waveguide body includes an organic solid crystal.
  • Example 5 The display system of any of Examples 1-4, where the out-coupling element includes an optically isotropic material.
  • Example 6 The display system of any of Examples 1-4, where the out-coupling element includes an optically anisotropic material.
  • Example 7 The display system of any of Examples 1-6, where the out-coupling element includes a surface relief grating.
  • Example 8 The display system of Example 7, where the out-coupling element includes a structured diffraction grating selected from a binary grating, a slanted grating, and a blazed grating.
  • Example 9 The display system of any of Examples 1-8, where the reflective element overlies the output end of the waveguide body.
  • Example 11 The display system of any of Examples 1-9, where the reflective element contacts the out-coupling element.
  • Example 12 The display system of any of Examples 1-11, where the reflective element includes an optical element selected from a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer.
  • the reflective element includes an optical element selected from a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer.
  • Example 13 The display system of any of Examples 1-12, where the reflective element includes a notch dichroic mirror and quarter waveplate overlying the notch dichroic mirror.
  • Example 14 The display system of Example 13, further including a notch reflective polarizer located on a user side of the waveguide body.
  • a display system includes a waveguide configured to propagate image light therethrough, an out-coupling element disposed over an output region of the waveguide, where the out-coupling element is configured to out-couple the image light from the waveguide, and a reflective element located between the waveguide and a world side of the display system, where the reflective element is configured to direct the out-coupled image light toward an eye of a user.
  • Example 16 The display system of Example 15, where the waveguide includes an optically anisotropic material.
  • Example 17 The display system of any of Examples 15 and 16, where the waveguide includes an organic solid crystal.
  • Example 18 The display system of any of Examples 15-17, where the out-coupling element includes an optically anisotropic material.
  • Example 19 The display system of any of Examples 15-18, where the reflective element includes an optical element selected from a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer.
  • the reflective element includes an optical element selected from a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer.
  • Embodiments of the present disclosure may include or be implemented in conjunction with various types of artificial-reality systems.
  • Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, for example, a virtual reality, an augmented reality, a mixed reality, a hybrid reality, or some combination and/or derivative thereof.
  • Artificial-reality content may include completely computer-generated content or computer-generated content combined with captured (e.g., real-world) content.
  • the artificial-reality content may include video, audio, haptic feedback, or some combination thereof, any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional (3D) effect to the viewer).
  • artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, for example, create content in an artificial reality and/or are otherwise used in (e.g., to perform activities in) an artificial reality.
  • augmented-reality system 700 may include an eyewear device 702 with a frame 710 configured to hold a left display device 715(A) and a right display device 715(B) in front of a user's eyes.
  • Display devices 715(A) and 715(B) may act together or independently to present an image or series of images to a user.
  • augmented-reality system 700 includes two displays, embodiments of this disclosure may be implemented in augmented-reality systems with a single NED or more than two NEDs.
  • augmented-reality system 700 may include one or more sensors, such as sensor 740.
  • Sensor 740 may generate measurement signals in response to motion of augmented-reality system 700 and may be located on substantially any portion of frame 710.
  • Sensor 740 may represent a position sensor, an inertial measurement unit (IMU), a depth camera assembly, a structured light emitter and/or detector, or any combination thereof.
  • IMU inertial measurement unit
  • augmented-reality system 700 may or may not include sensor 740 or may include more than one sensor.
  • the IMU may generate calibration data based on measurement signals from sensor 740.
  • Examples of sensor 740 may include, without limitation, accelerometers, gyroscopes, magnetometers, other suitable types of sensors that detect motion, sensors used for error correction of the IMU, or some combination thereof.
  • Augmented-reality system 700 may also include a microphone array with a plurality of acoustic transducers 720(A)-720(J), referred to collectively as acoustic transducers 720.
  • Acoustic transducers 720 may be transducers that detect air pressure variations induced by sound waves.
  • Each acoustic transducer 720 may be configured to detect sound and convert the detected sound into an electronic format (e.g., an analog or digital format).
  • the configuration of acoustic transducers 720 of the microphone array may vary. While augmented-reality system 700 is shown in FIG. 7 as having ten acoustic transducers 720, the number of acoustic transducers 720 may be greater or less than ten. In some embodiments, using higher numbers of acoustic transducers 720 may increase the amount of audio information collected and/or the sensitivity and accuracy of the audio information. In contrast, using a lower number of acoustic transducers 720 may decrease the computing power required by an associated controller 750 to process the collected audio information. In addition, the position of each acoustic transducer 720 of the microphone array may vary. For example, the position of an acoustic transducer 720 may include a defined position on the user, a defined coordinate on frame 710, an orientation associated with each acoustic transducer 720, or some combination thereof.
  • acoustic transducers 720(A) and 720(B) may not be used at all in conjunction with augmented-reality system 700.
  • Acoustic transducers 720 on frame 710 may be positioned along the length of the temples, across the bridge, above or below display devices 715(A) and 715(B), or some combination thereof.
  • Acoustic transducers 720 may be oriented such that the microphone array is able to detect sounds in a wide range of directions surrounding the user wearing the augmented-reality system 700.
  • an optimization process may be performed during manufacturing of augmented-reality system 700 to determine relative positioning of each acoustic transducer 720 in the microphone array.
  • augmented-reality system 700 may include or be connected to an external device (e.g., a paired device), such as neckband 705.
  • an external device e.g., a paired device
  • Neckband 705 generally represents any type or form of paired device.
  • the following discussion of neckband 705 may also apply to various other paired devices, such as charging cases, smart watches, smart phones, wrist bands, other wearable devices, hand-held controllers, tablet computers, laptop computers, other external compute devices, etc.
  • Neckband 705 may be communicatively coupled with eyewear device 702 and/or to other devices. These other devices may provide certain functions (e.g., tracking, localizing, depth mapping, processing, storage, etc.) to augmented-reality system 700.
  • neckband 705 may include two acoustic transducers (e.g., 720(1) and 720(J)) that are part of the microphone array (or potentially form their own microphone subarray).
  • Neckband 705 may also include a controller 725 and a power source 735.
  • Acoustic transducers 720(1) and 720(J) of neckband 705 may be configured to detect sound and convert the detected sound into an electronic format (analog or digital).
  • acoustic transducers 720(1) and 720(J) may be positioned on neckband 705, thereby increasing the distance between the neckband acoustic transducers 720(1) and 720(1) and other acoustic transducers 720 positioned on eyewear device 702.
  • increasing the distance between acoustic transducers 720 of the microphone array may improve the accuracy of beamforming performed via the microphone array.
  • Controller 725 of neckband 705 may process information generated by the sensors on neckband 705 and/or augmented-reality system 700.
  • controller 725 may process information from the microphone array that describes sounds detected by the microphone array. Foreach detected sound, controller 725 may perform a direction-of-arrival (DOA) estimation to estimate a direction from which the detected sound arrived at the microphone array. As the microphone array detects sounds, controller 725 may populate an audio data set with the information.
  • controller 725 may compute all inertial and spatial calculations from the IMU located on eyewear device 702.
  • a connector may convey information between augmented-reality system 700 and neckband 705 and between augmented-reality system 700 and controller 725. The information may be in the form of optical data, electrical data, wireless data, or any other transmittable data form. Moving the processing of information generated by augmented-reality system 700 to neckband 705 may reduce weight and heat in eyewear device 702, making it more comfortable to the user.
  • Power source 735 in neckband 705 may provide power to eyewear device 702 and/or to neckband 705.
  • Power source 735 may include, without limitation, lithium ion batteries, lithium-polymer batteries, primary lithium batteries, alkaline batteries, or any other form of power storage.
  • power source 735 may be a wired power source. Including power source 735 on neckband 705 instead of on eyewear device 702 may help better distribute the weight and heat generated by power source 735.
  • some artificial-reality systems may, instead of blending an artificial reality with actual reality, substantially replace one or more of a user's sensory perceptions of the real world with a virtual experience.
  • a head-worn display system such as virtual-reality system 800 in FIG. 8, that mostly or completely covers a user's field of view.
  • Virtual-reality system 800 may include a front rigid body 802 and a band 804 shaped to fit around a user's head.
  • Virtual-reality system 800 may also include output audio transducers 806(A) and 806(B).
  • front rigid body 802 may include one or more electronic elements, including one or more electronic displays, one or more inertial measurement units (IMUs), one or more tracking emitters or detectors, and/or any other suitable device or system for creating an artificial reality experience.
  • IMUs inertial measurement units
  • Artificial-reality systems may include a variety of types of visual feedback mechanisms.
  • display devices in augmented-reality system 700 and/or virtual-reality system 800 may include one or more liquid crystal displays (LCDs), light emitting diode (LED) displays, organic LED (OLED) displays, digital light project (DLP) micro-displays, liquid crystal on silicon (LCoS) micro-displays, and/or any other suitable type of display screen.
  • LCDs liquid crystal displays
  • LED light emitting diode
  • OLED organic LED
  • DLP digital light project
  • LCD liquid crystal on silicon
  • Artificial-reality systems may include a single display screen for both eyes or may provide a display screen for each eye, which may allow for additional flexibility for varifocal adjustments or for correcting a user's refractive error.
  • Some artificial-reality systems may also include optical subsystems having one or more lenses (e.g., conventional concave or convex lenses, Fresnel lenses, adjustable liquid lenses, etc.) through which a user may view a display screen.
  • These optical subsystems may serve a variety of purposes, including to collimate (e.g., make an object appear at a greater distance than its physical distance), to magnify (e.g., make an object appear larger than its actual size), and/or to relay (to, e.g., the viewer's eyes) light.
  • optical subsystems may be used in a non-pupil-forming architecture (such as a single lens configuration that directly collimates light but results in so-called pincushion distortion) and/or a pupil-forming architecture (such as a multi-lens configuration that produces so- called barrel distortion to nullify pincushion distortion).
  • a non-pupil-forming architecture such as a single lens configuration that directly collimates light but results in so-called pincushion distortion
  • a pupil-forming architecture such as a multi-lens configuration that produces so- called barrel distortion to nullify pincushion distortion
  • some artificial-reality systems may include one or more projection systems.
  • display devices in augmented-reality system 700 and/or virtual-reality system 800 may include micro-LED projectors that project light (using, e.g., a waveguide) into display devices, such as clear combiner lenses that allow ambient light to pass through.
  • the display devices may refract the projected light toward a user's pupil and may enable a user to simultaneously view both artificial-reality content and the real world.
  • the display devices may accomplish this using any of a variety of different optical components, including waveguide components (e.g., holographic, planar, diffractive, polarized, and/or reflective waveguide elements), light-manipulation surfaces and elements (such as diffractive, reflective, and refractive elements and gratings), coupling elements, etc.
  • waveguide components e.g., holographic, planar, diffractive, polarized, and/or reflective waveguide elements
  • light-manipulation surfaces and elements such as diffractive, reflective, and refractive elements and gratings
  • coupling elements etc.
  • Artificial-reality systems may also be configured with any other suitable type or form of image projection system, such as retinal projectors used in virtual retina displays.
  • Artificial-reality systems may also include various types of computer vision components and subsystems.
  • augmented-reality system 700 and/or virtual- reality system 800 may include one or more optical sensors, such as two-dimensional (2D) or 3D cameras, structured light transmitters and detectors, time-of-f light depth sensors, singlebeam or sweeping laser rangefinders, 3D LiDAR sensors, and/or any other suitable type or form of optical sensor.
  • An artificial-reality system may process data from one or more of these sensors to identify a location of a user, to map the real world, to provide a user with context about real-world surroundings, and/or to perform a variety of other functions.
  • Artificial-reality systems may also include one or more input and/or output audio transducers.
  • output audio transducers 806(A) and 806(B) may include voice coil speakers, ribbon speakers, electrostatic speakers, piezoelectric speakers, bone conduction transducers, cartilage conduction transducers, tragus-vibration transducers, and/or any other suitable type or form of audio transducer.
  • input audio transducers may include condenser microphones, dynamic microphones, ribbon microphones, and/or any other type or form of input transducer.
  • a single transducer may be used for both audio input and audio output.
  • artificial-reality systems may include tactile (i.e., haptic) feedback systems, which may be incorporated into headwear, gloves, body suits, handheld controllers, environmental devices (e.g., chairs, floormats, etc.), and/or any other type of device or system.
  • Haptic feedback systems may provide various types of cutaneous feedback, including vibration, force, traction, texture, and/or temperature.
  • Haptic feedback systems may also provide various types of kinesthetic feedback, such as motion and compliance.
  • Haptic feedback may be implemented using motors, piezoelectric actuators, fluidic systems, and/or a variety of other types of feedback mechanisms.
  • Haptic feedback systems may be implemented independent of other artificial-reality devices, within other artificial-reality devices, and/or in conjunction with other artificial-reality devices.
  • artificial-reality systems may create an entire virtual experience or enhance a user's real-world experience in a variety of contexts and environments. For instance, artificial-reality systems may assist or extend a user's perception, memory, or cognition within a particular environment. Some systems may enhance a user's interactions with other people in the real world or may enable more immersive interactions with other people in a virtual world.
  • Artificial-reality systems may also be used for educational purposes (e.g., for teaching or training in schools, hospitals, government organizations, military organizations, business enterprises, etc.), entertainment purposes (e.g., for playing video games, listening to music, watching video content, etc.), and/or for accessibility purposes (e.g., as hearing aids, visual aids, etc.).
  • the embodiments disclosed herein may enable or enhance a user's artificial-reality experience in one or more of these contexts and environments and/or in other contexts and environments.
  • numeric value "50" as “approximately 50” may, in certain embodiments, include values equal to 50 ⁇ 5, i.e., values within the range 45 to 55.
  • the term "substantially" in reference to a given parameter, property, or condition may mean and include to a degree that one of ordinary skill in the art would understand that the given parameter, property, or condition is met with a small degree of variance, such as within acceptable manufacturing tolerances.
  • the parameter, property, or condition may be at least approximately 90% met, at least approximately 95% met, or even at least approximately 99% met.
  • transitional phrase "comprising” it is to be understood that alternative embodiments, including those that may be described using the transitional phrases “consisting of” or “consisting essentially of,” are implied.
  • implied alternative embodiments to a lens that comprises or includes polycarbonate include embodiments where a lens consists essentially of polycarbonate and embodiments where a lens consists of polycarbonate.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

A display system includes a waveguide body extending from an input end to an output end and configured to guide light by total internal reflection from the input end to the output end, an in-coupling element located proximate to the input end and configured to direct image light into the waveguide body, an out-coupling element located proximate to the output end and configured to direct image light out of the waveguide body, and a reflective element located on a world side of the waveguide body, where the reflective element is configured to direct image light out-coupled from the waveguide body toward a user's eyes.

Description

POLARIZED AUGMENTED REALITY DISPLAYS
CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims benefit of and priority to U.S. provisional patent application Ser. No. 63/572875 Filed April 1, 2024.
FIELD
This application relates to a system for polarized augmented reality displays.
SUMMARY
According to an aspect, there is provided a display system comprising: a waveguide body extending from an input end to an output end and configured to guide light by total internal reflection from the input end to the output end; an in-coupling element located proximate to the input end and configured to direct image light into the waveguide body; an out-coupling element located proximate to the output end and configured to direct image light out of the waveguide body; and a reflective element located on a world side of the waveguide body, wherein the reflective element is configured to direct image light out- coupled from the waveguide body toward a user's eyes.
In one embodiment, the waveguide body comprises an optically isotropic material.
In one embodiment, the waveguide body comprises an optically anisotropic material.
In one embodiment, the waveguide body comprises an organic solid crystal.
In one embodiment, the out-coupling element comprises an optically isotropic material.
In one embodiment, the out-coupling element comprises an optically anisotropic material.
In one embodiment, the out-coupling element comprises a surface relief grating.
In one embodiment, the out-coupling element comprises a structured diffraction grating selected from the group consisting of binary, slanted, and blazed.
In one embodiment, the reflective element overlies the output end of the waveguide body.
In one embodiment, the reflective element is spaced away from the out-coupling element.
In one embodiment, the reflective element contacts the out-coupling element.
In one embodiment, the reflective element comprises an optical element selected from the group consisting of a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer.
In one embodiment, the reflective element comprises a notch dichroic mirror and quarter waveplate overlying the notch dichroic mirror.
In one embodiment, the display system further comprises a notch reflective polarizer located on a user side of the waveguide body.
According to another aspect, there is provided a display system comprising: a waveguide configured to propagate image light therethrough; an out-coupling element disposed over an output region of the waveguide, wherein the out-coupling element is configured to out-couple the image light from the waveguide; and a reflective element located between the waveguide and a world side of the display system, wherein the reflective element is configured to direct the out-coupled image light toward an eye of a user.
In one embodiment, the waveguide comprises an optically anisotropic material.
In one embodiment, the waveguide comprises an organic solid crystal.
In one embodiment, the out-coupling element comprises an optically anisotropic material.
In one embodiment, the reflective element comprises an optical element selected from the group consisting of a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer.
According to a further aspect, there is provided a display system comprising: a waveguide configured to propagate image light therethrough; and a reflective element located between the waveguide and a world side of the display system, wherein the reflective element comprises an optical element selected from the group consisting of a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer, and the reflective element is configured to direct image light out-coupled from the waveguide toward an eye of a user.
It will be appreciated that any features described herein as being suitable for incorporation into one or more aspects or embodiments of the present disclosure are intended to be generalizable across any and all aspects and embodiments of the present disclosure. Other aspects of the present disclosure can be understood by those skilled in the art in light of the description, the claims, and the drawings of the present disclosure. The foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings illustrate a number of exemplary embodiments and are a part of the specification. Together with the following description, these drawings demonstrate and explain various principles of the present disclosure.
FIG. 1 is an illustration of an exemplary waveguide with grating structures to couple light into and out of the waveguide according to some embodiments.
FIG. 2 is an illustration of an exemplary waveguide with grating structures to couple light into a birefringent waveguide substrate and grating structures to outcouple polarized light from the birefringent substrate according to various embodiments.
FIG. 3 is an illustration of an exemplary waveguide with grating structures to couple light into an isotropic or anisotropic waveguide substrate and birefringent grating structures to outcouple polarized light from the waveguide substrate according to various embodiments.
FIG. 4 depicts momentum space renderings and the outcoupling of polarized light from an example waveguide according to certain embodiments.
FIG. 5 is an illustration of the waveguide of FIG. 2 including a reflective element for redirecting leaked polarized light according to some embodiments.
FIG. 6 is an illustration of the waveguide of FIG. 3 including a reflective element for redirecting leaked polarized light according to further embodiments.
FIG. 7 is an illustration of exemplary augmented-reality glasses that may be used in connection with embodiments of this disclosure.
FIG. 8 is an illustration of an exemplary virtual-reality headset that may be used in connection with embodiments of this disclosure.
Throughout the drawings, identical reference characters and descriptions indicate similar, but not necessarily identical, elements. While the exemplary embodiments described herein are susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. However, the exemplary embodiments described herein are not intended to be limited to the particular forms disclosed.
DETAILED DESCRIPTION
Virtual reality (VR) and augmented reality (AR) eyewear devices or headsets may enable users to experience events, such as interactions with people in a computer-generated simulation of a three-dimensional world or viewing data superimposed on a real-world view. By way of example, superimposing information onto a field of view may be achieved through an optical head-mounted display (OHMD) or by using embedded wireless glasses with a transparent heads-up display (HUD) or augmented reality (AR) overlay. VR/AR eyewear devices and headsets may be used for a variety of purposes. For example, governments may use such devices for military training, medical professionals may use such devices to simulate surgery, and engineers may use such devices as design visualization aids.
A waveguide display system for VR and AR applications may include a micro-display module and waveguide optics for directing a display image to a user. The micro-display module may include a light source, such as a light emitting diode (LED). The waveguide optics may include input-coupling and output-coupling elements such as surface relief gratings that are configured to couple light into and out of the waveguide. Example grating structures may have a one-dimensional or two-dimensional periodicity, and may include a binary, slanted, or blazed architecture. In some embodiments, a vertical grating coupler, for instance, may be configured to change an out-of-plane wave-vector direction of light to an in-plane waveguide direction, or vice versa, and accordingly direct the passage of light through the waveguide display. An input-coupling grating may determine the angular uniformity and coupling efficiency of image light.
In exemplary systems, the waveguide optics may be advantageously configured to create illuminance uniformity and a wide field of view (FOV). The FOV relates to the angular range of an image observable by a user, whereas illuminance uniformity may include both the uniformity of image light over an expanded exit pupil (exit pupil uniformity) and the uniformity of image light over the FOV (angular uniformity). In various waveguide configurations, the supported field of view (FOV) is directly proportional to the refractive index of the waveguide material.
As used herein, "image light" refers to light that carries the visual information or images being displayed to a user. Image light in augmented reality (AR) or mixed reality (MR) systems is the visual content displayed or projected into the eyes of a user through an optical system, allowing them to see real world and virtual elements simultaneously.
Organic solid crystal (OSC) materials with high refractive index and birefringence can be used for various optical components, including surface relief gratings, meta-surfaces, waveguides, beam splitting, photonic elements such as photonic integrated circuits, and polarization selective elements. For instance, an augmented reality display may include an OSC-based waveguide.
Organic solid crystals include organic compounds that form solid crystalline structures. The molecular structure of the organic solids allows for flexibility, facile modification, and a broad range of potential functionalities and applications. Organic solid crystals possess a range of properties that make them attractive for use in various technologies.
One characteristic of organic solid crystals is their electrical properties. Many organic crystals are semiconductors. This makes them suitable for use in electronic devices such as transistors, sensors, and memory storage systems. Another important attribute of organic solid crystals is their optical properties. These materials may have a large refractive index (n>1.5) and often exhibit fluorescence and absorption in the visible and infrared spectrums, allowing them to be used in light-emitting devices, displays, and photo detectors. Their ability to emit light efficiently makes them valuable in applications such as organic light-emitting diodes.
In addition, organic crystals are often more mechanically compliant than inorganic materials, which may render them suitable for flexible electronics and wearable devices. Their low density also contributes to their lightweight nature making them ideal for portable and wearable technologies.
The properties of organic solid crystals may be tunable. By adjusting the molecular structure of these materials, their electronic and optical characteristics can be pre-set or modified in real time to meet specific needs, allowing for greater customization in various applications.
Organic solid crystal (OSC) materials may be incorporated into monolithic bodies, such as optical elements (e.g., lenses, waveguides, and the like) and other structures. For instance, particles of an OSC material may be manufactured and consolidated/densified to form an optical element. An optical element formed from an organic solid crystal material may be configured to provide one or more advantageous characteristics, including one or more of a controllable refractive index and birefringence, optical clarity, and optical transparency.
Due to their optical and mechanical properties, organic solid crystals may enable high- performance devices, and may be incorporated into passive or active optics, including AR/VR headsets, and may replace comparative material systems such as polymers, inorganic materials, and liquid crystals. In certain aspects, organic solid crystals may have optical properties that rival those of inorganic crystals while exhibiting the processability and electrical response of liquid crystals.
Structurally, the disclosed organic materials may be glassy, polycrystalline, or single crystal. In some embodiments, the organic crystalline phase may include amorphous regions. In some embodiments, the organic crystalline phase may be substantially crystalline. Organic solid crystals may include closely packed structures (e.g., organic molecules) that exhibit desirable optical properties such as a high and tunable refractive index, and high birefringence. Anisotropic organic solid materials may include a preferred packing of molecules, i.e., a preferred orientation or alignment of molecules. Example devices may include a waveguide substrate and/or a grating structure formed from an optically anisotropic material.
The organic crystalline phase may be characterized by a refractive index along at least one principal axis of at least approximately 1.5 at 589 nm. By way of example, the refractive index of the organic crystalline phase along at least one principal axis may be at least approximately 1.5, at least approximately 1.6, at least approximately 1.7, at least approximately 1.8, at least approximately 1.9, at least approximately 2.0, at least approximately 2.1, at least approximately 2.2, at least approximately 2.3, at least approximately 2.4, at least approximately 2.5, or at least approximately 2.6, including ranges between any of the foregoing values.
In some embodiments, the organic crystalline phase may be characterized by a birefringence (An), where n mz^na, ni=n2*ns, ni*n2=n3, or ni=n3*n2, of at least approximately 0.01, e.g., at least approximately 0.01, at least approximately 0.02, at least approximately 0.05, at least approximately 0.1, at least approximately 0.2, at least approximately 0.3, at least approximately 0.4, or at least approximately 0.5, including ranges between any of the foregoing values. In some embodiments, a birefringent organic crystalline phase may be characterized by a birefringence of less than approximately 0.05, e.g., less than approximately 0.05, less than approximately 0.02, less than approximately 0.01, less than approximately 0.005, less than approximately 0.002, or less than approximately 0.001, including ranges between any of the foregoing values.
Particular example organic optically anisotropic materials include anthracene, polycene, triazole, thiophene, as well as derivatives thereof. Example inorganic optically anisotropic materials include SiC>2, TiCh, GazCh, LiNbC , SiC, and ZnS. As used herein, an optically anisotropic material may be characterized by a refractive index difference between at least one pair of principal axes of at least approximately 0.1, e.g., 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, or 0.9, including ranges between any of the foregoing values. An optically isotropic material may be characterized by a refractive index difference between each respective pair of principal axes of less than approximately 0.05, e.g., 0.05, 0.02, 0.01, 0.005, 0.002, or 0.001, including ranges between any of the foregoing values. In particular examples, the principal refractive indices of an optically isotropic material may be equivalent.
Organic solid crystals with high refractive index and birefringence have a unique value proposition for use in diffractive optical elements, such as a planar diffractive waveguide. An example waveguide includes a longitudinally extending high-index optical medium, which is transversely encased by low-index media or cladding. During use, a guided optical wave propagates in the waveguide through the high-index core along the longitudinal direction. In accordance with various embodiments, the high-index core of such a waveguide may be formed from an organic solid crystal (OSC). Such a construction may beneficially impact one or more of the display field of view, uniformity, efficiency, and cost of manufacture.
A grating may overlie a waveguide substrate through which an electromagnetic wave may propagate. According to various embodiments, the waveguide substrate includes or is formed from an organic solid crystal material. In some examples, the substrate may include a single phase OSC material. In some examples, the substrate may include a single organic solid crystal layer or an OSC multilayer. Each OSC layer or other optically anisotropic layer may be characterized by three principal refractive indices, where ni * n2 * n3, ni = n2 * ns, or ni * n2 = ns, or ni=n3*n2. The characteristic refractive indices (m, n2, ns) may be aligned or askew with respect to the principal dimensions of the substrate. The waveguide substrate may include an OSC material with either a fixed optical axis or a spatially varying optical axis.
The grating may include a plurality of raised structures and may constitute a surface relief grating, for example. Example gratings may be configured with a polar angle (0) and an azimuthal angle (q>), where 0 < 0 < 7t and <p (0 < <p < it). As used herein, a grating is an optical element having a periodic structure that is configured to disperse or diffract light into plural component beams. The direction or diffraction angles of the diffracted light may depend on the wavelength of the light incident on the grating, the orientation of the incident light with respect to a grating surface, and the refractive index and spacing between adjacent diffracting elements. In certain embodiments, grating architectures may be tunable along one, two, or three dimensions.
In some embodiments, the substrate and the grating may be formed from an isotropic material and the grating may be backfilled with an anisotropic material such as an organic solid crystal or a liquid crystal. The anisotropic material may be aligned crystallographically with respect to the grating structure, where the extraordinary refractive index of the anisotropic material is matched to the extraordinary refractive index of the grating and an ordinary refractive index of the anisotropic material is less than the ordinary refractive index of the grating. By way of example, a backfilled OSC layer may be crystallized in situ on grating surfaces by controlled cooling from a melt phase or by controlled crystallization from solution. An imposed temperature gradient during cooling or solvent evaporation may be used to control the kinetics of nucleation and growth and the formation of a backfilled OSC layer have a desired crystalline orientation. In further embodiments, the orientation of a backfilled liquid crystal layer may be controlled using an engineered thermal profile during crystallization, optionally in conjunction with the formation of an alignment layer over the grating structure prior to backfilling with the liquid crystal material.
During operation of an AR display, for example, diffractive gratings configured to decouple image light from a waveguide substrate can direct the light toward a user's eyes as well as toward the world side of the display, which may create a socially unacceptable appearance and/or introduce privacy issues where an external audience is able to view content intended only for the user. Notwithstanding recent developments, it would be advantageous to configure an AR display where world side light emission is blocked or even recycled back to the user.
In accordance with some embodiments, an AR display is configured with a reflective element that directs world side emission from the waveguide back to the user. In certain embodiments, a polarization state of light decoupled from the waveguide may be controlled to improve its world side attenuation and improve recycling. Light outcoupled from the waveguide may be linearly polarized or circularly polarized, for example.
In particular embodiments, one or both of the waveguide substrate and the diffractive gratings may include an optically anisotropic (i.e., birefringent) material, such as an organic solid crystal, although further optically anisotropic media are contemplated, including glass, ceramic, and polymer materials. Through interaction with the birefringent material, image light within the waveguide may be polarized along a particular optical axis. Light outcoupled to both the user and the world side retains the induced polarization. In turn, a reflective element located on the world side of the waveguide may be configured to redirect leaked image light back to the user.
Located on the world side of the waveguide, the reflective element may include a partial reflector, a spectral notch reflector, a reflective polarizer, a spectral notch reflective polarizer, etc. In some embodiments, the reflective element may include a notch dichroic mirror and quarter waveplate overlying the dichroic mirror. The reflective element may be configured to direct leaked image light back to a user's eye.
In some embodiments, a partial reflector may be configured to reflect from 5 to 95% of incident light while allowing 95 to 5% of the light to pass. For example, a partial reflector may be configured to reflect 5, 10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75, 80, 85, 90, or 95% of incident light, including ranges between any of the foregoing values.
A spectral notch reflector may be configured to reflect a selected band of wavelengths while allowing non-reflected light to pass. A notch in the reflection spectrum may be created by the interaction of incident light with an engineered layer or coating in the notch reflector, which causes destructive interference at targeted wavelength(s), effectively blocking or reflecting that specific range of light. A spectral notch reflector may be configured to reflect 5, 10, 15, 20, 25, 30, 35, 40, 45, 50, 55, 60, 65, 70, 75, 80, 85, 90, or 95% of incident light, including ranges between any of the foregoing values.
In some embodiments, a reflective polarizer may be configured to selectively reflect light of a particular polarization while transmitting light of the opposite polarization. A spectral notch reflective polarizer may have the combined functionality of a spectral notch filter and a reflective polarizer. A spectral notch reflective polarizer may be configured to reflect light based on both its polarization and wavelength, effectively creating a notch in the reflection spectrum at a specific wavelength or band, while reflecting or transmitting other wavelengths depending on their polarization.
In certain embodiments, a reflective element may include a stacked structure including a notch dichroic mirror and a quarter waveplate. A notch dichroic filter may be configured to selectively block or attenuate light in a narrow wavelength range (the "notch") while allowing light outside of this range to pass. Dichroic filters may be designed to selectively reflect certain wavelengths based on their angle of incidence, while transmitting others.
If incident light is linearly polarized, the quarter waveplate can modify the polarization state, converting it into circular or elliptical polarization depending on the angle of the incident light relative to the waveplate's optical axis. After polarization modification, the light then passes through the notch dichroic filter, which selectively reflects light within a specific wavelength range (the notch) and transmits light outside that range. Since the filter's behavior is generally independent of polarization, the light that passes will be primarily affected by the spectral notch characteristic, blocking specific wavelengths.
The following will provide, with reference to FIGS. 1-8, detailed descriptions of polarized augmented reality (AR) displays. The discussion associated with FIG. 1 includes a description of an example waveguide with grating structures to in-couple and out-couple image light from a waveguide substrate. The discussion associated with FIGS. 2 and 3 includes a description of planar waveguides having light polarizing configurations. In a planar waveguide, electromagnetic waves are confined to propagate along the plane of the waveguide, whereas in non-planar waveguides, the waves may follow a curved path, guided by the shape of the waveguide.
The discussion associated with FIG. 4 includes momentum space renderings depicting the diffraction of image light from birefringent materials or structures. The discussion associated with FIGS. 5 and 6 includes a description of planar waveguides having a reflective optical element for recycling leaked polarized image light. The discussion associated with FIGS. 7 and 8 relates to exemplary virtual reality and augmented reality devices that may include one or more waveguide configurations as disclosed herein.
A schematic view of a display system 100 including a planar waveguide is shown in FIG. 1. An input grating 110 is configured to couple image light into a waveguide substrate
150 and an output grating 120 is configured to couple the internally reflected image light out of the waveguide and toward both a user's eye 130 and the world side 140 of the display system.
Referringto FIGS. 2 and 3, in example systems, one or both of the waveguide substrate
151 and the output grating 121 may include an optically anisotropic (i.e., birefringent) material. A momentum space rendering showing the optical path of image light through exemplary display systems having a birefringent component (waveguide substrate and/or output grating) is shown in FIG. 4. In an example embodiment, an optically anisotropic waveguide substrate may be characterized by refractive indices (nx, ny, nz) where nx = 1.6, ny = 1.6, and nz = 2.4. A field of view reaching an eyebox of the display may be 50° x 50°, for example.
In FIGS. 5 and 6, for the display systems of FIGS. 2 and 3, respectively, shown schematically is the optical path of polarized image light that is leaked toward the world side of the display and redirected by a world side reflective element 500 back to the eye of a user. The reflective element 500 may include a structure selected from a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer.
In certain example embodiments, in the polarized waveguide display of either FIG. 5 or FIG. 6, the reflective element 500 may include a notch dichroic mirror and quarter waveplate (QWP) located on the world side of the waveguide. The display may additionally include a notch reflective polarizer 510 disposed on the user side. The dichroic mirror may be curved or flat and may have matching notch characteristics with the notch reflective polarizer 510. In such display systems, polarized light emitted toward the world side of the display may be redirected by the QWP and dichroic mirror back to the user. Light emitted toward the user side of the display may be initially reflected by the reflective polarizer but subsequently redirected by the QWP and dichroic mirror back to the user.
As will be appreciated, the generation of polarized image light allows for efficient redirection of leaked world side light while maintaining commercially-relevant see-through transmission. Although the waveguides in FIGS. 2, 3, 5, and 6 are depicted as having a planar surface (i.e., abutting the grating structures), non-planar waveguide surfaces are also contemplated where, for example, a curvature of the reflective element may match a curvature of the waveguide.
Example Embodiments
Example 1: A display system includes a waveguide body extending from an input end to an output end and configured to guide light by total internal reflection from the input end to the output end, an in-coupling element located proximate to the input end and configured to direct image light into the waveguide body, an out-coupling element located proximate to the output end and configured to direct image light out of the waveguide body, and a reflective element located on a world side of the waveguide body, where the reflective element is configured to direct image light out-coupled from the waveguide body toward a user's eyes. Example 2: The display system of Example 1, where the waveguide body includes an optically isotropic material.
Example 3: The display system of Example 1, where the waveguide body includes an optically anisotropic material.
Example 4: The display system of any of Examples 1-3, where the waveguide body includes an organic solid crystal.
Example 5: The display system of any of Examples 1-4, where the out-coupling element includes an optically isotropic material.
Example 6: The display system of any of Examples 1-4, where the out-coupling element includes an optically anisotropic material.
Example 7: The display system of any of Examples 1-6, where the out-coupling element includes a surface relief grating.
Example 8: The display system of Example 7, where the out-coupling element includes a structured diffraction grating selected from a binary grating, a slanted grating, and a blazed grating.
Example 9: The display system of any of Examples 1-8, where the reflective element overlies the output end of the waveguide body.
Example 10: The display system of any of Examples 1-9, where the reflective element is spaced away from the out-coupling element.
Example 11: The display system of any of Examples 1-9, where the reflective element contacts the out-coupling element.
Example 12: The display system of any of Examples 1-11, where the reflective element includes an optical element selected from a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer.
Example 13: The display system of any of Examples 1-12, where the reflective element includes a notch dichroic mirror and quarter waveplate overlying the notch dichroic mirror.
Example 14: The display system of Example 13, further including a notch reflective polarizer located on a user side of the waveguide body.
Example 15: A display system includes a waveguide configured to propagate image light therethrough, an out-coupling element disposed over an output region of the waveguide, where the out-coupling element is configured to out-couple the image light from the waveguide, and a reflective element located between the waveguide and a world side of the display system, where the reflective element is configured to direct the out-coupled image light toward an eye of a user.
Example 16: The display system of Example 15, where the waveguide includes an optically anisotropic material.
Example 17: The display system of any of Examples 15 and 16, where the waveguide includes an organic solid crystal.
Example 18: The display system of any of Examples 15-17, where the out-coupling element includes an optically anisotropic material.
Example 19: The display system of any of Examples 15-18, where the reflective element includes an optical element selected from a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer.
Example 20: A display system includes a waveguide configured to propagate image light therethrough, and a reflective element located between the waveguide and a world side of the display system, where the reflective element includes an optical element selected from a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer, and the reflective element is configured to direct image light out-coupled from the waveguide toward an eye of a user.
Embodiments of the present disclosure may include or be implemented in conjunction with various types of artificial-reality systems. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, for example, a virtual reality, an augmented reality, a mixed reality, a hybrid reality, or some combination and/or derivative thereof. Artificial-reality content may include completely computer-generated content or computer-generated content combined with captured (e.g., real-world) content. The artificial-reality content may include video, audio, haptic feedback, or some combination thereof, any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional (3D) effect to the viewer). Additionally, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, for example, create content in an artificial reality and/or are otherwise used in (e.g., to perform activities in) an artificial reality.
Artificial-reality systems may be implemented in a variety of different form factors and configurations. Some artificial-reality systems may be designed to work without near-eye displays (NEDs). Other artificial-reality systems may include an NED that also provides visibility into the real world (e.g., augmented-reality system 700 in FIG. 7) or that visually immerses a user in an artificial reality (e.g., virtual-reality system 800 in FIG. 8). While some artificialreality devices may be self-contained systems, other artificial-reality devices may communicate and/or coordinate with external devices to provide an artificial-reality experience to a user. Examples of such external devices include handheld controllers, mobile devices, desktop computers, devices worn by a user, devices worn by one or more other users, and/or any other suitable external system.
Turning to FIG. 7, augmented-reality system 700 may include an eyewear device 702 with a frame 710 configured to hold a left display device 715(A) and a right display device 715(B) in front of a user's eyes. Display devices 715(A) and 715(B) may act together or independently to present an image or series of images to a user. While augmented-reality system 700 includes two displays, embodiments of this disclosure may be implemented in augmented-reality systems with a single NED or more than two NEDs.
In some embodiments, augmented-reality system 700 may include one or more sensors, such as sensor 740. Sensor 740 may generate measurement signals in response to motion of augmented-reality system 700 and may be located on substantially any portion of frame 710. Sensor 740 may represent a position sensor, an inertial measurement unit (IMU), a depth camera assembly, a structured light emitter and/or detector, or any combination thereof. In some embodiments, augmented-reality system 700 may or may not include sensor 740 or may include more than one sensor. In embodiments in which sensor 740 includes an IMU, the IMU may generate calibration data based on measurement signals from sensor 740. Examples of sensor 740 may include, without limitation, accelerometers, gyroscopes, magnetometers, other suitable types of sensors that detect motion, sensors used for error correction of the IMU, or some combination thereof.
Augmented-reality system 700 may also include a microphone array with a plurality of acoustic transducers 720(A)-720(J), referred to collectively as acoustic transducers 720. Acoustic transducers 720 may be transducers that detect air pressure variations induced by sound waves. Each acoustic transducer 720 may be configured to detect sound and convert the detected sound into an electronic format (e.g., an analog or digital format). The microphone array in FIG. 7 may include, for example, ten acoustic transducers: 720(A) and 720(B), which may be designed to be placed inside a corresponding ear of the user, acoustic transducers 720(C), 720(D), 720(E), 720(F), 720(G), and 720(H), which may be positioned at various locations on frame 710, and/or acoustic transducers 720(1) and 720(J), which may be positioned on a corresponding neckband 705.
In some embodiments, one or more of acoustic transducers 720(A)-(F) may be used as output transducers (e.g., speakers). For example, acoustic transducers 720(A) and/or 720(B) may be earbuds or any other suitable type of headphone or speaker.
The configuration of acoustic transducers 720 of the microphone array may vary. While augmented-reality system 700 is shown in FIG. 7 as having ten acoustic transducers 720, the number of acoustic transducers 720 may be greater or less than ten. In some embodiments, using higher numbers of acoustic transducers 720 may increase the amount of audio information collected and/or the sensitivity and accuracy of the audio information. In contrast, using a lower number of acoustic transducers 720 may decrease the computing power required by an associated controller 750 to process the collected audio information. In addition, the position of each acoustic transducer 720 of the microphone array may vary. For example, the position of an acoustic transducer 720 may include a defined position on the user, a defined coordinate on frame 710, an orientation associated with each acoustic transducer 720, or some combination thereof.
Acoustic transducers 720(A) and 720(B) may be positioned on different parts of the user's ear, such as behind the pinna, behind the tragus, and/or within the auricle or fossa. Or, there may be additional acoustic transducers 720 on or surrounding the ear in addition to acoustic transducers 720 inside the ear canal. Having an acoustic transducer 720 positioned next to an ear canal of a user may enable the microphone array to collect information on how sounds arrive at the ear canal. By positioning at least two of acoustic transducers 720 on either side of a user's head (e.g., as binaural microphones), augmented-reality device 700 may simulate binaural hearing and capture a 3D stereo sound field around about a user's head. In some embodiments, acoustic transducers 720(A) and 720(B) may be connected to augmented-reality system 700 via a wired connection 730, and in other embodiments acoustic transducers 720(A) and 720(B) may be connected to augmented-reality system 700 via a wireless connection (e.g., a Bluetooth connection). In still other embodiments, acoustic transducers 720(A) and 720(B) may not be used at all in conjunction with augmented-reality system 700. Acoustic transducers 720 on frame 710 may be positioned along the length of the temples, across the bridge, above or below display devices 715(A) and 715(B), or some combination thereof. Acoustic transducers 720 may be oriented such that the microphone array is able to detect sounds in a wide range of directions surrounding the user wearing the augmented-reality system 700. In some embodiments, an optimization process may be performed during manufacturing of augmented-reality system 700 to determine relative positioning of each acoustic transducer 720 in the microphone array.
In some examples, augmented-reality system 700 may include or be connected to an external device (e.g., a paired device), such as neckband 705. Neckband 705 generally represents any type or form of paired device. Thus, the following discussion of neckband 705 may also apply to various other paired devices, such as charging cases, smart watches, smart phones, wrist bands, other wearable devices, hand-held controllers, tablet computers, laptop computers, other external compute devices, etc.
As shown, neckband 705 may be coupled to eyewear device 702 via one or more connectors. The connectors may be wired or wireless and may include electrical and/or nonelectrical (e.g., structural) components. In some cases, eyewear device 702 and neckband 705 may operate independently without any wired or wireless connection between them. While FIG. 7 illustrates the components of eyewear device 702 and neckband 705 in example locations on eyewear device 702 and neckband 705, the components may be located elsewhere and/or distributed differently on eyewear device 702 and/or neckband 705. In some embodiments, the components of eyewear device 702 and neckband 705 may be located on one or more additional peripheral devices paired with eyewear device 702, neckband 705, or some combination thereof.
Pairing external devices, such as neckband 705, with augmented-reality eyewear devices may enable the eyewear devices to achieve the form factor of a pair of glasses while still providing sufficient battery and computation power for expanded capabilities. Some or all of the battery power, computational resources, and/or additional features of augmented- reality system 700 may be provided by a paired device or shared between a paired device and an eyewear device, thus reducing the weight, heat profile, and form factor of the eyewear device overall while still retaining desired functionality. For example, neckband 705 may allow components that would otherwise be included on an eyewear device to be included in neckband 705 since users may tolerate a heavier weight load on their shoulders than they would tolerate on their heads. Neckband 705 may also have a larger surface area over which to diffuse and disperse heat to the ambient environment. Thus, neckband 705 may allow for greater battery and computation capacity than might otherwise have been possible on a stand-alone eyewear device. Since weight carried in neckband 705 may be less invasive to a user than weight carried in eyewear device 702, a user may tolerate wearing a lighter eyewear device and carrying or wearing the paired device for greater lengths of time than a user would tolerate wearing a heavy standalone eyewear device, thereby enabling users to more fully incorporate artificial-reality environments into their day-to-day activities.
Neckband 705 may be communicatively coupled with eyewear device 702 and/or to other devices. These other devices may provide certain functions (e.g., tracking, localizing, depth mapping, processing, storage, etc.) to augmented-reality system 700. In the embodiment of FIG. 7, neckband 705 may include two acoustic transducers (e.g., 720(1) and 720(J)) that are part of the microphone array (or potentially form their own microphone subarray). Neckband 705 may also include a controller 725 and a power source 735.
Acoustic transducers 720(1) and 720(J) of neckband 705 may be configured to detect sound and convert the detected sound into an electronic format (analog or digital). In the embodiment of FIG. 7, acoustic transducers 720(1) and 720(J) may be positioned on neckband 705, thereby increasing the distance between the neckband acoustic transducers 720(1) and 720(1) and other acoustic transducers 720 positioned on eyewear device 702. In some cases, increasing the distance between acoustic transducers 720 of the microphone array may improve the accuracy of beamforming performed via the microphone array. For example, if a sound is detected by acoustic transducers 720(C) and 720(D) and the distance between acoustic transducers 720(C) and 720(D) is greater than, e.g., the distance between acoustic transducers 720(D) and 720(E), the determined source location of the detected sound may be more accurate than if the sound had been detected by acoustic transducers 720(D) and 720(E).
Controller 725 of neckband 705 may process information generated by the sensors on neckband 705 and/or augmented-reality system 700. For example, controller 725 may process information from the microphone array that describes sounds detected by the microphone array. Foreach detected sound, controller 725 may perform a direction-of-arrival (DOA) estimation to estimate a direction from which the detected sound arrived at the microphone array. As the microphone array detects sounds, controller 725 may populate an audio data set with the information. In embodiments in which augmented-reality system 700 includes an inertial measurement unit, controller 725 may compute all inertial and spatial calculations from the IMU located on eyewear device 702. A connector may convey information between augmented-reality system 700 and neckband 705 and between augmented-reality system 700 and controller 725. The information may be in the form of optical data, electrical data, wireless data, or any other transmittable data form. Moving the processing of information generated by augmented-reality system 700 to neckband 705 may reduce weight and heat in eyewear device 702, making it more comfortable to the user.
Power source 735 in neckband 705 may provide power to eyewear device 702 and/or to neckband 705. Power source 735 may include, without limitation, lithium ion batteries, lithium-polymer batteries, primary lithium batteries, alkaline batteries, or any other form of power storage. In some cases, power source 735 may be a wired power source. Including power source 735 on neckband 705 instead of on eyewear device 702 may help better distribute the weight and heat generated by power source 735.
As noted, some artificial-reality systems may, instead of blending an artificial reality with actual reality, substantially replace one or more of a user's sensory perceptions of the real world with a virtual experience. One example of this type of system is a head-worn display system, such as virtual-reality system 800 in FIG. 8, that mostly or completely covers a user's field of view. Virtual-reality system 800 may include a front rigid body 802 and a band 804 shaped to fit around a user's head. Virtual-reality system 800 may also include output audio transducers 806(A) and 806(B). Furthermore, while not shown in FIG. 8, front rigid body 802 may include one or more electronic elements, including one or more electronic displays, one or more inertial measurement units (IMUs), one or more tracking emitters or detectors, and/or any other suitable device or system for creating an artificial reality experience.
Artificial-reality systems may include a variety of types of visual feedback mechanisms. For example, display devices in augmented-reality system 700 and/or virtual-reality system 800 may include one or more liquid crystal displays (LCDs), light emitting diode (LED) displays, organic LED (OLED) displays, digital light project (DLP) micro-displays, liquid crystal on silicon (LCoS) micro-displays, and/or any other suitable type of display screen. Artificial-reality systems may include a single display screen for both eyes or may provide a display screen for each eye, which may allow for additional flexibility for varifocal adjustments or for correcting a user's refractive error. Some artificial-reality systems may also include optical subsystems having one or more lenses (e.g., conventional concave or convex lenses, Fresnel lenses, adjustable liquid lenses, etc.) through which a user may view a display screen. These optical subsystems may serve a variety of purposes, including to collimate (e.g., make an object appear at a greater distance than its physical distance), to magnify (e.g., make an object appear larger than its actual size), and/or to relay (to, e.g., the viewer's eyes) light. These optical subsystems may be used in a non-pupil-forming architecture (such as a single lens configuration that directly collimates light but results in so-called pincushion distortion) and/or a pupil-forming architecture (such as a multi-lens configuration that produces so- called barrel distortion to nullify pincushion distortion).
In addition to or instead of using display screens, some artificial-reality systems may include one or more projection systems. For example, display devices in augmented-reality system 700 and/or virtual-reality system 800 may include micro-LED projectors that project light (using, e.g., a waveguide) into display devices, such as clear combiner lenses that allow ambient light to pass through. The display devices may refract the projected light toward a user's pupil and may enable a user to simultaneously view both artificial-reality content and the real world. The display devices may accomplish this using any of a variety of different optical components, including waveguide components (e.g., holographic, planar, diffractive, polarized, and/or reflective waveguide elements), light-manipulation surfaces and elements (such as diffractive, reflective, and refractive elements and gratings), coupling elements, etc. Artificial-reality systems may also be configured with any other suitable type or form of image projection system, such as retinal projectors used in virtual retina displays.
Artificial-reality systems may also include various types of computer vision components and subsystems. For example, augmented-reality system 700 and/or virtual- reality system 800 may include one or more optical sensors, such as two-dimensional (2D) or 3D cameras, structured light transmitters and detectors, time-of-f light depth sensors, singlebeam or sweeping laser rangefinders, 3D LiDAR sensors, and/or any other suitable type or form of optical sensor. An artificial-reality system may process data from one or more of these sensors to identify a location of a user, to map the real world, to provide a user with context about real-world surroundings, and/or to perform a variety of other functions.
Artificial-reality systems may also include one or more input and/or output audio transducers. In the examples shown in FIG. 8, output audio transducers 806(A) and 806(B) may include voice coil speakers, ribbon speakers, electrostatic speakers, piezoelectric speakers, bone conduction transducers, cartilage conduction transducers, tragus-vibration transducers, and/or any other suitable type or form of audio transducer. Similarly, input audio transducers may include condenser microphones, dynamic microphones, ribbon microphones, and/or any other type or form of input transducer. In some embodiments, a single transducer may be used for both audio input and audio output.
While not shown in FIG. 7, artificial-reality systems may include tactile (i.e., haptic) feedback systems, which may be incorporated into headwear, gloves, body suits, handheld controllers, environmental devices (e.g., chairs, floormats, etc.), and/or any other type of device or system. Haptic feedback systems may provide various types of cutaneous feedback, including vibration, force, traction, texture, and/or temperature. Haptic feedback systems may also provide various types of kinesthetic feedback, such as motion and compliance. Haptic feedback may be implemented using motors, piezoelectric actuators, fluidic systems, and/or a variety of other types of feedback mechanisms. Haptic feedback systems may be implemented independent of other artificial-reality devices, within other artificial-reality devices, and/or in conjunction with other artificial-reality devices.
By providing haptic sensations, audible content, and/or visual content, artificial-reality systems may create an entire virtual experience or enhance a user's real-world experience in a variety of contexts and environments. For instance, artificial-reality systems may assist or extend a user's perception, memory, or cognition within a particular environment. Some systems may enhance a user's interactions with other people in the real world or may enable more immersive interactions with other people in a virtual world. Artificial-reality systems may also be used for educational purposes (e.g., for teaching or training in schools, hospitals, government organizations, military organizations, business enterprises, etc.), entertainment purposes (e.g., for playing video games, listening to music, watching video content, etc.), and/or for accessibility purposes (e.g., as hearing aids, visual aids, etc.). The embodiments disclosed herein may enable or enhance a user's artificial-reality experience in one or more of these contexts and environments and/or in other contexts and environments.
The process parameters and sequence of the steps described and/or illustrated herein are given by way of example only and can be varied as desired. For example, while the steps illustrated and/or described herein may be shown or discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed. The various exemplary methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or include additional steps in addition to those disclosed.
The preceding description has been provided to enable others skilled in the art to best utilize various aspects of the exemplary embodiments disclosed herein. This exemplary description is not intended to be exhaustive or to be limited to any precise form disclosed. The embodiments disclosed herein should be considered in all respects illustrative and not restrictive. Reference should be made to the appended claims in determining the scope of the present disclosure.
Unless otherwise noted, the terms "connected to" and "coupled to" (and their derivatives), as used in the specification and claims, are to be construed as permitting both direct and indirect (i.e., via other elements or components) connection. In addition, the terms "a" or "an," as used in the specification and claims, are to be construed as meaning "at least one of." Finally, for ease of use, the terms "including" and "having" (and their derivatives), as used in the specification and claims, are interchangeable with and have the same meaning as the word "comprising."
It will be understood that when an element such as a layer or a region is referred to as being formed on, deposited on, or disposed "on" or "over" another element, it may be located directly on at least a portion of the other element, or one or more intervening elements may also be present. In contrast, when an element is referred to as being "directly on" or "directly over" another element, it may be located on at least a portion of the other element, with no intervening elements present.
As used herein, the term "approximately" in reference to a particular numeric value or range of values may, in certain embodiments, mean and include the stated value as well as all values within 10% of the stated value. Thus, by way of example, reference to the numeric value "50" as "approximately 50" may, in certain embodiments, include values equal to 50±5, i.e., values within the range 45 to 55.
As used herein, the term "substantially" in reference to a given parameter, property, or condition may mean and include to a degree that one of ordinary skill in the art would understand that the given parameter, property, or condition is met with a small degree of variance, such as within acceptable manufacturing tolerances. By way of example, depending on the particular parameter, property, or condition that is substantially met, the parameter, property, or condition may be at least approximately 90% met, at least approximately 95% met, or even at least approximately 99% met. While various features, elements or steps of particular embodiments may be disclosed using the transitional phrase "comprising," it is to be understood that alternative embodiments, including those that may be described using the transitional phrases "consisting of" or "consisting essentially of," are implied. Thus, for example, implied alternative embodiments to a lens that comprises or includes polycarbonate include embodiments where a lens consists essentially of polycarbonate and embodiments where a lens consists of polycarbonate.

Claims

1. A display system comprising: a waveguide body extending from an input end to an output end and configured to guide light by total internal reflection from the input end to the output end; an in-coupling element located proximate to the input end and configured to direct image light into the waveguide body; an out-coupling element located proximate to the output end and configured to direct image light out of the waveguide body; and a reflective element located on a world side of the waveguide body, wherein the reflective element is configured to direct image light out-coupled from the waveguide body toward a user's eyes.
2. The display system of claim 1, wherein the waveguide body comprises: an optically isotropic material, or an optically anisotropic material.
3. The display system of any preceding claim, wherein the waveguide body comprises an organic solid crystal.
4. The display system of any preceding claim, wherein the out-coupling element comprises: an optically isotropic material, or an optically anisotropic material.
5. The display system of any preceding claim, wherein the out-coupling element comprises a surface relief grating, and optionally wherein the out-coupling element comprises a structured diffraction grating selected from the group consisting of binary, slanted, and blazed.
6. The display system of any preceding claim, wherein the reflective element overlies the output end of the waveguide body.
7. The display system of any preceding claim, wherein the reflective element is spaced away from the out-coupling element.
8. The display system of any of claims 1 to 6, wherein the reflective element contacts the out-coupling element.
9. The display system of any preceding claim, wherein the reflective element comprises an optical element selected from the group consisting of a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer.
10. The display system of any preceding claim, wherein the reflective element comprises a notch dichroic mirrorand quarter waveplate overlyingthe notch dichroic mirror, and optionally wherein the display system further comprises a notch reflective polarizer located on a user side of the waveguide body.
11. A display system comprising: a waveguide configured to propagate image light therethrough; an out-coupling element disposed over an output region of the waveguide, wherein the out-coupling element is configured to out-couple the image light from the waveguide; and a reflective element located between the waveguide and a world side of the display system, wherein the reflective element is configured to direct the out-coupled image light toward an eye of a user.
12. The display system of claim 11, wherein the waveguide comprises an optically anisotropic material.
13. The display system of claim 11 or claim 12, wherein the waveguide comprises an organic solid crystal.
14. The display system of any of claims 11 to 13, wherein: the out-coupling element comprises an optically anisotropic material, and/or wherein the reflective element comprises an optical element selected from the group consisting of a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer.
15. A display system comprising: a waveguide configured to propagate image light therethrough; and a reflective element located between the waveguide and a world side of the display system, wherein the reflective element comprises an optical element selected from the group consisting of a partial reflector, a spectral notch reflector, a reflective polarizer, and a spectral notch reflective polarizer, and the reflective element is configured to direct image light out-coupled from the waveguide toward an eye of a user.
PCT/US2025/021905 2024-04-01 2025-03-27 Polarized augmented reality displays Pending WO2025212393A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202463572875P 2024-04-01 2024-04-01
US63/572,875 2024-04-01

Publications (1)

Publication Number Publication Date
WO2025212393A1 true WO2025212393A1 (en) 2025-10-09

Family

ID=95446690

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2025/021905 Pending WO2025212393A1 (en) 2024-04-01 2025-03-27 Polarized augmented reality displays

Country Status (1)

Country Link
WO (1) WO2025212393A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4029662B2 (en) * 2002-05-17 2008-01-09 ソニー株式会社 Image display device
WO2022140763A1 (en) * 2020-12-21 2022-06-30 Digilens Inc. Eye glow suppression in waveguide based displays
US20220221747A1 (en) * 2017-02-23 2022-07-14 Magic Leap, Inc. Display system with variable power reflector
JP2023507052A (en) * 2019-12-18 2023-02-21 メタ プラットフォームズ テクノロジーズ, リミテッド ライアビリティ カンパニー Birefringent polymer-based surface relief gratings

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4029662B2 (en) * 2002-05-17 2008-01-09 ソニー株式会社 Image display device
US20220221747A1 (en) * 2017-02-23 2022-07-14 Magic Leap, Inc. Display system with variable power reflector
JP2023507052A (en) * 2019-12-18 2023-02-21 メタ プラットフォームズ テクノロジーズ, リミテッド ライアビリティ カンパニー Birefringent polymer-based surface relief gratings
WO2022140763A1 (en) * 2020-12-21 2022-06-30 Digilens Inc. Eye glow suppression in waveguide based displays

Similar Documents

Publication Publication Date Title
US20230367041A1 (en) Reflective polarizer coated fresnel lens
WO2025212393A1 (en) Polarized augmented reality displays
WO2025212389A1 (en) Rainbow reduction in augmented reality displays using filled anisotropic gratings
WO2022232675A1 (en) Heat dissipative and lightweight optical elements having increased strength and stiffness
US20250116866A1 (en) Layered kaleido geometric waveguide
US20250085592A1 (en) High-contrast laser-illuminated liquid crystal on silicon display
US20250044519A1 (en) Micro-molded prism geometric waveguide
US20250044520A1 (en) Multi-part geometric waveguide
US20250093563A1 (en) Angular occlusion filter
US20240329290A1 (en) Bilayer binary 2d gratings for waveguide display
Shaw HIGH ASPECT RATIO REACTIVE MESOGENS FOR AR/VR DISPLAYS
US20250291188A1 (en) COMPACT LCoS DISPLAY ENGINE FOR ARTIFICIAL REALITY
US20240295763A1 (en) Advanced optical materials and structures
US20240094552A1 (en) Geometrical waveguide with partial-coverage beam splitters
US20240184136A1 (en) Prescription lenses with gradient-index liquid crystal lens and pancharatnam-berry phase lens
US20240069404A1 (en) Enhanced grin lc lens response time using temperature control
US20250130484A1 (en) Front-lit illumination module
WO2025080449A1 (en) Layered kaleido geometric waveguide
EP4567497A1 (en) Insert-free prescription correction optical module
US20240302578A1 (en) Apparatuses, systems, and methods for variable profile lenses
WO2025054566A1 (en) High-contrast laser-illuminated liquid crystal on silicon display
EP4332640A1 (en) Reflective polarizer with integrated anti-reflective coating
US20250164682A1 (en) Devices and systems for light-recycling waveguides
US20220348748A1 (en) Heat dissipative and lightweight optical elements having increased strength and stiffness
US20240255758A1 (en) High-contrast pancake lens with pass-polarization absorber

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25719567

Country of ref document: EP

Kind code of ref document: A1