[go: up one dir, main page]

US20190212482A1 - Angle selective filter for near eye displays - Google Patents

Angle selective filter for near eye displays Download PDF

Info

Publication number
US20190212482A1
US20190212482A1 US15/867,652 US201815867652A US2019212482A1 US 20190212482 A1 US20190212482 A1 US 20190212482A1 US 201815867652 A US201815867652 A US 201815867652A US 2019212482 A1 US2019212482 A1 US 2019212482A1
Authority
US
United States
Prior art keywords
ned
eye
angle
selective filter
light beams
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/867,652
Inventor
Evan M. Richards
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Platforms Technologies LLC
Original Assignee
Facebook Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Facebook Technologies LLC filed Critical Facebook Technologies LLC
Priority to US15/867,652 priority Critical patent/US20190212482A1/en
Assigned to OCULUS VR, LLC reassignment OCULUS VR, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: RICHARDS, EVAN M.
Priority to US15/884,293 priority patent/US10809429B1/en
Assigned to FACEBOOK TECHNOLOGIES, LLC reassignment FACEBOOK TECHNOLOGIES, LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: OCULUS VR, LLC
Publication of US20190212482A1 publication Critical patent/US20190212482A1/en
Assigned to META PLATFORMS TECHNOLOGIES, LLC reassignment META PLATFORMS TECHNOLOGIES, LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: FACEBOOK TECHNOLOGIES, LLC
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/208Filters for use with infrared or ultraviolet radiation, e.g. for separating visible light from infrared and/or ultraviolet radiation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/02Simple or compound lenses with non-spherical faces
    • G02B3/08Simple or compound lenses with non-spherical faces with discontinuous faces, e.g. Fresnel lens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • G02B2027/012Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility comprising devices for attenuating parasitic image effects
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B2207/00Coding scheme for general features or characteristics of optical elements and systems of subclass G02B, but not including elements and systems which would be classified in G02B6/00 and subgroups
    • G02B2207/123Optical louvre elements, e.g. for directional light blocking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0018Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for preventing ghost images
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/28Interference filters
    • G02B5/281Interference filters designed for the infrared light
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/28Interference filters
    • G02B5/285Interference filters comprising deposited thin solid films

Definitions

  • Embodiments of the present invention relate generally to near eye displays, and, more specifically, to an angle selective filter for reducing stray light in near eye displays.
  • VR virtual reality
  • NEDs near eye displays
  • stereoscopic images can be displayed on an electronic display inside the NED to simulate the illusion of depth.
  • head tracking sensors can be used to estimate what portion of the virtual environment is being viewed by the user.
  • stray light within the NED can interfere with the viewing experience of user of the NED. Stray light can be caused by surface defects, dust, or any other object in the imaging path that may cause light to deviate from the intended imaging path.
  • a Fresnel lens can be used in the NED for increased optical performance.
  • a Fresnel lens in the NED can introduce stray light as a result of the faceted and discontinuous nature of the intended refracting surface of the lens. Additionally, stray light may be caused by unwanted reflections off various optical or mechanical surfaces.
  • Stray light is distracting for the user of the NED and thus breaks VR immersion.
  • stray light can reduce the contrast of an image being viewed by the user and, in some cases, causes glare dots or patterns to become visible on the image.
  • the presence of stray light in the NED thus decreases the quality of the images presented to the user and, consequently, negatively impacts the overall VR viewing experience.
  • the NED includes an electronic display configured to output image light. Further, the NED includes an optical element configured to receive the image light, direct the image light, and form an image at an eye-box.
  • the NED also includes an angle selective filter having a curved surface. The angle selective filter is configured to filter out light beams of the image light with an angle of incidence on the curved surface larger than a cut-off angle of incidence.
  • the angle selective filter blocks stray light beams in the NED from reaching the eye box.
  • the reduction in the stray light beams results in the reduction of glare dots or glare parents on the images generated at the eye.
  • the angle selective filter is configured such the amount of stray light beams that are blocked when the optical axis of the eye aligns with the optical axis of the NED is substantially similar to the amount of stray light beams that are blocked when the optical axis of the eye is not aligned with the optical axis of the NED. In this way, as the eye of the user can rotate around the center of rotation, the images being viewed remain at about a similar level of quality with a similar level of reduction of the stray light beams. Accordingly, the user of the NED has a consistent and comfortable immersion viewing experience while using the NED.
  • FIG. 2A is an illustration of a cross-sectional view of the NED including the optics block, according to an embodiment.
  • FIG. 2B is an illustration of glare on an image resulting from the stray light beams of FIG. 2A , according to an embodiment.
  • FIGS. 3A-3C are illustrations of a cross-sectional view of the NED including the optics block and an angle selective filter configured to block stray light beams, according to various embodiments.
  • FIG. 4 is an illustration of a perspective view of the angle selective filter having a curved surface, according to an embodiment.
  • FIG. 7 is a flow diagram of a method for reducing stray light beams in the NED by using an angle selective filter having a curved surface, according to various embodiments of the present invention.
  • FIG. 8 is a block diagram of an embodiment of a NED system in which a console operates.
  • a near eye display includes an angle selective filter that is configured to filter out light beams of image light.
  • the angle selective filter is configured to allow only light beams having angles of incidences on the filter that are less than a cut-off angle to pass through the filter.
  • the angle selective filter includes a curved surface, for example, a spherical surface, where a center of the spherical surface corresponds to a center of rotation of the eye.
  • the angle selective filter with the curved surface can significantly reduce the stray light beams in the NED, thus reducing the number of glare dots or glare patterns on the images generated at the eye.
  • the angle selective filter is configured such the amount of stray light beams that are blocked when the optical axis of the eye aligns with the optical axis of the NED is substantially similar to the amount of stray light beams that are blocked when the optical axis of the eye is not aligned with the optical axis of the NED. In this way, as the eye of the user can rotate around the center of rotation, the images being viewed remain at about a similar level of quality with a similar level of reduction of the stray light beams.
  • FIG. 1A is a wire diagram of a near eye display (NED) 100 , according to an embodiment.
  • the NED 100 includes a front rigid body 105 and a band 110 .
  • the front rigid body 105 includes one or more electronic display elements of an electronic display (not shown), an IMU 115 , one or more position sensors 120 , and locators 125 .
  • the position sensors 120 are located within the IMU 115 , and neither the IMU 115 nor the position sensors 120 are visible to the user.
  • the IMU 115 , the position sensors 120 , and the locators 125 are discussed in detail below with regard to FIG. 8 . Note in embodiments, where the NED 100 acts as an AR or MR device, portions of the NED 100 and its internal components are at least partially transparent.
  • FIG. 1B is a cross section 160 of the front rigid body 105 of the embodiment of the NED 100 shown in FIG. 1A .
  • the front rigid body 105 includes an electronic display 130 and an optics block 135 that together provide image light to an exit pupil 145 .
  • the exit pupil 145 is the location of the front rigid body 105 where a user's eye 140 is positioned.
  • FIG. 1B shows a cross section 160 associated with a single eye 140 , but another optics block, separate from the optics block 135 , provides altered image light to another eye of the user.
  • the NED 100 includes an eye tracking system (not shown in FIG. 1B ).
  • the eye tracking system may include one or more sources that illuminate one or both eyes of the user.
  • the eye tracking system may also include one or more cameras that capture images of one or both eyes of the user to track the positions of the eyes.
  • the electronic display 130 displays images to the user.
  • the electronic display 130 may comprise a single electronic display or multiple electronic displays (e.g., a display for each eye of a user).
  • Examples of the electronic display 130 include: a liquid crystal display (LCD), an organic light emitting diode (OLED) display, an active-matrix organic light-emitting diode display (AMOLED), a QOLED, a QLED, some other display, or some combination thereof.
  • the optics block 135 adjusts an orientation of image light emitted from the electronic display 130 such that the electronic display 130 appears at particular virtual image distances from the user.
  • the optics block 135 is configured to receive image light emitted from the electronic display 130 and direct the image light to an eye-box associated with the exit pupil 145 .
  • the image light directed to the eye-box forms an image at a retina of the eye 140 .
  • the eye-box is a region defining how much the eye 140 moves up/down/left/right from without significant degradation in the image quality.
  • a field of view (FOV) 150 is the extent of the observable world that is seen by the eye 140 at any given moment.
  • the optics block 135 magnifies received light, corrects optical errors associated with the image light, and presents the corrected image light to the eye 140 of user of the NED 100 .
  • the optics block 135 may include one or more optical elements 155 in optical series.
  • An optical element 155 may be an aperture, a Fresnel lens, a convex lens, a concave lens, a filter, or any other suitable optical element 155 that affects the image light.
  • the optics block 135 may include combinations of different optical elements.
  • one or more of the optical elements in the optics block 135 may have one or more coatings, such as anti-reflective coatings. The optics block 135 is discussed in greater detail in conjunction with FIGS. 2A-7 .
  • FIG. 2A is an illustration of a cross-sectional view of the NED 100 including the optics block 135 , according to an embodiment.
  • the eye 140 of the user can have a field of view 215 .
  • a Fresnel lens 205 included in the optics block 135 can be configured to receive the image light emitted from the electronic display 130 and direct the image light to an eye-box associated with the exit pupil 145 .
  • the Fresnel lens 205 is a compact lens with a rotational symmetric structure having multiple annular segments.
  • the Fresnel lens 205 may be manufactured by segmenting a continuous curved surface of an ordinary lens into a plurality of segments and, after reducing the thickness of each segment, arranging the segments on a surface.
  • the Fresnel lens 205 has a smooth surface on the side facing the exit pupil 145 and a discontinuous and faceted surface on the side facing the electronic display 130 , i.e., the intended refracting surface of the lens.
  • the Fresnel lens 205 can be much thinner relative to a comparable conventional lens, thus allowing a substantial reduction in relative thickness, mass, and volume.
  • the faceted and discontinuous nature of the intended refracting surface of the Fresnel lens 205 can contribute to stray light beams, such as stray light 210 , within the NED 100 .
  • the stray light 210 can cause glare on the image formed at the eye of the user and can reduce the overall image contrast. In general, glare occurs when non-image forming stray light beams are incident on a focal surface.
  • the stray light 210 can be caused by surface defects, dust, or any other object that might cause light to deviate from its intended image path.
  • the Fresnel lens 205 can emit many such non-image forming light beams as a result of its faceted and discontinuous nature of the intended refracting surface of the lens. Therefore, glare generated by stray light can be an issue for the NED 100 including the Fresnel lens 205 .
  • the Fresnel lens 205 can include many annular segments, each of the annular segments can include a refracting optical portion and a non-imaging portion referred to as a back-cut. These back-cuts are deliberate surface defects that enable the Fresnel lens 205 to be made very thin but contribute to the amount of glare in the NED 100 . As shown in FIG.
  • stray light 210 not from the electronic display 130 and incident on the back-cut portion of the Fresnel lens 205 , scatters and falls upon the image formed at the eye 140 as off-field noise.
  • the off-field noise contributes to the glare on the image.
  • FIG. 2B is an illustration of glare on an image resulting from the stray light 210 of FIG. 2A , according to an embodiment.
  • the light beams of the image light exiting the Fresnel lens 205 can include stray light 210 .
  • the stray light corresponds to glare dots 220 on the image, as shown in FIG. 2B .
  • FIGS. 3A-3C are illustrations of a cross-sectional view of the NED 100 including the optics block 135 and an angle selective filter 310 configured to block stray light beams, according to various embodiments.
  • the angle selective filter 310 is disposed between the optics block 135 and the exit pupil 145 of the NED 100 .
  • the angle selective filter 310 is configured to filter out or block light beams of the image light, for example, stray light beams. Each of the blocked light beams has an angle of incidence on the curved surface of the angle selective filter 310 that is larger than or equal to a cut-off angle of incidence. Filtering out stray light beams reduces the glare on the image formed at the eye 140 .
  • the cut-off angle of incidence can be configured to correspond to a visual field of the eye having a field of view (FOV) 315 .
  • the visual field of the eye corresponds to an instantaneous view of the eye.
  • the instantaneous view of the eye e.g. how much a user sees instantaneously looking forward
  • the cut-off angle can be configured to correspond to the set of small viewing angles. Examples of the cut-off angle include 5 degrees, 10 degrees, 15 degrees, 20 degrees, 25 degrees, 30 degrees, etc.
  • the angle selective filter 310 is thus configured to pass light beams incident on the angle selective filter 310 at angles smaller than the set of small viewing angles.
  • the light beams incident on the angle selective filter 310 at angles larger than or equal to the cut-off angle are blocked by the angle selective filter 310 .
  • stray light beams having large angles of incidence are blocked by the angle selective filter 310 . Therefore, the number of stray light beams reaching the eye 140 is significantly reduced when the angle selective filter 310 is disposed between the optics block 135 and the exit pupil 145 .
  • the angle selective filter 310 when the optical axis 320 of the eye 140 is aligned with the optical axis 345 of the NED 100 , light beams corresponding to light 325 emitted from the electronic display 105 pass through the angle selective filter 310 .
  • the angle of incidence of the light beams exiting the optics block 135 and corresponding to the light 325 is the angle 335 between the normal to the tangent plane 330 and the light beams.
  • the angle of incidence of the light beams corresponding to the light 325 is smaller than the cut-off angle of the angle selective filter 310 , and, therefore, the light beams corresponding to the light 325 are allowed to pass through the angle selective filter 310 .
  • the angle selective filter 310 when the optical axis 320 of the eye 140 is aligned with the optical axis 345 of the NED 100 , light beams corresponding to stray light 350 are blocked by the angle selective filter 310 .
  • the angle of incidence of the light beams exiting the optics block 135 and corresponding to the stray light 350 is the angle 355 between the normal to the tangent plane 330 and the light beams.
  • the angle of incidence of the light beams is larger than the cut-off angle of the angle selective filter 310 , and, therefore, the light beams corresponding to the stray light 350 are blocked by the angle selective filter 310 .
  • the angle selective filter 310 can include a spherical surface, where a center of the spherical surface corresponds to a center of rotation 340 of the eye 140 . In this way, for each point on the surface of the angle selective filter 310 , a tangent plane of the point is configured to be normal to an optical axis 320 of the eye 140 when the eye 140 rotates such that the optical axis 320 of the eye 140 aligns with the point.
  • the angle selective filter 310 is configured such that the angle of incidence of light beams incident on the curved surface at that point is equal to an angle between the light beams and the optical axis 320 of the eye. Therefore, the angle selective filter 310 is configured to have substantially the same cut-off angle at each point.
  • the angle selective filter 310 is configured to reduce a similar amount of stray light beams when the eye 140 rotates to change the orientation of the optical axis 320 of the eye 140 .
  • the similar reductions means the difference between the level of reduction of the stray light beams when the eye 140 rotates to different orientations are less than 5%, less than 10%, less than 15%, less than 20%, etc.
  • the images resulting from image light emitted from the optics block 135 have a similar level of quality and the user can have a comfortable immersion viewing experience.
  • the eye 140 rotates to an orientation different than the illustrations in FIGS. 3A and 3B .
  • the optical axis 320 of the eye 140 is thus no longer the same as the optical axis 345 of the NED 100 .
  • the cut-off angle of the angle selective filter 310 at the point on the curved surface where the optical axis 320 intersects is substantially the same as when the optical axis 320 of the eye 140 is the same as the optical axis 345 of the NED 100 .
  • light beams corresponding to light 365 emitted from the electronic display 105 pass through the angle selective filter 310 .
  • the angle of incidence of the light beams exiting the optics block 135 and corresponding to the light 365 is the angle 375 between the normal to the tangent plane 370 and the light beams.
  • the angle of incidence of the light beams corresponding to the light 365 is smaller than the cut-off angle of the angle selective filter 310 , and, therefore, the light beams corresponding to the light 365 are allowed to pass through the angle selective filter 310 .
  • Light beams corresponding to stray light 380 are blocked by the angle selective filter 310 .
  • the angle of incidence of the light beams exiting the optics block 135 and corresponding to the stray light 380 is the angle 385 between the normal to the tangent plane 370 and the angle of the light beams.
  • the angle of incidence of the light beams corresponding to the stray light 380 is larger than the cut-off angle of the angle selective filter 310 , and, therefore, the light beams corresponding to the stray light 380 are blocked by the angle selective filter 310 .
  • FIG. 4 is an illustration of a perspective view of the angle selective filter 310 having a curved surface 405 , according to an embodiment.
  • the angle selective filter 310 can include a thin film multilayer filter.
  • the angle selective filter 310 can be a thin film or laminated filter configured to pass light beams with an angle of incidence less than a cut-off angle.
  • the angle selective filter 310 can include a set of louvers arranged concentrically. For example, a set of louvers could be used in concentric fashion to create the same effect of reducing light beams with an angle of incidence larger than the cut-off angle.
  • the illumination source 505 is configured to emit infrared light beams or near infrared light beams.
  • the camera 510 is configured to receive or detect reflected light beams from the eye 140 .
  • the reflected light beams are received or detected by the camera 519 and analyzed to extract information about eye rotation, for example, from changes in the infrared light beams reflected by the eye 140 .
  • the center of the curved surface of the angle selective filter 310 is adjustable and is determined based on one or more measurements performed by the eye tracking system.
  • the angle selective filter 310 can be configured to pass infrared light beams or near infrared light beams for all or a much wider group of angles than visible light beams.
  • the angle selective filter 310 is configured to filter out light beams of the image light in the visible light wavelength range with an angle of incidence on the curved surface larger than a first cut-off angle of incidence.
  • the angle selective filter 310 can further be configured to allow light beams of the image light in the infrared or near infrared light wavelength range with an angle of incidence on the curved surface smaller than a second cut-off angle of incidence and larger than the first cut-off angle of incidence.
  • FIG. 6 is an illustration of a cross-sectional view of the NED 100 including the optics block 135 and a flat filter 605 , according to an embodiment.
  • the flat filter 605 is configured to filter out light beams of the image light, for example, stray light beams, with an angle of incidence on the surface of the flat filter 605 that is larger than or equal to a cut-off angle of incidence. As shown in FIG. 6 , light beams corresponding to stray light 610 are blocked by the flat filter 605 . In the illustration, the angle of incidence of the light beams exiting the optics block 135 and corresponding to the stray light 610 is larger than the cut-off angle.
  • the NED 805 may be a head-mounted display that presents content to a user.
  • the content may include virtual and/or augmented views of a physical, real-world environment including computer-generated elements (e.g., two-dimensional or three-dimensional images, two-dimensional or three-dimensional video, sound, etc.).
  • the NED 805 may also present audio content to a user.
  • the NED 805 and/or the console 810 may transmit the audio content to an external device via the I/O interface 815 .
  • the external device may include various forms of speaker systems and/or headphones.
  • the audio content is synchronized with visual content being displayed by the NED 805 .
  • the NED 805 may comprise one or more rigid bodies, which may be rigidly or non-rigidly coupled together.
  • a rigid coupling between rigid bodies causes the coupled rigid bodies to act as a single rigid entity.
  • a non-rigid coupling between rigid bodies allows the rigid bodies to move relative to each other.
  • the NED 805 may include a depth camera assembly (DCA) 820 , a display 825 , an optical assembly 830 , one or more position sensors 835 , an inertial measurement unit (IMU) 840 , an eye tracking system 845 , and a varifocal module 850 .
  • the display 825 and the optical assembly 830 can be integrated together into a projection assembly.
  • Various embodiments of the NED 805 may have additional, fewer, or different components than those listed above. Additionally, the functionality of each component may be partially or completely encompassed by the functionality of one or more other components in various embodiments.
  • the DCA 820 captures sensor data describing depth information of an area surrounding the NED 805 .
  • the sensor data may be generated by one or a combination of depth imaging techniques, such as triangulation, structured light imaging, time-of-flight imaging, laser scan, and so forth.
  • the DCA 820 can compute various depth properties of the area surrounding the NED 805 using the sensor data. Additionally or alternatively, the DCA 820 may transmit the sensor data to the console 810 for processing.
  • the DCA 820 includes an illumination source, an imaging device, and a controller.
  • the illumination source emits light onto an area surrounding the NED 805 .
  • the emitted light is structured light.
  • the illumination source includes a plurality of emitters that each emits light having certain characteristics (e.g., wavelength, polarization, coherence, temporal behavior, etc.). The characteristics may be the same or different between emitters, and the emitters can be operated simultaneously or individually.
  • the plurality of emitters could be, e.g., laser diodes (such as edge emitters), inorganic or organic light-emitting diodes (LEDs), a vertical-cavity surface-emitting laser (VCSEL), or some other source.
  • a single emitter or a plurality of emitters in the illumination source can emit light having a structured light pattern.
  • the imaging device captures ambient light in the environment surrounding NED 805 , in addition to light reflected off of objects in the environment that is generated by the plurality of emitters.
  • the imaging device may be an infrared camera or a camera configured to operate in a visible spectrum.
  • the controller coordinates how the illumination source emits light and how the imaging device captures light. For example, the controller may determine a brightness of the emitted light. In some embodiments, the controller also analyzes detected light to detect objects in the environment and position information related to those objects.
  • the display 825 displays two-dimensional or three-dimensional images to the user in accordance with pixel data received from the console 810 .
  • the display 825 comprises a single display or multiple displays (e.g., separate displays for each eye of a user).
  • the display 825 comprises a single or multiple waveguide displays.
  • Light can be coupled into the single or multiple waveguide displays via, e.g., a liquid crystal display (LCD), an organic light emitting diode (OLED) display, an inorganic light emitting diode (ILED) display, an active-matrix organic light-emitting diode (AMOLED) display, a transparent organic light emitting diode (TOLED) display, a laser-based display, one or more waveguides, other types of displays, a scanner, a one-dimensional array, and so forth.
  • LCD liquid crystal display
  • OLED organic light emitting diode
  • ILED inorganic light emitting diode
  • AMOLED active-matrix organic light-emitting diode
  • TOLED transparent organic light emitting diode
  • laser-based display e.g., a laser-based display, one or more waveguides, other types of displays, a scanner, a one-dimensional array, and so forth.
  • combinations of the displays types may be incorporated in display 825 and
  • the optical assembly 830 magnifies image light received from the display 825 , corrects optical errors associated with the image light, and presents the corrected image light to a user of the NED 805 .
  • the optical assembly 830 includes a plurality of optical elements. For example, one or more of the following optical elements may be included in the optical assembly 830 : an aperture, a Fresnel lens, a convex lens, a concave lens, a filter, a reflecting surface, or any other suitable optical element that deflects, reflects, refracts, and/or in some way alters image light.
  • the optical assembly 830 may include combinations of different optical elements.
  • one or more of the optical elements in the optical assembly 830 may have one or more coatings, such as partially reflective or antireflective coatings.
  • the optical assembly 830 can be integrated into a projection assembly, e.g., a projection assembly.
  • the optical assembly 830 includes the optics block 135 , the angle selective filter 310 , and/or the flat filter 605 .
  • the optical assembly 830 magnifies and focuses image light generated by the display 825 .
  • the optical assembly 830 enables the display 825 to be physically smaller, weigh less, and consume less power than displays that do not use the optical assembly 830 .
  • magnification may increase the field of view of the content presented by the display 825 .
  • the field of view of the displayed content partially or completely uses a user's field of view.
  • the field of view of a displayed image may meet or exceed 810 degrees.
  • the amount of magnification may be adjusted by adding or removing optical elements.
  • the optical assembly 830 may be designed to correct one or more types of optical errors.
  • optical errors include barrel or pincushion distortions, longitudinal chromatic aberrations, or transverse chromatic aberrations.
  • Other types of optical errors may further include spherical aberrations, chromatic aberrations or errors due to the lens field curvature, astigmatisms, in addition to other types of optical errors.
  • visual content transmitted to the display 825 is pre-distorted, and the optical assembly 830 corrects the distortion as image light from the display 825 passes through various optical elements of the optical assembly 830 .
  • optical elements of the optical assembly 830 are integrated into the display 825 as a projection assembly that includes at least one waveguide coupled with one or more optical elements.
  • the IMU 840 is an electronic device that generates data indicating a position of the NED 805 based on measurement signals received from one or more of the position sensors 835 and from depth information received from the DCA 820 .
  • the IMU 840 may be a dedicated hardware component.
  • the IMU 840 may be a software component implemented in one or more processors.
  • the IMU 840 is the same component as the IMU 115 of FIG. 1A and the position sensors 835 are the same components as the position sensors 120 .
  • a position sensor 835 In operation, a position sensor 835 generates one or more measurement signals in response to a motion of the NED 805 .
  • position sensors 835 include: one or more accelerometers, one or more gyroscopes, one or more magnetometers, one or more altimeters, one or more inclinometers, and/or various types of sensors for motion detection, drift detection, and/or error detection.
  • the position sensors 835 may be located external to the IMU 840 , internal to the IMU 840 , or some combination thereof.
  • the IMU 840 Based on the one or more measurement signals from one or more position sensors 835 , the IMU 840 generates data indicating an estimated current position of the NED 805 relative to an initial position of the NED 805 .
  • the position sensors 835 include multiple accelerometers to measure translational motion (forward/back, up/down, left/right) and multiple gyroscopes to measure rotational motion (e.g., pitch, yaw, and roll).
  • the IMU 840 rapidly samples the measurement signals and calculates the estimated current position of the NED 805 from the sampled data.
  • the IMU 840 integrates the measurement signals received from the accelerometers over time to estimate a velocity vector and integrates the velocity vector over time to determine an estimated current position of a reference point on the NED 805 .
  • the IMU 840 provides the sampled measurement signals to the console 810 , which analyzes the sample data to determine one or more measurement errors.
  • the console 810 may further transmit one or more of control signals and/or measurement errors to the IMU 840 to configure the IMU 840 to correct and/or reduce one or more measurement errors (e.g., drift errors).
  • the reference point is a point that may be used to describe the position of the NED 805 .
  • the reference point may generally be defined as a point in space or a position related to a position and/or orientation of the NED 805 .
  • the IMU 840 receives one or more parameters from the console 810 .
  • the one or more parameters are used to maintain tracking of the NED 805 .
  • the IMU 840 may adjust one or more IMU parameters (e.g., a sample rate).
  • certain parameters cause the IMU 840 to update an initial position of the reference point so that it corresponds to a next position of the reference point. Updating the initial position of the reference point as the next calibrated position of the reference point helps reduce drift errors in detecting a current position estimate of the IMU 840 .
  • the eye tracking system 845 is integrated into the NED 805 .
  • the eye-tracking system 845 may comprise one or more illumination sources and an imaging device (camera).
  • the eye tracking system 845 generates and analyzes tracking data related to a user's eyes as the user wears the NED 805 .
  • the eye tracking system 845 may further generate eye tracking information that may comprise information about a position of the user's eye, i.e., information about an angle of an eye-gaze.
  • the varifocal module 850 is further integrated into the NED 805 .
  • the varifocal module 850 may be communicatively coupled to the eye tracking system 845 in order to enable the varifocal module 850 to receive eye tracking information from the eye tracking system 845 .
  • the varifocal module 850 may further modify the focus of image light emitted from the display 825 based on the eye tracking information received from the eye tracking system 845 . Accordingly, the varifocal module 850 can reduce vergence-accommodation conflict that may be produced as the user's eyes resolve the image light.
  • the varifocal module 850 can be interfaced (e.g., either mechanically or electrically) with at least one optical element of the optical assembly 830 .
  • the varifocal module 850 may adjust the position and/or orientation of one or more optical elements in the optical assembly 830 in order to adjust the virtual image projected by the optical assembly 830 .
  • the varifocal module 850 may use eye tracking information obtained from the eye tracking system 845 to determine how to adjust one or more optical elements in the optical assembly 830 .
  • the varifocal module 850 may perform foveated rendering of the image light based on the eye tracking information obtained from the eye tracking system 845 in order to adjust the resolution of the image light emitted by the display 825 . In this case, the varifocal module 850 configures the display 825 to display a high pixel density in a foveal region of the user's eye-gaze and a low pixel density in other regions of the user's eye-gaze.
  • the I/O interface 815 facilitates the transfer of action requests from a user to the console 810 .
  • the I/O interface 815 facilitates the transfer of device feedback from the console 810 to the user.
  • An action request is a request to perform a particular action.
  • an action request may be an instruction to start or end capture of image or video data or an instruction to perform a particular action within an application, such as pausing video playback, increasing or decreasing the volume of audio playback, and so forth.
  • the I/O interface 815 may include one or more input devices.
  • Example input devices include: a keyboard, a mouse, a game controller, a joystick, and/or any other suitable device for receiving action requests and communicating the action requests to the console 810 .
  • the I/O interface 815 includes an IMU 840 that captures calibration data indicating an estimated current position of the I/O interface 815 relative to an initial position of the I/O interface 815 .
  • the I/O interface 815 receives action requests from the user and transmits those action requests to the console 810 . Responsive to receiving the action request, the console 810 performs a corresponding action. For example, responsive to receiving an action request, console 810 may configure I/O interface 815 to emit haptic feedback onto an arm of the user. For example, console 815 may configure I/O interface 815 to deliver haptic feedback to a user when an action request is received. Additionally or alternatively, the console 810 may configure the I/O interface 815 to generate haptic feedback when the console 810 performs an action, responsive to receiving an action request.
  • the console 810 provides content to the NED 805 for processing in accordance with information received from one or more of: the DCA 820 , the NED 805 , and the I/O interface 815 .
  • the console 810 includes an application store 855 , a tracking module 860 , and an engine 865 .
  • the console 810 may have additional, fewer, or different modules and/or components than those described in conjunction with FIG. 8 .
  • the functions further described below may be distributed among components of the console 810 in a different manner than described in conjunction with FIG. 8 .
  • the application store 855 stores one or more applications for execution by the console 810 .
  • An application is a group of instructions that, when executed by a processor, performs a particular set of functions, such as generating content for presentation to the user. For example, an application may generate content in response to receiving inputs from a user (e.g., via movement of the NED 805 as the user moves his/her head, via the I/O interface 815 , etc.). Examples of applications include: gaming applications, conferencing applications, video playback applications, or other suitable applications.
  • the tracking module 860 calibrates the NED system 800 using one or more calibration parameters.
  • the tracking module 860 may further adjust one or more calibration parameters to reduce error in determining a position and/or orientation of the NED 805 or the I/O interface 815 .
  • the tracking module 860 may transmit a calibration parameter to the DCA 820 in order to adjust the focus of the DCA 820 .
  • the DCA 820 may more accurately determine positions of structured light elements reflecting off of objects in the environment.
  • the tracking module 860 may also analyze sensor data generated by the IMU 840 in determining various calibration parameters to modify.
  • the tracking module 860 may re-calibrate some or all of the components in the NED system 800 . For example, if the DCA 820 loses line of sight of at least a threshold number of structured light elements projected onto the user's eye, the tracking module 860 may transmit calibration parameters to the varifocal module 850 in order to re-establish eye tracking.
  • the tracking module 860 tracks the movements of the NED 805 and/or of the I/O interface 815 using information from the DCA 820 , the one or more position sensors 835 , the IMU 840 or some combination thereof. For example, the tracking module 860 may determine a reference position of the NED 805 from a mapping of an area local to the NED 805 . The tracking module 860 may generate this mapping based on information received from the NED 805 itself. The tracking module 860 may also utilize sensor data from the IMU 840 and/or depth data from the DCA 820 to determine references positions for the NED 805 and/or I/O interface 815 . In various embodiments, the tracking module 860 generates an estimation and/or prediction for a subsequent position of the NED 805 and/or the I/O interface 815 . The tracking module 860 may transmit the predicted subsequent position to the engine 865 .
  • the engine 865 generates a three-dimensional mapping of the area surrounding the NED 805 (i.e., the “local area”) based on information received from the NED 805 .
  • the engine 865 determines depth information for the three-dimensional mapping of the local area based on depth data received from the DCA 820 (e.g., depth information of objects in the local area).
  • the engine 865 calculates a depth and/or position of the NED 805 by using depth data generated by the DCA 820 .
  • the engine 865 may implement various techniques for calculating the depth and/or position of the NED 805 , such as stereo based techniques, structured light illumination techniques, time-of-flight techniques, and so forth.
  • the engine 865 uses depth data received from the DCA 820 to update a model of the local area and to generate and/or modify media content based in part on the updated model.
  • the engine 865 also executes applications within the NED system 800 and receives position information, acceleration information, velocity information, predicted future positions, or some combination thereof, of the NED 805 from the tracking module 860 . Based on the received information, the engine 865 determines various forms of media content to transmit to the NED 805 for presentation to the user. For example, if the received information indicates that the user has looked to the left, the engine 865 generates media content for the NED 805 that mirrors the user's movement in a virtual environment or in an environment augmenting the local area with additional media content. Accordingly, the engine 865 may generate and/or modify media content (e.g., visual and/or audio content) for presentation to the user. The engine 865 may further transmit the media content to the NED 805 .
  • media content e.g., visual and/or audio content
  • the engine 865 may perform an action within an application executing on the console 810 .
  • the engine 805 may further provide feedback when the action is performed.
  • the engine 865 may configure the NED 805 to generate visual and/or audio feedback and/or the I/O interface 815 to generate haptic feedback to the user.
  • the engine 865 determines a resolution of the media content provided to the NED 805 for presentation to the user on the display 825 .
  • the engine 865 may adjust a resolution of the visual content provided to the NED 805 by configuring the display 825 to perform foveated rendering of the visual content, based at least in part on a direction of the user's gaze received from the eye tracking system 845 .
  • the engine 865 provides the content to the NED 805 having a high resolution on the display 825 in a foveal region of the user's gaze and a low resolution in other regions, thereby reducing the power consumption of the NED 805 .
  • the engine 865 can further use the eye tracking information to adjust a focus of the image light emitted from the display 825 in order to reduce vergence-accommodation conflicts.
  • the NED 800 may be part of, e.g., a virtual reality (VR) system, an augmented reality (AR) system, a mixed reality (MR) system, or some combination thereof.
  • the NED may also be referred to as a head-mounted display (HMD).
  • the projection assembly of the NED includes a source assembly, a waveguide, a main optic, and an optional focusing element.
  • the source assembly generates image light that is coupled into the waveguide.
  • the image light is expanded in at least one dimension and out-coupled from the waveguide.
  • the focusing element e.g., liquid crystal lens
  • the focusing element may be located between the waveguide and the main optic.
  • the focusing element can, e.g., add or subtract optical power to adjust focus of the image light.
  • the main optic receives light from a local area surrounding the NED and combines that light with the image light received either directly from the waveguide or from the focusing element. The combined light is provided to an eye-box of a user.
  • a near eye display comprises an electronic display configured to output image light, an optical element configured to direct a plurality of light beams associated with the image light to an eye-box, and an angle selective filter comprising a curved surface, the angle selective filter configured to filter out one or more light beams of the plurality of light beams, each of the one or more light beams having an angle of incidence on the curved surface that is larger than a first cut-off angle of incidence.
  • the eye tracking system comprises a light source in the near infrared wavelength range, wherein the angle selective filter is further configured to allow a light beam in the near infrared wavelength range with an angle of incidence on the curved surface larger than the first cut-off angle of incidence.
  • a near eye display comprises an electronic display configured to output image light, a Fresnel lens configured to direct a plurality of light beams associated with the image light to an eye-box, wherein the plurality of light beams exiting the Fresnel lens includes one or more stray light beams, and an angle selective filter comprising a curved surface, the angle selective filter configured to filter out at least one of the one or more stray light beams, the at least one of the one or more stray light beams having an angle of incidence on the curved surface that is larger than a first cut-off angle of incidence.
  • a method comprises receiving, from an electronic display, image light associated with a content scene, filtering out, by an angle selective filter, one or more light beams of the image light having an angle of incidence larger than a first cut-off angle of incidence, and forming an image of the content scene based on filtered light beams exiting the angle selective filter.
  • the angle selective filter comprises a spherical surface, and a center of the spherical surface corresponds to a center of rotation of an eye.
  • Embodiments of the disclosure may also relate to a product that is produced by a computing process described herein.
  • a product may comprise information resulting from a computing process, where the information is stored on a non-transitory, tangible computer readable storage medium and may include any embodiment of a computer program product or other data combination described herein.
  • aspects of the present embodiments may be embodied as a system, method or computer program product. Accordingly, aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a ““module” or “system.” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
  • the computer readable medium may be a computer readable signal medium or a computer readable storage medium.
  • a computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the Figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Health & Medical Sciences (AREA)
  • Toxicology (AREA)

Abstract

One embodiment sets forth a near eye display (NED). The NED includes an electronic display configured to output image light to an optical element. The optical element is configured to receive the image light, direct the image light, and form an image at the eye. The NED also includes an angle selective filter having a curved surface. The angle selective filter is configured to filter out light beams of light exiting the optical element and having an angle of incidence on the curved surface larger than a cut-off angle of incidence.

Description

    BACKGROUND OF THE INVENTION Field of the Invention
  • Embodiments of the present invention relate generally to near eye displays, and, more specifically, to an angle selective filter for reducing stray light in near eye displays.
  • Description of the Related Art
  • Virtual reality (VR) near eye displays (NEDs) can be used to simulate virtual environments. For example, stereoscopic images can be displayed on an electronic display inside the NED to simulate the illusion of depth. Further, head tracking sensors can be used to estimate what portion of the virtual environment is being viewed by the user.
  • For NEDs, stray light within the NED can interfere with the viewing experience of user of the NED. Stray light can be caused by surface defects, dust, or any other object in the imaging path that may cause light to deviate from the intended imaging path. For example, a Fresnel lens can be used in the NED for increased optical performance. A Fresnel lens in the NED, however, can introduce stray light as a result of the faceted and discontinuous nature of the intended refracting surface of the lens. Additionally, stray light may be caused by unwanted reflections off various optical or mechanical surfaces.
  • Stray light is distracting for the user of the NED and thus breaks VR immersion. In addition, stray light can reduce the contrast of an image being viewed by the user and, in some cases, causes glare dots or patterns to become visible on the image. The presence of stray light in the NED thus decreases the quality of the images presented to the user and, consequently, negatively impacts the overall VR viewing experience.
  • As the foregoing illustrates, what is needed in the art is a technique for reducing stray light in NEDs.
  • SUMMARY OF THE INVENTION
  • One embodiment of the present invention sets forth a near eye display (NED). The NED includes an electronic display configured to output image light. Further, the NED includes an optical element configured to receive the image light, direct the image light, and form an image at an eye-box. The NED also includes an angle selective filter having a curved surface. The angle selective filter is configured to filter out light beams of the image light with an angle of incidence on the curved surface larger than a cut-off angle of incidence.
  • One advantage of the disclosed techniques is that the angle selective filter blocks stray light beams in the NED from reaching the eye box. The reduction in the stray light beams results in the reduction of glare dots or glare parents on the images generated at the eye. Further, the angle selective filter is configured such the amount of stray light beams that are blocked when the optical axis of the eye aligns with the optical axis of the NED is substantially similar to the amount of stray light beams that are blocked when the optical axis of the eye is not aligned with the optical axis of the NED. In this way, as the eye of the user can rotate around the center of rotation, the images being viewed remain at about a similar level of quality with a similar level of reduction of the stray light beams. Accordingly, the user of the NED has a consistent and comfortable immersion viewing experience while using the NED.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • So that the manner in which the above recited features of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, for the invention may admit to other equally effective embodiments.
  • FIG. 1A is a wire diagram of a near eye display (NED), according to an embodiment.
  • FIG. 1B is a cross section of the front rigid body of the embodiment of the NED shown in FIG. 1A.
  • FIG. 2A is an illustration of a cross-sectional view of the NED including the optics block, according to an embodiment.
  • FIG. 2B is an illustration of glare on an image resulting from the stray light beams of FIG. 2A, according to an embodiment.
  • FIGS. 3A-3C are illustrations of a cross-sectional view of the NED including the optics block and an angle selective filter configured to block stray light beams, according to various embodiments.
  • FIG. 4 is an illustration of a perspective view of the angle selective filter having a curved surface, according to an embodiment.
  • FIG. 5 is an illustration of a cross-sectional view of the NED including the optics block, the angle selective filter, and an eye tracking system, according to an embodiment.
  • FIG. 6 is an illustration of a cross-sectional view of the NED including the optics block and a flat filter, according to an embodiment.
  • FIG. 7 is a flow diagram of a method for reducing stray light beams in the NED by using an angle selective filter having a curved surface, according to various embodiments of the present invention.
  • FIG. 8 is a block diagram of an embodiment of a NED system in which a console operates.
  • DETAILED DESCRIPTION
  • In the following description, numerous specific details are set forth to provide a more thorough understanding of the present invention. However, it will be apparent to one of skilled in the art that the present invention may be practiced without one or more of these specific details.
  • Configuration Overview
  • A near eye display (NED) includes an angle selective filter that is configured to filter out light beams of image light. The angle selective filter is configured to allow only light beams having angles of incidences on the filter that are less than a cut-off angle to pass through the filter. Thus, light beams having angles of incidences on the filter that are larger than the cut-off angle are filtered out and do not reach the eye of the user of the NED. The angle selective filter includes a curved surface, for example, a spherical surface, where a center of the spherical surface corresponds to a center of rotation of the eye. Thus, when the eye rotates such that the optical axis of the eye aligns with a given point on the spherical surface, a tangent plane of the point on the spherical surface is normal to the optical axis of the eye.
  • Advantageously, the angle selective filter with the curved surface can significantly reduce the stray light beams in the NED, thus reducing the number of glare dots or glare patterns on the images generated at the eye. More specifically, the angle selective filter is configured such the amount of stray light beams that are blocked when the optical axis of the eye aligns with the optical axis of the NED is substantially similar to the amount of stray light beams that are blocked when the optical axis of the eye is not aligned with the optical axis of the NED. In this way, as the eye of the user can rotate around the center of rotation, the images being viewed remain at about a similar level of quality with a similar level of reduction of the stray light beams.
  • System Overview
  • FIG. 1A is a wire diagram of a near eye display (NED) 100, according to an embodiment. The NED 100 includes a front rigid body 105 and a band 110. The front rigid body 105 includes one or more electronic display elements of an electronic display (not shown), an IMU 115, one or more position sensors 120, and locators 125. In the embodiment shown by FIG. 1A, the position sensors 120 are located within the IMU 115, and neither the IMU 115 nor the position sensors 120 are visible to the user. The IMU 115, the position sensors 120, and the locators 125 are discussed in detail below with regard to FIG. 8. Note in embodiments, where the NED 100 acts as an AR or MR device, portions of the NED 100 and its internal components are at least partially transparent.
  • FIG. 1B is a cross section 160 of the front rigid body 105 of the embodiment of the NED 100 shown in FIG. 1A. As shown in FIG. 1B, the front rigid body 105 includes an electronic display 130 and an optics block 135 that together provide image light to an exit pupil 145. The exit pupil 145 is the location of the front rigid body 105 where a user's eye 140 is positioned. For purposes of illustration, FIG. 1B shows a cross section 160 associated with a single eye 140, but another optics block, separate from the optics block 135, provides altered image light to another eye of the user. Additionally, the NED 100 includes an eye tracking system (not shown in FIG. 1B). The eye tracking system may include one or more sources that illuminate one or both eyes of the user. The eye tracking system may also include one or more cameras that capture images of one or both eyes of the user to track the positions of the eyes.
  • The electronic display 130 displays images to the user. In various embodiments, the electronic display 130 may comprise a single electronic display or multiple electronic displays (e.g., a display for each eye of a user). Examples of the electronic display 130 include: a liquid crystal display (LCD), an organic light emitting diode (OLED) display, an active-matrix organic light-emitting diode display (AMOLED), a QOLED, a QLED, some other display, or some combination thereof.
  • The optics block 135 adjusts an orientation of image light emitted from the electronic display 130 such that the electronic display 130 appears at particular virtual image distances from the user. The optics block 135 is configured to receive image light emitted from the electronic display 130 and direct the image light to an eye-box associated with the exit pupil 145. The image light directed to the eye-box forms an image at a retina of the eye 140. The eye-box is a region defining how much the eye 140 moves up/down/left/right from without significant degradation in the image quality. As shown in FIG. 1B, a field of view (FOV) 150 is the extent of the observable world that is seen by the eye 140 at any given moment.
  • Additionally, in some embodiments, the optics block 135 magnifies received light, corrects optical errors associated with the image light, and presents the corrected image light to the eye 140 of user of the NED 100. The optics block 135 may include one or more optical elements 155 in optical series. An optical element 155 may be an aperture, a Fresnel lens, a convex lens, a concave lens, a filter, or any other suitable optical element 155 that affects the image light. Moreover, the optics block 135 may include combinations of different optical elements. In some embodiments, one or more of the optical elements in the optics block 135 may have one or more coatings, such as anti-reflective coatings. The optics block 135 is discussed in greater detail in conjunction with FIGS. 2A-7.
  • Filtering Stray Light in a Near Eye Display
  • FIG. 2A is an illustration of a cross-sectional view of the NED 100 including the optics block 135, according to an embodiment. The eye 140 of the user can have a field of view 215.
  • As shown in FIG. 2A, a Fresnel lens 205 included in the optics block 135 can be configured to receive the image light emitted from the electronic display 130 and direct the image light to an eye-box associated with the exit pupil 145. The Fresnel lens 205 is a compact lens with a rotational symmetric structure having multiple annular segments. In one embodiment, the Fresnel lens 205 may be manufactured by segmenting a continuous curved surface of an ordinary lens into a plurality of segments and, after reducing the thickness of each segment, arranging the segments on a surface. As shown in FIG. 2A, the Fresnel lens 205 has a smooth surface on the side facing the exit pupil 145 and a discontinuous and faceted surface on the side facing the electronic display 130, i.e., the intended refracting surface of the lens.
  • The Fresnel lens 205 can be much thinner relative to a comparable conventional lens, thus allowing a substantial reduction in relative thickness, mass, and volume. For a VR NED system, it is advantageous to use the Fresnel lens 205 in order to make the optics block 135 thinner and lighter. The faceted and discontinuous nature of the intended refracting surface of the Fresnel lens 205, however, can contribute to stray light beams, such as stray light 210, within the NED 100. The stray light 210 can cause glare on the image formed at the eye of the user and can reduce the overall image contrast. In general, glare occurs when non-image forming stray light beams are incident on a focal surface. The stray light 210 can be caused by surface defects, dust, or any other object that might cause light to deviate from its intended image path.
  • The Fresnel lens 205 can emit many such non-image forming light beams as a result of its faceted and discontinuous nature of the intended refracting surface of the lens. Therefore, glare generated by stray light can be an issue for the NED 100 including the Fresnel lens 205. In particular and as discussed above, the Fresnel lens 205 can include many annular segments, each of the annular segments can include a refracting optical portion and a non-imaging portion referred to as a back-cut. These back-cuts are deliberate surface defects that enable the Fresnel lens 205 to be made very thin but contribute to the amount of glare in the NED 100. As shown in FIG. 2A, stray light 210, not from the electronic display 130 and incident on the back-cut portion of the Fresnel lens 205, scatters and falls upon the image formed at the eye 140 as off-field noise. The off-field noise contributes to the glare on the image.
  • FIG. 2B is an illustration of glare on an image resulting from the stray light 210 of FIG. 2A, according to an embodiment. As discussed above, the light beams of the image light exiting the Fresnel lens 205 can include stray light 210. The stray light corresponds to glare dots 220 on the image, as shown in FIG. 2B.
  • FIGS. 3A-3C are illustrations of a cross-sectional view of the NED 100 including the optics block 135 and an angle selective filter 310 configured to block stray light beams, according to various embodiments. The angle selective filter 310 is disposed between the optics block 135 and the exit pupil 145 of the NED 100.
  • The angle selective filter 310 is configured to filter out or block light beams of the image light, for example, stray light beams. Each of the blocked light beams has an angle of incidence on the curved surface of the angle selective filter 310 that is larger than or equal to a cut-off angle of incidence. Filtering out stray light beams reduces the glare on the image formed at the eye 140.
  • The cut-off angle of incidence can be configured to correspond to a visual field of the eye having a field of view (FOV) 315. In some embodiments, the visual field of the eye corresponds to an instantaneous view of the eye. For example, the instantaneous view of the eye (e.g. how much a user sees instantaneously looking forward) may correspond to a set of small viewing angles. The cut-off angle can be configured to correspond to the set of small viewing angles. Examples of the cut-off angle include 5 degrees, 10 degrees, 15 degrees, 20 degrees, 25 degrees, 30 degrees, etc. The angle selective filter 310 is thus configured to pass light beams incident on the angle selective filter 310 at angles smaller than the set of small viewing angles. The light beams incident on the angle selective filter 310 at angles larger than or equal to the cut-off angle are blocked by the angle selective filter 310. In particular, stray light beams having large angles of incidence are blocked by the angle selective filter 310. Therefore, the number of stray light beams reaching the eye 140 is significantly reduced when the angle selective filter 310 is disposed between the optics block 135 and the exit pupil 145.
  • As shown in FIG. 3A, when the optical axis 320 of the eye 140 is aligned with the optical axis 345 of the NED 100, light beams corresponding to light 325 emitted from the electronic display 105 pass through the angle selective filter 310. In the illustration of FIG. 3A, the angle of incidence of the light beams exiting the optics block 135 and corresponding to the light 325 is the angle 335 between the normal to the tangent plane 330 and the light beams. In the illustration, the angle of incidence of the light beams corresponding to the light 325 is smaller than the cut-off angle of the angle selective filter 310, and, therefore, the light beams corresponding to the light 325 are allowed to pass through the angle selective filter 310.
  • As shown in FIG. 3B, when the optical axis 320 of the eye 140 is aligned with the optical axis 345 of the NED 100, light beams corresponding to stray light 350 are blocked by the angle selective filter 310. In the illustration of FIG. 3B, the angle of incidence of the light beams exiting the optics block 135 and corresponding to the stray light 350 is the angle 355 between the normal to the tangent plane 330 and the light beams. In the illustration, the angle of incidence of the light beams is larger than the cut-off angle of the angle selective filter 310, and, therefore, the light beams corresponding to the stray light 350 are blocked by the angle selective filter 310.
  • The eye 140 rotates roughly around the center of rotation 340. When the eye 140 rotates to view a content scene off-axis of the optical axis 345 of NED 100, the optical axis 320 of eye changes orientation. In some embodiments, the angle selective filter 310 can include a spherical surface, where a center of the spherical surface corresponds to a center of rotation 340 of the eye 140. In this way, for each point on the surface of the angle selective filter 310, a tangent plane of the point is configured to be normal to an optical axis 320 of the eye 140 when the eye 140 rotates such that the optical axis 320 of the eye 140 aligns with the point. Consequently, for each point on the surface of the angle selective filter 310, when the optical axis 320 of the eye 140 is aligned with the point, the angle selective filter 310 is configured such that the angle of incidence of light beams incident on the curved surface at that point is equal to an angle between the light beams and the optical axis 320 of the eye. Therefore, the angle selective filter 310 is configured to have substantially the same cut-off angle at each point.
  • Accordingly, the angle selective filter 310 is configured to reduce a similar amount of stray light beams when the eye 140 rotates to change the orientation of the optical axis 320 of the eye 140. The similar reductions means the difference between the level of reduction of the stray light beams when the eye 140 rotates to different orientations are less than 5%, less than 10%, less than 15%, less than 20%, etc. Advantageously, as the eye 140 rotates around the center of rotation 340, the images resulting from image light emitted from the optics block 135 have a similar level of quality and the user can have a comfortable immersion viewing experience.
  • As shown in FIG. 3C, the eye 140 rotates to an orientation different than the illustrations in FIGS. 3A and 3B. The optical axis 320 of the eye 140 is thus no longer the same as the optical axis 345 of the NED 100. As discussed above, when the eye 140 rotates to the new orientation, the cut-off angle of the angle selective filter 310 at the point on the curved surface where the optical axis 320 intersects is substantially the same as when the optical axis 320 of the eye 140 is the same as the optical axis 345 of the NED 100.
  • In the illustration of FIG. 3C, light beams corresponding to light 365 emitted from the electronic display 105 pass through the angle selective filter 310. The angle of incidence of the light beams exiting the optics block 135 and corresponding to the light 365 is the angle 375 between the normal to the tangent plane 370 and the light beams. In the illustration, the angle of incidence of the light beams corresponding to the light 365 is smaller than the cut-off angle of the angle selective filter 310, and, therefore, the light beams corresponding to the light 365 are allowed to pass through the angle selective filter 310.
  • Light beams corresponding to stray light 380 are blocked by the angle selective filter 310. In the illustration of FIG. 3C, the angle of incidence of the light beams exiting the optics block 135 and corresponding to the stray light 380 is the angle 385 between the normal to the tangent plane 370 and the angle of the light beams. In the illustration, the angle of incidence of the light beams corresponding to the stray light 380 is larger than the cut-off angle of the angle selective filter 310, and, therefore, the light beams corresponding to the stray light 380 are blocked by the angle selective filter 310.
  • FIG. 4 is an illustration of a perspective view of the angle selective filter 310 having a curved surface 405, according to an embodiment. In some embodiments, the angle selective filter 310 can include a thin film multilayer filter. For example, the angle selective filter 310 can be a thin film or laminated filter configured to pass light beams with an angle of incidence less than a cut-off angle. In some embodiments, the angle selective filter 310 can include a set of louvers arranged concentrically. For example, a set of louvers could be used in concentric fashion to create the same effect of reducing light beams with an angle of incidence larger than the cut-off angle.
  • FIG. 5 is an illustration of a cross-sectional view of the NED 100 including the optics block 135, the angle selective filter 310, and an eye tracking system, according to an embodiment. In some embodiments, the eye tracking system includes an illumination source 505 and a camera 510.
  • The illumination source 505 is configured to emit infrared light beams or near infrared light beams. The camera 510 is configured to receive or detect reflected light beams from the eye 140. The reflected light beams are received or detected by the camera 519 and analyzed to extract information about eye rotation, for example, from changes in the infrared light beams reflected by the eye 140. In one embodiment, the center of the curved surface of the angle selective filter 310 is adjustable and is determined based on one or more measurements performed by the eye tracking system.
  • Further, in the case of eye tracking, the angle selective filter 310 can be configured to pass infrared light beams or near infrared light beams for all or a much wider group of angles than visible light beams. As discussed above, the angle selective filter 310 is configured to filter out light beams of the image light in the visible light wavelength range with an angle of incidence on the curved surface larger than a first cut-off angle of incidence. The angle selective filter 310 can further be configured to allow light beams of the image light in the infrared or near infrared light wavelength range with an angle of incidence on the curved surface smaller than a second cut-off angle of incidence and larger than the first cut-off angle of incidence.
  • FIG. 6 is an illustration of a cross-sectional view of the NED 100 including the optics block 135 and a flat filter 605, according to an embodiment. The flat filter 605 is configured to filter out light beams of the image light, for example, stray light beams, with an angle of incidence on the surface of the flat filter 605 that is larger than or equal to a cut-off angle of incidence. As shown in FIG. 6, light beams corresponding to stray light 610 are blocked by the flat filter 605. In the illustration, the angle of incidence of the light beams exiting the optics block 135 and corresponding to the stray light 610 is larger than the cut-off angle.
  • FIG. 7 is a flow diagram of a method 700 for reducing stray light beams in the NED 100 by using an angle selective filter having a curved surface, according to various embodiments of the present invention. Although the method steps are described in conjunction with the systems of FIGS. 1-6 and 8, persons skilled in the art will understand that any system configured to perform the method steps, in any order, is within the scope of the present invention.
  • The method 700 begins at step 705, where the optics block 135 receives image light of a content scene from an electronic display 130 of the NED 100. At step 710, the optics block 135 directs the image light to the eye-box associated with the exit pupil of the NED 100. At step 715, the angle selective filter 310, disposed between the optics block 135 and the exit pupil 145, filters out light beams of the image light having an angle of incidence larger than a cut-off angle of incidence. The filtered image light forms an image of the content scene at the eye of a user of the NED 100.
  • Example Architecture of a Near Eye Display System
  • FIG. 8 is a block diagram of an embodiment of a near eye display (NED) system 800 in which a console 810 operates. The NED system 800 may operate in a virtual reality (VR) system environment, an augmented reality (AR) system environment, a mixed reality (MR) system environment, or some combination thereof. The NED system 800 shown in FIG. 8 comprises a NED 805 and an input/output (I/O) interface 815 that is coupled to the console 810. In one embodiment, the NED 805 may be the NED 100 discussed above in conjunction with FIGS. 1-7.
  • While FIG. 8 shows an example NED system 800 including one NED 805 and one I/O interface 815, in other embodiments any number of these components may be included in the NED system 800. For example, there may be multiple NEDs 805 that each has an associated I/O interface 815, where each NED 805 and I/O interface 815 communicates with the console 810. In alternative configurations, different and/or additional components may be included in the NED system 800. Additionally, various components included within the NED 805, the console 810, and the I/O interface 815 may be distributed in a different manner than is described in conjunction with FIG. 8 in some embodiments. For example, some or all of the functionality of the console 810 may be provided by the NED 805.
  • The NED 805 may be a head-mounted display that presents content to a user. The content may include virtual and/or augmented views of a physical, real-world environment including computer-generated elements (e.g., two-dimensional or three-dimensional images, two-dimensional or three-dimensional video, sound, etc.). In some embodiments, the NED 805 may also present audio content to a user. The NED 805 and/or the console 810 may transmit the audio content to an external device via the I/O interface 815. The external device may include various forms of speaker systems and/or headphones. In various embodiments, the audio content is synchronized with visual content being displayed by the NED 805.
  • The NED 805 may comprise one or more rigid bodies, which may be rigidly or non-rigidly coupled together. A rigid coupling between rigid bodies causes the coupled rigid bodies to act as a single rigid entity. In contrast, a non-rigid coupling between rigid bodies allows the rigid bodies to move relative to each other.
  • As shown in FIG. 8, the NED 805 may include a depth camera assembly (DCA) 820, a display 825, an optical assembly 830, one or more position sensors 835, an inertial measurement unit (IMU) 840, an eye tracking system 845, and a varifocal module 850. In some embodiments, the display 825 and the optical assembly 830 can be integrated together into a projection assembly. Various embodiments of the NED 805 may have additional, fewer, or different components than those listed above. Additionally, the functionality of each component may be partially or completely encompassed by the functionality of one or more other components in various embodiments.
  • The DCA 820 captures sensor data describing depth information of an area surrounding the NED 805. The sensor data may be generated by one or a combination of depth imaging techniques, such as triangulation, structured light imaging, time-of-flight imaging, laser scan, and so forth. The DCA 820 can compute various depth properties of the area surrounding the NED 805 using the sensor data. Additionally or alternatively, the DCA 820 may transmit the sensor data to the console 810 for processing.
  • The DCA 820 includes an illumination source, an imaging device, and a controller. The illumination source emits light onto an area surrounding the NED 805. In an embodiment, the emitted light is structured light. The illumination source includes a plurality of emitters that each emits light having certain characteristics (e.g., wavelength, polarization, coherence, temporal behavior, etc.). The characteristics may be the same or different between emitters, and the emitters can be operated simultaneously or individually. In one embodiment, the plurality of emitters could be, e.g., laser diodes (such as edge emitters), inorganic or organic light-emitting diodes (LEDs), a vertical-cavity surface-emitting laser (VCSEL), or some other source. In some embodiments, a single emitter or a plurality of emitters in the illumination source can emit light having a structured light pattern. The imaging device captures ambient light in the environment surrounding NED 805, in addition to light reflected off of objects in the environment that is generated by the plurality of emitters. In various embodiments, the imaging device may be an infrared camera or a camera configured to operate in a visible spectrum. The controller coordinates how the illumination source emits light and how the imaging device captures light. For example, the controller may determine a brightness of the emitted light. In some embodiments, the controller also analyzes detected light to detect objects in the environment and position information related to those objects.
  • The display 825 displays two-dimensional or three-dimensional images to the user in accordance with pixel data received from the console 810. In various embodiments, the display 825 comprises a single display or multiple displays (e.g., separate displays for each eye of a user). In some embodiments, the display 825 comprises a single or multiple waveguide displays. Light can be coupled into the single or multiple waveguide displays via, e.g., a liquid crystal display (LCD), an organic light emitting diode (OLED) display, an inorganic light emitting diode (ILED) display, an active-matrix organic light-emitting diode (AMOLED) display, a transparent organic light emitting diode (TOLED) display, a laser-based display, one or more waveguides, other types of displays, a scanner, a one-dimensional array, and so forth. In addition, combinations of the displays types may be incorporated in display 825 and used separately, in parallel, and/or in combination.
  • The optical assembly 830 magnifies image light received from the display 825, corrects optical errors associated with the image light, and presents the corrected image light to a user of the NED 805. The optical assembly 830 includes a plurality of optical elements. For example, one or more of the following optical elements may be included in the optical assembly 830: an aperture, a Fresnel lens, a convex lens, a concave lens, a filter, a reflecting surface, or any other suitable optical element that deflects, reflects, refracts, and/or in some way alters image light. Moreover, the optical assembly 830 may include combinations of different optical elements. In some embodiments, one or more of the optical elements in the optical assembly 830 may have one or more coatings, such as partially reflective or antireflective coatings. The optical assembly 830 can be integrated into a projection assembly, e.g., a projection assembly. In one embodiment, the optical assembly 830 includes the optics block 135, the angle selective filter 310, and/or the flat filter 605.
  • In operation, the optical assembly 830 magnifies and focuses image light generated by the display 825. In so doing, the optical assembly 830 enables the display 825 to be physically smaller, weigh less, and consume less power than displays that do not use the optical assembly 830. Additionally, magnification may increase the field of view of the content presented by the display 825. For example, in some embodiments, the field of view of the displayed content partially or completely uses a user's field of view. For example, the field of view of a displayed image may meet or exceed 810 degrees. In various embodiments, the amount of magnification may be adjusted by adding or removing optical elements.
  • In some embodiments, the optical assembly 830 may be designed to correct one or more types of optical errors. Examples of optical errors include barrel or pincushion distortions, longitudinal chromatic aberrations, or transverse chromatic aberrations. Other types of optical errors may further include spherical aberrations, chromatic aberrations or errors due to the lens field curvature, astigmatisms, in addition to other types of optical errors. In some embodiments, visual content transmitted to the display 825 is pre-distorted, and the optical assembly 830 corrects the distortion as image light from the display 825 passes through various optical elements of the optical assembly 830. In some embodiments, optical elements of the optical assembly 830 are integrated into the display 825 as a projection assembly that includes at least one waveguide coupled with one or more optical elements.
  • The IMU 840 is an electronic device that generates data indicating a position of the NED 805 based on measurement signals received from one or more of the position sensors 835 and from depth information received from the DCA 820. In some embodiments of the NED 805, the IMU 840 may be a dedicated hardware component. In other embodiments, the IMU 840 may be a software component implemented in one or more processors. In one embodiment, the IMU 840 is the same component as the IMU 115 of FIG. 1A and the position sensors 835 are the same components as the position sensors 120.
  • In operation, a position sensor 835 generates one or more measurement signals in response to a motion of the NED 805. Examples of position sensors 835 include: one or more accelerometers, one or more gyroscopes, one or more magnetometers, one or more altimeters, one or more inclinometers, and/or various types of sensors for motion detection, drift detection, and/or error detection. The position sensors 835 may be located external to the IMU 840, internal to the IMU 840, or some combination thereof.
  • Based on the one or more measurement signals from one or more position sensors 835, the IMU 840 generates data indicating an estimated current position of the NED 805 relative to an initial position of the NED 805. For example, the position sensors 835 include multiple accelerometers to measure translational motion (forward/back, up/down, left/right) and multiple gyroscopes to measure rotational motion (e.g., pitch, yaw, and roll). In some embodiments, the IMU 840 rapidly samples the measurement signals and calculates the estimated current position of the NED 805 from the sampled data. For example, the IMU 840 integrates the measurement signals received from the accelerometers over time to estimate a velocity vector and integrates the velocity vector over time to determine an estimated current position of a reference point on the NED 805. Alternatively, the IMU 840 provides the sampled measurement signals to the console 810, which analyzes the sample data to determine one or more measurement errors. The console 810 may further transmit one or more of control signals and/or measurement errors to the IMU 840 to configure the IMU 840 to correct and/or reduce one or more measurement errors (e.g., drift errors). The reference point is a point that may be used to describe the position of the NED 805. The reference point may generally be defined as a point in space or a position related to a position and/or orientation of the NED 805.
  • In various embodiments, the IMU 840 receives one or more parameters from the console 810. The one or more parameters are used to maintain tracking of the NED 805. Based on a received parameter, the IMU 840 may adjust one or more IMU parameters (e.g., a sample rate). In some embodiments, certain parameters cause the IMU 840 to update an initial position of the reference point so that it corresponds to a next position of the reference point. Updating the initial position of the reference point as the next calibrated position of the reference point helps reduce drift errors in detecting a current position estimate of the IMU 840.
  • In some embodiments, the eye tracking system 845 is integrated into the NED 805. The eye-tracking system 845 may comprise one or more illumination sources and an imaging device (camera). In operation, the eye tracking system 845 generates and analyzes tracking data related to a user's eyes as the user wears the NED 805. The eye tracking system 845 may further generate eye tracking information that may comprise information about a position of the user's eye, i.e., information about an angle of an eye-gaze.
  • In some embodiments, the varifocal module 850 is further integrated into the NED 805. The varifocal module 850 may be communicatively coupled to the eye tracking system 845 in order to enable the varifocal module 850 to receive eye tracking information from the eye tracking system 845. The varifocal module 850 may further modify the focus of image light emitted from the display 825 based on the eye tracking information received from the eye tracking system 845. Accordingly, the varifocal module 850 can reduce vergence-accommodation conflict that may be produced as the user's eyes resolve the image light. In various embodiments, the varifocal module 850 can be interfaced (e.g., either mechanically or electrically) with at least one optical element of the optical assembly 830.
  • In operation, the varifocal module 850 may adjust the position and/or orientation of one or more optical elements in the optical assembly 830 in order to adjust the virtual image projected by the optical assembly 830. In various embodiments, the varifocal module 850 may use eye tracking information obtained from the eye tracking system 845 to determine how to adjust one or more optical elements in the optical assembly 830. In some embodiments, the varifocal module 850 may perform foveated rendering of the image light based on the eye tracking information obtained from the eye tracking system 845 in order to adjust the resolution of the image light emitted by the display 825. In this case, the varifocal module 850 configures the display 825 to display a high pixel density in a foveal region of the user's eye-gaze and a low pixel density in other regions of the user's eye-gaze.
  • The I/O interface 815 facilitates the transfer of action requests from a user to the console 810. In addition, the I/O interface 815 facilitates the transfer of device feedback from the console 810 to the user. An action request is a request to perform a particular action. For example, an action request may be an instruction to start or end capture of image or video data or an instruction to perform a particular action within an application, such as pausing video playback, increasing or decreasing the volume of audio playback, and so forth. In various embodiments, the I/O interface 815 may include one or more input devices. Example input devices include: a keyboard, a mouse, a game controller, a joystick, and/or any other suitable device for receiving action requests and communicating the action requests to the console 810. In some embodiments, the I/O interface 815 includes an IMU 840 that captures calibration data indicating an estimated current position of the I/O interface 815 relative to an initial position of the I/O interface 815.
  • In operation, the I/O interface 815 receives action requests from the user and transmits those action requests to the console 810. Responsive to receiving the action request, the console 810 performs a corresponding action. For example, responsive to receiving an action request, console 810 may configure I/O interface 815 to emit haptic feedback onto an arm of the user. For example, console 815 may configure I/O interface 815 to deliver haptic feedback to a user when an action request is received. Additionally or alternatively, the console 810 may configure the I/O interface 815 to generate haptic feedback when the console 810 performs an action, responsive to receiving an action request.
  • The console 810 provides content to the NED 805 for processing in accordance with information received from one or more of: the DCA 820, the NED 805, and the I/O interface 815. In the embodiment shown in FIG. 8, the console 810 includes an application store 855, a tracking module 860, and an engine 865. In some embodiments, the console 810 may have additional, fewer, or different modules and/or components than those described in conjunction with FIG. 8. Similarly, the functions further described below may be distributed among components of the console 810 in a different manner than described in conjunction with FIG. 8.
  • The application store 855 stores one or more applications for execution by the console 810. An application is a group of instructions that, when executed by a processor, performs a particular set of functions, such as generating content for presentation to the user. For example, an application may generate content in response to receiving inputs from a user (e.g., via movement of the NED 805 as the user moves his/her head, via the I/O interface 815, etc.). Examples of applications include: gaming applications, conferencing applications, video playback applications, or other suitable applications.
  • The tracking module 860 calibrates the NED system 800 using one or more calibration parameters. The tracking module 860 may further adjust one or more calibration parameters to reduce error in determining a position and/or orientation of the NED 805 or the I/O interface 815. For example, the tracking module 860 may transmit a calibration parameter to the DCA 820 in order to adjust the focus of the DCA 820. Accordingly, the DCA 820 may more accurately determine positions of structured light elements reflecting off of objects in the environment. The tracking module 860 may also analyze sensor data generated by the IMU 840 in determining various calibration parameters to modify. Further, in some embodiments, if the NED 805 loses tracking of the user's eye, then the tracking module 860 may re-calibrate some or all of the components in the NED system 800. For example, if the DCA 820 loses line of sight of at least a threshold number of structured light elements projected onto the user's eye, the tracking module 860 may transmit calibration parameters to the varifocal module 850 in order to re-establish eye tracking.
  • The tracking module 860 tracks the movements of the NED 805 and/or of the I/O interface 815 using information from the DCA 820, the one or more position sensors 835, the IMU 840 or some combination thereof. For example, the tracking module 860 may determine a reference position of the NED 805 from a mapping of an area local to the NED 805. The tracking module 860 may generate this mapping based on information received from the NED 805 itself. The tracking module 860 may also utilize sensor data from the IMU 840 and/or depth data from the DCA 820 to determine references positions for the NED 805 and/or I/O interface 815. In various embodiments, the tracking module 860 generates an estimation and/or prediction for a subsequent position of the NED 805 and/or the I/O interface 815. The tracking module 860 may transmit the predicted subsequent position to the engine 865.
  • The engine 865 generates a three-dimensional mapping of the area surrounding the NED 805 (i.e., the “local area”) based on information received from the NED 805. In some embodiments, the engine 865 determines depth information for the three-dimensional mapping of the local area based on depth data received from the DCA 820 (e.g., depth information of objects in the local area). In some embodiments, the engine 865 calculates a depth and/or position of the NED 805 by using depth data generated by the DCA 820. In particular, the engine 865 may implement various techniques for calculating the depth and/or position of the NED 805, such as stereo based techniques, structured light illumination techniques, time-of-flight techniques, and so forth. In various embodiments, the engine 865 uses depth data received from the DCA 820 to update a model of the local area and to generate and/or modify media content based in part on the updated model.
  • The engine 865 also executes applications within the NED system 800 and receives position information, acceleration information, velocity information, predicted future positions, or some combination thereof, of the NED 805 from the tracking module 860. Based on the received information, the engine 865 determines various forms of media content to transmit to the NED 805 for presentation to the user. For example, if the received information indicates that the user has looked to the left, the engine 865 generates media content for the NED 805 that mirrors the user's movement in a virtual environment or in an environment augmenting the local area with additional media content. Accordingly, the engine 865 may generate and/or modify media content (e.g., visual and/or audio content) for presentation to the user. The engine 865 may further transmit the media content to the NED 805. Additionally, in response to receiving an action request from the I/O interface 815, the engine 865 may perform an action within an application executing on the console 810. The engine 805 may further provide feedback when the action is performed. For example, the engine 865 may configure the NED 805 to generate visual and/or audio feedback and/or the I/O interface 815 to generate haptic feedback to the user.
  • In some embodiments, based on the eye tracking information (e.g., orientation of the user's eye) received from the eye tracking system 845, the engine 865 determines a resolution of the media content provided to the NED 805 for presentation to the user on the display 825. The engine 865 may adjust a resolution of the visual content provided to the NED 805 by configuring the display 825 to perform foveated rendering of the visual content, based at least in part on a direction of the user's gaze received from the eye tracking system 845. The engine 865 provides the content to the NED 805 having a high resolution on the display 825 in a foveal region of the user's gaze and a low resolution in other regions, thereby reducing the power consumption of the NED 805. In addition, using foveated rendering reduces a number of computing cycles used in rendering visual content without compromising the quality of the user's visual experience. In some embodiments, the engine 865 can further use the eye tracking information to adjust a focus of the image light emitted from the display 825 in order to reduce vergence-accommodation conflicts.
  • A projection assembly integrated into a near-eye-display (NED) is presented herein. The NED 800 may be part of, e.g., a virtual reality (VR) system, an augmented reality (AR) system, a mixed reality (MR) system, or some combination thereof. The NED may also be referred to as a head-mounted display (HMD). In some embodiments, the projection assembly of the NED includes a source assembly, a waveguide, a main optic, and an optional focusing element. The source assembly generates image light that is coupled into the waveguide. The image light is expanded in at least one dimension and out-coupled from the waveguide. The focusing element (e.g., liquid crystal lens) may be located between the waveguide and the main optic. The focusing element can, e.g., add or subtract optical power to adjust focus of the image light. The main optic receives light from a local area surrounding the NED and combines that light with the image light received either directly from the waveguide or from the focusing element. The combined light is provided to an eye-box of a user.
  • 1. In some embodiments, a near eye display (NED) comprises an electronic display configured to output image light, an optical element configured to direct a plurality of light beams associated with the image light to an eye-box, and an angle selective filter comprising a curved surface, the angle selective filter configured to filter out one or more light beams of the plurality of light beams, each of the one or more light beams having an angle of incidence on the curved surface that is larger than a first cut-off angle of incidence.
  • 2. The NED of clause 1, wherein the curved surface comprises a spherical surface.
  • 3. The NED of clauses 1 or 2, wherein a center of the spherical surface corresponds to a center of rotation of an eye.
  • 4. The NED of any of clauses 1-3, wherein, for each point on the curved surface, a tangent plane of the point is configured to be normal to an optical axis of an eye when the optical axis of the eye aligns with the point.
  • 5. The NED of any of clauses 1-4, wherein the angle selective filter comprises a thin film multilayer filter.
  • 6. The NED of any of clauses 1-5, wherein the angle selective filter comprises a set of louvers arranged concentrically.
  • 7. The NED of any of clauses 1-6, wherein the one or more light beams include at least one stray light beam.
  • 8. The NED of any of clauses 1-7, wherein the first cut-off angle of incidence is between a range comprising 45-55 degrees.
  • 9. The NED of any of clauses 1-8, wherein the angle selective filter is disposed between the optical element and an exit pupil of the NED.
  • 10. The NED of any of clauses 1-9, wherein the first cut-off angle of incidence corresponds to a visual field of an eye of a user of the NED.
  • 11. The NED of any of clauses 1-10, further comprising an eye tracking system, the eye tracking system comprises a light source in the near infrared wavelength range, wherein the angle selective filter is further configured to allow a light beam in the near infrared wavelength range with an angle of incidence on the curved surface larger than the first cut-off angle of incidence.
  • 12. The NED of any of clauses 1-11, wherein a center of the curved surface is adjustable and determined based on a measurement from the eye tracking system.
  • 13. In some embodiments, a near eye display (NED) comprises an electronic display configured to output image light, a Fresnel lens configured to direct a plurality of light beams associated with the image light to an eye-box, wherein the plurality of light beams exiting the Fresnel lens includes one or more stray light beams, and an angle selective filter comprising a curved surface, the angle selective filter configured to filter out at least one of the one or more stray light beams, the at least one of the one or more stray light beams having an angle of incidence on the curved surface that is larger than a first cut-off angle of incidence.
  • 14. The NED of clause 13, wherein the curved surface comprises a spherical surface.
  • 15. The NED of clauses 13 or 14, wherein a center of the spherical surface corresponds to a center of rotation of an eye.
  • 16. The NED of any of clauses 13-15, wherein, for each point on the curved surface, a tangent plane of the point is configured to be normal to an optical axis of an eye when the optical axis of the eye aligns with the point.
  • 17. The NED of any of clauses 13-16, wherein the angle selective filter is disposed between the Fresnel lens and an exit pupil of the NED.
  • 18. In some embodiments, a method comprises receiving, from an electronic display, image light associated with a content scene, filtering out, by an angle selective filter, one or more light beams of the image light having an angle of incidence larger than a first cut-off angle of incidence, and forming an image of the content scene based on filtered light beams exiting the angle selective filter.
  • 19. The method of clause 18, wherein the angle selective filter comprises a spherical surface, and a center of the spherical surface corresponds to a center of rotation of an eye.
  • 20. The method of clauses 18 or 19, wherein, for each point on the curved surface, a tangent plane of the point is configured to be normal to an optical axis of an eye when the optical axis of the eye aligns with the point.
  • Any and all combinations of any of the claim elements recited in any of the claims and/or any elements described in this application, in any fashion, fall within the contemplated scope of the present invention and protection.
  • The foregoing description of the embodiments of the disclosure has been presented for the purpose of illustration; it is not intended to be exhaustive or to limit the disclosure to the precise forms disclosed. Persons skilled in the relevant art can appreciate that many modifications and variations are possible in light of the above disclosure.
  • Some portions of this description describe the embodiments of the disclosure in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are commonly used by those skilled in the data processing arts to convey the substance of their work effectively to others skilled in the art. These operations, while described functionally, computationally, or logically, are understood to be implemented by computer programs or equivalent electrical circuits, microcode, or the like. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules, without loss of generality. The described operations and their associated modules may be embodied in software, firmware, hardware, or any combinations thereof.
  • Any of the steps, operations, or processes described herein may be performed or implemented with one or more hardware or software modules, alone or in combination with other devices. In one embodiment, a software module is implemented with a computer program product comprising a computer-readable medium containing computer program code, which can be executed by a computer processor for performing any or all of the steps, operations, or processes described.
  • Embodiments of the disclosure may also relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, and/or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a non-transitory, tangible computer readable storage medium, or any type of media suitable for storing electronic instructions, which may be coupled to a computer system bus. Furthermore, any computing systems referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
  • Embodiments of the disclosure may also relate to a product that is produced by a computing process described herein. Such a product may comprise information resulting from a computing process, where the information is stored on a non-transitory, tangible computer readable storage medium and may include any embodiment of a computer program product or other data combination described herein.
  • Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the disclosure be limited not by this detailed description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of the embodiments is intended to be illustrative, but not limiting, of the scope of the disclosure, which is set forth in the following claims.
  • The descriptions of the various embodiments have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments.
  • Aspects of the present embodiments may be embodied as a system, method or computer program product. Accordingly, aspects of the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a ““module” or “system.” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
  • Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Aspects of the present disclosure are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine. The instructions, when executed via the processor of the computer or other programmable data processing apparatus, enable the implementation of the functions/acts specified in the flowchart and/or block diagram block or blocks. Such processors may be, without limitation, general purpose processors, special-purpose processors, application-specific processors, or field-programmable gate arrays.
  • The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the Figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
  • While the preceding is directed to embodiments of the present disclosure, other and further embodiments of the disclosure may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.

Claims (20)

What is claimed is:
1. A near eye display (NED) comprising:
an electronic display configured to output image light;
an optical element configured to direct a plurality of light beams associated with the image light to an eye-box; and
an angle selective filter comprising a curved surface, the angle selective filter configured to filter out one or more light beams of the plurality of light beams, each of the one or more light beams having an angle of incidence on the curved surface that is larger than a first cut-off angle of incidence.
2. The NED of claim 1, wherein the curved surface comprises a spherical surface.
3. The NED of claim 2, wherein a center of the spherical surface corresponds to a center of rotation of an eye.
4. The NED of claim 1, wherein, for each point on the curved surface, a tangent plane of the point is configured to be normal to an optical axis of an eye when the optical axis of the eye aligns with the point.
5. The NED of claim 1, wherein the angle selective filter comprises a thin film multilayer filter.
6. The NED of claim 1, wherein the angle selective filter comprises a set of louvers arranged concentrically.
7. The NED of claim 1, wherein the one or more light beams include at least one stray light beam.
8. The NED of claim 1, wherein the first cut-off angle of incidence is between a range comprising 45-55 degrees.
9. The NED of claim 1, wherein the angle selective filter is disposed between the optical element and an exit pupil of the NED.
10. The NED of claim 1, wherein the first cut-off angle of incidence corresponds to a visual field of an eye of a user of the NED.
11. The NED of claim 1, further comprising an eye tracking system, the eye tracking system comprises a light source in the near infrared wavelength range, wherein the angle selective filter is further configured to allow a light beam in the near infrared wavelength range with an angle of incidence on the curved surface larger than the first cut-off angle of incidence.
12. The NED of claim 1, wherein a center of the curved surface is adjustable and determined based on a measurement from the eye tracking system.
13. A near eye display (NED) comprising:
an electronic display configured to output image light;
a Fresnel lens configured to direct a plurality of light beams associated with the image light to an eye-box, wherein the plurality of light beams exiting the Fresnel lens includes one or more stray light beams; and
an angle selective filter comprising a curved surface, the angle selective filter configured to filter out at least one of the one or more stray light beams, the at least one of the one or more stray light beams having an angle of incidence on the curved surface that is larger than a first cut-off angle of incidence.
14. The NED of claim 13, wherein the curved surface comprises a spherical surface.
15. The NED of claim 14, wherein a center of the spherical surface corresponds to a center of rotation of an eye.
16. The NED of claim 13, wherein, for each point on the curved surface, a tangent plane of the point is configured to be normal to an optical axis of an eye when the optical axis of the eye aligns with the point.
17. The NED of claim 13, wherein the angle selective filter is disposed between the Fresnel lens and an exit pupil of the NED.
18. A method comprising:
receiving, from an electronic display, image light associated with a content scene;
filtering out, by an angle selective filter, one or more light beams of the image light having an angle of incidence larger than a first cut-off angle of incidence; and
forming an image of the content scene based on filtered light beams exiting the angle selective filter.
19. The method of claim 18, wherein the angle selective filter comprises a spherical surface, and a center of the spherical surface corresponds to a center of rotation of an eye.
20. The method of claim 18, wherein, for each point on a curved surface of the angle selective filter, a tangent plane of the point is configured to be normal to an optical axis of an eye when the optical axis of the eye aligns with the point.
US15/867,652 2018-01-10 2018-01-10 Angle selective filter for near eye displays Abandoned US20190212482A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/867,652 US20190212482A1 (en) 2018-01-10 2018-01-10 Angle selective filter for near eye displays
US15/884,293 US10809429B1 (en) 2018-01-10 2018-01-30 Angle selective filter having curved surface for near eye displays

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/867,652 US20190212482A1 (en) 2018-01-10 2018-01-10 Angle selective filter for near eye displays

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/884,293 Continuation US10809429B1 (en) 2018-01-10 2018-01-30 Angle selective filter having curved surface for near eye displays

Publications (1)

Publication Number Publication Date
US20190212482A1 true US20190212482A1 (en) 2019-07-11

Family

ID=67140763

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/867,652 Abandoned US20190212482A1 (en) 2018-01-10 2018-01-10 Angle selective filter for near eye displays
US15/884,293 Expired - Fee Related US10809429B1 (en) 2018-01-10 2018-01-30 Angle selective filter having curved surface for near eye displays

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/884,293 Expired - Fee Related US10809429B1 (en) 2018-01-10 2018-01-30 Angle selective filter having curved surface for near eye displays

Country Status (1)

Country Link
US (2) US20190212482A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021045865A1 (en) * 2019-09-05 2021-03-11 Facebook Technologies, Llc Magnetic field driven liquid crystal patterning control system
EP3809185A1 (en) * 2019-10-15 2021-04-21 Nokia Technologies Oy Rendering glare content
US11099393B2 (en) * 2019-11-22 2021-08-24 Facebook Technologies, Llc Surface emitting light source with lateral variant refractive index profile
WO2021173377A1 (en) * 2020-02-25 2021-09-02 Facebook Technologies, Llc Angularly selective diffusive combiner
WO2022193880A1 (en) * 2021-03-17 2022-09-22 Oppo广东移动通信有限公司 Near-to-eye display optical system, optical filter and near-to-eye display device
JP2022542750A (en) * 2019-08-07 2022-10-07 メタ プラットフォームズ テクノロジーズ, リミテッド ライアビリティ カンパニー Stray light suppression in eye tracking imaging
JP2023060578A (en) * 2021-10-18 2023-04-28 キヤノン株式会社 Louver, head-mounted display, optical equipment, method of manufacturing louver
US11749964B2 (en) 2020-06-24 2023-09-05 Meta Platforms Technologies, Llc Monolithic light source with integrated optics based on nonlinear frequency conversion
US20230358928A1 (en) * 2021-06-30 2023-11-09 Fujikura Ltd. Optical computing device and optical computing method
WO2025058645A1 (en) * 2023-09-14 2025-03-20 Magic Leap, Inc. Compact extended depth of field lenses for wearable display devices
JP2025061739A (en) * 2021-03-19 2025-04-11 キヤノン株式会社 Optical and display devices

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12298505B2 (en) * 2020-02-06 2025-05-13 Apple Inc. Optical systems having angle-selective transmission filters
US11953941B2 (en) * 2021-10-25 2024-04-09 Universal City Studios Llc Interactive device of an attraction system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI539230B (en) * 2007-05-09 2016-06-21 杜比實驗室特許公司 3D image projection and viewing system (1)
US7589901B2 (en) * 2007-07-10 2009-09-15 Microvision, Inc. Substrate-guided relays for use with scanned beam light sources
US9632315B2 (en) * 2010-10-21 2017-04-25 Lockheed Martin Corporation Head-mounted display apparatus employing one or more fresnel lenses
US8998414B2 (en) * 2011-09-26 2015-04-07 Microsoft Technology Licensing, Llc Integrated eye tracking and display system
US8611015B2 (en) * 2011-11-22 2013-12-17 Google Inc. User interface
DE102014119550B4 (en) * 2014-12-23 2022-05-12 tooz technologies GmbH Imaging optics for generating a virtual image and data glasses
CN107003526A (en) * 2015-03-19 2017-08-01 松下知识产权经营株式会社 Head-up display
US10146054B2 (en) * 2015-07-06 2018-12-04 Google Llc Adding prescriptive correction to eyepieces for see-through head wearable displays

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022542750A (en) * 2019-08-07 2022-10-07 メタ プラットフォームズ テクノロジーズ, リミテッド ライアビリティ カンパニー Stray light suppression in eye tracking imaging
US12270992B2 (en) 2019-08-07 2025-04-08 Meta Platforms Technologies, Llc Beam shaping optic for light sources
WO2021045865A1 (en) * 2019-09-05 2021-03-11 Facebook Technologies, Llc Magnetic field driven liquid crystal patterning control system
US11249365B2 (en) 2019-09-05 2022-02-15 Facebook Technologies, Llc Magnetic field driven liquid crystal patterning control system
US11681194B2 (en) 2019-09-05 2023-06-20 Meta Platforms Technologies, Llc Magnetic field driven liquid crystal patterning control system
EP3809185A1 (en) * 2019-10-15 2021-04-21 Nokia Technologies Oy Rendering glare content
WO2021074759A1 (en) * 2019-10-15 2021-04-22 Nokia Technologies Oy Rendering glare content
CN114746791A (en) * 2019-10-15 2022-07-12 诺基亚技术有限公司 Rendering glare content
US12346992B2 (en) 2019-10-15 2025-07-01 Nokia Technologies Oy Rendering glare content
US11099393B2 (en) * 2019-11-22 2021-08-24 Facebook Technologies, Llc Surface emitting light source with lateral variant refractive index profile
WO2021173377A1 (en) * 2020-02-25 2021-09-02 Facebook Technologies, Llc Angularly selective diffusive combiner
CN115038996A (en) * 2020-02-25 2022-09-09 元平台技术有限公司 Angle selective diffusion combiner
US11749964B2 (en) 2020-06-24 2023-09-05 Meta Platforms Technologies, Llc Monolithic light source with integrated optics based on nonlinear frequency conversion
WO2022193880A1 (en) * 2021-03-17 2022-09-22 Oppo广东移动通信有限公司 Near-to-eye display optical system, optical filter and near-to-eye display device
JP2025061739A (en) * 2021-03-19 2025-04-11 キヤノン株式会社 Optical and display devices
US20230358928A1 (en) * 2021-06-30 2023-11-09 Fujikura Ltd. Optical computing device and optical computing method
JP2023060578A (en) * 2021-10-18 2023-04-28 キヤノン株式会社 Louver, head-mounted display, optical equipment, method of manufacturing louver
JP7707026B2 (en) 2021-10-18 2025-07-14 キヤノン株式会社 Louver, head mounted display, optical device, and method for manufacturing louver
WO2025058645A1 (en) * 2023-09-14 2025-03-20 Magic Leap, Inc. Compact extended depth of field lenses for wearable display devices

Also Published As

Publication number Publication date
US10809429B1 (en) 2020-10-20

Similar Documents

Publication Publication Date Title
US10809429B1 (en) Angle selective filter having curved surface for near eye displays
US10878594B1 (en) Boundary region glint tracking
US10416766B1 (en) Varifocal head-mounted display including modular air spaced optical assembly
US10481687B1 (en) Waveguide integrated eye tracking
US10257507B1 (en) Time-of-flight depth sensing for eye tracking
US10466484B1 (en) Compact head-mounted display for artificial reality
US9910282B2 (en) Increasing field of view of head-mounted display using a mirror
US10690929B1 (en) Beamsplitter assembly for eye tracking in head-mounted displays
US10957059B1 (en) Multi-pattern depth camera assembly
US10120442B2 (en) Eye tracking using a light field camera on a head-mounted display
US11953688B1 (en) High-resolution liquid crystal displays
US20180157320A1 (en) Air spaced optical assembly with integrated eye tracking
US10914956B1 (en) Tiled display assemblies for artificial reality headset
US10789777B1 (en) Generating content for presentation by a head mounted display based on data captured by a light field camera positioned on the head mounted display
US10996514B1 (en) Offsetting non-uniform brightness using a backlight assembly
US10698218B1 (en) Display system with oscillating element
WO2024147919A1 (en) Generating tile-based region of interest representation of video frames for video encoding
US10848753B1 (en) Eye-tracking system using a scanning laser assembly
US12078803B1 (en) Expanding field-of-view in direct projection augmented reality and virtual reality systems
US10359845B1 (en) Display assembly using dynamic liquid crystal array
US20230300470A1 (en) Techniques for producing glints and iris illumination for eye tracking
US11163159B1 (en) Display system with extended display area
US10572731B1 (en) Infrared transparent backlight device for eye tracking applications
US10859702B1 (en) Positional tracking using retroreflectors
US12124623B1 (en) Techniques for gaze-contingent sensing and processing

Legal Events

Date Code Title Description
AS Assignment

Owner name: OCULUS VR, LLC, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RICHARDS, EVAN M.;REEL/FRAME:044640/0936

Effective date: 20180116

AS Assignment

Owner name: FACEBOOK TECHNOLOGIES, LLC, CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:OCULUS VR, LLC;REEL/FRAME:049900/0142

Effective date: 20180904

STCB Information on status: application discontinuation

Free format text: EXPRESSLY ABANDONED -- DURING EXAMINATION

AS Assignment

Owner name: META PLATFORMS TECHNOLOGIES, LLC, CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:FACEBOOK TECHNOLOGIES, LLC;REEL/FRAME:060637/0858

Effective date: 20220318