WO2025153823A1 - Display system with steerable eye box - Google Patents
Display system with steerable eye boxInfo
- Publication number
- WO2025153823A1 WO2025153823A1 PCT/GB2025/050077 GB2025050077W WO2025153823A1 WO 2025153823 A1 WO2025153823 A1 WO 2025153823A1 GB 2025050077 W GB2025050077 W GB 2025050077W WO 2025153823 A1 WO2025153823 A1 WO 2025153823A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- eye
- light beam
- illumination
- light guide
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0081—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. enlarging, the entrance or exit pupil
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0149—Head-up displays characterised by mechanical features
- G02B2027/0154—Head-up displays characterised by mechanical features with movable elements
Definitions
- the invention relates to a display system producing a steerable eye box, referred to as a display exit pupil, for use in near eye displays (NEDs), such as those included in mixed reality (MR) and virtual reality (VR) head mounted displays (HMDs).
- NEDs near eye displays
- MR mixed reality
- VR virtual reality
- HMDs head mounted displays
- MR and VR devices allow a user to experience and/or interact with virtual information, without losing situational awareness of the real world in the case of MR.
- Such devices typically include a display device positioned close to the eye of a user, which displays images to the user, but which may still allow the user to see past the displayed images to remain aware of the external environment.
- VR and MR devices take the form of a head mounted display (HMD), which is worn on the head of the user in the same way as a pair of glasses or a helmet.
- the display is positioned in front of one or both eyes of the user and in the case of MR devices typically takes the form of partially transparent optics through which an image is projected. The user can then see the image projected through the optics, overlayed onto the outside world.
- HMD head mounted display
- MR headset devices can also operate by providing an opaque display set up and using an externally facing camera to incorporate images of the outside world into the image displayed to the user.
- This type of device is known as a digital pass-through device.
- HMD applications require the MR display to be compact, unintrusive, comfortable, and capable of producing high- quality images.
- Each HMD device has an eye box, which is defined as the area in which a user's eye can be physically located relative to the display and still see the complete projected image. It is desirable to have a large eyebox so that: (a) the user, as they move their eye in the eye box, is still able to see the image from different eye positions, (b) the eyebox size can compensate for any misalignments of the HMD, and (c) a greater proportion of users can view the projected image without readjusting the distance between the two displays of the HMD, taking into account the natural variation in Interpupil Distance (IPD) due to face shapes and eye separation distances between users. While it is defined as an area, the eye box must exist in many different planes, and thus it extends in three dimensions.
- IPD Interpupil Distance
- a display's field of view is defined as the angle subtended at the eye over which the user can see the displayed content.
- a large FOV increases the realism of the virtual content and the sense of immersion provided by a VR or MR device, as well as enabling larger images to be displayed.
- a small FOV is one of the most common complaints about existing VR and MR displays, with many currently available HMDs only achieving less than 56° diagonal FOV.
- the depth of field (DOF) of a display measures how fast the image becomes defocused, moving away from the focal plane along the optical axis of the display optics. When the F- number of the optics is larger, which corresponds to a smaller beam diameter in the optics, the depth of field is larger.
- a larger depth of field means that the image remains in focus for a larger distance away from the focal plane.
- Catadioptric optics i.e. optics using both reflective and refractive components.
- Catadioptric optics are used for their ability to form a large FOV while minimising the Total Track Length (TTL) of the display.
- TTL is defined here as the length along the optical path from the light-emitting surface of the display device (i.e. the first surface of the optics) to the last optical surface closest to the user's eye.
- Well-known examples of this architecture include the pancake design by 3M (RTM) and the Apple (RTM)/Limbak (RTM) lens ThinEyes (RTM). Another example is the Apple Vision Pro (RTM).
- the illumination engine comprises: a segmented light source comprising an array of pixels; and an optical coupling element arranged to receive light from the segmented light source and to output the input light beam to the light guide.
- the illumination engine is configured to control an output angle of the input light beam by controlling which pixels of the segmented light source illuminate the optical coupling element.
- the illumination engine further comprises relay optics configured to receive light reflected by the scanning mirror and to convert the reflected light into the input light beam.
- the display system is configured such that out-coupled beams corresponding to successive internal reflections overlap one another by an amount greater than 0 mm and less than or equal to 1 mm.
- the successive internal reflections correspond to propagation of a beam propagating within the light guide in response to in-coupling of the input light beam.
- the condition that the overlap is greater than 0 mm and less than or equal to 1 mm may apply across the operational range of angles of the input light beam.
- the display device takes the form of a transmissive liquid crystal display device.
- the transmissive liquid crystal display device operates by modulating the amplitude of the illumination light beam.
- the transmissive liquid crystal display device comprises one or more polarizers
- the input light beam should be incoherent or partially coherent.
- Incoherent or partially coherent may correspond to a minimum spectral bandwidth of 2 nm at any point in time.
- Incoherent or partially coherent may correspond to a minimum spectral bandwidth of 2 nm within a 1 second integration period.
- a display system comprising an illumination engine configured to generate an input light beam having a first width.
- the display system also includes a diffractive light guide optically coupled to the illumination engine and configured to receive the input light beam.
- the light guide is configured to convert the input light beam into an illumination light beam having a second width larger than the first width.
- the display system also includes a transmissive liquid crystal display device configured to be illuminated by the illumination light beam and to convert the illumination light beam into an image light beam.
- the display system also includes a camera configured to capture images of a user's eye and output the images.
- the illumination engine comprises a segmented light source comprising an array of pixels, and an optical coupling element arranged to receive light from the segmented light source and to output the input light beam to the light guide.
- the illumination engine is configured to control an output angle of the input light beam by controlling which pixels of the segmented light source illuminate the optical coupling element.
- the display system according to the second aspect of the invention may comprise features corresponding to any features of the display system according to the first aspect of the invention.
- Definitions applicable to the display system according to the first aspect of the invention (and/or features thereof) may preferably be applicable to the display system according to the second aspect of the invention (and/or features thereof).
- a light guide for use in a head mounted display, the light guide comprising: a transparent substrate having upper and lower surfaces; an in-coupling element on the upper surface of the substrate configured to couple light incident on the in-coupling element into the light guide; and an exit pupil expander comprising an out-coupling element on the upper surface of the substrate configured to couple light incident on the out-coupling grating out of the light guide; wherein the upper surface is planar, and the thickness of the transparent substrate in a direction normal to the upper surface varies across the plane of the upper surface.
- the lower surface has at least one of a curved profile and a profile that is inclined relative to the upper surface when viewed in a direction parallel to the upper surface.
- the transparent substrate has a thickness variation of more than 1%.
- the transparent substrate has a thickness variation of more than 5%.
- the transparent substrate has a thickness variation of more than 10%.
- the transparent substrate has a thickness variation of more than 25%.
- the transparent substrate has a thickness variation of more than 50%.
- the transparent substrate comprises an upper layer including the upper surface and a lower layer including the lower surface, the upper layer and the lower layer being stacked in a vertical direction, wherein the upper layer and lower layers are formed from different optical materials.
- each of the in-coupling element and the out-coupling element is one of: a diffractive optical element, a metasurface, a prism and a prism array.
- each of the in-coupling element and the out-coupling element is one of: a diffractive optical element, a partially reflective mirror, a metasurface, a prism and a prism array.
- a light guide for use in a head mounted display, the light guide comprising: a transparent substrate having upper and lower surfaces; an in-coupling element on one of the upper and lower surfaces of the substrate configured to couple light incident on the in-coupling element into the light guide; and an out-coupling element on the other of the upper and lower surfaces of the substrate configured to couple light incident on the out-coupling element out of the light guide; wherein the upper surface is planar, and the lower surface is reflective and has a curved profile when viewed in a direction parallel to the upper surface.
- a display system comprising: a display device comprising an array of pixels and configured to emit image light representing an image; a converging lens configured to collect image light from the display device; a shutter mechanism configured to partially block light that has passed through the converging lens from the display device, wherein the shutter mechanism defines an adjustable aperture through which light can pass; a dynamic lens configured to collect light that has passed through the aperture to form a display exit pupil of the display system; a telescopic optical system configured to image the display exit pupil in an eye box of the display system; and a control device configured to control the shutter mechanism and the dynamic lens, wherein the control device is configured to control the position and size of the aperture and the power of the dynamic lens so as to locate the image of the display exit pupil in the eye box.
- the display system further comprises an eye-tracking system including: an eye tracking camera configured to image an eye of a user of the display system and output image data; and an eye position calculating unit configured to calculate a position of the user's eye based on the image data from the eye tracking camera; wherein the eye position calculating unit is configured to output the calculated position of the user's eye to the control device and the control device is configured to control the position and size of the aperture and the power of the dynamic lens so as to locate the image of the display exit pupil at the calculated position of the user's eye.
- an eye-tracking system including: an eye tracking camera configured to image an eye of a user of the display system and output image data; and an eye position calculating unit configured to calculate a position of the user's eye based on the image data from the eye tracking camera; wherein the eye position calculating unit is configured to output the calculated position of the user's eye to the control device and the control device is configured to control the position and size of the aperture and the power of the dynamic lens so as to locate the image of the display
- the display does not take the form of a phase modulating device.
- Fig. 1 shows an example of the pancake-type optics used in a conventional VR or MR display
- Fig. 2 shows a display system according to an embodiment of the invention
- Fig. 3 is a diagram showing how the focus of a light beam from an object varies for two different beam diameters as the object moves from a distance of 0.3m from the eye to infinity through optics with a focal plane at a distance of 2m.
- the diagram shows that the depth of field of the optics is larger when using a beam diameter of 1mm than when using a beam diameter of 5mm;
- Fig. 4 is a spot diagram showing the performance of pancake lens optics with an eye box of 12mm diameter
- Fig. 6 is a spot diagram showing the performance of pancake lens optics with an eye box of 1mm diameter
- Fig. 26 schematically illustrates using the fourth illumination engine to generate a steerable input illumination light beam
- Figure 29 shows a third example of light guide propagation and out-coupling.
- the invention can be applied to any Near Eye Display (NED) that uses a backlit pixelated display device.
- NED Near Eye Display
- an amplitude modulating backlit pixelated display device such as a transmissive liquid crystal device utilizing one or more polarizers.
- Possible architectures for the NED optical system include pancake lenses, segmented lens architectures (segmented lens architectures are lenses having discontinuity at the first derivative of the shape of the lens), birdbath optics, beam splitter architectures, curved mirrors, holographic reflectors and TIR prisms. This list is not exhaustive, and other NED optical arrangements are possible and compatible with the invention.
- the invention provides a NED display with a larger FOV, greatly reduced or eliminated VAC, and reduced need for prescription glasses.
- these enhancements are achieved by combining an illumination light guide with a reflective/refractive NED to create a smaller display exit pupil within the eye box and centered on the eye pupil of the user. This reduction in the display exit pupil diameter leads to a display with a larger DOF and improved contrast.
- the eye pupil is the opening of the user's eye through which light enters.
- the iris determines the diameter of the eye pupil, which usually varies between 3mm and 5mm.
- the display exit pupil is the pupil formed by the NED in the eye box.
- the display exit pupil of the optical system moves within the eye box and follows the eye pupil in the use of the NED, as will be described below.
- the term "diameter” is used here to mean a width or more generally a dimension of the beam in a direction perpendicular to the propagation direction of the beam. It does not imply that the beam is necessarily circular in cross-section. For example, the beam diameter could be the width dimension of a beam having a rectangular cross-section.
- the display system relaxes manufacturing tolerances for the optics, which allows the cost of the display system to be reduced.
- the use of a narrow-angle beam allows the optics to be simplified and consequently also has the effect of reducing the overall size of the optical system.
- the display system of the embodiment uses the display system of the embodiment, the user does not need prescription glasses and will not experience VAC, the system's optics are simplified, and contrast improves.
- the display system reduces or eliminates VAC by creating an image that is in focus for all depths.
- the inventors have found that the problems described above can be reduced or eliminated using optics with a large F-number.
- Increasing the F-number of the optics of the display system increases the depth of field and reduces the effect of optical aberrations on the displayed image in a NED.
- the F-number of the optical system can be increased by dynamically reducing the illumination cone emitted by the display device onto the optics at any one time.
- An embodiment of the invention comprises the following components:
- a NED display device (1) A NED display device.
- An illumination light guide also referred to as a waveguide.
- the light guide can be flat (as shown in Fig. 2) or have one of its surfaces be curved (see Fig. 10 and 11). Its purpose is to act as a backlight and illuminate the display of the NED.
- the ET system may utilise an image sensor and a plurality of light sources emitting light towards the user's eye. Images captured by the image sensor can be used to determine the position of the user's eye. Such an ET system may detect reflections from the eye to determine the 3D position and direction of the eye.
- Examples of light sources for the illumination engine include Light Emitting Diodes (LEDs), lasers, VCSELS, and arrays of such sources, like inorganic LED displays.
- Examples of components controlling the emission angle include scanning mirrors and arrays of emitters at different angles that are selectively switched.
- the display device 26 can, in other embodiments, be any image-forming device that utilises a back light unit (BLU).
- suitable display devices include liquid crystal on silicon (LCOS) and liquid crystal display (LCD) devices.
- Display device 26 is preferably an amplitude modulating display device such as a transmissive liquid crystal device utilizing one or more polarizers.
- the light guide's output angle determines the position of the display exit pupil 38 in the eye box 40, and the position of the display exit pupil 38 will move as the angle of the scanning mirror 30 changes.
- An eye-tracking system including an eye-tracking image sensor 36, is also provided in the display system. The image sensor or camera 36 is used to determine the eye's exact position and thus correctly set the angle of the light from the illumination engine 32 so that the display exit pupil 38 is formed where the user's pupil is.
- the eye-tracking system includes an array of infrared (IR) light emitting diodes (LEDs) positioned on a part of the display system facing the eye.
- IR infrared
- LEDs light emitting diodes
- the display system is incorporated into a HMD resembling a pair of glasses, the LEDs are located around the frame of the spectacles facing the user's eye.
- the IR LEDs are located inside the headset.
- the LEDs can be placed around the edge of the aperture through which image light from the display device passes to reach the user's eye.
- Infrared light from the LEDs is reflected from the eye and sensed by the eye-tracking camera 36, which can be a standard digital camera of the kind commonly used in smartphones, provided that the camera sensor is sensitive to infrared light.
- the information is then analyzed to extract eye rotation from changes in the reflections of the infrared light from the eye, which lead to corresponding changes in the images captured by the camera.
- the 2D image from the camera is fitted into a 3D model by an eye position calculating unit in the display system. Within a few frames of image data from the camera, the eye position calculating unit creates a 3D model of the eye representing the eye's position and orientation in 3D space.
- the eye position calculating unit can calculate the eye position by tracking various features of the eye using the image of the reflected light, including the corneal reflection (the first Purkinje image) and the center of the eye pupil in this example.
- the eye-tracking system of the invention may also track both the reflection from the front of the cornea (the first Purkinje image) and the back of the lens (fourth Purkinje image). This type of eye-tracking system is known as a dual-Purkinje eye tracker.
- Another alternative or additional feature of the eye-tracking system is to image features from inside the eye, such as the retinal blood vessels, and follow these features as the eye rotates. This leads to a more accurate determination of the eye orientation.
- the display system includes an illumination light guide.
- the light guide acts as an exit pupil expander (EPE) in one or two dimensions, thus creating a large illumination area with compact optics.
- EPE exit pupil expander
- This light guide (unlike more conventional waveguides used for AR devices) is used for illumination purposes and is not used as an image-conveying device, i.e. the illumination light guide is positioned before the display device in the optical path of the display system.
- the illumination light guide 1 benefits from a greater degree of design freedom compared to more conventional light guides, resulting from its use for illumination purposes and the properties of the display system of the invention.
- the illumination light guide 1 includes an in-coupling element 2 having an exit pupil expander (EPE) 3 and an out-coupling element 4 such as a DOE grating.
- EPE exit pupil expander
- 4 out-coupling element 4
- the following features can be included in the illumination light guide 1 as appropriate for the particular application of the display system:
- the out-coupling grating (item 4 in Figure 7) may be configured to have variable orientation and spatial frequency.
- the spatial frequency of the diffractive grating (the equivalent of optical power in refractive optics) can change the diffraction angle for each position and add an angular offset required for the angles on the pixelated device backlight.
- Variable thickness adds another degree of freedom in matching the target illumination angle for each position in the eye box and each image pixel.
- the light guide thickness can be varied to achieve the desired central and range of illumination angles.
- Low-refractive index (Rl) materials The range of angles within the illumination light guide will be reduced compared to the range required in the waveguide of a conventional AR device. The reduced range of angles allows for the use of low-RI materials like plastic in the light guide of the invention.
- the transmissive LCD panel is preferably an amplitude modulating device utilizing one or more polarizers.
- the display device causes the beam from the illumination light guide to diverge in this embodiment.
- the light from the display device enters the pancake lens, which is located at one focal length from the display device and one focal length from the eye box.
- the angular spread of the pixel output beam from the display device will determine the size of the display exit pupil.
- the display exit pupil needs to be large enough to reliably form an image in the user's eye. If the angular spread of the pixel output beam and hence the display exit pupil would otherwise be too small, a diffusing function can be added on one of the surfaces in the optical path of the beam. For example, a diffusing optical surface can be added prior to the display so that the angular spread of the pixel output beam increases and the size of the display exit pupil also increases.
- the pancake lens is formed of a plurality of lenses that are used in both reflection and refraction.
- a polarizer and a quarter waveplate are used to selectively reflect the first bounce while transmitting the second.
- a half-silvered mirror is used to reflect the backwards propagating ray towards the user's eye.
- Suitable alternatives include achromatic converging lenses, Fresnel lenses, freeform reflectors and birdbath lenses.
- the invention reduces or eliminates VAC by reducing the beam diameter in the optics of the display system. By keeping the beam diameter narrower, the image is more in focus and the user can experience focused images without requiring the significant eye accommodation that can lead to VAC.
- Figs 9A and 9B illustrate further how the display exit pupil 108 is moved across the eye box 107 of an optical system according to an embodiment of the invention.
- Fig. 9A shows the optical system in cross-section, including an illumination engine 100, a light guide 102, a transmissive display device 104, a projection optical system 106, and the eye box 107.
- An input illumination light beam 101 is output by the illumination engine 100 into the light guide, which outputs an illumination light beam 103 into the display device 104 via an outcoupling element 105.
- the display device modifies the light passing through it at pixels including pixels A, B and C labelled 119, 120 and 121 respectively.
- Image light beams from pixels A, B and C are labelled 122, 123 and 124 respectively, and the chief image light rays within those beams are labelled 125, 126 and 127 respectively.
- Fig. 9B shows the possible variation of the angles of the Pixel A image light chief ray 125, Pixel B image light chief ray 126 and Pixel C image light chief ray 127.
- the variation of the chief ray angle as a function of the display exit pupil 108 position in the eye box 107 i.e. the local vertical coordinate axis y 111
- these model lines each need two independent variables to be independently defined for a fixed pixel position. This means that two degrees of freedom are required to control 0y sufficiently for each chief ray.
- the second illumination engine 200 includes an array of emissive light sources 202 and a collimating lens 203.
- the emissive light sources 202 are arranged in a 2D cartesian array, although this is not essential and the emissive light sources 202 may be arranged using different types of 2D grid (e.g. hexagonal), or may instead be arranged in a linear array and so forth.
- 2D grid e.g. hexagonal
- the following discussion will assume a 2D, N by M square or rectangular grid, so that each emissive light sources 202 effectively provides one "pixel" of a segmented light source.
- the angle 0 ou t of the input illumination light beam 201 is controlled by illuminating one emissive light source 202 at a time, or a small group of adjacent emissive light sources 202 depending on the number, size and spacing.
- Figure 16 is a schematic, geometric optics diagram for a collimating lens 203 in the form of a convex lens.
- the m th column of the array of emissive light sources 202 has been shown as pixels P(l,m), ..., P(ll,m) on the focal plane of the collimating lens 203.
- the emissive light source 202 corresponding to the pixel P(10,m) is deactivated (switched to an "OFF" state) and illuminating the emissive light source 202 corresponding to the pixel P(4,m).
- a second pair of rays 204b, 205b are drawn with chained lines: a first ray 204b of light travelling parallel to the optic axis 206 and a second ray 205b of light which passes through the optical centre of collimating lens 203.
- pixel P(4,m) it may be observed that the direction of the input illumination light beam 201 is shifted from 0 ou t to 0' ou t.
- the in-coupling element 207, light guide 208, out-coupling element 209, display 211 and projection optical system 213 may be as described in any of embodiments hereinbefore.
- a display system including the second illumination engine 200 may additionally include any other element described hereinbefore, such as variable thickness light guides and so forth.
- the collimating lens 203 may include two or more lenses and/or additional optical elements.
- the collimation function may be provided by an off axis lens or a section of a lens.
- the collimation function may be provided by freeform optics. Freeform optics may be useful when the optical system is not rotationally symmetric or when the focal plane is not on the optical axis.
- Each in-coupling elements 207a, 207b, 207c, 207d receives an input illumination light beam 201 from a respective paired collimating lens 203a, 203b, 203c, 203d and array of emissive light sources 202a, 202b, 202c, 202d. All the input illumination light beams 201 may be controlled to have the same angle, for example by illuminating the pixels P(n,m) from the same row n and column m in each array of emissive light sources 202a, 202b, 202c, 202d.
- angles of some or all of the input illumination light beams 201 may be varied by illuminating different pixel P(n,m) coordinates (n,m) within each array of emissive light sources 202a, 202b, 202c, 202d.
- using differently angle input illumination light beams 201 may allow the effects described hereinbefore in relation to variable thickness light guides 202 to be obtained to even greater degrees.
- light guides 208 having multiple in-coupling elements 207 is not restricted to illumination using the second illumination engine 200, and the first illumination engine 32 could also be used with such light guides 208 (for example a set of relay optics 28 and scanning mirror 30 may be provided for each in-coupling element 207).
- FIG 18 corresponds to the configuration shown in Figure 14, in which the array of emissive light sources 202 and collimating lens 203 are arranged with an optical axis oriented substantially perpendicular to the in-coupling element 207 of the light guide 208.
- the optic axis 206 of the second illumination engine 200 may be angled relative to the light guide 208. This is applicable whether the second illumination engine 200 includes one, two or more pairs of collimating lens(es) 203 and respective array(s) of emissive light sources 202. When multiple pairs of collimating lens(es) 203 and respective array(s) of emissive light sources 202 are used, they are not required to have parallel optic axes, and may make different angles to the light guide 208.
- variable thickness light guides and/or illuminating different pixel coordinates (n,m) in different arrays to provide even greater control of the distribution of output light beams 212 provided to and focussed by the projection optical system.
- Further advantages include having a single part for manufacturing and having all electronics and LEDs on substantially the same plane.
- FIGS 14, 16, 18 and 19 include generally symmetric collimating lenses 203 arranged concentrically with the respective arrays of emissive light sources 202. However, this is not essential, and as shown in Figure 20, a collimating lens 203 may be optical asymmetric with respect to a centroid of a respective array of emissive light sources 202.
- the respective collimating lenses 203 and arrays of emissive light sources 202 may be separate units. However, they may equally be integrated as a single unit.
- FIG. 21 an example is shown in which three arrays of emissive light sources 202a, 202b, 202c are supported on a common substrate 214 (for example a printed circuit board).
- the arrays of emissive light sources 202a, 202b, 202c illuminate three corresponding in-coupling elements 207a, 207b, 207c via collimating lenses 203a, 203b, 203c which are provided as a single piece 215, for example an injection moulded plastic element.
- discontinuous in-coupling elements 207a, 207b, 207c could be merged into one continuous in-coupling element 207.
- arrays of in-coupling elements may be arranged two or more edges of a light guide, each receiving an input illumination light beam 201.
- each out-coupled beam 505a, 505b, 505c is longer than the bounce distance Dbounce, so that each out-coupled beam 505a, 505b, 505c overlaps an amount Dover within the preceding and following outcoupled beams 505a, 505b, 505c.
- the display 26, 60, 104, 211 may take the form of an amplitude modulating display such as, for example a transmissive liquid crystal device utilizing one or more polarizers.
- the input illumination light beam for example input illumination light beam 101, 201, 502, need not be coherent.
- an input light beam may be incoherent or partially coherent. Incoherent or partially coherent may correspond herein to a minimum spectral bandwidth of 2 nm at any point in time or within a 1 second integration period.
- a light guide may include an out-coupling element in the form of a number of partial bulk reflectors, for example disposed as a linear array.
- the partial bulk reflectors may take the form of partial dielectric mirrors, each having an angular bandwidth of ⁇ 15 degrees for a central or design wavelength.
- the design wavelength may be selected based on the application. For example, for a white light backlight, the design wavelength may be selected as the average across the emission spectrum. Variabilities in the out-coupling efficiency with wavelength may be compensated by modulating the colours of an image output using the display 26, 60, 104, 211.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
Abstract
A display system includes an illumination engine configured to generate an input light beam having a first width. The illumination engine includes a segmented light source including an array of pixels. The illumination engine also includes an optical coupling element arranged to receive light from the segmented light source and to output the input light beam to a light guide. The display system also includes the light guide optically coupled to the illumination engine and configured to receive the input light beam. The light guide is configured to convert the input light beam into an illumination light beam having a second width larger than the first width. The display system also includes a display device configured to be illuminated by the illumination light beam and to convert the illumination light beam into an image light beam. The illumination engine is configured to rotate the angle of the input light beam controllably relative to the orientation of the light guide by controlling which pixels of the segmented light source illuminate the optical coupling element.
Description
DISPLAY SYSTEM WITH STEERABLE EYE BOX
FIELD OF THE INVENTION
The invention relates to a display system producing a steerable eye box, referred to as a display exit pupil, for use in near eye displays (NEDs), such as those included in mixed reality (MR) and virtual reality (VR) head mounted displays (HMDs).
BACKGROUND
MR and VR devices allow a user to experience and/or interact with virtual information, without losing situational awareness of the real world in the case of MR. Such devices typically include a display device positioned close to the eye of a user, which displays images to the user, but which may still allow the user to see past the displayed images to remain aware of the external environment. Often VR and MR devices take the form of a head mounted display (HMD), which is worn on the head of the user in the same way as a pair of glasses or a helmet. The display is positioned in front of one or both eyes of the user and in the case of MR devices typically takes the form of partially transparent optics through which an image is projected. The user can then see the image projected through the optics, overlayed onto the outside world. This type of device is known as an optical see-through device. MR headset devices can also operate by providing an opaque display set up and using an externally facing camera to incorporate images of the outside world into the image displayed to the user. This type of device is known as a digital pass-through device.
There are many existing HMD products commercially available. HMD applications require the MR display to be compact, unintrusive, comfortable, and capable of producing high- quality images.
Each HMD device has an eye box, which is defined as the area in which a user's eye can be physically located relative to the display and still see the complete projected image. It is desirable to have a large eyebox so that: (a) the user, as they move their eye in the eye box, is still able to see the image from different eye positions, (b) the eyebox size can compensate for any misalignments of the HMD, and (c) a greater proportion of users can view the projected image without readjusting the distance between the two displays of the HMD, taking into account the natural variation in Interpupil Distance (IPD) due to face shapes and eye separation distances between users. While it is defined as an area, the eye box must exist in many different planes, and thus it extends in three dimensions.
A display's field of view (FOV) is defined as the angle subtended at the eye over which the user can see the displayed content. A large FOV increases the realism of the virtual content and the sense of immersion provided by a VR or MR device, as well as enabling larger images to be displayed. A small FOV is one of the most common complaints about existing VR and MR displays, with many currently available HMDs only achieving less than 56° diagonal FOV.
The depth of field (DOF) of a display measures how fast the image becomes defocused, moving away from the focal plane along the optical axis of the display optics. When the F- number of the optics is larger, which corresponds to a smaller beam diameter in the optics, the depth of field is larger. A larger depth of field means that the image remains in focus for a larger distance away from the focal plane.
One of the prevalent architectures for VR/MR displays uses catadioptric optics, i.e. optics using both reflective and refractive components. Catadioptric optics are used for their ability to form a large FOV while minimising the Total Track Length (TTL) of the display. The TTL is defined here as the length along the optical path from the light-emitting surface of the display device (i.e. the first surface of the optics) to the last optical surface closest to the user's eye. Well-known examples of this architecture include the pancake design by 3M (RTM) and the Apple (RTM)/Limbak (RTM) lens ThinEyes (RTM). Another example is the Apple Vision Pro (RTM).
Figure 1 shows a typical example of "pancake" optics for a Near Eye Display (NED) used in a VR or MR display. In this design, light is emitted by a display device 10, e.g. an LCD panel and passes through a circular polarizer 12. The light passes through a pancake lens having a halfmirror 14 and a quarter wave plate 16 formed on its front and back surfaces respectively and is then reflected by a surface, in this case a reflective polariser 18. The light is then reflected again by the half-mirror 14 and passes through the quarter wave plate before arriving into the eye box. Other configurations include surfaces having different reflectivities, different shapes for the surfaces and different polarization configurations.
The pancake VR optics will be used as an exemplar in this document, but other optical architectures can use the principles described. A feature of the pancake optics design is that there is a one-to-one relationship between the angle of the light leaving the display and its arrival position in the eye box. The one-to-one relationship applies to each specific pixel of the display. Given that most of the time, the eye will not be in the central position of the eye box, the light finally arriving at the user's eye will usually leave the display at an oblique angle rather than a normal angle, especially at the edges of the FOV and eye box. Given that most displays emit light in a Lambertian-like power distribution across emission angles, only a small fraction of the display's optical power will arrive into the eye's pupil, reducing efficiency and contrast. In pancake optics, efficiency is especially important as a large amount of light (more than 75%) is lost in the partially reflective mirror of the optics. Light lost within the optical system reduces efficiency and contrast due to scattering or leaking through the reflective polariser.
Another challenge with the pancake optics design realised by the inventors is the significant aberration introduced by the optics. The relatively steep reflection and refraction angles involved in directing light from the display to the user's eye using pancake optics will introduce aberrations even without any manufacturing errors. These aberrations arise due to the relatively small F-number of pancake optics and large eye box. Current approaches to reducing aberrations include using multiple high-quality glass elements. These optical elements increase cost and weight due to their material and number. In contrast, a single
plastic optical element is more suitable for a consumer head-mounted VR display due to its lower cost and weight.
Finally, the pancake optics design will cause users to suffer from Vergence Accommodation Conflict (VAC), which will be explained further below. The large eye box creates a narrow DOF, so the user's focus will be fixed on one plane while their vergence will change according to the displayed content, leading to VAC.
VAC is a particularly significant problem for MR displays in general. The conflict between accommodation and vergence is a significant reason for users' discomfort. This conflict occurs because human brains link "vergence" and "accommodation", as defined below, and expect both of these parameters to match the distance of an object being viewed. In a NED, this link breaks, and the user's brain tries to focus on one plane matching the user's vergence while the optical system displays the image on another plane.
Vergence is the movement of the eyes in opposite directions when an object moves closer to them. The effect of vergence is best described when the object is on the optical axis of the user. As the object moves closer to the eyes, the eyes look more towards the nose. On the other hand, accommodation is the ability of the eye to change focus from more distant to nearer objects by changing the shape of the eye's lens. Muscles in the eye change the shape of the lens and hence its optical power to keep an image of an object on the retina sharp as the distance of the object from the eye varies.
The movement of the muscles controlling vergence is highly linked to the eye's accommodation and the ciliary muscles bringing an object to focus. The human brain controls the ocular muscles in such a way that vergence and accommodation are linked because they normally occur in unison, i.e. as the eye muscles are controlled to increase vergence, they are also controlled to perform accommodation to focus closer to the eye. When vergence and accommodation are inconsistent or in conflict, this normal connection between the two in the brain is broken, and there is a vergence accommodation conflict (VAC). This conflict is believed to be a cause of fatigue and sickness in NEDs.
An equally challenging issue for VR and MR displays is the necessity for users to use prescription glasses. Users with prescription glasses have the option to:
(a) wear the display on top of their spectacles (this increases the size of the headset, pushes the centre of gravity away from the nose, and causes discomfort)
(b) purchase custom-made clip-ons (which tend to be expensive), or
(c) partially correct for the user's optical prescription, usually only spherical correction, by using the optics of the display (this leads to inferior correction and a more complex headset containing expensive optomechanics).
The currently proposed solutions for minimising both the VAC problem and the prescription glasses problem in NEDs are inadequate, as described below.
(i) Fixed accommodation plane display: minimising the display's vergence variation about a single fixed accommodation plane. The single accommodation plane is
positioned in the working space to minimise the user's overall discomfort. If the working space covers all the depth a user can focus on (e.g. 30cm to infinity), then the intermediate plane is between one meter and two meters away. The display is then designed to minimise the discrepancy between the accommodation plane (fixed) and the vergence plane (software-controlled) and thus lessen the VAC-associated discomfort. However, with this approach, users still need prescription glasses to correct their short- or long-sightedness.
(ii) Dynamic lenses: using dynamic lenses that change the accommodation plane in real-time. The principle is to have a dynamic lens between the user and the display and change the power of the dynamic lens according to the content being displayed and the need to correct the user's optical prescription. However, current dynamic lenses (e.g. Alvarez lenses, liquid crystal lenses and liquid lenses) correct for spherical aberration only over a very small FOV, introduce scattering, are bulky, and are too slow for fast-moving experiences such as video games.
SUMMARY
According to a first aspect of the invention, there is provided a display system comprising: an illumination engine configured to generate an input light beam having a first diameter; a light guide optically coupled to the illumination engine and configured to receive the input light beam, the light guide being configured to convert the input light beam into an illumination light beam having a second diameter larger than the first diameter; and a display device configured to be illuminated by the illumination light beam and to convert the illumination light beam into an image light beam, wherein the illumination engine is configured to rotate the angle of the input light beam relative to the orientation of the light guide.
Preferably, the light guide is a planar waveguide.
Preferably, the light guide comprises an in-coupling element and an out-coupling element.
Preferably, each of the in-coupling element and the out-coupling element is one of: a diffractive optical element, a metasurface, a prism and a prism array.
Preferably, each of the in-coupling element and the out-coupling element is one of: a diffractive optical element, a metasurface, a partial reflector, a partially reflecting dielectric mirror, a partially reflecting dielectric coating, a prism and a prism array.
Preferably, each of the in-coupling element and the out-coupling element is one of: a diffractive optical element, a partially reflective mirror, a metasurface, a prism and a prism array.
Preferably, each of the in-coupling element and the out-coupling element is one of: a diffractive optical element, a partially reflective mirror, a metasurface, a partial reflector, a partially reflecting dielectric mirror, a partially reflecting dielectric coating, a prism and a prism array.
Preferably, the illumination engine comprises: a light source configured to output light; a collimation module configured to collimate light from the light source to generate a collimated beam; a rotatable scanning mirror configured to reflect the collimated beam; an actuator configured to rotate the scanning mirror; and a control device configured to control operation of the actuator to rotate the scanning mirror.
Preferably, the illumination engine comprises: a segmented light source comprising an array of pixels; and an optical coupling element arranged to receive light from the segmented light source and to output the input light beam to the light guide. Preferably, the illumination engine is configured to control an output angle of the input light beam by controlling which pixels of the segmented light source illuminate the optical coupling element.
Each pixel of the segmented light source may take the form of an emissive pixel. Each pixel of the segmented light source may take the form of an inorganic light-emitting diode. Each pixel of the segmented light source may take the form of an organic light-emitting diode.
Alternatively, the segmented light source may include a backlight and an array of transmissive pixels disposed between the backlight and the optical coupling element. Each pixel of the segmented light source may take the form of a liquid crystal pixel. Each pixel of the segmented light source may take the form of an electrochromatographic pixel.
The backlight unit may include, or take the form of, of any type known in relation to conventional liquid crystal displays (transmissive or reflective), liquid crystal on silicon displays, and/or electrochromatographic displays.
The optical coupling element may include, or take the form of, a lens. The optical coupling element may be a refractive lens. The optical coupling element may be a pancake lens. The optical coupling element may be a Fresnel lens. The optical coupling element may be a diffractive lens. The optical coupling element may be a meta-lens. The optical coupling element may be a holographic lens.
Each optical coupling element may include, or take the form of, one or more arrays of transmissive shutter pixels stacked in sequence between the segmented light source and the light guide. Each transmissive shutter pixel may include, or take the form of, a transmissive pixel such as a liquid crystal pixel or an electrochromatographic pixel. The illumination engine may be configured to control the one or more arrays of transmissive shutter pixels such that light from the segmented light source cell can only illuminate the light guide by passing through a sequence of apertures formed using the one or more arrays of transmissive shutter pixels.
The illumination engine may include two or more arrays of transmissive shutter pixels. The apertures formed in the one or more arrays of transmissive shutter pixels may be disposed along a line originating at an illuminated pixel (or centroid of a group of illumination pixels) of the segmented light source and extending along the desired angle of the input light beam. In this way, light originating from the illuminated pixel(s) of the segmented light source and directed along the angle set for the input light beam passes through the
apertures to illuminate the light guide. Similarly, light originating from the illuminated pixel(s) of the segmented light source will be blocked (unless at a small enough angle to the angle set for the input light beam). Consequently, in combination with selecting which pixel(s) of the segmented light source will be illuminated, the one or more arrays of transmissive shutter pixels may be controlled to produce the collimated input light beam which is incident to the light guide at a desired angle.
Preferably, the illumination engine is configured to generate one or more further input light beams, each further input light beam having a respective width. Preferably, the light guide is configured to receive the one or more further input light beams, the light guide being configured to convert each further input light beam into a further illumination light beam having a width larger than the width of the respective further input light beam. Preferably, each of the illumination light beam and the one or more further illumination beams illuminates a different portion of the display device.
The portions of the display device illuminated by each of the illumination light beam and the one or more further illumination beams may overlap one another.
Preferably, the display system further comprises: a camera configured to capture images of a user's eye and output the images; and a processing unit configured to receive the images output by the camera, calculate a position of the user's eye based on the images, and calculate a rotation angle of the input light beam required to direct the image light beam onto an eye pupil of the user's eye based on the calculated position, wherein the processing unit is configured to output the calculated rotation angle to the illumination engine, and the illumination engine is configured to rotate the angle of the input light beam by the calculated rotation angle so as to direct the image light beam onto the eye pupil of the user's eye.
Preferably, the illumination engine further comprises relay optics configured to receive light reflected by the scanning mirror and to convert the reflected light into the input light beam.
Preferably, the display system further comprises an imaging optical system configured to project the image light beam generated by the display device onto a user's eye.
Preferably, the imaging optical system comprises catadioptric optics.
Preferably, the imaging optical system has an effective F-number greater than 4.
Preferably, the display system has a display exit pupil diameter in the range of 0.5mm to 3mm.
Preferably, the illumination input light beam has a divergence or convergence angle in the range of 0° to 15°.
Preferably, the light guide comprises an exit pupil expander including the out-coupling element, wherein the out-coupling element is a diffractive optical element configured to expand the beam diameter of the input light beam along at least a first axis.
Preferably, the exit pupil expander further comprises another diffractive optical element configured to expand the beam diameter of the input light beam along a second axis perpendicular to the first axis.
The display does not take the form of a phase modulating device.
Preferably, the display system is configured such that out-coupled beams corresponding to successive internal reflections overlap one another by an amount greater than 0 mm and less than or equal to 1 mm. The successive internal reflections correspond to propagation of a beam propagating within the light guide in response to in-coupling of the input light beam. The condition that the overlap is greater than 0 mm and less than or equal to 1 mm may apply across the operational range of angles of the input light beam.
Preferably, the display device takes the form of a transmissive liquid crystal display device. Preferably, the transmissive liquid crystal display device operates by modulating the amplitude of the illumination light beam. Preferably, the transmissive liquid crystal display device comprises one or more polarizers
Preferably, the input light beam should be incoherent or partially coherent. Incoherent or partially coherent may correspond to a minimum spectral bandwidth of 2 nm at any point in time. Incoherent or partially coherent may correspond to a minimum spectral bandwidth of 2 nm within a 1 second integration period.
Preferably, the out-coupling element of the light guide comprises a plurality of partial bulk reflectors. Preferably each partial bulk reflectors comprises a partial dielectric mirror having an angular bandwidth of ±15 degrees for a central or design wavelength.
According to a second aspect of the invention, there is provided a display system comprising an illumination engine configured to generate an input light beam having a first width. The display system also includes a diffractive light guide optically coupled to the illumination engine and configured to receive the input light beam. The light guide is configured to convert the input light beam into an illumination light beam having a second width larger than the first width. The display system also includes a transmissive liquid crystal display device configured to be illuminated by the illumination light beam and to convert the illumination light beam into an image light beam. The display system also includes a camera configured to capture images of a user's eye and output the images. The display system also includes a processing unit configured to receive the images output by the camera, calculate a position of the user's eye based on the images, and calculate a rotation angle of the input light beam required to direct the image light beam onto an eye pupil of the user's eye based on the calculated position. The display system also includes an imaging optical system configured to project the image light beam generated by the display device onto a user's eye. The processing unit is configured to output the calculated rotation angle to the illumination engine, and the illumination engine is configured to rotate the angle of the input light beam controllably relative to the orientation of the light guide by the calculated rotation angle so as to direct the image light beam onto the eye pupil of the user's eye.
Preferably, the transmissive liquid crystal display device operate by modulating the amplitude of the illumination light beam. Preferably, the transmissive liquid crystal display device comprises one or more polarizers.
Preferably, the illumination engine comprises a light source configured to output light, a collimation module configured to collimate light from the light source to generate a collimated beam, a rotatable scanning mirror configured to reflect the collimated beam, an actuator configured to rotate the scanning mirror; and a control device configured to control operation of the actuator to rotate the scanning mirror.
Preferably, the illumination engine comprises a segmented light source comprising an array of pixels, and an optical coupling element arranged to receive light from the segmented light source and to output the input light beam to the light guide. Preferably, the illumination engine is configured to control an output angle of the input light beam by controlling which pixels of the segmented light source illuminate the optical coupling element.
Preferably, the display system according to the second aspect of the invention may comprise features corresponding to any features of the display system according to the first aspect of the invention. Definitions applicable to the display system according to the first aspect of the invention (and/or features thereof) may preferably be applicable to the display system according to the second aspect of the invention (and/or features thereof).
According to a third aspect of the invention, there is provided a light guide for use in a head mounted display, the light guide comprising: a transparent substrate having upper and lower surfaces; an in-coupling element on the upper surface of the substrate configured to couple light incident on the in-coupling element into the light guide; and an exit pupil expander comprising an out-coupling element on the upper surface of the substrate configured to couple light incident on the out-coupling grating out of the light guide; wherein the upper surface is planar, and the thickness of the transparent substrate in a direction normal to the upper surface varies across the plane of the upper surface.
Preferably, the lower surface has at least one of a curved profile and a profile that is inclined relative to the upper surface when viewed in a direction parallel to the upper surface.
Preferably, the transparent substrate has a thickness variation of more than 1%. Suitably, the transparent substrate has a thickness variation of more than 5%. Suitably, the transparent substrate has a thickness variation of more than 10%. Suitably, the transparent substrate has a thickness variation of more than 25%. Suitably, the transparent substrate has a thickness variation of more than 50%.
Preferably, the transparent substrate comprises an upper layer including the upper surface and a lower layer including the lower surface, the upper layer and the lower layer being stacked in a vertical direction, wherein the upper layer and lower layers are formed from different optical materials.
Preferably, the upper surface has a partially reflective coating.
Preferably, each of the in-coupling element and the out-coupling element is one of: a diffractive optical element, a metasurface, a prism and a prism array.
Preferably, each of the in-coupling element and the out-coupling element is one of: a diffractive optical element, a partially reflective mirror, a metasurface, a prism and a prism array.
According to a fourth aspect of the invention, there is provided a light guide for use in a head mounted display, the light guide comprising: a transparent substrate having upper and lower surfaces; an in-coupling element on one of the upper and lower surfaces of the substrate configured to couple light incident on the in-coupling element into the light guide; and an out-coupling element on the other of the upper and lower surfaces of the substrate configured to couple light incident on the out-coupling element out of the light guide; wherein the upper surface is planar, and the lower surface is reflective and has a curved profile when viewed in a direction parallel to the upper surface.
According to a fifth aspect of the invention, there is provided a display system comprising: a display device comprising an array of pixels and configured to emit image light representing an image; a converging lens configured to collect image light from the display device; a shutter mechanism configured to partially block light that has passed through the converging lens from the display device, wherein the shutter mechanism defines an adjustable aperture through which light can pass; a dynamic lens configured to collect light that has passed through the aperture to form a display exit pupil of the display system; a telescopic optical system configured to image the display exit pupil in an eye box of the display system; and a control device configured to control the shutter mechanism and the dynamic lens, wherein the control device is configured to control the position and size of the aperture and the power of the dynamic lens so as to locate the image of the display exit pupil in the eye box.
Preferably, the display system further comprises an eye-tracking system including: an eye tracking camera configured to image an eye of a user of the display system and output image data; and an eye position calculating unit configured to calculate a position of the user's eye based on the image data from the eye tracking camera; wherein the eye position calculating unit is configured to output the calculated position of the user's eye to the control device and the control device is configured to control the position and size of the aperture and the power of the dynamic lens so as to locate the image of the display exit pupil at the calculated position of the user's eye.
The display does not take the form of a phase modulating device.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the present invention will now be described by way of further example only and with reference to the accompanying drawings, in which:
Fig. 1 shows an example of the pancake-type optics used in a conventional VR or MR display;
Fig. 2 shows a display system according to an embodiment of the invention;
Fig. 3 is a diagram showing how the focus of a light beam from an object varies for two different beam diameters as the object moves from a distance of 0.3m from the eye to infinity through optics with a focal plane at a distance of 2m. The diagram shows that the depth of field of the optics is larger when using a beam diameter of 1mm than when using a beam diameter of 5mm;
Fig. 4 is a spot diagram showing the performance of pancake lens optics with an eye box of 12mm diameter;
Fig. 5 is a spot diagram showing the performance of pancake lens optics with an eye box of 5mm diameter;
Fig. 6 is a spot diagram showing the performance of pancake lens optics with an eye box of 1mm diameter;
Fig. 7 shows an illumination light guide having an exit pupil expander used in an embodiment of the invention;
Fig. 8 shows the path of light rays from the display device through a pancake-type lens to the eye in the optical system of the invention for three different eye positions within the eye box;
Fig. 9A shows the path light takes from the display device to the display exit pupil. The relationships between the pixel position, the angle of light leaving the display device 0y, and the position in the eye box y are illustrated for three different pixels. Fig. 9B shows how the angle of light leaving the display device 0y, and the position in the eye box y are related for rays leaving the display device to the display exit pupil for the three different pixels of Fig. 9A.
Figs. IDA and 10B show variable thickness illumination light guides according to embodiments of the invention;
Fig. 11 shows a variable thickness illumination light guide according to another embodiment of the invention;
Fig. 12 shows a 6f (six focal length) optical system that can be used to generate a steerable exit display pupil in a display system according to an alternative embodiment of the invention;
Fig. 13 shows a display system according to the alternative embodiment of the invention incorporating the 6f optical system;
Fig. 14 shows a side view of a second illumination engine;
Fig. 15 shows a top view of the second illumination engine;
Fig. 16 schematically illustrates using the second illumination engine to generate a steerable input illumination light beam;
Fig. 17 shows a modification of the second illumination engine;
Figs. 18 to 20 show alternative configurations of the second illumination engine;
Fig. 21 shows an example in which three arrays of emissive light sources are supported on a common substrate;
Fig. 22 shows a plan view of light guides for illuminating a display;
Fig. 23 shows a display system including a third illumination engine; and
Fig. 24 schematically illustrates using the third illumination engine to generate a steerable input illumination light beam;
Fig. 25 shows a display system including a fourth illumination engine;
Fig. 26 schematically illustrates using the fourth illumination engine to generate a steerable input illumination light beam;
Figure 27 shows a first example of light guide propagation and out-coupling;
Figure 28 shows a second example of light guide propagation and out-coupling; and
Figure 29 shows a third example of light guide propagation and out-coupling.
DETAILED DESCRIPTION
Introduction
The invention can be applied to any Near Eye Display (NED) that uses a backlit pixelated display device. Preferably an amplitude modulating backlit pixelated display device such as a transmissive liquid crystal device utilizing one or more polarizers. Possible architectures for the NED optical system include pancake lenses, segmented lens architectures (segmented lens architectures are lenses having discontinuity at the first derivative of the shape of the lens), birdbath optics, beam splitter architectures, curved mirrors, holographic reflectors and TIR prisms. This list is not exhaustive, and other NED optical arrangements are possible and compatible with the invention.
For simplicity, this description will focus on embodiments using a pancake lens optical architecture. The skilled person will appreciate how the invention can be applied to other optical architectures.
The invention provides a NED display with a larger FOV, greatly reduced or eliminated VAC, and reduced need for prescription glasses. In some embodiments, these enhancements are achieved by combining an illumination light guide with a reflective/refractive NED to create a smaller display exit pupil within the eye box and centered on the eye pupil of the user. This reduction in the display exit pupil diameter leads to a display with a larger DOF and improved contrast.
The eye pupil is the opening of the user's eye through which light enters. The iris determines the diameter of the eye pupil, which usually varies between 3mm and 5mm. The display exit pupil is the pupil formed by the NED in the eye box. The display exit pupil of the optical system moves within the eye box and follows the eye pupil in the use of the NED, as will be described below.
A display system as described below uses a small steerable display exit pupil that only illuminates the user's pupil. Eye tracking can be used to relay the exact position of the eye pupil within the eye box. Constant steering or constant discrete switching is achieved by using a light guide that illuminates the pixelated display device with a steerable light beam. Light leaving the pixelated display device forms only a narrow display exit pupil within the eye box that coincides with the user's eye pupil. While the eye box size remains large, the display exit pupil is small. The display system keeps the diameter of the display exit pupil narrow (for example approximately 1 mm in diameter). This reduction in the beam diameter and hence the optics' diameter reduces the effect on the displayed image of aberrations in both the eye and the optical system.
The term "diameter" is used here to mean a width or more generally a dimension of the beam in a direction perpendicular to the propagation direction of the beam. It does not imply that the beam is necessarily circular in cross-section. For example, the beam diameter could be the width dimension of a beam having a rectangular cross-section.
By reducing the effect of aberrations in the optical system, the display system relaxes manufacturing tolerances for the optics, which allows the cost of the display system to be reduced. The use of a narrow-angle beam allows the optics to be simplified and consequently also has the effect of reducing the overall size of the optical system.
Using the display system of the embodiment, the user does not need prescription glasses and will not experience VAC, the system's optics are simplified, and contrast improves. The display system reduces or eliminates VAC by creating an image that is in focus for all depths.
The F-number of an optical system is the ratio of the focal distance to the diameter. Estimates for the F-number of the eye vary from 2~8, depending on the light conditions (the darker the conditions, the larger the pupil diameter and hence the lower the F-number). A pancake lens can have an F-number below 1.
The inventors have found that the problems described above can be reduced or eliminated using optics with a large F-number. Increasing the F-number of the optics of the display system increases the depth of field and reduces the effect of optical aberrations on the displayed image in a NED. By increasing the optical system's F-number above the human eye's F-number, aberrations introduced by either the optical system or the eye are reduced, and the depth of field increases and resolution increases. The F-number of the optical system can be increased by dynamically reducing the illumination cone emitted by the display device onto the optics at any one time.
Components of the display system
An embodiment of the invention comprises the following components:
(1) A NED display device.
(2) An illumination light guide (also referred to as a waveguide). The light guide can be flat (as shown in Fig. 2) or have one of its surfaces be curved (see Fig. 10 and 11). Its purpose is to act as a backlight and illuminate the display of the NED.
(3) An illumination engine. The illumination engine includes a light source and a beam steering unit that steers light emitted from the light source to illuminate an in-coupling element of the light guide or waveguide. The details of the relay optics between the beam steering unit and the in-coupling element (such as a 4f, i.e. four focal length, system for example) are not shown here for simplicity and clarity. Suitable 4f relay optics are known.
(4) An eye-tracking (ET) system. The ET system may utilise an image sensor and a plurality of light sources emitting light towards the user's eye. Images captured by the image sensor can be used to determine the position of the user's eye. Such an ET system may detect reflections from the eye to determine the 3D position and direction of the eye.
(5) A controller that receives the position of the eye from the ET system and sets the illumination engine to the appropriate illumination angle by controlling the beam steering unit, such that light from the illumination engine enters the eye via the display exit pupil and the eye pupil. For example, if the beam steering unit uses a rotatable scanning mirror, this illumination angle will correspond to the angle of the mirror.
The illumination engine injects collimated light or light with a low beam divergence into the illumination light guide at a controlled angle by controlling the orientation of a scanning mirror operating as the beam steering unit. The divergence or convergence of the collimated beam is preferred to be less than +/- 2 degrees. This control over the angle of the illumination light also controls the position of the display exit pupil produced by the display system after the light from the illumination engine has passed through the display device and the subsequent NED imaging optics. The technique of moving the display exit pupil to different positions to illuminate the user's eye pupil is known as pupil steering.
The illumination engine may compromise any suitable number or combination of components that emit light in visible wavelengths, components for controlling the emission angle and optical elements to relay the collimated beam into the lightguide.
Examples of light sources for the illumination engine include Light Emitting Diodes (LEDs), lasers, VCSELS, and arrays of such sources, like inorganic LED displays. Examples of components controlling the emission angle include scanning mirrors and arrays of emitters at different angles that are selectively switched.
When a scanning mirror is used as the beam steering unit, its rotation needs to be controlled at a frequency on the order of 100 Hz to ensure that the beam angle remains aligned with the user's eye. This frequency is lower than the frequency at which a scanning laser is used to form an image by scanning a beam across the FoV in known display devices, thus making driving the embodiment's scanning mirror easier.
First embodiment
Figure 2 shows one embodiment of the invention. In this example, the illumination engine 32 of the display system comprises a light source, which is a LED, a laser or a superluminescent diode, and a scanning mirror 30 with relay optics 28 such as a relay lens as a beam steering unit. The illumination engine illuminates an in-coupling diffractive optical element (DOE) 22 of a planar light guide 20, forming the illumination light guide, which also acts as an exit pupil expander (EPE). "In-coupling" means that the DOE provides optical coupling into the light guide 20, in this case, from the illumination engine 32. The function of the illumination light guide 20 is to receive a narrower beam of light and output a larger- area backlight illumination for the display device 26. Note that the term illumination light guide includes both planar waveguides, i.e. waveguides having a substantially constant thickness, and freeform light guides, i.e. light guides having a variable thickness across their length and/or width. The thickness direction of the light guide 20 is defined as the direction substantially along the optical path in the optical system.
In other examples, the in-coupling diffractive optical element (DOE) 22 may be replaced with a comparable in-coupling element such as, for example, partial reflector, a partially reflecting dielectric mirror, a partially reflecting dielectric coating, a prism, a prism array, and so forth.
The in-coupling DOE 22 diffracts the beam sufficiently that the beam is totally internally reflected within the illumination light guide 20. Light trapped in the light guide 20 by total internal reflection (TIR) will eventually hit the out-coupling element 24 provided on the light guide 20 and diffract towards the display 26. The out-coupling element 24 is also a grating, for example another DOE or a metasurface. The out-coupling element 24 has the function of coupling light out of the light guide 20 and into the next stage of the display optics, in this case the display device 26 itself, which is a transmissive liquid crystal display panel in the embodiment.
In other examples, the out-coupling element 24 may be replaced with a comparable incoupling element such as, for example, partial reflector, a partially reflecting dielectric mirror, a partially reflecting dielectric coating, a prism, a prism array, and so forth.
The display device 26 can, in other embodiments, be any image-forming device that utilises a back light unit (BLU). Examples of suitable display devices include liquid crystal on silicon (LCOS) and liquid crystal display (LCD) devices. Display device 26 is preferably an amplitude modulating display device such as a transmissive liquid crystal device utilizing one or more polarizers.
The light guide's output angle determines the position of the display exit pupil 38 in the eye box 40, and the position of the display exit pupil 38 will move as the angle of the scanning mirror 30 changes.
An eye-tracking system, including an eye-tracking image sensor 36, is also provided in the display system. The image sensor or camera 36 is used to determine the eye's exact position and thus correctly set the angle of the light from the illumination engine 32 so that the display exit pupil 38 is formed where the user's pupil is.
In this embodiment, the eye-tracking system includes an array of infrared (IR) light emitting diodes (LEDs) positioned on a part of the display system facing the eye. For example, if the display system is incorporated into a HMD resembling a pair of glasses, the LEDs are located around the frame of the spectacles facing the user's eye. In a larger headset, such as a typical VR helmet-type display, the IR LEDs are located inside the headset. For example, the LEDs can be placed around the edge of the aperture through which image light from the display device passes to reach the user's eye.
Infrared light from the LEDs is reflected from the eye and sensed by the eye-tracking camera 36, which can be a standard digital camera of the kind commonly used in smartphones, provided that the camera sensor is sensitive to infrared light. The information is then analyzed to extract eye rotation from changes in the reflections of the infrared light from the eye, which lead to corresponding changes in the images captured by the camera. The 2D image from the camera is fitted into a 3D model by an eye position calculating unit in the display system. Within a few frames of image data from the camera, the eye position calculating unit creates a 3D model of the eye representing the eye's position and orientation in 3D space.
The eye position calculating unit is provided to perform the analysis of image data from the camera and calculate the position of the eye. The eye position calculating unit can be implemented as software running on a general purpose processor or as a dedicated hardware component.
The eye position calculating unit can calculate the eye position by tracking various features of the eye using the image of the reflected light, including the corneal reflection (the first Purkinje image) and the center of the eye pupil in this example. The eye-tracking system of the invention may also track both the reflection from the front of the cornea (the first Purkinje image) and the back of the lens (fourth Purkinje image). This type of eye-tracking system is known as a dual-Purkinje eye tracker. Another alternative or additional feature of the eye-tracking system is to image features from inside the eye, such as the retinal blood vessels, and follow these features as the eye rotates. This leads to a more accurate determination of the eye orientation.
Illumination light guide
The display system includes an illumination light guide. The light guide acts as an exit pupil expander (EPE) in one or two dimensions, thus creating a large illumination area with compact optics. This light guide (unlike more conventional waveguides used for AR devices) is used for illumination purposes and is not used as an image-conveying device, i.e. the
illumination light guide is positioned before the display device in the optical path of the display system.
An example of the illumination light guide is shown in Figure 7. This illumination light guide 1 benefits from a greater degree of design freedom compared to more conventional light guides, resulting from its use for illumination purposes and the properties of the display system of the invention. The illumination light guide 1 includes an in-coupling element 2 having an exit pupil expander (EPE) 3 and an out-coupling element 4 such as a DOE grating. The following features can be included in the illumination light guide 1 as appropriate for the particular application of the display system:
(1) Out-coupling optical power. The out-coupling grating (item 4 in Figure 7) may be configured to have variable orientation and spatial frequency. The spatial frequency of the diffractive grating (the equivalent of optical power in refractive optics) can change the diffraction angle for each position and add an angular offset required for the angles on the pixelated device backlight.
(2) Variable thickness. Variable thickness adds another degree of freedom in matching the target illumination angle for each position in the eye box and each image pixel. The light guide thickness can be varied to achieve the desired central and range of illumination angles.
(3) Including both refractive and reflective components.
(4) Low-refractive index (Rl) materials. The range of angles within the illumination light guide will be reduced compared to the range required in the waveguide of a conventional AR device. The reduced range of angles allows for the use of low-RI materials like plastic in the light guide of the invention.
Light from the illumination light guide 1 passes through the transmissive LCD panel forming the display device. The transmissive LCD panel is preferably an amplitude modulating device utilizing one or more polarizers. The display device causes the beam from the illumination light guide to diverge in this embodiment. The light from the display device enters the pancake lens, which is located at one focal length from the display device and one focal length from the eye box.
The angular spread of the pixel output beam from the display device will determine the size of the display exit pupil. The smaller the angular spread of the pixel output beam, the smaller the display exit pupil in the eyebox and hence the larger the depth of field of the image. On the other hand, the display exit pupil needs to be large enough to reliably form an image in the user's eye. If the angular spread of the pixel output beam and hence the display exit pupil would otherwise be too small, a diffusing function can be added on one of the surfaces in the optical path of the beam. For example, a diffusing optical surface can be added prior to the display so that the angular spread of the pixel output beam increases and the size of the display exit pupil also increases.
The pancake lens is formed of a plurality of lenses that are used in both reflection and refraction. A polarizer and a quarter waveplate are used to selectively reflect the first
bounce while transmitting the second. A half-silvered mirror is used to reflect the backwards propagating ray towards the user's eye.
Alternative types of lens can be used in place of the pancake lens in other embodiments. Suitable alternatives include achromatic converging lenses, Fresnel lenses, freeform reflectors and birdbath lenses.
Eliminating VAC
The invention reduces or eliminates VAC by reducing the beam diameter in the optics of the display system. By keeping the beam diameter narrower, the image is more in focus and the user can experience focused images without requiring the significant eye accommodation that can lead to VAC.
There are two opposing factors in deciding the beam diameter within the display system. On the one hand, the larger the beam is, the more "defocus" the eye will experience due to the narrower DOF that results. This means that more accommodation will be required in the eye to produce a sharp image. On the other hand, making the beam diameter too narrow will introduce aperture diffraction and thus reduce the resolution of the display system. The optimal value of the beam diameter depends on the specific optics used in a given embodiment, but it is usually in the range 0.5 mm to 1 mm.
Fig. 3 shows how the spot diameter of an image changes on the retina of an idealised eye model. The eye was modelled as a paraxial lens at a perfect focus at 2 meters. The object is placed at five different positions, from infinity to 30cm. The spot radius on the retina is then calculated for an eye with a 1mm eye pupil (top row) and a 5mm eye pupil (bottom row).
The five columns represent an object at different distances, with the eye's focus fixed at 2 meters. When the pupil diameter is only 1mm, the geometric spot diameter is small for the full depth range (shown by black dots in the diagram). On the other hand, when the pupil diameter is 5mm the DOF is much narrower. This leads to much larger geometric spot diameters at depths (i.e. distances) further away from the focal distance of 2m. This is shown by the large and diffuse discs represented by "+" symbols shown at infinity and 0.3m in the diagram for the 5mm pupil diameter. This means that there is significant defocus at distances away from the focal plane when the pupil diameter is 5mm, and the user will likely experience VAC.
By reducing the beam diameter of the display system, the invention creates the same larger DOF that is normally associated with a small pupil diameter as shown in Fig. 3. Even if the user's pupil diameter is in fact 5mm, using the display system of the invention the spot diameters at different distances will be the same as if the user's pupil diameter was the same as the smaller beam diameter output by the display system (e.g. around 1mm).
While the diffraction spot diameter (shown by a circular black outline in the diagram) is larger when the pupil diameter is 1mm, the large reduction in geometric aberrations relative to a pupil diameter of 5mm keeps the small eye box in focus for all depths. The
increase in the spot diameter due to diffraction is significantly less compared to the spot diameter reduction due to the lower geometrical aberrations in the optical system of the embodiment.
The inventors have found that a projection optical system with a small display exit pupil also decreases the aberrations associated with the optical system. The display exit pupil is the exit pupil formed by the display optics. The display exit pupil can be replicated at multiple places in the eye box. For example, this replication can be achieved by using fan-out gratings. As the beam diameter output by the illumination engine and hence the display exit pupil diameter becomes smaller, the beam samples a smaller area of the display optics and thus collects fewer aberrations as it passes through the optics. The display exit pupil diameter in the optical system is preferred to be in the range 0.5 - 3 mm, more preferably in the range 2 - 3 mm.
Fig. 4 shows the optical diagram of a simple pancake optics-based VR display with a horizontal FoV (shown vertically on the diagram) of 110 degrees. The user's eye is on the left, and the display device emitting light is on the right. Each line type corresponds to a different angle in the eye box and a different image pixel, which is often referred to as a different "field". In this case, each field corresponds to a single image pixel. In the example shown in this diagram, the eye box formed is fixed and has a length of 12mm, which in many cases will not be sufficiently large.
The rays are traced from the eye to the display in the diagram. The system was simulated using OpticStudio (RTM) in this example, which is an example of optical design software. This reversal of rays allows us to compare the resolution of the display system optics to the device pixel size. The right-hand image shows the spot size on the display. The total size of the display is 50.6mm. In the design of Fig. 4, the spot diameter for the central 22 degrees of the FoV is approximately 80pm. Dividing the display size by the spot size for the example of Fig. 4 leads to a maximum central display resolution of approximately 632 pixels, which is very low for a 110-degree FoV VR display.
Fig. 5 shows the same optical system as Fig. 4 except that the eye box was cropped to 5mm diameter, which corresponds to the usual maximum diameter of a human eye pupil. This simulation enables us to determine the image produced by the optical system as perceived by the user. This means that in theory all the rays from the display system shown in Fig. 5 could enter the user's eye. With a 5mm eye box, the spot diameter for the range of 0-22 degrees away from the central position of the display is approximately 45pm. This leads to a total display resolution of around 1224 pixels, which is still low for a 110 degrees FoV VR display.
Fig. 6 shows the same optical system as Figs. 4 and 5, except that the eye box is now cropped down to 1 mm. This 1mm diameter disk now corresponds to the display exit pupil that moves within the eye box in use of the optical system. The consequent improvement in spot size (and thus resolution) on the display device is significant. Using a 1 mm display exit pupil, the spot diameter for the range of 0-22 degrees away from the central position of the display is approximately 10 pm. For a 50.6mm display, this represents a resolution of
approximately 5,000 pixels. This is more than enough for a 110 degree FoV VR display, meaning that with this optical performance the limiting factor for display resolution is likely to be the pixel size of the display device rather than the maximum resolution of the optics.
In the same way that the display system reduces the effect of aberrations in the optical system, it also reduces the effect of eye aberrations. Fig. 3 shows how the effect of defocus is reduced by reducing the beam diameter. In the same way, a narrow beam diameter will sample fewer aberrations in the eye and affect the image quality less when users suffer from ocular aberrations. In addition, creating a focus-free image can help people with presbyopia (the progressive loss of near focusing ability of the eye due to ageing).
Contrast is often reduced in an optical system by scattering on the surfaces or in the bulk of the optics, producing scattered light that subsequently arrives in the eye to wash out the image produced by the intended image light. In pancake optics, this is particularly challenging as a large portion of the light is reflected by the half-silvered mirror. Contrast can be improved by reducing the amount of scattered light. The current conventional approach to achieving this is to make better optics with more complex coatings to eliminate scattering.
In the display system of the invention on the other hand, scattering is reduced by reducing the amount of unutilised light produced, i.e. light that will never arrive at the user's eye. This is achieved by using only a narrow beam of light that is directed accurately through the optics to the user's eye. The invention reduces unutilised light by one or two orders of magnitude in this way. Therefore, there is less light to cause scattering and contrast improves without requiring more expensive coatings and lenses.
Rays from different pixels on the display device should arrive in the same position as the eye moves across the eye box. Effectively for any specific pixel (i,j) on the pixelated display device with coordinates (Hi, Hj), there should be a target emission angle (0i, 0j) that will make the rays from the specific pixel arrive at the target eye box position (Pm, Pn). For a fixed eye position, the light input into the illumination light guide by the illumination engine is a beam directed substantially at a single angle (|>m, c|>n (with a relatively small beam divergence around that angle). On the other hand, light emerging from the illumination light guide is a two-dimensional matrix that must satisfy specific angular conditions for all i's and j's. The requirement to control a two-dimensional matrix using a single value is an overconstrained problem, which becomes an optimisation problem. Effectively, with a single value input to the illumination light guide (the angle of incident illumination light), all the conditions for emission angles (0i, 0j) must be satisfied for all i's and j's (i.e. all pixels).
There will inevitably be some degree of error in satisfying the conditions for the illumination angles (0i, 0j) for all pixels (i,j) in this way. The first way to reduce the error is by choosing an input angle (|>m, c|>n that approximates the curve of emission angles (0i, 0j) into a single plane of fixed value; effectively the emission angle across the display device is fixed for all pixels. The exact shape of the curve relating target emission angle (0i, 0j) to a specific display exit position (Pm,Pn) depends on the optics of the projection optical system. One way of approximating the 2D relationship between input angle and display exit position in
the eye box is to minimise the root mean square error between the optimal curve for (0i, 0j) and the plane approximation corresponding to a fixed emission angle. However, the error will remain significant for some angles resulting in the beam not arriving at the user's eye.
The second approach is to have different out-coupling gratings (different in direction and/or spatial frequency) at different areas on the illumination light guide. The resulting non- uniform out-coupling grating will effectively enable the central angle of the waveguide to be shifted to any desired angle. Therefore, there will be at least one position of the display exit pupil in the eye box where the conditions on the angle (0i, 0j) of light from each pixel are perfectly satisfied. At this position in the eye box, the light is emitted from all the pixels at the target angles and the conditions are met for the display exit pupil to be formed at the same target position in the eye box. In the example shown in Fig. 8, the out-coupling grating would be varied across the illumination light guide so that for each position on the display device the light is emitted at the target angle (0i, 0j) and therefore satisfies the conditions for forming the display exit pupil at the target position.
The third approach is to create a light guide with a variable thickness, i.e. an illumination light guide 5 in which one of its surfaces is free-form (see Figs. IDA and 10B). A free-form light guide adds more degrees of freedom to the optics at each pixel position. The thickness variation allows the beam to be magnified or demagnified to different extents across the surface of the illumination light guide, effectively increasing or decreasing the angle of light output by the light guide at each position relative to the input angle of light into the illumination light guide.
The variable thickness illumination light guides 5 of Figs. IDA and 10B can be made from plastic, glass, or a combination of both. In some embodiments, the light guide can be moulded in a shape that allows for additional functions to be carried out in it, like focusing and angle magnification. Such light guides may be wedge-shaped, have a concave shape with a constant radius of curvature or have a freeform shape.
In addition, a stack of different materials can be used to form the light guide. For example, the light guide nanostructures (i.e. the in-coupling 2 and out-coupling 4 gratings on the input side of the light guide) can be made in a substrate suitable for nanofabrication such as glass or quartz. Fig. 10B shows a light guide having a layer 6 of glass or quartz on which the in-coupling 2 and out-coupling 4 gratings are formed. The part of the light guide 5 on the output side forming the inclined, curved and/or freeform output surface can be formed from plastics, for example using injection moulding. This allows for greater freedom in the shape of the output surface and reduces cost compared to a light guide formed entirely of glass or quartz.
Fig. 11 shows another illumination light guide 50 of variable thickness including in-coupling 54 and out-coupling 52 gratings. Like the waveguide shown in Fig. 7, the light guide 50 of Fig. 11 also includes another grating (not shown) to perform the EPE in the second dimension. The light guide 50 of Fig. 11 has a concave curved reflective lower surface, while the upper surface is flat where the in-coupling and out-coupling gratings 54 and 52 are formed. The lower surface may have a reflective coating or may utilise the Rl difference
between the light guide material (which can be made of materials such as optical glass or optical plastic) and the air to reflect light by TIR.
Light enters the light guide 50 from an illumination engine via the in-coupling grating 54 and the upper surface and is then reflected from the lower surface. The reflected light is incident on the out-coupling grating 52 on the upper surface and then exits the light guide at different angles across the light guide as shown. The exit angles of the light across the light guide 50 are controlled by the variation in the angle of the lower surface across the light guide. The in-coupling and out-coupling gratings 54 and 52 may each independently have any shape in plan view (i.e. viewed in a direction perpendicular to the upper surface of the light guide), including for example a circle, ellipse, rectangle or square.
Compared to a uniform-thickness planar waveguide, a variable-thickness illumination light guide provides additional design freedom, allowing the out-coupled rays output by the light guide to change in angle across the light guide for a constant angle of input light. Starting from the angle range of illumination light required by the optical system of a NED (for example the pancake lens NED described above), an optical engineer designing the optical system can select the shape of the lower surface of a wedge-shaped, curved and/or freeform light guide to produce the desired output angles across the light guide.
In particular, the curved light guide 50 shown in Fig. 11 allows an in-coupled ray at a single angle to exit at different angles at different positions across the light guide, breaking the one-to-one relationship between the input and output angles of the waveguide that exists in planar waveguides. A single input angle is entered into the waveguide, while the output angle is varied across the out-coupling element. The rays of interest are shown as bold dashed lines for clarity.
Figs 9A and 9B illustrate further how the display exit pupil 108 is moved across the eye box 107 of an optical system according to an embodiment of the invention. Fig. 9A shows the optical system in cross-section, including an illumination engine 100, a light guide 102, a transmissive display device 104, a projection optical system 106, and the eye box 107. An input illumination light beam 101 is output by the illumination engine 100 into the light guide, which outputs an illumination light beam 103 into the display device 104 via an outcoupling element 105. The display device modifies the light passing through it at pixels including pixels A, B and C labelled 119, 120 and 121 respectively. Image light beams from pixels A, B and C are labelled 122, 123 and 124 respectively, and the chief image light rays within those beams are labelled 125, 126 and 127 respectively.
The image light beams from the display device are incident on the projection optical system 106, which converts these beams into pixel A projection output light beam 116, pixel B projection output light beam 117, and pixel C projection output light beam 118. The projection output light beams from the projection optical system 106 are all incident on the display exit pupil 108 within the eye box 107.
There is a requirement that the Pixel A projection output light 116, Pixel B projection output light 117, Pixel C projection output light 118, and projection output light from all other pixels on the display device 104 arrive at the display exit pupil 108 position. This
requirement ensures that the user's eye lens 109 will collect light from all the pixels of the display device 104 so that the entire displayed image is viewable. The display exit pupil 108 can be at any position in the eye box 107. The position of the display exit pupil 108 is defined by the local vertical coordinate axis y 111 shown in Fig. 9A. In this document the local vertical coordinate axis y 111 is assumed to be in the range between 0 and 1 where 0 is the lowest point in the eye box 107 and 1 is the highest point. It will be understood that the user's eye 112 can move also in the horizontal dimension x within the eye box 107 but for simplicity this is not shown in Fig. 9A.
The Pixel A projection output light 116, Pixel B projection output light 117, Pixel C projection output light 118, and projection output light from all other pixels on the display device 104 will coincide on the display exit pupil 108 if the Pixel A image light chief ray 125, Pixel B image light chief ray 126 and Pixel C image light chief ray 127 each have a respective correct angle 0y(i=O), 0y(i=O.5), and 0y(i= 1). The correct angle for each chief ray is defined by the optics of the system and particularly the projection optical system 106. The correct angle for each chief ray is a function of the eye 112 position in the eye box 107 on the eye box local vertical coordinate axis y 111 and the pixel position on the display device 104, given by the display device local vertical coordinate axis i 110.
Fig. 9B shows the possible variation of the angles of the Pixel A image light chief ray 125, Pixel B image light chief ray 126 and Pixel C image light chief ray 127. As a first approximation, the variation of the chief ray angle as a function of the display exit pupil 108 position in the eye box 107, i.e. the local vertical coordinate axis y 111, can be modelled as a straight line with a given offset 0O(i) and a slope of d0y/dy. Therefore, these model lines each need two independent variables to be independently defined for a fixed pixel position. This means that two degrees of freedom are required to control 0y sufficiently for each chief ray.
The first degree of freedom can be the grating period of the out-coupling element 105. However, the local grating frequency of the out-coupling element 105 will have the slope and the offset of the 0y(y) function linked. Therefore, another variable is required to create fully independent control of the slope and offset of the 0y(y) function for each chief ray.
A second degree of freedom can be to use a light guide 102 with variable thickness, as described above in relation to Figs. 10A, 10B and 11. The variable thickness of the light guide 102 can shift the angle along its thickness to the desired offset so that the 0y angle offset 128 for Pixel A and the 0y angle offset 129 for Pixel C shown in Fig. 9B have the required target values. Combining adjustment of the grating period from the out-coupling element 105 and the thickness of the light guide 102 can provide sufficient control of the 0y(y) function for each chief ray.
In the embodiments above, a method and system have been described for steering the display exit pupil of a display system, using a directional illumination engine producing a low-divergence light beam and an illumination light guide located behind a pixelated display device. In such a display system, the display device is typically 2f (two focal distances) from
the user's eye. However, it is also possible to steer the display exit pupil by using a 6f display system in which the display device is six focal distances from the user's eye.
Fig. 12 schematically illustrates different options for controlling the position of the display exit pupil of the invention. In the case of the directional backlight created by the illumination engine and the illumination light guide of the embodiment discussed above, the position of the display exit pupil is controlled by changing the angle of light on Plane E of Fig. 12. In an alternative embodiment, the position of the display exit pupil can be controlled instead by a shutter at Plane C and a pixelated display device at Plane A of Fig. 12, in a 6f optical system.
Fig. 13 shows this alternative embodiment using a 6f optical system. A pixelated display device 60 at Plane A (position 6f from the eye) emits light that is then collimated by a lens 62 at Plane B (position 5f). The collimated light is then selectively blocked by a shutter mechanism 64 having an adjustable aperture at Plane C (4f) to create an exit pupil having the required size at a desired position on plane C. A converging dynamic lens 66 is also provided after the shutter 64 in the optical path in this embodiment, to adjust the path of light passing out of the aperture of the shutter 64. This lens 66 can be omitted in other embodiments.
In this embodiment, the shutter 64 can be a liquid crystal shutter. For example, a liquid crystal shutter modulating polarizations between two crossed polarizers can be used. As a voltage is selectively applied to each pixel of the shutter 64, the light passing through the first of the two polarizers is either transmitted or blocked through the second polarizer according to the voltage applied to that pixel, forming an aperture and a light blocking part. In this way, the liquid crystal shutter provides an aperture whose shape and size can be freely controlled. The shape and size of the aperture can also be modified quickly by changing the voltages applied to pixels across the shutter 64.
Using a telescope comprising two lenses 68 and 72 at Plane D (3f ) and Plane F (If), the aperture created by the shutter 64 as the display exit pupil is re-imaged from the shutter plane C into the eye box 74 at Of. The lenses 68 and 72 at Plane D and Plane F can be formed in the same solid piece of plastic and/or formed from diffractive surfaces as an alternative to using conventional separate converging lenses. In this example, a reflective surface 70 is provided in the optical path between the lens 68 on plane D and a diffractive element or metasurface 72 as the lens at plane F. The reflective surface 70 can be used to produce an optical system with the desired form factor. The reflective surface 70 may also be curved to assist with imaging the display exit pupil in the eye box 74.
With this arrangement, the position of the eye box 74 can be adjusted by changing the position of the aperture in the shutter plane C by controlling the shutter mechanism 64, thus moving the display exit pupil.
To produce a larger eye box using the optical system of Fig. 13 without increasing the burden on the display engine, display exit pupil replication can be used. Display exit pupil replication can be implemented by providing a DOE on Plane F that splits each incident ray into multiple output rays. By using display exit pupil replication, an array of display exit
pupils is formed in the eye box 74. In this arrangement, the DOE on plane F is configured to produce an array of display exit pupils with a separation distance larger than the maximum usual eye pupil diameter, i.e. larger than 5mm. This ensures that the user only sees a single display exit pupil at any point in the eye box.
Second illumination engine
An example of an illumination engine 32 is described hereinbefore in relation to Figure 2, which includes a light source, relay optics 28 and a scanning mirror 30. This illumination engine, referred to hereinafter as the first illumination engine 32, could be used to provide the illumination engine 100 of the example illustrated in Figures 9A and 9B, or indeed any other embodiments described hereinbefore.
However, illumination engines are not restricted to the first illumination engine 32, and referring also to Figures 14 to 16, a second illumination engine 200 is shown. The second illumination engine 200 may be used as the illumination engine 100, may be swapped for the first illumination engine 32, and/or used to provide light input to any of the hereinbefore described embodiments and/or modifications thereof described hereinbefore.
Figure 14 shows a side-view of the second illumination engine 200 in the context of a display system. Figure 15 shows a top view. Figure 16 illustrates the mechanism through which the second illumination engine 200 generates a steerable input illumination light beam 201.
The second illumination engine 200 includes an array of emissive light sources 202 and a collimating lens 203. In the example illustrated, the emissive light sources 202 are arranged in a 2D cartesian array, although this is not essential and the emissive light sources 202 may be arranged using different types of 2D grid (e.g. hexagonal), or may instead be arranged in a linear array and so forth. The following discussion will assume a 2D, N by M square or rectangular grid, so that each emissive light sources 202 effectively provides one "pixel" of a segmented light source. The emissive light source 202 providing the pixel in the nth of N rows and mth of M columns of the array shall be denoted P(n,m) for the purposes of the description hereinafter. The emissive light sources 202 take the form of LEDs, organic LEDs, or any other emissive light source capable of being provided in an array and illuminated individually (or in small groups). For the purposes of describing the example shown in Figures 14 to 16, the emissive light sources 202 should be understood to be LEDs or OLEDs.
The collimating lens 203 is positioned at or close to a focal length f from the array of emissive light sources 202. The collimating lens 203 may be of any type described hereinbefore in relation to other lenses of the preceding embodiments.
The angle 0out of the input illumination light beam 201 is controlled by illuminating one emissive light source 202 at a time, or a small group of adjacent emissive light sources 202 depending on the number, size and spacing.
Referring in particular to Figure 16, the operation of the second illumination engine 200 is illustrated.
Figure 16 is a schematic, geometric optics diagram for a collimating lens 203 in the form of a convex lens. For the sake of visual clarity, the mth column of the array of emissive light sources 202 has been shown as pixels P(l,m), ..., P(ll,m) on the focal plane of the collimating lens 203.
For a first case, in which the emissive light source 202 corresponding to the pixel P(10,m) is illuminated (switched to an "ON" state), a first pair of rays 204a, 205a are drawn with solid lines: a first ray 204a of light travelling parallel to an optic axis 206 of the collimating lens 203 and a second ray 205a of light which passes through the optical centre of collimating lens 303. These rays 204a, 205a are drawn for visual clarity, though all other rays originating from the ON pixel P(10,m) will be deflected to the same angle 0out to form the input illumination light beam 201.
In a second case, the emissive light source 202 corresponding to the pixel P(10,m) is deactivated (switched to an "OFF" state) and illuminating the emissive light source 202 corresponding to the pixel P(4,m). A second pair of rays 204b, 205b are drawn with chained lines: a first ray 204b of light travelling parallel to the optic axis 206 and a second ray 205b of light which passes through the optical centre of collimating lens 203. However, for the second case, with pixel P(4,m), it may be observed that the direction of the input illumination light beam 201 is shifted from 0out to 0'out.
In this way, by controlling which pixel(s) P(n,m) of the array of emissive light sources 202 are illuminated, the angle 0out of the input illumination light beam 201 may be controlled.
In the same way as the optical systems described hereinbefore, the input illumination light beam 201 is directed to an in-coupling element 207 of a light guide 208. As described hereinbefore, an out-coupling element 209 of light guide 208 extracts an illumination light beam 210 which illuminates display 211. Output light beams 212 are collected and focused to the eye box (not shown in Figures 14 to 16) by a projection optical system 213.
The in-coupling element 207, light guide 208, out-coupling element 209, display 211 and projection optical system 213 may be as described in any of embodiments hereinbefore. A display system including the second illumination engine 200 may additionally include any other element described hereinbefore, such as variable thickness light guides and so forth.
Although described as a collimating lens 203, in some examples the collimating lens 203 may include two or more lenses and/or additional optical elements. In some examples, the collimation function may be provided by an off axis lens or a section of a lens. In some other examples, the collimation function may be provided by freeform optics. Freeform optics may be useful when the optical system is not rotationally symmetric or when the focal plane is not on the optical axis.
Referring also to Figure 17, a modification of the second illumination engine 200 is illustrated.
Although described as including a single pairing of an array of emissive light sources 202 and collimating lens 203, the second illumination engine 200 may include any number of such pairings, each corresponding to a respective in-coupling element 207 of a light guide 208.
In the example shown in Figure 17, a linear array of four in-coupling elements 207a, 207b, 207c, 207d are arranged along one side of a light guide 208. A single out-coupling element 209 corresponds to all the in-coupling elements 207a, 207b, 207c, 207d, though it could alternatively be sub-divided correspondingly.
Each in-coupling elements 207a, 207b, 207c, 207d receives an input illumination light beam 201 from a respective paired collimating lens 203a, 203b, 203c, 203d and array of emissive light sources 202a, 202b, 202c, 202d. All the input illumination light beams 201 may be controlled to have the same angle, for example by illuminating the pixels P(n,m) from the same row n and column m in each array of emissive light sources 202a, 202b, 202c, 202d.
Alternatively, the angles of some or all of the input illumination light beams 201 may be varied by illuminating different pixel P(n,m) coordinates (n,m) within each array of emissive light sources 202a, 202b, 202c, 202d. For example, using differently angle input illumination light beams 201 may allow the effects described hereinbefore in relation to variable thickness light guides 202 to be obtained to even greater degrees.
The use of light guides 208 having multiple in-coupling elements 207 is not restricted to illumination using the second illumination engine 200, and the first illumination engine 32 could also be used with such light guides 208 (for example a set of relay optics 28 and scanning mirror 30 may be provided for each in-coupling element 207).
Referring also to Figures 18 to 20, further examples of possible configurations of the second illumination engine 200 are shown. Out-coupling elements 209 and further elements of a display system are not repeated in Figures 18 to 20 for visual clarity.
The example shown in Figure 18 corresponds to the configuration shown in Figure 14, in which the array of emissive light sources 202 and collimating lens 203 are arranged with an optical axis oriented substantially perpendicular to the in-coupling element 207 of the light guide 208.
However, this is not essential, and as shown in Figure 19, the optic axis 206 of the second illumination engine 200 may be angled relative to the light guide 208. This is applicable whether the second illumination engine 200 includes one, two or more pairs of collimating lens(es) 203 and respective array(s) of emissive light sources 202. When multiple pairs of collimating lens(es) 203 and respective array(s) of emissive light sources 202 are used, they are not required to have parallel optic axes, and may make different angles to the light guide 208. This may be combined with other features that as variable thickness light guides and/or illuminating different pixel coordinates (n,m) in different arrays to provide even greater control of the distribution of output light beams 212 provided to and focussed by the projection optical system. Further advantages include having a single part for manufacturing and having all electronics and LEDs on substantially the same plane.
The examples shown in Figures 14, 16, 18 and 19 include generally symmetric collimating lenses 203 arranged concentrically with the respective arrays of emissive light sources 202. However, this is not essential, and as shown in Figure 20, a collimating lens 203 may be
optical asymmetric with respect to a centroid of a respective array of emissive light sources 202.
When multiple in-coupling elements 207 are used, the respective collimating lenses 203 and arrays of emissive light sources 202 may be separate units. However, they may equally be integrated as a single unit.
Referring also to Figure 21, an example is shown in which three arrays of emissive light sources 202a, 202b, 202c are supported on a common substrate 214 (for example a printed circuit board). The arrays of emissive light sources 202a, 202b, 202c illuminate three corresponding in-coupling elements 207a, 207b, 207c via collimating lenses 203a, 203b, 203c which are provided as a single piece 215, for example an injection moulded plastic element.
In a modification of the example shown in Figure 21, the discontinuous in-coupling elements 207a, 207b, 207c could be merged into one continuous in-coupling element 207.
The example described in relation to Figure 17 used multiple in-coupling elements 207a, 207b, 207c, 207d arranged along a single edge of a single light guide 208 to each receive an input illumination light beam 201, combined with a single out-coupling element 209.
However, variations are possible. For example, arrays of in-coupling elements may be arranged two or more edges of a light guide, each receiving an input illumination light beam 201.
Equally, it may be beneficial to reduce cross-talk if multiple in-coupling elements 207 did not share a single light guide 208. For example, referring also to Figure 22, a further embodiment of a display system is shown.
Figure 22 is a plan view of light guides 208a, ..., 208h used for illuminating a display 211. The projected area 216 of the display 211 is superimposed in figure 22 as a dash-dot-dot line.
Half of the display 211 (the left hand side as illustrated) is illuminated by directional backlighting provided by out-coupling elements 209a, 209b, 209c, 209d of an array of striplike light guides 208a, 208b, 208c, 208d. The in-coupling elements 207a, 207b, 207c, 207d receive respective directional input illumination light beams 201 from a single, combined first 32 or second 200 illumination engine. Alternatively each in-coupling element 207a, 207b, 207c, 207d may receive the respective directional input illumination light beam 201 from a separate first 32 or second 200 illumination engine.
The other half (the right hand side) is illuminated by directional backlighting provided by further light guides 208e, 208f, 208g, 208h which are identically configured and mirrored about the mid-line of the projected area 216 of the display 211.
Each in-coupling elements 207a, ..., 207h may all receive an input illumination light beam 201 at substantially the same angle. Alternatively, different in-coupling elements 207a, ..., 207h may receive respective input illumination light beams 201 at individual angles.
Third illumination engine
Referring also to Figure 23 a further embodiment of a display system is shown, which is identical to the display system shown in Figure 14, except that the second illumination engine 200 is replaced by a third illumination engine 300.
The third illumination engine 300 includes a collimating lens 303 which is the same as the collimating lens 203 of the second illumination engine 200. However, the array of emissive light sources 202 is replaced by a backlight source 301 separated from the collimating lens 303 by an array of switchable transmissive pixels 302. For example, the switchable transmissive pixels 302 may take the form of LC pixels, or any other type of transmissive display. Unlike the display 26, 104, 213 used to output to the eye box, the array of switchable transmissive pixels 302 does not need colour pixels, and need only be monochrome. In many examples, the array of switchable transmissive pixels 302 may be switched between binary "ON" (maximum transmission) and "OFF" (minimum transmission) states. The backlight source 301 may be of any suitable type, for example any type known for use with prior art backlit displays.
In the example illustrated, the transmissive pixels 302 are arranged in a 2D cartesian array, although this is not essential and the transmissive pixels 302 may be arranged using different types of 2D grid (e.g. hexagonal), or may instead be arranged in a linear array and so forth. The following discussion will assume a 2D, N by M square or rectangular grid, so that each transmissive pixels 30, in combination with the backlight source 301, effectively provides one "pixel" of a segmented light source. The transmissive pixel in the nth of N rows and mth of M columns of the array shall again be denoted P(n,m) for the purposes of the description hereinafter.
Referring also to Figure 24, the operation of the third illumination engine 300 is illustrated.
Figure 24 is a schematic, geometric optics diagram for a collimating lens 303 in the form of a convex lens. For the sake of visual clarity, the mth column of the array of transmissive pixels 302 has been shown as pixels P(l,m), ..., P(ll,m) on the focal plane of the collimating lens 303.
For a first case, the transmissive pixel 302 corresponding to the pixel P(10,m) is controlled to transmit the undirected light 304 from the backlight source 301 (switched to the "ON" state) whilst every other pixel is kept opaque (switch to the "OFF" state). In the first case, a first pair of rays 304a, 305a are drawn with solid lines: a first ray 304a of light travelling parallel to an optic axis 306 of the collimating lens 304 and a second ray 205a of light which passes through the optical centre of collimating lens 303. These rays 304a, 305a are drawn for visual clarity, though all other rays originating from the ON pixel P(10,m) will be deflected to the same angle 0out to form the input illumination light beam 201.
In a second case, the transmissive pixel 302 corresponding to the pixel P(10,m) is made opaque (switched to the "OFF" state) and the transmissive pixel 302 corresponding to the pixel P(4,m) is made transparent (switched to the "ON" state). A second pair of rays 304b, 305b are drawn with chained lines: a first ray 304b of light travelling parallel to the optic axis
306 and a second ray 305b of light which passes through the optical centre of collimating lens 303. However, for the second case, with pixel P(4,m), it may be observed that the direction of the input illumination light beam 201 is shifted from 0out to 0'out.
In this way, by controlling which pixel(s) P(n,m) of the array of transmissive pixels 302 are made transparent, the angle 0out of the input illumination light beam 201 may be controlled in the same way as for the second illumination engine 200. Consequently, the third illumination engine 300 may be used in any way, and in any example, described in relation to the second illumination engine 200.
Fourth illumination engine
The second 200 and third 300 illumination engines use collimating lenses 203, 303. However, similar illumination engines may be constructed using different optical elements to provide collimation.
For example, referring also to Figure 25 a further embodiment of a display system is shown, which is identical to the display systems shown in Figures 14 or 23, except that the second illumination engine 200 or third illumination engine 300 is respectively replaced by a fourth illumination engine 400.
The fourth illumination engine 400 includes a backlight source 401 which is the same as the backlight source 301 of the third illumination engine 300, and an array of switchable transmissive pixels 402 which is the same as the array of switchable transmissive pixels 302 of the third illumination engine 300. However, unlike the third illumination engine 300, the collimating lens 303 is replaced by one or more arrays of transmissive shutter pixels 403, stacked in sequence between the array of switchable transmissive pixels 402 and the light guide 208. In the example shown in Figure 25, there is a single layer of transmissive shutter pixels 403. Each shutter pixel 403 is a transmissive pixel, for example a liquid crystal pixel. For example, each array of shutter pixels 403 may be structurally identical to the array of switchable transmissive pixels 402.
Referring also to Figure 26, operation of the fourth illumination engine 400 to produce a steerable input illumination light beam 201 shall be described.
The arrays of transmissive shutter pixels 403 and the array of switchable transmissive pixels 402 are spaced apart parallel and one after the other (along an optic axis) with a separation s. The separation may be between about 0.1 mm and several mm, for example about 0.5 mm.
The transmissive pixel 402 illustrated without hatching in Figure 26 is switched to be transmissive (or "ON") whilst the other (hatched in Figure 26) transmissive pixels 402 are kept opaque (or "OFF") to form an aperture through which undirected backlight (see 304 in Figure 24) from the backlight source 401 passes to form a first cone of light 404 centred roughly perpendicular to the array of transmissive pixels 402.
The shutter pixels 403 of the first array of shutter pixels 403 which lie along a line originating from the transmissive pixel 402 set to the "ON" state and oriented at a desired
output angle 0out is switched to the "ON" state (not hatched in Figure 26) whilst the remaining shutter pixels 403 are kept in the "OFF" state (hatched in Figure 26)). The shutter pixel 403 in the "ON" state, and those surrounding it in the "OFF" state are illuminated by the first cone of light 404. However, only incident light from the first cone of light 404 having a reduced range of angles which will pass through the "ON" shutter pixel 403 can pass through to form a second cone of light 405. The second cone of light 405 will be already collimated, and depending on the application this could provide the collimated and directional input illumination light beam (as shown in Figure 25).
If tighter collimation is required, then a further, second array of shutter pixels 404 may be added to the stack for further collimation. Each successive array of shutter pixels 404 which the input illumination light beam 201 must pass through will provide tighter collimation, at the cost of relatively reduced luminance. The number of arrays of shutter pixels 404 used to form and steer the input illumination light beam 201 may be selected for a particular display system based on the required degree of collimation balanced against the luminance which may be provided by the backlight source 401.
Overlap of out-coupled beams
In some AR/VR applications, gaps between adjacent beams out-coupled beams forming an image may be acceptable, if undesirable. However, for the present application to directional backlight display systems, any gap in the backlight illumination will means that the display 26, 60, 104, 211, for example an amplitude modulating display such as a transmissive liquid crystal device utilizing one or more polarizers, will contain pixels that are not illuminated at all. Preferably there will be no gaps between the out-coupled beams.
Referring also to Figure 27, a first example 500 of light guide propagation and out-coupling is shown .
Light guide 501 (for example a planar waveguide) receives an input illumination beam 502 via in-coupling element 503 which may be of any type described herein. A beam 504 travels internally in the light guide 501. The distance between two bounces of the beam 504 within the lightguide 501, termed herein the "bounce distance" and denoted Dbounce, is a function of the lightguide 501 thickness, refractive index, and the angle of the input illumination beam 502. For a fixed input angle, and a fixed refractive index and thickness of lightguide 501, the outcoupling beams 505a, 505b, 505c and so forth produced by outcoupling element 506 may be separated by a distance DOut, as illustrates in Figure 27. The distance Dout corresponds to the difference of the bounce distance Dbounce and a projected width Wout of the beam 504. The projected width Wout may be the same as the width of the input illumination beam 502, depending on the type and configuration of the in-coupling element 503. Expressed another way, the out-coupled beams 505a, 505b, 505c corresponding to successive internal reflections of the beam 504 are separated by a distance DOut-
Although separations Dout may be small enough to be acceptable in some applications, for example expanding an already formed image in an AR/VR application, as explained
hereinbefore, in a backlighting application prior to innage formation by the display 26, 60, 104, 211, gaps may be more noticeable and less acceptable.
In an ideal situation, the outcoupled beams 505a, 505b, 505c and so forth will just touch one another. For example, referring also to Figure 28, a second example 507 of light guide propagation and out-coupling is shown, in which the outcoupled beams 505a, 505b, 505c follow one another seamlessly. In other words, Dbounce = Wout so that Dout is zero. Expressed another way, the out-coupled beams 505a, 505b, 505c corresponding to successive internal reflections of the beam 504 are not separate by any distance.
The situation of the second example 507 could be engineered in a static backlighting application. However, considering the range of angles required for displays systems described herein, the seamless outcoupled beams 505a, 505b, 505c are not practical to achieve because the bounce distance Dbounce varies with the angle of the input illumination light beam 502.
Referring also to Figure 29, a third example 508 of light guide propagation and out-coupling is shown.
In the third example 508, the width Wout of each out-coupled beam 505a, 505b, 505c is longer than the bounce distance Dbounce, so that each out-coupled beam 505a, 505b, 505c overlaps an amount Dover within the preceding and following outcoupled beams 505a, 505b, 505c.
In display systems described herein, the overlap Dover is preferably configured to be between 0 mm and 1 mm across a range of angles for the input illumination beam 502. In this way, gaps may be avoided using the minimum overlap Dover- For e given display system, the bounce distance Dbounce and width Wout of the light guide 5, 20, 50, 102, 208, may be configured such that the range of the different Dbounce - Wout remains within the range 0 mm < Dover 1 mm across a desired range of incident angles. Expressed another way, the out-coupled beams 505a, 505b, 505c corresponding to successive internal reflections of the beam 504 overlap one another by a minimum of 0 mm and a maximum of 1 mm, across the range of input angles to the waveguide 501.
In the examples described herein, the display 26, 60, 104, 211 may take the form of an amplitude modulating display such as, for example a transmissive liquid crystal device utilizing one or more polarizers.
In the examples described herein which utilise a scanning mirror, the input illumination light beam, for example input illumination light beam 101, 201, 502, need not be coherent. In some examples, an input light beam may be incoherent or partially coherent. Incoherent or partially coherent may correspond herein to a minimum spectral bandwidth of 2 nm at any point in time or within a 1 second integration period.
In the examples described herein, the in-coupling elements to light guides 5, 20, 50, 102, 501, 208, have taken the form of a diffractive gratings 2, 22, 54, 207, and out-coupling elements have taken the form of diffractive gratings 4, 24, 52, 209. However, in general each of the in-coupling element and/or the out-coupling element may be selected from: a
diffractive optical element, a metasurface, a partial reflector, a partially reflecting dielectric mirror, a partially reflecting dielectric coating, a prism and a prism array.
In some examples (not shown), a light guide may include an out-coupling element in the form of a number of partial bulk reflectors, for example disposed as a linear array. The partial bulk reflectors may take the form of partial dielectric mirrors, each having an angular bandwidth of ±15 degrees for a central or design wavelength. The design wavelength may be selected based on the application. For example, for a white light backlight, the design wavelength may be selected as the average across the emission spectrum. Variabilities in the out-coupling efficiency with wavelength may be compensated by modulating the colours of an image output using the display 26, 60, 104, 211.
The foregoing description has been given by way of example only and it will be appreciated by a person skilled in the art that modifications can be made without departing from the scope of the present invention as defined by the claims.
Claims
1. A display system comprising: an illumination engine configured to generate an input light beam having a first width, the illumination engine comprises: a segmented light source comprising an array of pixels; an optical coupling element arranged to receive light from the segmented light source and to output the input light beam to the light guide; a light guide optically coupled to the illumination engine and configured to receive the input light beam, the light guide being configured to convert the input light beam into an illumination light beam having a second width larger than the first width; and a display device configured to be illuminated by the illumination light beam and to convert the illumination light beam into an image light beam, wherein the illumination engine is configured to rotate the angle of the input light beam controllably relative to the orientation of the light guide by controlling which pixels of the segmented light source illuminate the optical coupling element.
2. The display system of claim 1, wherein the light guide is a planar waveguide.
3. The display system of claim 1 or claim 2, wherein the light guide comprises an incoupling element and an out-coupling element.
4. The display system of claim 3, wherein each of the in-coupling element and the out- coupling element is one of: a diffractive optical element, a partially reflective mirror, a metasurface, a partial reflector, a partially reflecting dielectric mirror, a partially reflecting dielectric coating, a prism and a prism array.
5. The display system of any or claims 1 to 4, wherein the illumination engine is configured to generate one or more further input light beams, each further input light beam having a respective width; wherein the light guide is configured to receive the one or more further input light beams, the light guide being configured to convert each further input light beam into a further illumination light beam having a width larger than the width of the respective further input light beam; wherein each of the illumination light beam and the one or more further illumination beams illuminates a different portion of the display device.
6. The display system of any of claims 1-5, further comprising: a camera configured to capture images of a user's eye and output the images; and
a processing unit configured to receive the innages output by the camera, calculate a position of the user's eye based on the images, and calculate a rotation angle of the input light beam required to direct the image light beam onto an eye pupil of the user's eye based on the calculated position, wherein the processing unit is configured to output the calculated rotation angle to the illumination engine, and the illumination engine is configured to rotate the angle of the input light beam by the calculated rotation angle so as to direct the image light beam onto the eye pupil of the user's eye.
7. The display system of any preceding claim, further comprising an imaging optical system configured to project the image light beam generated by the display device onto a user's eye.
8. The display system of claim 7, wherein the imaging optical system comprises catadioptric optics.
9. The display system of claim 7 or claim 1, wherein the imaging optical system has an effective F-numbergreater than 4.
10. The display system of any preceding claim having a display exit pupil diameter in the range of 0.5mm to 3mm.
11. The display system of any preceding claim, wherein the illumination input light beam has a divergence or convergence angle in the range of 0° to 15°.
12. The display system of claim 3 or claim 4, wherein the light guide comprises an exit pupil expander including the out-coupling element, wherein the out-coupling element is a diffractive optical element configured to expand the beam width of the input light beam along at least a first axis.
13. The display system of claim 12, wherein the exit pupil expander further comprises another diffractive optical element configured to expand the beam width of the input light beam along a second axis perpendicular to the first axis.
14. The display system of any preceding claim, configured such that out-coupled beams corresponding to successive internal reflections overlap one another by an amount greater than 0 mm and less than or equal to 1 mm.
15. A display system comprising: an illumination engine configured to generate an input light beam having a first width; a diffractive light guide optically coupled to the illumination engine and configured to receive the input light beam, the light guide being configured to convert the input light beam into an illumination light beam having a second width larger than the first width; and a transmissive liquid crystal display device configured to be illuminated by the illumination light beam and to convert the illumination light beam into an image light beam, a camera configured to capture images of a user's eye and output the images; and a processing unit configured to receive the images output by the camera, calculate a position of the user's eye based on the images, and calculate a rotation angle of the input light beam required to direct the image light beam onto an eye pupil of the user's eye based on the calculated position; an imaging optical system configured to project the image light beam generated by the display device onto a user's eye; wherein the processing unit is configured to output the calculated rotation angle to the illumination engine, and the illumination engine is configured to rotate the angle of the input light beam controllably relative to the orientation of the light guide by the calculated rotation angle so as to direct the image light beam onto the eye pupil of the user's eye.
16. The display system of claim 15, wherein the illumination engine comprises: a light source configured to output light; a collimation module configured to collimate light from the light source to generate a collimated beam; a rotatable scanning mirror configured to reflect the collimated beam; an actuator configured to rotate the scanning mirror; and a control device configured to control operation of the actuator to rotate the scanning mirror.
17. The display system of claim 15, wherein the illumination engine comprises: a segmented light source comprising an array of pixels; an optical coupling element arranged to receive light from the segmented light source and to output the input light beam to the light guide; wherein the illumination engine is configured to control an output angle of the input light beam by controlling which pixels of the segmented light source illuminate the optical coupling element.
18. A light guide for use in a head mounted display, the light guide comprising: a transparent substrate having upper and lower surfaces; an in-coupling element on the upper surface of the substrate configured to couple light incident on the in-coupling element into the light guide; and
an exit pupil expander comprising an out-coupling element on the upper surface of the substrate configured to couple light incident on the out-coupling grating out of the light guide; wherein the upper surface is planar, and the thickness of the transparent substrate in a direction normal to the upper surface varies across the plane of the upper surface.
19. The light guide according to claim 18, wherein the lower surface has at least one of a curved profile and a profile that is inclined relative to the upper surface when viewed in a direction parallel to the upper surface.
20. The light guide according to claim 18 or claim 19, wherein the transparent substrate has a thickness variation of more than 1%.
21. The light guide according to any of claims 18-20, wherein the transparent substrate comprises an upper layer including the upper surface and a lower layer including the lower surface, the upper layer and the lower layer being stacked in a vertical direction, wherein the upper layer and lower layers are formed from different optical materials.
22. The light guide according to any of claims 18-21, wherein the upper surface has a partially reflective coating.
23. The light guide according to any of claims 18-22, wherein each of the in-coupling element and the out-coupling element is one of: a diffractive optical element, a partially reflective mirror, a metasurface, a prism and a prism array.
24. A light guide for use in a head mounted display, the light guide comprising: a transparent substrate having upper and lower surfaces; an in-coupling element on one of the upper and lower surfaces of the substrate configured to couple light incident on the in-coupling element into the light guide; and an out-coupling element on the other of the upper and lower surfaces of the substrate configured to couple light incident on the out-coupling element out of the light guide; wherein the upper surface is planar, and the lower surface is reflective and has a curved profile when viewed in a direction parallel to the upper surface.
25. A display system comprising: a display device comprising an array of pixels and configured to emit image light representing an image; a converging lens configured to collect image light from the display device;
a shutter mechanism configured to partially block light that has passed through the converging lens from the display device, wherein the shutter mechanism defines an adjustable aperture through which light can pass; a dynamic lens configured to collect light that has passed through the aperture to form a display exit pupil of the display system; a telescopic optical system configured to image the display exit pupil in an eye box of the display system; and a control device configured to control the shutter mechanism and the dynamic lens, wherein the control device is configured to control the position and size of the aperture and the power of the dynamic lens so as to locate the image of the display exit pupil in the eye box.
26. The display system of claim 25, further comprising an eye-tracking system including: an eye tracking camera configured to image an eye of a user of the display system and output image data; and an eye position calculating unit configured to calculate a position of the user's eye based on the image data from the eye tracking camera; wherein the eye position calculating unit is configured to output the calculated position of the user's eye to the control device and the control device is configured to control the position and size of the aperture and the power of the dynamic lens so as to locate the image of the display exit pupil at the calculated position of the user's eye.
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB2400676.9A GB2637327A (en) | 2024-01-18 | 2024-01-18 | Display system with steerable eye box |
| GB2400676.9 | 2024-01-18 | ||
| GB2413748.1 | 2024-09-18 | ||
| GB2413748.1A GB2637375A (en) | 2024-01-18 | 2024-09-18 | Display system with steerable eye box |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025153823A1 true WO2025153823A1 (en) | 2025-07-24 |
Family
ID=94382472
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/GB2025/050077 Pending WO2025153823A1 (en) | 2024-01-18 | 2025-01-17 | Display system with steerable eye box |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025153823A1 (en) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10775633B1 (en) * | 2018-04-10 | 2020-09-15 | Facebook Technologies, Llc | Exit pupil steering for near-eye displays |
| WO2022254243A1 (en) * | 2021-06-03 | 2022-12-08 | Creal Sa | Light-field projector having a small form factor |
| WO2023288014A1 (en) * | 2021-07-16 | 2023-01-19 | Meta Platforms Technologies, Llc | Display with image light steering |
| US11575881B2 (en) * | 2020-11-26 | 2023-02-07 | Sun Yat-Sen University | Near-eye display module releasing the eye's focus from fixed plane |
| WO2023154241A1 (en) * | 2022-02-08 | 2023-08-17 | Meta Platforms Technologies, Llc | Lightguide based illuminator for a reflective display panel |
-
2025
- 2025-01-17 WO PCT/GB2025/050077 patent/WO2025153823A1/en active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10775633B1 (en) * | 2018-04-10 | 2020-09-15 | Facebook Technologies, Llc | Exit pupil steering for near-eye displays |
| US11575881B2 (en) * | 2020-11-26 | 2023-02-07 | Sun Yat-Sen University | Near-eye display module releasing the eye's focus from fixed plane |
| WO2022254243A1 (en) * | 2021-06-03 | 2022-12-08 | Creal Sa | Light-field projector having a small form factor |
| WO2023288014A1 (en) * | 2021-07-16 | 2023-01-19 | Meta Platforms Technologies, Llc | Display with image light steering |
| WO2023154241A1 (en) * | 2022-02-08 | 2023-08-17 | Meta Platforms Technologies, Llc | Lightguide based illuminator for a reflective display panel |
Non-Patent Citations (1)
| Title |
|---|
| GEORGIOU ANDREAS ET AL: "A holographic near-eye display with glass form factor and spectacle-free operation", 20230807, vol. 12624, 7 August 2023 (2023-08-07), pages 1262408 - 1262408, XP060187796, ISSN: 0277-786X, ISBN: 978-1-5106-6457-9, DOI: 10.1117/12.2679479 * |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12292571B2 (en) | Tilting array based display | |
| KR102717573B1 (en) | Waveguide illuminator | |
| US20250244588A1 (en) | Augmented and virtual reality display systems with shared display for left and right eyes | |
| KR102636903B1 (en) | Augmented reality display with multi-element adaptive lens for changing depth planes | |
| US10048500B2 (en) | Directionally illuminated waveguide arrangement | |
| WO2017150631A1 (en) | Head Mounted Display Using Spatial Light Modulator To Move the Viewing Zone | |
| US20200301239A1 (en) | Varifocal display with fixed-focus lens | |
| CN113544560A (en) | Virtual and augmented reality display system with light emitting microdisplay | |
| CN110088666B (en) | Head-mounted display and optical system thereof | |
| US9829716B1 (en) | Head mounted display | |
| CN111856749A (en) | Display device and method | |
| US20240329408A1 (en) | Method and system for performing optical imaging in augmented reality devices | |
| US20220155591A1 (en) | Eyebox expanding viewing optics assembly for stereo-viewing | |
| WO2025153823A1 (en) | Display system with steerable eye box | |
| GB2637375A (en) | Display system with steerable eye box | |
| JP2025500298A (en) | Method and system for implementing optical imaging in an augmented reality device - Patents.com |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25701024 Country of ref document: EP Kind code of ref document: A1 |