CN120604162A - High-contrast pancake lens with transmissive polarizing absorber - Google Patents
High-contrast pancake lens with transmissive polarizing absorberInfo
- Publication number
- CN120604162A CN120604162A CN202480007338.1A CN202480007338A CN120604162A CN 120604162 A CN120604162 A CN 120604162A CN 202480007338 A CN202480007338 A CN 202480007338A CN 120604162 A CN120604162 A CN 120604162A
- Authority
- CN
- China
- Prior art keywords
- light
- user
- absorbing element
- display
- reflective polarizer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
Abstract
一种设备可以包括向用户产生图像的光学系统以及吸收由该光学系统透射的光的一部分的吸光元件。该光学系统可以包括分束元件和反射式偏振器,该反射式偏振器反射第一旋向性的圆偏振光并且透射第二旋向性的偏振光。还公开了各种其它设备、系统以及制造方法。
A device may include an optical system that produces an image for a user and a light-absorbing element that absorbs a portion of light transmitted by the optical system. The optical system may include a beam-splitting element and a reflective polarizer that reflects circularly polarized light of a first handedness and transmits polarized light of a second handedness. Various other devices, systems, and manufacturing methods are also disclosed.
Description
Cross Reference to Related Applications
The present application claims the benefit of U.S. provisional application No. 63/482,400 filed on day 31 of 1 of 2023.
Technical Field
The present disclosure relates to an optical apparatus, an optical system, and a method of manufacturing the same.
Disclosure of Invention
According to an aspect of the present invention, there is provided an apparatus comprising an optical system that produces an image to a user, wherein the optical system comprises a beam splitting element and a reflective polarizer that reflects circularly polarized light of a first handedness and transmits polarized light of a second handedness, and a light absorbing element that absorbs a portion of the light transmitted by the optical system.
Optionally, the reflective polarizer comprises a cholesteric reflective polarizer.
Optionally, the reflective polarizer comprises a reflective linear polarizer and a retarder.
Optionally, the reflective linear polarizer comprises at least one of a birefringent polymeric multilayer optical film, or a wire grid.
Optionally, the absorbed portion of light is greater than at least one of 10% of the light, 20% of the light, 30% of the light, 40% of the light, or 50% of the light.
Optionally, the light absorbing element in the transmission polarization (pass-polarization) has a higher optical transparency for wavelengths emitted by the display than for photopic weighted white light (photopic WEIGHTED WHITE LIGHT).
Optionally, the light absorbing element in the transmissive polarization has a lower absorbance for wavelengths of about 460 nanometers, about 520 nanometers, and about 615 nanometers than for other wavelengths.
Optionally, the light absorbing element is located between the reflective polarizer and the position of the user's eye.
Optionally, the light absorbing element is located between the reflective polarizer and the beam splitter
Optionally, the absorbance of the light absorbing element is at least one of 10% or more, 20% or more, or 30% or more higher at the center than at 50% of the outer radius (outer radius of the edge) of the edge.
Alternatively, according to another aspect of the present invention, there is provided a system comprising a display, an optical system producing an image from the display to a user, wherein the optical system comprises a beam splitting element and a reflective polarizer reflecting circularly polarized light of a first handedness and transmitting polarized light of a second handedness, and a light absorbing element absorbing a portion of the light transmitted by the optical system.
Optionally, the reflective polarizer comprises a cholesteric reflective polarizer.
Optionally, the reflective polarizer comprises a reflective linear polarizer and a retarder.
Optionally, the reflective linear polarizer comprises at least one of a birefringent polymeric multilayer optical film, or a wire grid.
Optionally, the absorbed portion of light is greater than 10% of the light.
Optionally, the light absorbing element in a transmissive polarization has a higher optical transparency for wavelengths emitted by the display than for photopic weighted white light.
Optionally, the light absorbing element in the transmissive polarization has a lower absorbance for wavelengths of about 460 nanometers, about 520 nanometers, and about 615 nanometers than for other wavelengths.
Optionally, the light absorbing element is located between the reflective polarizer and the position of the user's eye.
Optionally, the light absorbing element is located between the reflective polarizer and the beam splitter.
According to yet another aspect of the present invention, there is provided a manufacturing method comprising providing an optical system between a display and an eyebox, the optical system producing an image from the display to a user, wherein the optical system comprises a beam splitting element and a reflective polarizer that reflects circularly polarized light of a first handedness and transmits polarized light of a second handedness, and providing a light absorbing element between the display and the eyebox, the light absorbing element absorbing a portion of the light transmitted by the optical system.
Drawings
The accompanying drawings illustrate many exemplary embodiments and are a part of the specification. Together with the following description, these drawings demonstrate and explain various principles of the present disclosure.
FIG. 1 shows a schematic diagram of an example high contrast wafer lens with a transmissive polarization absorber (pass-polarization absorber).
Fig. 2 shows an example high contrast wafer lens with a transmissive polarization absorber.
Fig. 3 shows an example emission from a display through the high contrast wafer lens of fig. 2 with a transmissive polarization absorber.
Fig. 4 shows an example image formed at a user's retina by the example emission of fig. 3.
Fig. 5 shows another example image formed at the retina of a user by a modified version of the example transmission of fig. 3.
Fig. 6 shows another example image formed at a user's retina by another modified version of the example transmission of fig. 3.
Fig. 7 shows an example image formed at the retina of a user as shown in fig. 4 using a modified version of the high contrast wafer lens of fig. 2 with a transmissive polarization absorber.
Fig. 8 shows an example image formed at the retina of a user as shown in fig. 5 using a modified version of the high contrast wafer lens of fig. 2 with a transmissive polarization absorber.
Fig. 9 shows an example image formed at the retina of a user as shown in fig. 6 using a modified version of the high contrast wafer lens of fig. 2 with a transmissive polarization absorber.
FIG. 10 is a flow chart of an example method of manufacturing a high contrast wafer lens with a transmissive polarization absorber.
Fig. 11 is an illustration of exemplary augmented reality glasses that may be used in connection with embodiments of the present disclosure.
Fig. 12 is an illustration of an exemplary virtual reality headset that may be used in connection with embodiments of the present disclosure.
Throughout the drawings, identical reference numerals and descriptions indicate similar, but not necessarily identical elements. While the exemplary embodiments described herein are susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. However, the exemplary embodiments described herein are not intended to be limited to the particular forms disclosed. Rather, the present disclosure covers all modifications, equivalents, and alternatives falling within the scope of the appended claims.
Detailed Description
The wafer lens can be compact, lightweight, have a wide field of view, and have a high resolution. In some applications (e.g., head-mounted displays for virtual reality and augmented reality), the use of wafer lenses may help to enhance immersion, improve user comfort, and/or increase design possibilities. However, due to reflection of light (e.g., from the original display or from the environment) from the user's eyes and surrounding areas, the wafer lens may reduce the display contrast, some of which may be reflected back to the user by the wafer lens. This light reflected back to the user may form ghost images, reduce overall image contrast, or both.
The present disclosure relates generally to wafer lenses with transmissive polarization absorbers. By disposing the light absorbing element between the user's eye and the display (e.g., between the user's eye and the wafer lens), light reflected from the user's eye or face may be absorbed rather than reflected back to the user. In one case, the light absorbing element may selectively absorb wavelengths not generated by the display, thereby eliminating contrast degradation and ghosting caused by ambient light. In another case, the light absorbing element may absorb a portion of the light of the wavelength generated by the display—in which case the benefit of the absorber may outweigh the transmission loss from the display that it causes, as the absorber may absorb light from the display in only one pass, while the absorber may absorb light reflected from the user's eyes (and, for example, from the user's facial region around the user's eyes) and back to the user in at least two passes.
By absorbing light reflected from the user (and thus preventing light from reflecting back to the user), the devices and systems described herein may reduce potential image degradation causes, such as ghost images and contrast degradation. In the context of VR/AR systems, this may improve user immersion and experience.
Fig. 1 shows a schematic diagram of an apparatus 100 with a high contrast wafer lens. As shown in fig. 1, the apparatus 100 may include a wafer lens assembly 110. Wafer lens assembly 110 may include a back side optical element 115 (e.g., a lens) and a front side optical element 120 (e.g., a lens). In some examples, the backside optical element 115 can include a wave plate surface 125 and a mirror 130. Front side optical element 120 may include a wave plate surface 135 and a reflective polarizer surface 140. Light emitted from display 105 may travel along a refractive path through wafer lens assembly 110 and reach user's eye 150.
As can be appreciated, in some examples, light from the display 105 and/or from the environment can be reflected from the user's eyes 150 (or another portion of the user's face) and directed toward the wafer lens assembly 110. Typically, a portion of this light reflected from the user may be in a polarization state such that it passes through reflective polarizer surface 140 and may be reflected back toward the user (e.g., by mirror 130). However, the absorber 145 may absorb light reflected from the user's eye 150 (e.g., when reflected from the user's eye 150 toward the wafer lens assembly 110 and/or when reflected from the wafer lens assembly toward the user's eye 150). Thus, any interference caused by light reflected from the user's eye 150 with the image of the display 105 may be mitigated.
Fig. 2 shows an apparatus 200 with a high contrast wafer lens. As shown in fig. 2, the apparatus 200 may include a wafer lens system 215. Wafer lens system 215 may include lens 216 and lens 217. Wafer lens system 215 may also include a partially transmissive and partially reflective coating 220 and a retarder 225 (e.g., adjacent to and/or coupled to lens 216). Wafer lens system 215 may also include a reflective polarizer 230 and an absorber 235 (e.g., adjacent to and/or coupled to lens 217).
In some examples, absorber 235 may have a high absorption for light of at least wavelengths that are not emitted by display 205. For example, the absorber 235 may have a higher absorption for at least one wavelength of light that is not emitted by the display 205 than for any wavelength of light emitted by the display 205. In some examples, the absorber may have a higher optical transparency for light in the transmitted polarization state than for photopic weighted white light (WEIGHTED WHITE LIGHT, PWWL) for wavelengths emitted by the display 205. For example, the percentage increase in the absorbance of the transmitted polarization PWWL relative to the absorbance of the transmitted polarized light at the display wavelength may be 10% or more, 20% or more, 50% or more, 100% or more, 200% or more, or 500% or more.
In this way, absorber 235 may reduce or eliminate ghost images and/or contrast degradation caused by ambient light. In some examples, absorber 235 may also absorb light of the wavelength emitted by display 205. In various examples, absorber 235 may be a polarization transmission axis absorber (pass-polarization axis absorber) and may preferentially attenuate light reflected from the user's eyes and/or facial regions relative to attenuation of signal light (e.g., from display 205) that forms an image for the user. In various examples, the contrast of the device 200 may exceed 500:1, may exceed 1000:1, and/or may exceed 2000:1.
Fig. 3 shows an example emission 310 from display 205 through device 200 shown in fig. 2 including a high contrast wafer lens with a transmissive polarization absorber.
Fig. 4 shows an example image formed at a user's retina by the example emission of fig. 3. In one example, the area of emission 310 on display 205 may be 2 millimeters by 2 millimeters. As shown in plot 400 of fig. 4, an image 410 from emission 310 may be formed at the retina of the user. The non-image light may be shown by irradiance of the area 420 (e.g., an area outside of the image 410). In some examples, the background light may be about 1% of the emission 310, providing a contrast ratio of about 100:1.
Fig. 5 shows another example image formed at the retina of a user by a modified version of the example transmission of fig. 3. In this example provided in fig. 5, the emission 310 may be about 33.3% brighter than the example given in fig. 4. In this example, an absorber (e.g., absorber 235) may absorb 25% of the light passing through the transmissive polarizer. As shown in plot 500 of fig. 5, an image 510 from emission 310 may be formed at the user's retina, and non-image light may be shown by irradiance of region 520.
Fig. 6 shows another example image formed at a user's retina by another modified version of the example transmission of fig. 3. In this example provided in fig. 6, the emission 310 may be about 100% brighter than the example given in fig. 4. In this example, an absorber (e.g., absorber 235) may absorb 50% of the light passing through the transmissive polarizer. Thus, the modified system may have a four-fold contrast ratio of the system shown in fig. 4. As shown in plot 600 of fig. 6, an image 610 from emission 310 may be formed at a user's retina, and non-image light may be shown by irradiance of region 620.
Fig. 7 shows an example image formed at the retina of a user as shown in fig. 4 using a modified version of the high contrast wafer lens of fig. 2 with a transmissive polarization absorber. In the example provided in fig. 7, a quarter-wave retarder may be added to the device 200 of fig. 2 between the user's eye 240 and the absorber 235.
Fig. 8 shows an example image formed at the retina of a user as shown in fig. 5 using a modified version of the high contrast wafer lens of fig. 2 with a transmissive polarization absorber. In the example provided in fig. 8, a quarter-wave retarder may be added to the device 200 of fig. 2 between the user's eye 240 and the absorber 235.
Fig. 9 shows an example image formed at the retina of a user as shown in fig. 6 using a modified version of the high contrast wafer lens of fig. 2 with a transmissive polarization absorber. In the example provided in fig. 9, a quarter-wave retarder may be added to the device 200 of fig. 2 between the user's eye 240 and the absorber 235.
Fig. 10 is a flow chart of an example method 1000 of manufacturing a high contrast wafer lens with a transmissive polarization absorber. As shown in fig. 10, at step 1010, method 1000 may include disposing an optical system between a display and an eyebox, the optical system producing an image from the display to a user, wherein the optical system includes a beam splitting element and a reflective polarizer.
At step 1020, method 1000 may include disposing a light absorbing element between the display and the eyebox, the element absorbing a portion of the light transmitted by the optical system.
Embodiments of the present disclosure may include or be implemented in conjunction with various types of artificial reality systems. An artificial reality is a form of reality that has been regulated in some way before being presented to a user, and may include, for example, virtual reality, augmented reality, mixed reality (mixed reality), mixed reality (hybrid reality), or some combination and/or derivative thereof. The artificial reality content may include entirely computer-generated content or computer-generated content in combination with collected (e.g., real-world) content. The artificial reality content may include video, audio, haptic feedback, or some combination thereof, any of which may be presented in a single channel or multiple channels (e.g., stereoscopic video that produces three-dimensional (3D) effects to a viewer). Further, in some embodiments, the artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, e.g., for creating content in the artificial reality and/or otherwise using in the artificial reality (e.g., performing an activity in the artificial reality).
The artificial reality system may be implemented in a variety of different form factors (form factors) and configurations. Some artificial reality systems may be designed to operate without a near-eye display (NED-EYE DISPLAY, NED). Other artificial reality systems may include NEDs that also provide visibility to the real world (e.g., augmented reality system 1100 in FIG. 11) or NEDs that visually immerse the user in artificial reality (e.g., virtual reality system 1200 in FIG. 12). While some artificial reality devices may be stand-alone systems, other artificial reality devices may communicate and/or cooperate with external devices to provide an artificial reality experience to a user. Examples of such external devices include a handheld controller, a mobile device, a desktop computer, a device worn by a user, a device worn by one or more other users, and/or any other suitable external system.
Turning to fig. 11, the augmented reality system 1100 may include an eye-wear device 1102 having a frame 1110 configured to hold a left display device 1115 (a) and a right display device 1115 (B) in front of the eyes of a user. Display devices 1115 (a) and 1115 (B) may function together or independently to present an image or series of images to a user. Although the augmented reality system 1100 includes two displays, embodiments of the present disclosure may be implemented in an augmented reality system having a single NED or more than two nes.
In some embodiments, the augmented reality system 1100 may include one or more sensors, such as sensor 1140. The sensor 1140 may generate measurement signals in response to movement of the augmented reality system 1100 and may be located on substantially any portion of the frame 1110. Sensor 1140 may represent one or more of a variety of different sensing mechanisms such as a position sensor, an inertial measurement unit (inertial measurement unit, IMU), a depth camera assembly, a structured light emitter and/or detector, or any combination thereof. In some embodiments, the augmented reality system 1100 may or may not include a sensor 1140, or may include more than one sensor. In embodiments where the sensor 1140 includes an IMU, the IMU may generate calibration data based on measurement signals from the sensor 1140. Examples of sensors 1140 may include, but are not limited to, accelerometers, gyroscopes, magnetometers, other suitable types of sensors that detect motion, sensors for error correction of an IMU, or some combination thereof.
In some examples, the augmented reality system 1100 may also include a microphone array having a plurality of acoustic transducers 1120 (a) through 1120 (J) (collectively, acoustic transducers 1120). The acoustic transducer 1120 may represent a transducer that detects changes in air pressure caused by sound waves. Each acoustic transducer 1120 may be configured to detect sound and convert the detected sound to an electronic format (e.g., analog format or digital format). The microphone array in fig. 11 may include, for example, ten acoustic transducers, acoustic transducers 1120 (a) and 1120 (B), which may be designed to be placed within respective ears of a user, acoustic transducers 1120 (C), 1120 (D), 1120 (E), 1120 (F), 1120 (G), and 1120 (H), which may be positioned at different locations on frame 1110, and/or acoustic transducers 1120 (I) and 1120 (J), which may be positioned on corresponding neckband 1105.
In some embodiments, one or more of the acoustic transducers 1120 (a) to 1120 (J) may be used as an output transducer (e.g., a speaker). For example, the acoustic transducer 1120 (a) and/or 1120 (B) may be an ear bud (earbud), or any other suitable type of earphone or speaker.
The configuration of the acoustic transducer 1120 of the microphone array may vary. Although the augmented reality system 1100 is shown in fig. 11 as having ten acoustic transducers 1120, the number of acoustic transducers 1120 may be more or less than ten. In some embodiments, using a greater number of acoustic transducers 1120 may increase the amount of audio information collected and/or the sensitivity and accuracy of the audio information. In contrast, using a smaller number of acoustic transducers 1120 may reduce the computational power required by the associated controller 1150 to process the collected audio information. Furthermore, the position of each acoustic transducer 1120 of the microphone array may vary. For example, the locations of the acoustic transducers 1120 may include defined locations on the user, defined coordinates on the frame 1110, an orientation associated with each acoustic transducer 1120, or some combination thereof.
Acoustic transducers 1120 (a) and 1120 (B) may be positioned on different portions of the user's ear, such as behind the pinna (pinna), behind the tragus, and/or within the pinna (auricle) or the ear socket. Or there may be additional acoustic transducers 1120 on or around the ear in addition to the acoustic transducer 1120 in the ear canal. Positioning the acoustic transducer 1120 near the ear canal of the user may enable the microphone array to collect information about how sound reaches the ear canal. By positioning at least two acoustic transducers of the plurality of acoustic transducers 1120 on both sides of the user's head (e.g., as binaural microphones), the augmented reality system 1100 may simulate binaural hearing and capture a 3D stereoscopic field around the user's head. In some embodiments, acoustic transducers 1120 (a) and 1120 (B) may be connected to augmented reality system 1100 via wired connection 1130, while in other embodiments acoustic transducers 1120 (a) and 1120 (B) may be connected to augmented reality system 1100 via a wireless connection (e.g., a bluetooth connection). In other embodiments, acoustic transducers 1120 (a) and 1120 (B) may not be used in conjunction with augmented reality system 1100 at all.
The acoustic transducer 1120 on the frame 1110 can be positioned in a variety of different ways along the length of the temple, across the bridge, above or below the display devices 1115 (a) and 1115 (B), or some combination thereof. The acoustic transducer 1120 may also be oriented such that the microphone array is capable of detecting sound in a wide range of directions around a user wearing the augmented reality system 1100. In some embodiments, an optimization process may be performed during manufacture of the augmented reality system 1100 to determine the relative positioning of each acoustic transducer 1120 in the microphone array.
In some examples, the augmented reality system 1100 may include or be connected to an external device (e.g., a paired device), such as the napestrap 1105. The neck strap 1105 generally represents any type or form of mating device. Accordingly, the discussion of the neck strap 1105 below may also apply to various other paired devices, such as charging boxes, smartwatches, smartphones, wrist straps, other wearable devices, hand-held controllers, tablet computers, laptop computers, other external computing devices, and the like.
As shown, the neck strap 1105 may be coupled to the eye wear device 1102 via one or more connectors. These connectors may be wired or wireless and may include electrical components and/or non-electrical components (e.g., structural components). In some cases, the eye-wear device 1102 and the neck strap 1105 may operate independently without any wired or wireless connection between them. Although fig. 11 shows the components of the eye-wear device 1102 and the components of the neck strap 1105 in example locations on the eye-wear device 1102 and the neck strap 1105, these components may be located elsewhere on the eye-wear device 1102 and/or the neck strap 1105 and/or distributed differently on the eye-wear device 1102 and/or the neck strap 1105. In some embodiments, the components of the eye-wear device 1102 and the components of the neck strap 1105 may be located on one or more additional peripheral devices paired with the eye-wear device 1102, on the neck strap 1105, or some combination thereof.
Pairing an external device (e.g., neck strap 1105) with an augmented reality eye-wear device may enable the eye-wear device to implement the form factor of a pair of eyeglasses, and still provide sufficient battery power and computing power for the extended capabilities. Some or all of the battery power, computing resources, and/or additional features of the augmented reality system 1100 may be provided by the paired device, or shared between the paired device and the eye-worn device, thereby generally reducing the weight, thermal distribution, and form factor of the eye-worn device while still retaining the desired functionality. For example, the neck strap 1105 may allow components that would otherwise be included on the eye-wear device to be included in the neck strap 1105 because the user may bear a heavier weight load on their shoulders than the user bears on their heads. The napestrap 1105 may also have a larger surface area through which heat may be diffused and dispersed to the surrounding environment. As such, the neck strap 1105 may allow for greater battery power and computing power than would otherwise be possible on a freestanding eye-worn device. Because the weight carried in the neck strap 1105 may be less invasive to the user than the weight carried in the eye-wear device 1102, the user may be able to wear a lighter eye-wear device and carry or wear a mating device for a longer period of time than if the user were to wear a heavy, freestanding eye-wear device, thereby enabling the user to more fully integrate the artificial reality environment into their daily activities.
The neck strap 1105 may be communicatively coupled with the eye wear device 1102 and/or other devices. These other devices may provide certain functionality (e.g., tracking, locating, constructing depth maps, processing, storing, etc.) for the augmented reality system 1100. In the embodiment of fig. 11, the neck strap 1105 may include two acoustic transducers (e.g., 1120 (I) and 1120 (J)) that are part of the microphone array (or potentially form their own sub-arrays of microphones). The napestrap 1105 may also include a controller 1125 and a power supply 1135.
The acoustic transducers 1120 (I) and 1120 (J) of the neck strap 1105 may be configured to detect sound and convert the detected sound to an electronic format (analog format or digital format). In the embodiment of fig. 11, acoustic transducers 1120 (I) and 1120 (J) may be positioned on the neck strap 1105, thereby increasing the distance between the acoustic transducers 1120 (I) and 1120 (J) of the neck strap and the other acoustic transducers 1120 positioned on the eye-wear device 1102. In some cases, increasing the distance between the acoustic transducers 1120 of the microphone array may increase the accuracy of the beamforming performed via the microphone array. For example, if acoustic transducers 1120 (C) and 1120 (D) detect sound and the distance between acoustic transducers 1120 (C) and 1120 (D) is greater than, for example, the distance between acoustic transducers 1120 (D) and 1120 (E), the determined source location of the detected sound may be more accurate than if the sound was detected by acoustic transducers 1120 (D) and 1120 (E).
The controller 1125 of the napestrap 1105 may process information generated by sensors on the napestrap 1105 and/or on the augmented reality system 1100. For example, the controller 1125 may process information from the microphone array describing the sound detected by the microphone array. For each detected sound, the controller 1125 may perform a direction-of-arrival (DOA) estimation to estimate the direction in which the detected sound arrives at the microphone array. When sound is detected by the microphone array, the controller 1125 may populate the audio data set with this information. In embodiments where the augmented reality system 1100 includes an inertial measurement unit, the controller 1125 may calculate all inertial and spatial operations from an IMU located on the eye-worn device 1102. The connector may communicate information between the augmented reality system 1100 and the neck strap 1105 and between the augmented reality system 1100 and the controller 1125. The information may be in the form of optical data, electrical data, wireless data, or any other transmissible data. Moving processing of information generated by the augmented reality system 1100 to the neck strap 1105 may reduce the weight and heat of the eye-worn device 1102, making the eye-worn device more comfortable for the user.
The power supply 1135 in the neck strap 1105 may provide power to the eye wear device 1102 and/or the neck strap 1105. The power supply 1135 may include, but is not limited to, a lithium ion battery, a lithium polymer battery, a primary lithium battery, an alkaline battery, or any other form of power storage device. In some cases, power supply 1135 may be a wired power supply. Including the power supply 1135 on the neck strap 1105 rather than on the eye wear 1102 may help better distribute the weight and heat generated by the power supply 1135.
As mentioned, some artificial reality systems may replace one or more of the multiple sensory sensations of the real world with a virtual experience in general, rather than mixing the artificial reality with the actual reality. One example of this type of system is a head-worn display system that covers a majority or all of the user's field of view, such as virtual reality system 1200 in fig. 12. The virtual reality system 1200 may include a front rigid body 1202 and a band 1204 shaped to fit around the head of a user. The virtual reality system 1200 may also include output audio transducers 1206 (a) and 1206 (B). Further, although not shown in fig. 12, the front rigid body 1202 may include one or more electronic components including one or more electronic displays, one or more Inertial Measurement Units (IMUs), one or more tracking emitters or detectors, and/or any other suitable device or system for creating an artificial reality experience.
The artificial reality system may include various types of visual feedback mechanisms. For example, the display devices in the augmented reality system 1100 and/or the display devices in the virtual reality system 1200 may include one or more Liquid Crystal Displays (LCDs), light Emitting Diode (LED) displays, micro LED displays, organic LIGHT EMITTING Diode (OLED) displays, digital light projection (DIGITALLIGHT PROJECT, DLP) micro displays, liquid crystal on silicon (liquid crystal on silicon, LCoS) micro displays, and/or any other suitable type of display screen. These artificial reality systems may include a single display screen for both eyes, or a display screen may be provided for each eye, which may provide additional flexibility for zoom adjustment or correction of refractive errors of the user. Some of these artificial reality systems may also include optical subsystems having one or more lenses (e.g., concave or convex lenses, fresnel lenses, tunable liquid lenses, etc.) through which a user may view the display screen. These optical subsystems may be used for various purposes including collimating light (e.g., making objects appear to be at a greater distance than their physical distance), magnifying light (e.g., making objects appear to be larger than their physical size), and/or relay forwarding light (e.g., relaying light to a viewer's eyes). These optical subsystems may be used in direct-view architectures (non-pupil-forming architecture) (e.g., single lens configurations that directly collimate light but cause so-called pincushion distortion) and/or in non-direct-view architectures (pupil-forming architecture) (e.g., multi-lens configurations that produce so-called barrel distortion to eliminate pincushion distortion).
Some of these artificial reality systems described herein may include one or more projection systems in addition to or instead of using a display screen. For example, the display device in the augmented reality system 1100 and/or the display device in the virtual reality system 1200 may include a micro LED projector that projects light (using, for example, a waveguide) into the display device, such as a transparent combiner lens that allows ambient light to pass through. The display device may refract the projected light toward the pupil of the user, and may enable the user to view both the artificial reality content and the real world simultaneously. The display device may achieve this using any of a variety of different optical components including waveguide components (e.g., holographic waveguide elements, planar waveguide elements, diffractive waveguide elements, polarizing waveguide elements, and/or reflective waveguide elements), light manipulating surfaces and elements (e.g., diffractive elements and gratings, reflective elements and gratings, and refractive elements and gratings), coupling elements, and the like. The artificial reality system may also be configured with any other suitable type or form of image projection system, such as a retinal projector for a virtual retinal display.
The artificial reality systems described herein may also include various types of computer vision components and subsystems. For example, the augmented reality system 1100 and/or the virtual reality system 1200 may include one or more optical sensors, such as two-dimensional (2D) cameras or 3D cameras, structured light emitters and detectors, time-of-flight depth sensors, single beam rangefinders or scanning laser rangefinders, 3D laser radar (LiDAR) sensors, and/or any other suitable type or form of optical sensor. The artificial reality system may process data from one or more of these sensors to identify the user's location, map the real world, provide content to the user regarding the real world surroundings, and/or perform various other functions.
The artificial reality system described herein may also include one or more input audio transducers and/or output audio transducers. The output audio transducer may include a voice coil speaker, a ribbon speaker, an electrostatic speaker, a piezoelectric speaker, a bone conduction transducer, a cartilage conduction transducer, a tragus vibration transducer, and/or any other suitable type or form of audio transducer. Similarly, the input audio transducer may include a condenser microphone, a moving coil microphone (dynamic microphone), a ribbon microphone, and/or any other type or form of input transducer. In some embodiments, a single transducer may be used for both audio input and audio output.
In some embodiments, the artificial reality systems described herein may also include tactile (tactile) (i.e., haptic) feedback systems, which may be incorporated into headwear, gloves, clothing, hand-held controllers, environmental devices (e.g., chairs, floor mats, etc.), and/or any other type of device or system. The haptic feedback system may provide various types of skin feedback including vibration, force, traction, texture, and/or temperature. Haptic feedback systems may also provide various types of kinesthetic feedback, such as motion and compliance. Haptic feedback may be implemented using motors, piezoelectric actuators, fluid systems, and/or various other types of feedback mechanisms. The haptic feedback system may be implemented independently of, within, and/or in combination with other artificial reality devices.
By providing haptic perception, auditory content, and/or visual content, an artificial reality system may create a complete virtual experience or enhance a user's real-world experience in various contexts and environments. For example, an artificial reality system may assist or augment a user's perception, memory, or cognition within a particular environment. Some systems may enhance user interaction with others in the real world or may enable more immersive interaction with others in the virtual world. The artificial reality system may also be used for educational purposes (e.g., for teaching or training in schools, hospitals, government organizations, military organizations, commercial enterprises, etc.), entertainment purposes (e.g., for playing video games, listening to music, watching video content, etc.), and/or for accessibility purposes (e.g., as hearing aids, vision aids, etc.). Embodiments disclosed herein may implement or enhance the user's artificial reality experience in one or more of these contexts and environments and/or in other contexts and environments.
The order of process parameters and steps described and/or illustrated herein is presented as an example only and may be varied as desired. For example, although the steps illustrated and/or described herein may be illustrated or discussed in a particular order, the steps need not be performed in the order illustrated or discussed. Various exemplary methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or include additional steps in addition to those disclosed.
The previous description has been provided to enable any person skilled in the art to best utilize aspects of the exemplary embodiments disclosed herein. The exemplary description is not intended to be exhaustive or to be limited to any precise form disclosed. Many modifications and variations are possible without departing from the scope of the disclosure. The embodiments disclosed herein are to be considered in all respects as illustrative and not restrictive. In determining the scope of the present disclosure, reference should be made to any claims appended hereto and their equivalents.
The terms "connected to" and "coupled to" (and derivatives thereof) as used in the specification and/or claims should be interpreted as allowing for direct connection and indirect (i.e., via other elements or components) unless otherwise indicated. Furthermore, the terms "a" or "an" as used in the specification and/or claims are to be interpreted as meaning at least one of ". Finally, for convenience of use, the terms "comprising" and "having" (and their derivatives) are used in the description and/or claims, interchangeably with the word "comprising" and have the same meaning.
Claims (15)
1. An apparatus, comprising:
An optical system that generates an image to a user, wherein the optical system comprises:
beam splitting element, and
A reflective polarizer reflecting circularly polarized light of a first handedness and transmitting polarized light of a second handedness, and
A light absorbing element that absorbs a portion of the light transmitted by the optical system.
2. The apparatus of claim 1, wherein the reflective polarizer comprises a cholesteric reflective polarizer.
3. The apparatus of claim 1, wherein the reflective polarizer comprises:
Reflective linear polarizer, and
A retarder, in this case, optionally wherein the reflective linear polarizer comprises at least one of:
Birefringent polymeric multilayer optical film, or
A wire grid.
4. The apparatus of any preceding claim, wherein the absorbed portion of light is greater than at least one of:
10% of the light;
20% of the light;
30% of the light;
40% of the light, or
50% Of the light.
5. The apparatus of any preceding claim, there being any one or more of:
a) Wherein the light absorbing element in the transmissive polarization has a higher optical transparency for wavelengths emitted by the display than for photopic weighted white light, or
B) Wherein the light absorbing element in the transmitted polarization has a lower absorbance for wavelengths of about 460 nanometers, about 520 nanometers, and about 615 nanometers than for other wavelengths.
6. The apparatus of any preceding claim, there being any one of:
a) Wherein the light absorbing element is located between the reflective polarizer and the position of the user's eye, or
B) Wherein the light absorbing element is located between the reflective polarizer and the beam splitter.
7. The apparatus of any preceding claim, wherein the absorbance of the light absorbing element is higher at the center than at 50% of the outer radius of the edge at least one of:
10% or more;
20% or more, or
30% Or more.
8. A system, comprising:
A display;
An optical system that generates an image from the display to a user, wherein the optical system comprises:
beam splitting element, and
A reflective polarizer reflecting circularly polarized light of a first handedness and transmitting polarized light of a second handedness, and
A light absorbing element that absorbs a portion of the light transmitted by the optical system.
9. The system of claim 8, there is any one of:
a) Wherein the reflective polarizer comprises a cholesteric reflective polarizer, or
B) Wherein the reflective polarizer comprises:
Reflective linear polarizer, and
A retarder, in this case, optionally wherein the reflective linear polarizer comprises at least one of:
Birefringent polymeric multilayer optical film, or
A wire grid.
10. The system of claim 8 or 9, wherein the absorbed portion of light is greater than 10% of the light.
11. The system of claim 8, 9 or 10, wherein the light absorbing element in a transmissive polarization has a higher optical transparency for wavelengths emitted by the display than for photopic weighted white light.
12. The system of any of claims 8 to 11, wherein the light absorbing element in the transmitted polarization has a lower absorbance for wavelengths of about 460 nanometers, about 520 nanometers, and about 615 nanometers than for other wavelengths.
13. The system of any of claims 8 to 12, wherein the light absorbing element is located between the reflective polarizer and a position of a user's eye.
14. The system of any of claims 8-12, wherein the light absorbing element is located between the reflective polarizer and the beam splitter.
15. A method of manufacture, comprising:
Providing an optical system between a display and an eyebox, the optical system producing an image from the display to a user, wherein the optical system comprises:
beam splitting element, and
A reflective polarizer reflecting circularly polarized light of a first handedness and transmitting polarized light of a second handedness, and
A light absorbing element is disposed between the display and the eyebox, the light absorbing element absorbing a portion of the light transmitted by the optical system.
Applications Claiming Priority (5)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363482400P | 2023-01-31 | 2023-01-31 | |
| US63/482,400 | 2023-01-31 | ||
| US18/420,128 | 2024-01-23 | ||
| US18/420,128 US20240255758A1 (en) | 2023-01-31 | 2024-01-23 | High-contrast pancake lens with pass-polarization absorber |
| PCT/US2024/013605 WO2024163515A1 (en) | 2023-01-31 | 2024-01-30 | High-contrast pancake lens with pass-polarization absorber |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN120604162A true CN120604162A (en) | 2025-09-05 |
Family
ID=90361939
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202480007338.1A Pending CN120604162A (en) | 2023-01-31 | 2024-01-30 | High-contrast pancake lens with transmissive polarizing absorber |
Country Status (3)
| Country | Link |
|---|---|
| EP (1) | EP4659066A1 (en) |
| CN (1) | CN120604162A (en) |
| WO (1) | WO2024163515A1 (en) |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7242524B2 (en) * | 2003-11-25 | 2007-07-10 | Pc Mirage, Llc | Optical system for forming a real image in space |
| US11086161B2 (en) * | 2018-09-26 | 2021-08-10 | Facebook Technologies, Llc | Active alignment of pancake lens based display assemblies |
| US11782279B2 (en) * | 2021-04-29 | 2023-10-10 | Meta Platforms Technologies, Llc | High efficiency pancake lens |
-
2024
- 2024-01-30 CN CN202480007338.1A patent/CN120604162A/en active Pending
- 2024-01-30 WO PCT/US2024/013605 patent/WO2024163515A1/en not_active Ceased
- 2024-01-30 EP EP24709959.1A patent/EP4659066A1/en active Pending
Also Published As
| Publication number | Publication date |
|---|---|
| WO2024163515A1 (en) | 2024-08-08 |
| EP4659066A1 (en) | 2025-12-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20240210274A1 (en) | Optical combiners for binocular disparity detection | |
| US20240201495A1 (en) | Apparatus, system, and method for increasing contrast in pancake lenses via asymmetric beam splitters | |
| US20240255758A1 (en) | High-contrast pancake lens with pass-polarization absorber | |
| CN120604162A (en) | High-contrast pancake lens with transmissive polarizing absorber | |
| US12118143B1 (en) | Pancake lenses with integrated accommodation | |
| US20240094552A1 (en) | Geometrical waveguide with partial-coverage beam splitters | |
| US20240036328A1 (en) | Display system including curved diffuser | |
| EP4567497A1 (en) | Insert-free prescription correction optical module | |
| US20250085605A1 (en) | Apparatuses and systems for pancharatnam-berry phase augmented gradient-index liquid crystal lenses | |
| US20240302578A1 (en) | Apparatuses, systems, and methods for variable profile lenses | |
| US20250291188A1 (en) | COMPACT LCoS DISPLAY ENGINE FOR ARTIFICIAL REALITY | |
| US20250085592A1 (en) | High-contrast laser-illuminated liquid crystal on silicon display | |
| US20250060523A1 (en) | Large field-of-view geometrical waveguide | |
| US20250035928A1 (en) | Apparatus, system, and method for spreading light directed toward displays in eyewear devices | |
| US20250072183A1 (en) | Eye-tracking apparatus including transparent metal mesh traces for micro light emitting diodes | |
| US20240184136A1 (en) | Prescription lenses with gradient-index liquid crystal lens and pancharatnam-berry phase lens | |
| US20250124738A1 (en) | Apparatus, system, and method for sensing facial expressions for avatar animation | |
| US20250116866A1 (en) | Layered kaleido geometric waveguide | |
| US20250044520A1 (en) | Multi-part geometric waveguide | |
| US20250130484A1 (en) | Front-lit illumination module | |
| US20250044519A1 (en) | Micro-molded prism geometric waveguide | |
| Bills | APPARATUS, SYSTEM, AND METHOD FOR PERFORMING MULTI-WAVELENGTH INFIELD IMAGING BASED ON LOW-INDEX WAVEGUIDES FOR EYE TRACKING | |
| WO2025128851A1 (en) | Optical stack with varifocal component for mitigating vergence-accommodation conflict | |
| Bills | APPARATUS, SYSTEM, AND METHOD FOR GENERATING CIRCULAR FRINGE PROJECTIONS FOR PROFILOMETRY-BASED EYE TRACKING | |
| Shaw | SMART FOV TILING FOR WAVEGUIDE DESIGN |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |