WO2025180593A1 - An optical-see-through device and a method for providing an improved extended reality interface - Google Patents
An optical-see-through device and a method for providing an improved extended reality interfaceInfo
- Publication number
- WO2025180593A1 WO2025180593A1 PCT/EP2024/054789 EP2024054789W WO2025180593A1 WO 2025180593 A1 WO2025180593 A1 WO 2025180593A1 EP 2024054789 W EP2024054789 W EP 2024054789W WO 2025180593 A1 WO2025180593 A1 WO 2025180593A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- prisms
- light
- ost
- user
- imagery
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B26/00—Optical devices or arrangements for the control of light using movable or deformable optical elements
- G02B26/004—Optical devices or arrangements for the control of light using movable or deformable optical elements based on a displacement or a deformation of a fluid
- G02B26/005—Optical devices or arrangements for the control of light using movable or deformable optical elements based on a displacement or a deformation of a fluid based on electrowetting
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0123—Head-up displays characterised by optical features comprising devices increasing the field of view
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0081—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. enlarging, the entrance or exit pupil
Definitions
- the present invention relates to an arrangement comprising computer software modules, an optical-see-through, OST, device and a method for providing an improved optical- see-through interface, and in particular to an arrangement comprising computer software modules, a device and a method for providing an improved optical-see-through interface utilizing multiplexing of a reflective surface and a shutter, and, in some embodiments, which adapts to a direction of gaze of a user.
- Augmented reality is a technology that overlays digital information or computergenerated graphics onto the real world. It involves adding virtual elements to the user's actual environment, typically viewed through a camera (Video see-through VST) or a transparent screen (Optical see-through OST), such as in a headset, a full-body suit or through a smartphone. AR can enhance the perception of reality by adding layers of digital information that can be informative, entertaining, or both, or neither.
- VR virtual reality
- users are completely isolated from the real world and transported to a simulated environment that can be interactive and often responsive to user input.
- AR enhances the real world by overlaying digital elements onto it
- VR replaces the real world with a computergenerated environment
- AR is therefore a more natural and intuitive experience that blends the virtual with the real world
- VR is a more immersive but completely synthetic experience.
- Both AR and VR have a wide range of applications, including gaming, education, advertising, training, and simulation, among others. However, they offer distinct experiences that are best suited to different types of applications.
- MR Mixed reality
- Physical and virtual objects may co-exist in mixed reality environments and interact in real time.
- Extended reality is a catch-all term to refer to augmented reality (AR), virtual reality (VR), and mixed reality (MR).
- Head- worn devices such as optical-see-through, OST, devices, for example OST glasses, are used more and more in AR and XR systems, and are thus becoming more and more commonplace. Therefore, the public's demands are ever-increasing of both quality of image displayed and of wearability.
- the efficiency of the combiner display system in an AR system today is very low as it is compromising between transmitting light from real world objects outside of the device and reflecting light from the device light engine/display that will create the augmented content.
- the light from the light engine is either reflected in a half transparent mirror or pass through a waveguide using e.g., holographic optical element (HOE).
- HOE holographic optical element
- Free-space combiners can reach over 10-15% but waveguide combiners sends ⁇ 1% of the originally emitted display light to the eye. This loss of energy will in most implementations need to be compensated with a larger battery which in a weight-sensitive head worn device thus motivates for a solution that improves efficiency even if costly or complicated.
- an optical-see-through, OST, device comprising an Optical-See-Through, OST, device comprising an image presenting arrangement and a controller, wherein the image presenting arrangement comprises a plurality of electrocontrolled liquid prisms, which are arranged to be in a reflective state when a first voltage is applied to the prism and a transmissive state when a second voltage is applied to the prism, and a projector configured to project light for imagery onto at least some of the prisms, wherein the controller is configured to operate the image presenting arrangement in a transmission mode, wherein the prisms are in the transmissive state, whereby a user's eye can perceive incoming light from behind the image presenting arrangement, operate the image presenting arrangement in a reflection mode, wherein at least some of the prisms are in the reflective state, whereby the user's eye can perceive the imagery being projected to at least some of the prisms and reflected by the prisms being in a reflective state, and to repeatedly switch between the reflection mode
- a method for use in an optical-see- through, OST, device as herein comprising operating the image presenting arrangement in a transmission mode, wherein the prisms are in the transmissive state, whereby a user's eye can perceive incoming light from behind the image presenting arrangement, operating the image presenting arrangement in a reflection mode, wherein at least some of the prisms are in the reflective state, whereby the user's eye can perceive the imagery being projected to at least some of the prisms and reflected by the prisms being in a reflective state, and repeatedly switching between the reflection mode and the transmission mode enabling the user's eye to perceive both the incoming light and the projected imagery apparently simultaneously.
- a computer-readable medium carrying computer instructions that when loaded into and executed by a controller of an OST device enables the OST device to execute a method according to herein.
- an optical-see-through, OST, device as herein comprising a software code module arrangement, wherein the software code module arrangement comprises a software component for a software component for operating the image presenting arrangement in a transmission mode, wherein the prisms are in the transmissive state, whereby a user's eye can perceive incoming light from behind the image presenting arrangement, a software component for operating the image presenting arrangement in a reflection mode, wherein at least some of the prisms are in the reflective state, whereby the user's eye can perceive the imagery being projected to at least some of the prisms and reflected by the prisms being in a reflective state, and a software component for repeatedly switching between the reflection mode and the transmission mode enabling the user's eye to perceive both the incoming light and the projected imagery apparently simultaneously.
- a software code module may be replaced or supplemented by a software module.
- a software code module may alternatively be replaced or supplemented by a circuit for performing a corresponding function.
- Figure IB, 1C and ID each shows a schematic view of an image presenting arrangement of an optical-see-through, OST, device according to some embodiments of the present invention
- Figure 2A and 2B each shows a schematic view of an image presenting arrangement of an OST device according to some embodiments of the present invention
- Figures 3A to 3H each shows a schematic view of shows a schematic view of different arrangements that will optically act as prisms and how they can be arranged in an OST device according to some embodiments of the teachings herein,
- Figures 4A and 4B each shows a schematic view of an image presenting arrangement of an OST device according to some embodiments of the teachings herein,
- Figure 5 shows a flowchart of a general method according to some embodiments of the present invention
- Figure 6 shows a component view for a software component arrangement according to some embodiments of the teachings herein.
- Figure 7 shows a schematic view of a computer-readable medium carrying computer instructions that when loaded into and executed by a controller of an arrangement enables the arrangement to implement some embodiments of the present invention.
- FIG. 1A shows a schematic view of an optical-see-through, OST, device 100 according to some embodiments of the present invention.
- the OST device 100 comprises a controller 101, a memory 102 and an image presenting arrangement 105.
- the OST device 100 also comprises a communication interface 103.
- the OST device 100 also comprises a gaze tracking sensor 104.
- the OST device 100 also comprises a light sensor 106.
- the OST device is an HMD 100. In some such embodiments the OST device 100 is a pair of (VR/AR) glasses.
- the controller 101 is configured to control the overall operation of the OST device 100.
- the controller 101 may be a general-purpose controller, wherein general purpose refers to hardware (and/or software) that performs a variety of tasks.
- general purpose refers to hardware (and/or software) that performs a variety of tasks.
- controller 101 may be referred to simply as the controller 101.
- the memory 102 is configured to store instruction data, graphics data, User Interface (Ul) settings, and/or communication data as well as computer-readable instructions that when loaded into the controller 101 indicate how the OST device 100 is to be controlled.
- the memory 102 may comprise several memory units or devices, but they will be perceived as being part of the same overall memory 102. There may be one memory unit for the image presenting device storing graphics data, one memory unit for a gaze tracking sensor (more on such below) for storing settings if present, one memory for a communications interface (if such is present) for storing settings, and so on. As a skilled person would understand, there are many possibilities of how to select where data should be stored.
- the OST device may comprise a general memory 102 in turn comprising any and all such memory units for the purpose of this application.
- a general memory 102 in turn comprising any and all such memory units for the purpose of this application.
- non-volatile memory circuits such as EEPROM memory circuits
- volatile memory circuits such as RAM memory circuits.
- all such alternatives will be referred to simply as the memory 102.
- the image presenting arrangement 105 is an example of an optical combiner, which enables two sources of light to be combined.
- an optical see-through device such as an AR device
- the combiner makes it possible for light from two different origins to enter the eye, light from the environment and light from the projector, i.e. the display engine, also referred to as the imager.
- the various AR/VR devices on the market use a variety of display technologies to generate the VR content and this leads to the use of a wide terminology.
- a projector can be referred to as a subcomponent in such a system.
- Optical combiners can be arranged in a myriad of ways. A common alternative is waveguide combiners. The solution presented herein is mainly directed towards waveguide combiners, but other solutions are also available and possible.
- the image presenting arrangement 105 is thus the component that produces the visual image for the system.
- Augmented reality (AR) devices having a system allowing digital images to be superimposed onto the user's actual view of the real world can be called optical see-through displays (OSTDs).
- OSTDs use an optical combiner to merge the light from the display or projection system with the light from the real world.
- the type of combiner i.e. image presenting device 105 used will by large define the type of OSTD.
- Optical combiners could be based on free-space expansion and semi-transparent mirrors or more commonly if a slim design is the target a thin optical waveguide. Waveguides for AR devices comes in many forms.
- Different display technologies an imager as the projector
- Some examples are LCD, OLED and projector type using LCoS, micro OLED, laser projector.
- the image presenting arrangement 105 thus comprises a projector 105-1 arranged to receive (or store) graphics data indicating an image or other graphical object to be displayed, and to project such imagery.
- the imagery to be displayed may be an image or a collection (one or more) of graphical objects (schematically shown and referenced OBJ) to be displayed to provide an augmented reality to the user.
- the image presenting arrangement 105 further comprises a waveguide 105-2 and optionally a shutter 105-5.
- the waveguide 105-2 is arranged to house a plurality of prisms 105- 3 and an incoupling area.
- the prisms 105-3 may not be actual prisms 105-3 but arrangements that will optically act as prisms, and hence referred to as prisms herein.
- the incoupling area is the area where light from the imager enters the "wave guide" and is directed, perhaps based on reflection by a guiding element 105-4 such as an HOE (Holographical Optical Element) or a reflector 105-4, towards the prisms 105-3.
- HOE Holographical Optical Element
- guiding elements 105-4 are simple prisms, micro-prism arrays, embedded mirror arrays, surface relief gratings, thin or thick analog holographic gratings, metasurfaces, or resonant waveguide grating.
- the exact operation of the waveguide, the prisms and the shutter will be discussed in more detail below.
- a simple explanation is that an image is projected by the projector 105-1, projecting light (hereafter referred to as the image light) onto the reflector 105-4 whereby the image light is in turn reflected through the wave guide 105-2 where the image light will reach one or more of the prisms 105-3 where again it will be reflected towards an eye of the user.
- the OST device 100 is configured (through the controller 101) to alternate between a reflection mode and a transmission mode.
- the reflection mode the prisms 105-3 assume a reflective state and the shutter 105-5 is shut allowing the user to only see what the prisms 105-3 reflect.
- the transmission mode the prisms 105-3 assume a non-reflective state and the shutter 105-5 is opened allowing the user to only see what is behind the OST device 100 (in the direction of the eye). This will be discussed in more detail below.
- the shutter 105-5 is an optical shutter.
- An optical shutter is a device that controls the amount of light that passes through an optical system, i.e. it has a controllable transmission of light.
- the optical shutter is an optical dimmer.
- AR augmented reality
- an optical shutter is used to selectively block light from the user's surroundings, creating a dark environment in which to display digital content.
- the purpose of an optical shutter in AR glasses is to provide a high-contrast display by blocking external light sources that would interfere with the user's view of the digital content. This is especially important in outdoor environments, where ambient light can make it difficult to see the digital images.
- Optical shutters for AR glasses can take various forms, but some of the most common ones are liquid crystal displays (LCDs) and electrochromic materials.
- LCDs liquid crystal displays
- electrochromic materials In most LCDbased shutters, a layer of liquid crystal material is sandwiched between two polarizing filters. When an electric field is applied to the liquid crystal layer, its optical properties change, allowing light to pass through or blocking it. This creates a "switchable" filter that can be turned on and off to selectively block light.
- Electrochromic materials change their optical properties when an electric field is applied.
- an electrochromic-based shutter a thin film of the material is sandwiched between two conductive layers. When a voltage is applied, the material changes its color or opacity, blocking light.
- optical shutters for AR glasses provide a dark, controlled environment in which to display digital content for creating a high-quality AR experience.
- Electro Wetting, EW, prisms and the Liquid Crystal, LC, prisms discussed herein are both examples of electro-controlled liquid prisms 105-3, which are arranged to be in reflective state that reflects light coming from projector 105-1 when a first voltage is applied to the prism 105-3 and a transmissive state that is transmissive to ambient light when a second voltage is applied to the prism 105-3, or possibly in a state in-between when a third voltage is applied .
- the third voltage is a voltage between the voltages for transmissive and reflective mode.
- the OST 100 may also comprise or be connected to a communication interface 103.
- the communication interface 103 is arranged to enable communication with other devices, such as other devices 100 or a server (not shown) for receiving content, instructions and/or settings or other data.
- the communication interface 103 is arranged to receive data regarding the imagery that is to be displayed.
- the communication interface 103 may be wired and/or wireless.
- the communication interface may comprise several interfaces.
- the communication interface 103 may comprise a radio frequency (RF) communications interface.
- RF radio frequency
- the communication interface 103 may comprise a BluetoothTM interface, a WiFiTM interface, a ZigBeeTM interface, a RFIDTM (Radio Frequency Identification) interface, Wireless Display (WiDi) interface, Miracast interface, and/or other RF interface commonly used for short range RF communication.
- the communication interface 103 may comprise a cellular communications interface such as a fifth generation (5G) cellular communication interface, an LTE (Long Term Evolution) interface, a GSM (Global System for Mobile Communication) interface and/or other interface commonly used for cellular communication.
- the communication interface 103 may be configured to communicate using the UPnP (Universal Plug n Play) protocol.
- UPnP Universal Plug n Play
- the communication interface 103 may be configured to communicate using the DLNA (Digital Living Network Appliance) protocol.
- the communication interface 103 may comprise a USB (Universal Serial Bus) interface.
- the communication interface 103 may comprise an HDMI (High-Definition Multimedia Interface) interface.
- the communication interface 103 comprises a Display Port interface.
- the communication interface 103 may comprise an Ethernet interface.
- the communication interface 103 may comprise a MIPI (Mobile Industry Processor Interface) interface.
- the communication interface may comprise an analog interface, a CAN (Controller Area Network) bus interface, an I3C (Inter-Integrated Circuit) interface, or other communication interface.
- the OST 100 may also comprise or be connected to a gaze tracking sensor 104 that is arranged in order to track a gaze of the user and control at least some of the prisms 105-3 in order to provide the image to the user.
- the gaze tracking sensor 104 may in some embodiments be an image sensor, such as a camera or image sensor module, arranged to provide an image (or stream of images) of the user environment when the user is utilizing the OST device 100, wherein images of the user may be analyzed using image processing techniques known in the art in order to determine a gaze of the user, and more specifically the user's eye(s) E.
- the gaze tracking sensor 104 may in some embodiments be other types of electro-optical sensors.
- the OST device 100 may comprise one controller 101 and the sensor 104 may comprise another controller, but for the purpose of the teachings herein, they will be considered to be the same controller 101 in order to cover all possible variations of exactly where the determination of movement or motion takes place.
- Figure IB shows a schematic view of an image presenting arrangement 105 of an OST device 100 as shown in figure 1A.
- the prisms 105-3 may be arranged in a matrix arrangement. It should be noted that the exact arrangement of the prisms 105-3 in the image presenting arrangement 105 depends on for example, the size of the prisms and the size of the image presenting arrangement 105.
- the arrangement can be a line of prisms 105-3 or a series of lines (i.e. a matrix arrangement) of prisms 105-3.
- the distance (horizontal and/or vertical) between prisms 105-3 will of course also depend on the design. The distance can also be zero with the prisms 105-3 abutting each other.
- Figure IB also shows a schematic illustration of an example when a prism (referenced On in Figure IB) is in the reflective state, i.e. when it is in a state that reflects projected light arriving through the incoupling area from the projector 105-1.
- Figure IB also shows a schematic illustration of an example when a prism (referenced Off in Figure IB) is in the transmissive state, i.e. when it is in a state that allows the ambient light to pass through the prism 105-3.
- Figure IB also shows a schematic illustration of an alternative example when a prism (referenced Off (Alt) in Figure IB) is in the transmissive state, i.e. when it is in a state that allows the ambient light to pass through the prism 105-3.
- Figure 1C shows a schematic view of an image presenting arrangement 105 of an OST device 100 as shown in figure 1A, wherein the shutter 105-4 is closed and the prisms 105-3 are set to a reflective state, reflecting imagery projected by a projector (105-1 in figure 1A).
- the image presenting arrangement 105 is thus set to operate in a reflection mode.
- Figure ID shows a schematic view of an image presenting arrangement 105 of an OST device 100 as shown in figure 1A, wherein the shutter 105-4 is open and the prisms 105-3 are set to a transmissive state (Off), whereby the background can be seen.
- the image presenting arrangement 105 is thus set to operate in a transmission mode where the ambient light can pass through.
- Figure 2A shows a schematic view of an image presenting arrangement of an OST device 100 being an optical-see-through device, such as an OST Head-Mounted Device 100 (for example a pair of OST goggles 100) according to some embodiments of the present invention wherein the OST device 100 is arranged to be worn on a user's head.
- an OST Head-Mounted Device 100 for example a pair of OST goggles 100
- the OST device 100 is arranged to be worn on a user's head.
- the view of figure 1A shows the projector 105-1 and the prisms 105-3 being arranged in a vertical arrangement
- the view of figure 2A shows the projector 105-1 and the prisms 105-3 being arranged in a horizontal arrangement.
- the exact arrangement chosen will of course depend on the design of the OST device 100 and may thus vary.
- the image presenting arrangement 105 is arranged to alternate between a reflection mode and a transmission mode, which in turn can be expressed as a multiplexing in time of the real world and the virtual or augmented world.
- Figure 2A shows a view of the image presenting arrangement 105 such as that in figure 1, wherein it is shown how the light projected by the projector 105-1 enters the wave guide 105-2.
- the projector is shown as being arranged in a direction of the eye, which can allow for the projector 105-1 to be housed for example in temples of wearable AR glasses.
- the projector may be placed also at other positions and/or directions, and the arrangement of figures 1, 2A and 2B is only one of many possibilities chosen for illustrative purposes.
- the light is incoupled through an incoupling area, where in this example, the light is reflected by the guiding element 105-4 such as a reflector 105-4.
- the reflector 105-4 is arranged at an angle with regards to the light comping from the projector 105-1, so that the light reflected will in turn be reflected against the walls of the wave guide with Total Internal Reflection, TIR, or at least close thereto (withing 98 or 99% thereof), as indicated through the reference TIR in figure 2A.
- TIR Total Internal Reflection
- the angle selected is thus dependent on a number of factors such as the material of the wave guide, the material outside the waveguide (such as the material of the housing of the OST device), the wavelengths of the light and the thickness d of the waveguide.
- the glasses In order to provide OST devices of low weight, which is important for the user comfort when wearing the OST devices 100, it is important that the glasses, i.e. the OST device 100, is not (too) thick as this will increase the weight and lead to an aesthetically unpleasing look.
- the thickness d should thus be kept as low as possible, whereby the angle of the reflector 105-4 should be less than 45 degrees, with regards to a main or general direction to the prisms 105-3.
- the reflector 105-4 is a TIR in-coupler that can be made by numerous coupler technologies and can be refractive/reflective or diffractive/holographic.
- the 105-4 main purpose is to couple light from the projector 105-1 into the wave guide 105-2 under full TIR condition described by Snell's law.
- the angle of the reflector 105-4 depends on the TIR angle given by the refractive index of the wave guide 105-2.
- the shutter 105-5 is optional and that it may not be needed depending on the design and the placement of the prisms 105-3.
- Figure 2A shows the OST device in the reflection mode, where the prisms 105-3 has assumed a reflective state and as the reflected light coming from the projector 105-1 and the reflector 105-4 reaches a prism 105-3, it is reflected towards the user's eye E enabling the user to see the projected imagery.
- the shutter 105-5 is in a closed state, blocking incoming light (referenced IL) - in embodiments where a shutter is used. This enables the user to only see the projected imagery. In embodiments where a shutter is not used, the eye may not be able to adapt to the incoming light and the user will thus only perceive the projected imagery.
- Figure 2B shows the OST device in the transmission mode, where the shutter 105-5 is open allowing the incoming light to pass, and where the prisms 105-3 has assumed a transmissive state.
- the projector 105-1 is turned off in this mode. Only turning on the projector when the prisms are working as a mirror makes the system power efficient.
- the projector 105-1 is active all through and as the reflected light coming from the projector 105-1 and the reflector 105-4 reaches a prism 105-3, the light passes through the prism 105-3 and continues down the wave guide 105-2 never reaching the user's eye E.
- the shutter 105-5 is in an open state, allowing incoming light (referenced IL) to pass through. This enables the user to only see the incoming light and thus the real world behind the OST device (from the perspective of the eye).
- the projector 105-1 is in some embodiments configured to project light having all color components (RGB or CMY) at the same time. And, in some embodiments, the projector 105-1 is configured to project light the color components (R,G,B or C,M,Y) individually. In some such systems, the projector shows one color component between each alternate transmission mode, (R-OPEN-G-OPEN-B-OPEN). In some alternative such systems, the projector shows one color component at a time but all three between each alternate transmission mode (R-G-B-OPEN-). These are only some examples of cycle duties for the projector. It should be noted that the cycle duties are not restricted to RGB or CMY, but other advanced coloring schemes may also be used. The two schemes referred to herein are only for providing a simple explanation based on the commonly known coloring schemes RGB and CMY.
- the OST device (through the controller 101 and/or the projector 105-1) is configured to show all imagery each reflection mode cycle.
- the OST device (through the controller 101 and/or the projector 105-1) is configured to show some of the imagery each reflection mode cycle and alternate between the imagery each cycle.
- the controller is thus configured to control the projector 105-1 to project the first object in a first reflection mode, and to project the second object in a second reflection mode.
- multiple objects can be displayed in each reflection mode, and there can also be more than one or two reflection modes.
- FIG. 1 shows one image presenting arrangement 105 for one eye
- the arrangement can be adapted for use with two eyes as well.
- the image presenting arrangement 105 may be extended to allow for two eyes to see through the image presenting arrangement.
- the image presenting arrangement 105 may be duplicated to allow for two eyes to see through each an image presenting arrangement.
- the image presenting arrangement may be mirrored as regards the arrangement of components to allow for an easier mounting in glasses.
- prisms 105-3 there are one or more prisms 105-3.
- the number of prisms will - of course - depend on the size of the individual prisms and the size of the image presenting device 105.
- the reflector 105-4 and the projector 105-1 are arranged with regards to the waveguide 105-2 so that once the light has been incoupled in the waveguide 105-2 in the incoupling area (IA), the light is reflected internally in the wave guide with Total Internal Reflection, TIR. This enables the light from the projector 105-1 to travel through the waveguide without being seen by the user, and thus prevents any leakage of light that would otherwise disturb the user or reduce the quality of the images perceived by the user.
- the controller 101 is configured to control the shutter 105-5 to switch between the open state (as in figure 2B) and the closed or shut state (as in figure 2A).
- the controller 101 is also configured to control the prisms 105-3 to switch between the reflective state (as in figure 2A) and the transmissive state (as in figure 2B).
- the controller 101 is specifically configured to control the shutter 105-5 and the prisms 105-3 to alternate between their respective states in synchronicity, thereby providing a time multiplexing of the incoming light passing through the shutter and the light from the projector 105-1 carrying the virtual or augmented imagery.
- the multiplexing of the prisms 105-3 depends on the size of the object to be displayed. For example, if the object/image is of the size of one prism 105-3 or smaller, it is enough to multiplex between light from the projector 105-1 and the incoming light. If the object/image is larger, one option is to show all the image simultaneously (by using multiple layers of waveguides 105-2, which would prevent a prism earlier in the chain to block the projected light from a prism 105-3 further down in the chain. Another option is to multiplex between the prisms 105-3 depending on their position in the chain of prisms 105-3.
- the prisms 105-3 are arranged as electrowetting, EW, components and notably electrowetting prisms. In some embodiments, the prisms 105-3 are arranged as liquid Crystal, LC, components. Common to all embodiments is that the prisms are arranged to be electrically controlled to change the optical properties of the prism. This enables the controller 101 to control the prisms as discussed above.
- Electrowetting is a phenomenon in which the wetting properties of a liquid can be controlled by applying an electric field.
- An electrowetting prism is an optical device typically consists of a chamber containing two immiscible fluids: one conductive (usually water-based) and the other non-conductive (usually oil-based).
- the chamber is made of transparent materials, and transparent electrodes may be placed around the perimeter.
- the walls of the chamber may also have a hydrophobic insulating layer.
- Electrowetting prisms can be used in various optical applications, such as beam steering in optical communication systems, adaptive optics for imaging systems, tunable dispersion control for spectroscopy or wavelength division multiplexing.
- EW lenses and EW prisms both involve electrowetting effect. However, they apply the technique differently.
- EW lenses primarily modify the focal length of the system. They use a voltage-controlled electrowetting effect to change the radius of curvature of liquid-liquid interface, hence altering the focal length of the system.
- EW prisms primarily control the angular deflection of light, steering light beams in different directions without changing focus. In some embodiments the teachings herein use electrowetting prisms.
- a proposed beam steering device being an electrowetting controlled prism with an angle defined to support a TIR condition for example with a large steering angle (
- the structure of the prism consists of two immiscible, colorless, transparent liquids with the refraction indexes nl>n2.
- the first liquid is oil with the refraction index nl.
- the second liquid is a conductive liquid e.g., saline or NaCI dissolved in water, with a typical refraction index n2 ⁇ 1.33.
- the liquids are enclosed in a sealed transparent container with the sidewalls S made up of e.g., Indium Tin Oxides (ITO) glass coated with the dielectric insulator and a hydrophobic layer, e.g., silicon based transparent hydrophobic coating.
- the conductive layer comprises in some embodiments silver nanowires.
- the conductive layer comprises in some embodiments graphene.
- the angle of liquid-liquid interface/meniscus can be changed with a different voltages VR and VL, applied to the opposite sidewalls.
- the controller 101 will activate the image presenting device 105 and close the shutter 105-5.
- the ambient light from the real-world will be blocked by the shutter 105-5 and the display image will be reflected and projected to the wearer's eyes by the prisms 105-3.
- the liquid-liquid interface of the electrowetting device in Figures 3A, 3B, 3C and 3D forms a slope of 45° which can be realized by for example applying 30V to the left sidewall and 80 V to the opposite sidewall.
- TIR total internal reflection
- TIR occurs when the AOR reaches 90° and AOI will be equal or greater than a critical angle.
- AOI Angle of Incidence
- 0 W Angle of Reflection
- the critical angle in the context of TIR is associated with AOI.
- t is the specific AOI at which light is refracted along the boundary, beyond which all the light is reflected back into the denser medium, and no refraction into the less dense medium occurs.
- High index oil is currently used in microscopy to minimize the amount of light lost due to reflection and refraction at the interface between the microscope lens and the specimen being examined. This helps to improve the contrast and resolution of the image, providing clearer and more detailed information.
- an immersion oil with refraction index 1.70 or more has been demonstrated in experiments. This refractive index can be improved with future technological advancements.
- the projector 105-1 In Transmission mode the projector 105-1 is switched off and the shutter 105-5 is in a transmissive state, such as being switched off - in embodiments using a shutter 105-5.
- the liquid-liquid interface become a straight line and the ambient light from the real-world environment will be transmitted through the shutter and the electrowetting device to the wearer's eyes.
- the system When in transmission mode the system should be letting ambient light in undisturbed.
- the liquid-liquid interface can be oriented as shown in the figure 1 and could also be a straight line perpendicular to the ambient light to prevent deflection of rays that enter at an angle.
- the prisms have been shown as being arranged in a row, however, other arrangements are also possible.
- the prisms 105-3 may thus be arranged in a liquid prisms array, or an arrangement of multiple prisms placed in a linear, two or three- dimensional configuration. By combining multiple prisms, more complex and precise manipulation of light can be achieved.
- Figure 3E shows how a liquid prism may be controlled to provide different reflective states, where the prism on the left shows a first reflective state, and the one in the middle shows a transmissive state and the one on the right shows a second reflective state.
- the prisms are thus each capable in some embodiments of assuming more than one reflective state.
- Figures 3F, 3G and 3H thus show that the prisms 105-3 are arranged so that one prism is in front of the other with regards to the user's eye, and wherein the controller (101) is further configured to control a first prism to reflect or refract the light in a first reflective state to enable the second prism to reflect or refract in a second reflective state to enable the light to reach the user's eye.
- Figures 3F, 3G and 3H show different situations where the prisms 105-3 are arranged in a matrix distribution, and also where they are controlled to follow the gaze of a user, in order to ensure that the light is reflected into the eye of the user.
- the prisms 105-3 are arranged in two rows, and the prisms 105-3 where the user is looking are in a reflective state. In some embodiments the other prisms are in a transmissive state. Which prisms that are in a reflective state and which are in a transmissive state will, of course, depend on the direction the user is looking at, and also the specific arrangement of prisms.
- Figures 3G and 3H show how the prisms that are in a reflective state follow the user's eye. It should be noted that even if not explicitly shown in the figures, the prisms also comprise various components for changing the voltage applied to the prism.
- the controller is thus in some embodiments configured to adapt the display to a user's gaze. This can be done in at least two major manners, which are not necessarily alternatives to one another, but can be supplemental (i.e. can be combined).
- the first manner is to ensure that if a user is looking in a direction, such as where there will be objects, the objects there will be properly displayed.
- the prisms where there will be object(s) in the imagery to be displayed will be in a reflective state, and the others (at least some of them) will be in the transmissive state. This can also be used to extend the field of view as compared to a system operating only with a projector and no prisms.
- the second manner is to track the gaze of the user. This can be done in many different manners, and as gaze tracking is a known technology it will not be discussed in detail herein. Suffice to say that the OST device 100 in such cases is arranged with a gaze tracking sensor 104 as discussed briefly in connection with figure 1, for tracking the user's gaze, and then the controller 101 will control the prisms 105-3 to be reflective in directions of the user's gaze. This allows for only displaying content where the user is looking and thereby reducing the power needed by the system.
- the prisms 105-3 are controlled (by the controllerlOl) to assume any of two or more different reflective states (as discussed in relation to figure 3E above), in order to reflect the light to the user's eye.
- the first prism 105-3-1 has a different reflective plane than the second prism 105-3-2.
- An array 105-3A of prisms can thus be used as an alternative or as a replacement for a larger prism.
- each (or at least some) prism 105-3 in the array 105-3A can be individually controlled to provide its own prism angle between for example 0° and for example 45° to refract, reflect or transmit incoming light as shown in figure 3E.
- the individual control is achieved by applying different voltages VR and VL to opposite sidewalls of a prism, whereby the liquid-liquid interface will change the tilt angle due to electrowetting.
- FIG. 3F all the array prisms, except 105-3-1 and 105-3-2, have the same individual tilt angle 0° and can transmit light from the real world and from the projector 105-1.
- the prism 105-3-1 in this example has 45° tilt and reflects the display light with TIR towards the prism 105- 3-2 in this example, which has 24° tilt and refracts the light towards the eye with the gaze angle (for example +15°).
- the controller may thus be configured to determine a gaze angle (or direction) and control the prisms 105-3 accordingly as discussed herein to ensure reflection of the light towards the eye in the direction (angle) of gaze.
- the examples of angles given herein are only examples, and other variations are possible.
- Figure 3G shows the eye is looking straight, and prisms 105-3-3 and 105-3-4 both have 45° tilt so the projected light is reflected with TIR to the eye.
- Figure 3H shows the eye looking to the other side compared to figure 3F and the prisms 105-3-4 both have 41° tilt needed to redirect projected light with TIR toward the eye, in this case rotated 15° downward (-15°).
- the teachings herein may be used together with eye-tracking as discussed above, whereby the OST device 100 may dynamically adopt the prisms' tilt to project a virtual image of the projector towards the user's gaze direction.
- the cells can be arranged in a matrix or line fashion as in figures 3F, 3G and 3H. Such an arrangement allows for a thinner design of the OST device 100.
- the thickness d is reduced since the small cells or prisms can be placed closer next to each other in the same plane instead of being placed in an incremental height.
- the total cell volume is reduced requiring less liquid or in the case of a LC solution less liquid crystal.
- the shape of the volume holding the liquid(s) is not limited to a specific shape like a triangle, wedge or rectangle. The shape is designed to allow for a surface or an interface between materials to be created.
- the prisms 105-3 are arranged tilted with regards to the extension of the wave guide to support the best optimization of optical parameters between the two states, reflective and transmissive. In some embodiments the prisms 105-3 are tilted at a same angle, and in some embodiments the prisms 105-3 are tilted at different angles - at least some of them.
- some of the prisms are tilted at one angle and some prisms are tilted at a second angle.
- the tilting angle depends on the design of the OST device, but for example it can be mentioned that tilting angle is between 10 and 30 degrees.
- the line or matrix of prism is thus positioned in a slight angle compared to a vertical design.
- the prisms may also, in some embodiments, be Liquid Cristal components. It should be noted that many of the features discussed with regards to electrowetting prisms also apply to LC prisms, such as - for example - gaze tracking and controlling prisms to show imagery in the direction of a user's gaze.
- the LC prisms are used to create an interface with a delta in refractive indices (An). Birefringent liquid crystals can have An>0,3 or more. The principle used to control the light is similar as for EW prisms.
- LC prisms are also electrically controlled. By applying a voltage over the prism, the liquid inside will change its optical characteristics. In a first state where no voltage is applied, the liquid crystal molecules are arranged so that the refractive index of the LC is the same as of the bulk material (i.e. the material of the wave guide), which is the transmissive state of the LC prism. In a second state where a voltage is applied the molecules are arranged so that the refractive index is different to the bulk material and the prism will be in the reflective state. The amount or angle of reflection will depend on the LC used for example on the design of the cavity, the bulk material used and the voltage applied.
- the LC prisms can thus also be controlled in a number of different states (not just one reflective), as for the EW prisms.
- the controller 101 is configured to switch or alternate between a transmission mode and a reflection mode.
- the controller 101 is configured to alternate with a frequency of more than 60, 80 or 100 Hz.
- the controller 101 is configured to alternate with a frequency of more than 200 Hz.
- the controller 101 is configured to alternate with a frequency in the interval of 200 to 500 Hz or 300 to 400Hz.
- a high frequency provides for a smooth viewing experience, but of course, the frequency cannot be too high, or it will require too much of the prisms 105-3, the shutter 105-4 and the projector 105-1.
- the frequency used can thus depend on the components used and can vary from implementation to implementation.
- the controller is configured to have a duty cycle for alternating between transmission and reflection mode, meaning that in some embodiments, the controller 101 alternates so that each mode gets an equal amount of time, and in some embodiments the controller alternates to that one mode is used for longer times than the other mode.
- the controller is configured to adapt the so-called duty cycle (i.e. the portion that first and the second modes are used).
- the duty-cycle is adapted based on the imagery to be displayed. For example, if the imagery to be displayed comprises many objects, the duty-cycle can be in favor of the reflection mode (spending more time in reflection mode) allowing the user a better view of the objects.
- the OST device 100 also comprises a light sensor 106.
- the light sensor 106 that is configured for detecting presence or absence of light and can be used to determine the light condition of the scene.
- There are many types of different light sensors such as photoresistors LDR (Light Dependent Resistors), photodiodes, phototransistors, CCD Sensors: CCDs (Charge-Coupled Devices), CMOS (Complementary Metal- Oxide-Semiconductor) Sensors, or other image sensors.
- the type of light sensor could vary from single pixel (single light sensitive element), few pixels with and without filters and many pixels (camera) with and without filters.
- the controller 101 is further configured to adapt the duty cycle based on the surrounding (or background) light detected by the light sensor 106.
- the controller 101 is configured to detect that the light detected falls below a dark surrounding threshold level indicating a dark surrounding, and in response thereto adapt the duty-cycle to favor the transmission mode to allow the background to be seen or perceived by the user.
- the controller 101 is configured to detect that the light detected falls above a light surrounding threshold level indicating a light surrounding, and in response thereto adapt the duty-cycle to favor the reflection mode to allow the imagery to be seen or perceived by the user.
- the duty-cycle is set to 33/66, 30/70, or 25/75 for light surroundings and 66/33, 70/30, or 75/25 for dark surroundings, where the notation T/R indicates the portion of the duty cycle for T transmission mode and R reflection mode.
- the controller 101 is further configured to adapt the duty cycle based on the imagery to be projected.
- the controller 101 is configured to determine that the imagery falls below a dark content threshold level indicating dark content to be displayed, and in response thereto adapt the duty-cycle to favor the reflection mode to allow the dark content to be seen or perceived by the user.
- the controller 101 is configured to imagery falls above a light content threshold level indicating dark content to be displayed, and in response thereto adapt the duty-cycle to favor the transmission mode to allow the background to be seen or perceived by the user.
- the duty-cycle is set to 33/66, 30/70, or 25/75 for dark content and 66/33, 70/30, or 75/25 for light content, where the notation R/T indicates the portion of the duty cycle for T transmission mode and R reflection mode.
- figure 4A shows a schematic view of the gaze being determined to be in one direction, where it is illustrated how the prisms 105-3 in that direction are set to the reflective state in the reflection mode.
- figure 4B shows a schematic view of the gaze being determined to be in a second, different direction, and where it is shown how the prisms 105-3 in that direction are set to the reflective state in the reflection mode.
- the OST device can be run as a virtual reality device.
- Figure 5 shows a general flowchart for a method according to the teachings herein.
- the method corresponds to the operation of the OST device 100 as discussed in the above, wherein said method comprises operating 510 the image presenting arrangement 105 in a transmission mode, wherein the shutter is in the open state and the prisms 105-3 are in the transmissive state, whereby a user's eye can perceive incoming light from behind - in the line of sight from the eye - the image presenting arrangement 105, operating 520 the image presenting arrangement 105 in a reflection mode, wherein the shutter is in the closed state and at least some of the prisms 105-3 are in the reflective state, whereby the user's eye can perceive the imagery being projected to at least some of the prisms and reflected by the prisms being in a reflective state, and wherein incoming light from behind - in the line of sight from the eye - is blocked by the shutter 105-5, and repeatedly switching 530 between the reflection mode and the transmission mode enabling the user's eye to perceive
- Figure 6 shows a component view for a software component or module arrangement 600 according to some embodiments of the teachings herein.
- the software component arrangement 600 is adapted to be used in an OST device 100 as taught herein and corresponds to the operation of the OST device 100 in the above.
- the software component arrangement 600 comprises a software component 610 for operating the image presenting arrangement 105 in a transmission mode, wherein the shutter is in the open state and the prisms 105-3 are in the transmissive state, whereby a user's eye can perceive incoming light from behind - in the line of sight from the eye - the image presenting arrangement 105, a software component 620 for operating the image presenting arrangement 105 in a reflection mode, wherein the shutter 105-5 is in the closed state and at least some of the prisms 105-3 are in the reflective state, whereby the user's eye can perceive the imagery being projected to at least some of the prisms and reflected by the prisms being in a reflective state, and wherein incoming light from behind - in the line of sight from the eye - is blocked by the shutter 105-5, and a software component 630 for repeatedly switching between the reflection mode and the transmission mode enabling the user's eye to perceive both the incoming light and the projected imagery apparently simultaneously.
- the software component arrangement 600 also comprises software component(s) 640 for further functionalities as discussed in the teachings herein.
- a software code module may be replaced or supplemented by a software component.
- a software code module may be replaced or supplemented by a circuit configured for performing a corresponding function.
- Figure 7 shows a schematic view of a computer-readable medium 102 carrying computer instructions 121 that when loaded into and executed by a controller of an OST device 100 enables the OST device the teachings herein.
- the computer-readable medium 102 may be tangible such as a hard drive or a flash memory, for example a USB memory stick or a cloud server.
- the computer- readable medium 102 may be intangible such as a signal carrying the computer instructions enabling the computer instructions to be downloaded through a network connection, such as an internet connection.
- a computer-readable medium 700 is shown as being a computer disc 700 carrying computer-readable computer instructions 710, being inserted in a computer disc reader 720.
- the computer disc reader 720 may be part of a cloud server 730 - or other server - or the computer disc reader may be connected to a cloud server 730 - or other server.
- the cloud server 730 may be part of the internet or at least connected to the internet.
- the cloud server 730 may alternatively be connected through a proprietary or dedicated connection.
- the computer instructions are stored at a remote server 730 and be downloaded to the memory 102 of the OST device 100 for being executed by the controller 101.
- the computer disc reader 720 may also or alternatively be connected to (or possibly inserted into) an OST device 100 for transferring the computer- readable computer instructions 710 to a controller of the OST device via a memory of the OST viewing device 100).
- OST device Figure 7 shows both the situation when an OST device 100 receives the computer-readable computer instructions 710 via a server connection and the situation when another OST device 100 receives the computer-readable computer instructions 710 through a wired interface. This enables for computer-readable computer instructions 710 being downloaded into an OST viewing device 100 thereby enabling the OST device 100 to operate according to and implement the invention as disclosed herein.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
Abstract
An Optical-See-Through, OST, device (100) comprising an image presenting arrangement (105) and a controller (101), wherein the image presenting arrangement (105) comprises a plurality of electro-controlled liquid prisms (105-3), which are arranged to be in a reflective state when a first voltage is applied to the prism (105-3) and a transmissive state when a second voltage is applied to the prism (105-3), and a projector configured to project light for imagery onto at least some of the prisms (105-3), wherein the controller (101) is configured to operate the image presenting arrangement (105) in a transmission mode, wherein the prisms (105-3) are in the transmissive state, whereby a user's eye can perceive incoming light from behind the image presenting arrangement (105), operate the image presenting arrangement (105) in a reflection mode, wherein at least some of the prisms (105-3) are in the reflective state, whereby the user's eye can perceive the imagery being projected to at least some of the prisms and reflected by the prisms being in a reflective state, and to repeatedly switch between the reflection mode and the transmission mode enabling the user's eye to perceive both the incoming light and the projected imagery apparently simultaneously.
Description
AN OPTICAL-SEE-THROUGH DEVICE AND A METHOD FOR PROVIDING AN IMPROVED EXTENDED
REALITY INTERFACE
TECHNICAL FIELD
The present invention relates to an arrangement comprising computer software modules, an optical-see-through, OST, device and a method for providing an improved optical- see-through interface, and in particular to an arrangement comprising computer software modules, a device and a method for providing an improved optical-see-through interface utilizing multiplexing of a reflective surface and a shutter, and, in some embodiments, which adapts to a direction of gaze of a user.
BACKGROUND
Augmented reality (AR) is a technology that overlays digital information or computergenerated graphics onto the real world. It involves adding virtual elements to the user's actual environment, typically viewed through a camera (Video see-through VST) or a transparent screen (Optical see-through OST), such as in a headset, a full-body suit or through a smartphone. AR can enhance the perception of reality by adding layers of digital information that can be informative, entertaining, or both, or neither.
Virtual reality (VR), on the other hand, creates a completely immersive, computergenerated environment that can be experienced through a headset or a full-body suit. In VR, users are completely isolated from the real world and transported to a simulated environment that can be interactive and often responsive to user input.
The key difference between AR and VR is that AR enhances the real world by overlaying digital elements onto it, whereas VR replaces the real world with a computergenerated environment. AR is therefore a more natural and intuitive experience that blends the virtual with the real world, while VR is a more immersive but completely synthetic experience.
Both AR and VR have a wide range of applications, including gaming, education, advertising, training, and simulation, among others. However, they offer distinct experiences that are best suited to different types of applications.
Mixed reality (MR) is a term used to describe the merging of a real-world environment and a computer-generated one. Physical and virtual objects may co-exist in mixed reality environments and interact in real time.
Extended reality (XR) is a catch-all term to refer to augmented reality (AR), virtual reality (VR), and mixed reality (MR).
Head- worn devices, such as optical-see-through, OST, devices, for example OST glasses, are used more and more in AR and XR systems, and are thus becoming more and more commonplace. Therefore, the public's demands are ever-increasing of both quality of image displayed and of wearability.
When using optical see-through technology in augmented reality (AR), there are several optical problems with the image and the background that a user may experience. Here are a few examples: chromatic aberration, ghosting, glare and reflection, distortion and warping, contrast and image quality problem due to the background illumination is today solved with a dark lens like sunglasses sacrificing the transmittance through the system to be able to have decent image quality, and loss in light-efficiency due to polarization, reflection and absorption. These are just a few examples of the optical problems that can occur when using optical see-through technology in AR. Addressing these issues is an important area of research and development in the field of AR, as it can significantly impact the user experience and the effectiveness of AR applications in various domains.
Furthermore, the efficiency of the combiner display system in an AR system today is very low as it is compromising between transmitting light from real world objects outside of the device and reflecting light from the device light engine/display that will create the augmented content. The light from the light engine is either reflected in a half transparent mirror or pass through a waveguide using e.g., holographic optical element (HOE). Free-space combiners can reach over 10-15% but waveguide combiners sends < 1% of the originally emitted display light to the eye. This loss of energy will in most implementations need to be compensated with a
larger battery which in a weight-sensitive head worn device thus motivates for a solution that improves efficiency even if costly or complicated.
There is thus a strong need for a solution to the problems discussed in the above, that is made without increasing the weight significantly.
SUMMARY
According to one aspect there is provided an optical-see-through, OST, device comprising an Optical-See-Through, OST, device comprising an image presenting arrangement and a controller, wherein the image presenting arrangement comprises a plurality of electrocontrolled liquid prisms, which are arranged to be in a reflective state when a first voltage is applied to the prism and a transmissive state when a second voltage is applied to the prism, and a projector configured to project light for imagery onto at least some of the prisms, wherein the controller is configured to operate the image presenting arrangement in a transmission mode, wherein the prisms are in the transmissive state, whereby a user's eye can perceive incoming light from behind the image presenting arrangement, operate the image presenting arrangement in a reflection mode, wherein at least some of the prisms are in the reflective state, whereby the user's eye can perceive the imagery being projected to at least some of the prisms and reflected by the prisms being in a reflective state, and to repeatedly switch between the reflection mode and the transmission mode enabling the user's eye to perceive both the incoming light and the projected imagery apparently simultaneously.
Other embodiments are discussed in the below and are also as per the appended claims.
According to another aspect there is provided a method for use in an optical-see- through, OST, device as herein, wherein said method comprises operating the image presenting arrangement in a transmission mode, wherein the prisms are in the transmissive state, whereby a user's eye can perceive incoming light from behind the image presenting arrangement, operating the image presenting arrangement in a reflection mode, wherein at least some of the prisms are in the reflective state, whereby the user's eye can perceive the imagery being projected to at least some of the prisms and reflected by the prisms being in a reflective state,
and repeatedly switching between the reflection mode and the transmission mode enabling the user's eye to perceive both the incoming light and the projected imagery apparently simultaneously.
According to another aspect there is provided a computer-readable medium carrying computer instructions that when loaded into and executed by a controller of an OST device enables the OST device to execute a method according to herein.
According to another aspect there is provided an optical-see-through, OST, device as herein comprising a software code module arrangement, wherein the software code module arrangement comprises a software component for a software component for operating the image presenting arrangement in a transmission mode, wherein the prisms are in the transmissive state, whereby a user's eye can perceive incoming light from behind the image presenting arrangement, a software component for operating the image presenting arrangement in a reflection mode, wherein at least some of the prisms are in the reflective state, whereby the user's eye can perceive the imagery being projected to at least some of the prisms and reflected by the prisms being in a reflective state, and a software component for repeatedly switching between the reflection mode and the transmission mode enabling the user's eye to perceive both the incoming light and the projected imagery apparently simultaneously.
For the context of the teachings herein a software code module may be replaced or supplemented by a software module. For the context of the teachings herein a software code module may alternatively be replaced or supplemented by a circuit for performing a corresponding function.
Further embodiments and advantages of the present invention will be given in the detailed description. It should be noted that the teachings herein find use in see-through devices, such as in OST devices such as OST HMDs (Head-Mounted Display), media devices, and Head-Up-Displays for example in vehicular displays.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the invention will be described in the following, reference being made to the appended drawings which illustrate non-limiting examples of how the inventive concept can be reduced into practice.
Figure 1A shows a schematic view of an optical-see-through, OST, device according to some embodiments of the present invention,
Figure IB, 1C and ID each shows a schematic view of an image presenting arrangement of an optical-see-through, OST, device according to some embodiments of the present invention,
Figure 2A and 2B each shows a schematic view of an image presenting arrangement of an OST device according to some embodiments of the present invention,
Figures 3A to 3H each shows a schematic view of shows a schematic view of different arrangements that will optically act as prisms and how they can be arranged in an OST device according to some embodiments of the teachings herein,
Figures 4A and 4B each shows a schematic view of an image presenting arrangement of an OST device according to some embodiments of the teachings herein,
Figure 5 shows a flowchart of a general method according to some embodiments of the present invention,
Figure 6 shows a component view for a software component arrangement according to some embodiments of the teachings herein, and
Figure 7 shows a schematic view of a computer-readable medium carrying computer instructions that when loaded into and executed by a controller of an arrangement enables the arrangement to implement some embodiments of the present invention.
DETAILED DESCRIPTION
Figure 1A shows a schematic view of an optical-see-through, OST, device 100 according to some embodiments of the present invention. The OST device 100 comprises a controller 101, a memory 102 and an image presenting arrangement 105. In some embodiments the OST device 100 also comprises a communication interface 103. In some embodiments the OST
device 100 also comprises a gaze tracking sensor 104. And, in some embodiments the OST device 100 also comprises a light sensor 106.
In some embodiments the OST device is an HMD 100. In some such embodiments the OST device 100 is a pair of (VR/AR) glasses.
The controller 101 is configured to control the overall operation of the OST device 100. In some embodiments, the controller 101 may be a general-purpose controller, wherein general purpose refers to hardware (and/or software) that performs a variety of tasks. As a skilled person would understand there are many alternatives for how to implement a controller, such as using Field -Programmable Gate Arrays circuits, FPGAs, ASICs, GPUs, etc. in addition to or as an alternative. For the purpose of this application, all such possibilities and alternatives will be referred to simply as the controller 101.
The memory 102 is configured to store instruction data, graphics data, User Interface (Ul) settings, and/or communication data as well as computer-readable instructions that when loaded into the controller 101 indicate how the OST device 100 is to be controlled. The memory 102 may comprise several memory units or devices, but they will be perceived as being part of the same overall memory 102. There may be one memory unit for the image presenting device storing graphics data, one memory unit for a gaze tracking sensor (more on such below) for storing settings if present, one memory for a communications interface (if such is present) for storing settings, and so on. As a skilled person would understand, there are many possibilities of how to select where data should be stored. In one embodiment, the OST device may comprise a general memory 102 in turn comprising any and all such memory units for the purpose of this application. As a skilled person would understand there are many alternatives of how to implement a memory, for example using non-volatile memory circuits, such as EEPROM memory circuits, or using volatile memory circuits, such as RAM memory circuits. For the purpose of this application all such alternatives will be referred to simply as the memory 102.
The image presenting arrangement 105 is an example of an optical combiner, which enables two sources of light to be combined. As is known, an optical see-through device, such as an AR device, requires an optical combiner. The combiner makes it possible for light from
two different origins to enter the eye, light from the environment and light from the projector, i.e. the display engine, also referred to as the imager. The various AR/VR devices on the market use a variety of display technologies to generate the VR content and this leads to the use of a wide terminology. In some cases, a projector can be referred to as a subcomponent in such a system. In this document we use the term projector, but it does not exclude that it refers to a light engine, a display, a laser beam scanning system or another technology used to create the light that is coupled into the optical combiner. Optical combiners can be arranged in a myriad of ways. A common alternative is waveguide combiners. The solution presented herein is mainly directed towards waveguide combiners, but other solutions are also available and possible. The image presenting arrangement 105 is thus the component that produces the visual image for the system. Augmented reality (AR) devices having a system allowing digital images to be superimposed onto the user's actual view of the real world can be called optical see-through displays (OSTDs). OSTDs use an optical combiner to merge the light from the display or projection system with the light from the real world. The type of combiner (i.e. image presenting device 105) used will by large define the type of OSTD. Optical combiners could be based on free-space expansion and semi-transparent mirrors or more commonly if a slim design is the target a thin optical waveguide. Waveguides for AR devices comes in many forms. Different display technologies (an imager as the projector) could be used depending on the system requirements and mechanical structure. Some examples are LCD, OLED and projector type using LCoS, micro OLED, laser projector.
The image presenting arrangement 105 thus comprises a projector 105-1 arranged to receive (or store) graphics data indicating an image or other graphical object to be displayed, and to project such imagery. The imagery to be displayed may be an image or a collection (one or more) of graphical objects (schematically shown and referenced OBJ) to be displayed to provide an augmented reality to the user.
The image presenting arrangement 105 further comprises a waveguide 105-2 and optionally a shutter 105-5. The waveguide 105-2 is arranged to house a plurality of prisms 105- 3 and an incoupling area. It should be noted that the prisms 105-3 may not be actual prisms 105-3 but arrangements that will optically act as prisms, and hence referred to as prisms
herein. The incoupling area is the area where light from the imager enters the "wave guide" and is directed, perhaps based on reflection by a guiding element 105-4 such as an HOE (Holographical Optical Element) or a reflector 105-4, towards the prisms 105-3. Other examples of guiding elements 105-4 are simple prisms, micro-prism arrays, embedded mirror arrays, surface relief gratings, thin or thick analog holographic gratings, metasurfaces, or resonant waveguide grating. The exact operation of the waveguide, the prisms and the shutter will be discussed in more detail below. However, a simple explanation is that an image is projected by the projector 105-1, projecting light (hereafter referred to as the image light) onto the reflector 105-4 whereby the image light is in turn reflected through the wave guide 105-2 where the image light will reach one or more of the prisms 105-3 where again it will be reflected towards an eye of the user. The OST device 100 is configured (through the controller 101) to alternate between a reflection mode and a transmission mode. In the reflection mode, the prisms 105-3 assume a reflective state and the shutter 105-5 is shut allowing the user to only see what the prisms 105-3 reflect. In the transmission mode, the prisms 105-3 assume a non-reflective state and the shutter 105-5 is opened allowing the user to only see what is behind the OST device 100 (in the direction of the eye). This will be discussed in more detail below.
In some embodiments the shutter 105-5 is an optical shutter. The teachings herein allow for many variations of the exact design of the shutter 105-5. An optical shutter is a device that controls the amount of light that passes through an optical system, i.e. it has a controllable transmission of light. In some embodiments the optical shutter is an optical dimmer. In the context of augmented reality (AR) glasses, an optical shutter is used to selectively block light from the user's surroundings, creating a dark environment in which to display digital content. The purpose of an optical shutter in AR glasses is to provide a high-contrast display by blocking external light sources that would interfere with the user's view of the digital content. This is especially important in outdoor environments, where ambient light can make it difficult to see the digital images. Optical shutters for AR glasses can take various forms, but some of the most common ones are liquid crystal displays (LCDs) and electrochromic materials. In most LCDbased shutters, a layer of liquid crystal material is sandwiched between two polarizing filters. When an electric field is applied to the liquid crystal layer, its optical properties change,
allowing light to pass through or blocking it. This creates a "switchable" filter that can be turned on and off to selectively block light. Electrochromic materials, on the other hand, change their optical properties when an electric field is applied. In an electrochromic-based shutter, a thin film of the material is sandwiched between two conductive layers. When a voltage is applied, the material changes its color or opacity, blocking light.
Regardless of the specific technology used, optical shutters for AR glasses provide a dark, controlled environment in which to display digital content for creating a high-quality AR experience.
The Electro Wetting, EW, prisms and the Liquid Crystal, LC, prisms discussed herein are both examples of electro-controlled liquid prisms 105-3, which are arranged to be in reflective state that reflects light coming from projector 105-1 when a first voltage is applied to the prism 105-3 and a transmissive state that is transmissive to ambient light when a second voltage is applied to the prism 105-3, or possibly in a state in-between when a third voltage is applied . In some embodiments the third voltage is a voltage between the voltages for transmissive and reflective mode.
The OST 100 may also comprise or be connected to a communication interface 103. The communication interface 103 is arranged to enable communication with other devices, such as other devices 100 or a server (not shown) for receiving content, instructions and/or settings or other data. In some embodiments the communication interface 103 is arranged to receive data regarding the imagery that is to be displayed. The communication interface 103 may be wired and/or wireless. The communication interface may comprise several interfaces. In some embodiments, the communication interface 103 may comprise a radio frequency (RF) communications interface. In some such embodiments, the communication interface 103 may comprise a Bluetooth™ interface, a WiFi™ interface, a ZigBee™ interface, a RFID™ (Radio Frequency Identification) interface, Wireless Display (WiDi) interface, Miracast interface, and/or other RF interface commonly used for short range RF communication. In alternative or supplemental such embodiments, the communication interface 103 may comprise a cellular communications interface such as a fifth generation (5G) cellular communication interface, an LTE (Long Term Evolution) interface, a GSM (Global System for Mobile Communication)
interface and/or other interface commonly used for cellular communication. In some embodiments, the communication interface 103 may be configured to communicate using the UPnP (Universal Plug n Play) protocol. In some embodiments, the communication interface 103 may be configured to communicate using the DLNA (Digital Living Network Appliance) protocol. In some embodiments, the communication interface 103 may comprise a USB (Universal Serial Bus) interface. In some embodiments, the communication interface 103 may comprise an HDMI (High-Definition Multimedia Interface) interface. In some embodiments the communication interface 103 comprises a Display Port interface. In some embodiments the communication interface 103 may comprise an Ethernet interface. In some embodiments, the communication interface 103 may comprise a MIPI (Mobile Industry Processor Interface) interface. In some embodiments, the communication interface may comprise an analog interface, a CAN (Controller Area Network) bus interface, an I3C (Inter-Integrated Circuit) interface, or other communication interface.
The OST 100 may also comprise or be connected to a gaze tracking sensor 104 that is arranged in order to track a gaze of the user and control at least some of the prisms 105-3 in order to provide the image to the user. The gaze tracking sensor 104 may in some embodiments be an image sensor, such as a camera or image sensor module, arranged to provide an image (or stream of images) of the user environment when the user is utilizing the OST device 100, wherein images of the user may be analyzed using image processing techniques known in the art in order to determine a gaze of the user, and more specifically the user's eye(s) E. Optionally the gaze tracking sensor 104 may in some embodiments be other types of electro-optical sensors. As a skilled person would understand, the OST device 100 may comprise one controller 101 and the sensor 104 may comprise another controller, but for the purpose of the teachings herein, they will be considered to be the same controller 101 in order to cover all possible variations of exactly where the determination of movement or motion takes place.
Figure IB shows a schematic view of an image presenting arrangement 105 of an OST device 100 as shown in figure 1A. As is shown in figure IB, the prisms 105-3 may be arranged in a matrix arrangement. It should be noted that the exact arrangement of the prisms 105-3 in the
image presenting arrangement 105 depends on for example, the size of the prisms and the size of the image presenting arrangement 105. The arrangement can be a line of prisms 105-3 or a series of lines (i.e. a matrix arrangement) of prisms 105-3. The distance (horizontal and/or vertical) between prisms 105-3 will of course also depend on the design. The distance can also be zero with the prisms 105-3 abutting each other.
Figure IB also shows a schematic illustration of an example when a prism (referenced On in Figure IB) is in the reflective state, i.e. when it is in a state that reflects projected light arriving through the incoupling area from the projector 105-1. Figure IB also shows a schematic illustration of an example when a prism (referenced Off in Figure IB) is in the transmissive state, i.e. when it is in a state that allows the ambient light to pass through the prism 105-3. Figure IB also shows a schematic illustration of an alternative example when a prism (referenced Off (Alt) in Figure IB) is in the transmissive state, i.e. when it is in a state that allows the ambient light to pass through the prism 105-3.
Figure 1C shows a schematic view of an image presenting arrangement 105 of an OST device 100 as shown in figure 1A, wherein the shutter 105-4 is closed and the prisms 105-3 are set to a reflective state, reflecting imagery projected by a projector (105-1 in figure 1A). The image presenting arrangement 105 is thus set to operate in a reflection mode.
Figure ID shows a schematic view of an image presenting arrangement 105 of an OST device 100 as shown in figure 1A, wherein the shutter 105-4 is open and the prisms 105-3 are set to a transmissive state (Off), whereby the background can be seen. The image presenting arrangement 105 is thus set to operate in a transmission mode where the ambient light can pass through.
Figure 2A shows a schematic view of an image presenting arrangement of an OST device 100 being an optical-see-through device, such as an OST Head-Mounted Device 100 (for example a pair of OST goggles 100) according to some embodiments of the present invention wherein the OST device 100 is arranged to be worn on a user's head. It should be noted that the arrangement of components shown in figure 1 is only for illustrative purposes as also for other figures herein, and any exact requirement on a positional relationship between components will be as per the discussion herein, and not necessarily as illustrated in one of the
figures. For example, the view of figure 1A shows the projector 105-1 and the prisms 105-3 being arranged in a vertical arrangement, whereas the view of figure 2A shows the projector 105-1 and the prisms 105-3 being arranged in a horizontal arrangement. The exact arrangement chosen will of course depend on the design of the OST device 100 and may thus vary.
As discussed briefly in relation to figures 1A, IB, 1C and ID, the image presenting arrangement 105 is arranged to alternate between a reflection mode and a transmission mode, which in turn can be expressed as a multiplexing in time of the real world and the virtual or augmented world.
Figure 2A shows a view of the image presenting arrangement 105 such as that in figure 1, wherein it is shown how the light projected by the projector 105-1 enters the wave guide 105-2. The projector is shown as being arranged in a direction of the eye, which can allow for the projector 105-1 to be housed for example in temples of wearable AR glasses. However, the projector may be placed also at other positions and/or directions, and the arrangement of figures 1, 2A and 2B is only one of many possibilities chosen for illustrative purposes.
The light is incoupled through an incoupling area, where in this example, the light is reflected by the guiding element 105-4 such as a reflector 105-4. The reflector 105-4 is arranged at an angle with regards to the light comping from the projector 105-1, so that the light reflected will in turn be reflected against the walls of the wave guide with Total Internal Reflection, TIR, or at least close thereto (withing 98 or 99% thereof), as indicated through the reference TIR in figure 2A. The angle selected is thus dependent on a number of factors such as the material of the wave guide, the material outside the waveguide (such as the material of the housing of the OST device), the wavelengths of the light and the thickness d of the waveguide. In order to provide OST devices of low weight, which is important for the user comfort when wearing the OST devices 100, it is important that the glasses, i.e. the OST device 100, is not (too) thick as this will increase the weight and lead to an aesthetically unpleasing look. The thickness d should thus be kept as low as possible, whereby the angle of the reflector 105-4 should be less than 45 degrees, with regards to a main or general direction to the prisms 105-3. In embodiments where the reflector 105-4 is a TIR in-coupler that can be made by numerous coupler technologies and can be refractive/reflective or diffractive/holographic. The 105-4 main
purpose is to couple light from the projector 105-1 into the wave guide 105-2 under full TIR condition described by Snell's law. The angle of the reflector 105-4 depends on the TIR angle given by the refractive index of the wave guide 105-2.
It should be noted that the shutter 105-5 is optional and that it may not be needed depending on the design and the placement of the prisms 105-3.
Figure 2A shows the OST device in the reflection mode, where the prisms 105-3 has assumed a reflective state and as the reflected light coming from the projector 105-1 and the reflector 105-4 reaches a prism 105-3, it is reflected towards the user's eye E enabling the user to see the projected imagery. At the same time, the shutter 105-5 is in a closed state, blocking incoming light (referenced IL) - in embodiments where a shutter is used. This enables the user to only see the projected imagery. In embodiments where a shutter is not used, the eye may not be able to adapt to the incoming light and the user will thus only perceive the projected imagery.
Figure 2B shows the OST device in the transmission mode, where the shutter 105-5 is open allowing the incoming light to pass, and where the prisms 105-3 has assumed a transmissive state.
Alternatively, the projector 105-1 is turned off in this mode. Only turning on the projector when the prisms are working as a mirror makes the system power efficient.
Alternatively, the projector 105-1 is active all through and as the reflected light coming from the projector 105-1 and the reflector 105-4 reaches a prism 105-3, the light passes through the prism 105-3 and continues down the wave guide 105-2 never reaching the user's eye E. At the same time, the shutter 105-5 is in an open state, allowing incoming light (referenced IL) to pass through. This enables the user to only see the incoming light and thus the real world behind the OST device (from the perspective of the eye).
The projector 105-1 is in some embodiments configured to project light having all color components (RGB or CMY) at the same time. And, in some embodiments, the projector 105-1 is configured to project light the color components (R,G,B or C,M,Y) individually. In some such systems, the projector shows one color component between each alternate transmission mode, (R-OPEN-G-OPEN-B-OPEN...). In some alternative such systems, the projector shows one color
component at a time but all three between each alternate transmission mode (R-G-B-OPEN-...). These are only some examples of cycle duties for the projector. It should be noted that the cycle duties are not restricted to RGB or CMY, but other advanced coloring schemes may also be used. The two schemes referred to herein are only for providing a simple explanation based on the commonly known coloring schemes RGB and CMY.
In some embodiments the OST device (through the controller 101 and/or the projector 105-1) is configured to show all imagery each reflection mode cycle. However, in order to allow for saving the computing resources for the projector 105-1, in some embodiments, the OST device (through the controller 101 and/or the projector 105-1) is configured to show some of the imagery each reflection mode cycle and alternate between the imagery each cycle. The controller is thus configured to control the projector 105-1 to project the first object in a first reflection mode, and to project the second object in a second reflection mode. Of course, multiple objects can be displayed in each reflection mode, and there can also be more than one or two reflection modes.
It should also be noted that even though figures 1, 2A and 2B shows one image presenting arrangement 105 for one eye, a skilled person would understand that the arrangement can be adapted for use with two eyes as well. For example, in some embodiments, the image presenting arrangement 105 may be extended to allow for two eyes to see through the image presenting arrangement. Alternatively, in some embodiments, the image presenting arrangement 105 may be duplicated to allow for two eyes to see through each an image presenting arrangement. For such a cases, the image presenting arrangement may be mirrored as regards the arrangement of components to allow for an easier mounting in glasses.
As is indicated in the figures, there are one or more prisms 105-3. The number of prisms will - of course - depend on the size of the individual prisms and the size of the image presenting device 105.
The reflector 105-4 and the projector 105-1 are arranged with regards to the waveguide 105-2 so that once the light has been incoupled in the waveguide 105-2 in the incoupling area (IA), the light is reflected internally in the wave guide with Total Internal
Reflection, TIR. This enables the light from the projector 105-1 to travel through the waveguide without being seen by the user, and thus prevents any leakage of light that would otherwise disturb the user or reduce the quality of the images perceived by the user.
The controller 101 is configured to control the shutter 105-5 to switch between the open state (as in figure 2B) and the closed or shut state (as in figure 2A). The controller 101 is also configured to control the prisms 105-3 to switch between the reflective state (as in figure 2A) and the transmissive state (as in figure 2B). The controller 101 is specifically configured to control the shutter 105-5 and the prisms 105-3 to alternate between their respective states in synchronicity, thereby providing a time multiplexing of the incoming light passing through the shutter and the light from the projector 105-1 carrying the virtual or augmented imagery.
In some embodiments the multiplexing of the prisms 105-3 depends on the size of the object to be displayed. For example, if the object/image is of the size of one prism 105-3 or smaller, it is enough to multiplex between light from the projector 105-1 and the incoming light. If the object/image is larger, one option is to show all the image simultaneously (by using multiple layers of waveguides 105-2, which would prevent a prism earlier in the chain to block the projected light from a prism 105-3 further down in the chain. Another option is to multiplex between the prisms 105-3 depending on their position in the chain of prisms 105-3.
In some embodiments, the prisms 105-3 are arranged as electrowetting, EW, components and notably electrowetting prisms. In some embodiments, the prisms 105-3 are arranged as liquid Crystal, LC, components. Common to all embodiments is that the prisms are arranged to be electrically controlled to change the optical properties of the prism. This enables the controller 101 to control the prisms as discussed above.
The operation of electrowetting prisms will now be discussed briefly with reference to figures 3A, 3B, 3C and 3D which show a proposed beam steering device being an electrowetting controlled prism with a large steering angle (|> of around 45. It should be noted that the value 45 for the steering angle is only an example and other angle values are possible, such as 40-50, 30- 60 degrees to mention some examples.
Electrowetting is a phenomenon in which the wetting properties of a liquid can be controlled by applying an electric field. An electrowetting prism is an optical device typically
consists of a chamber containing two immiscible fluids: one conductive (usually water-based) and the other non-conductive (usually oil-based). The chamber is made of transparent materials, and transparent electrodes may be placed around the perimeter. The walls of the chamber may also have a hydrophobic insulating layer. When a voltage is applied to the electrodes, the electric field formed across the insulating layer changes the contact angle between the conductive liquid and the insulating layer. This change in contact angle leads to a change in the shape of the liquid-liquid interface creating a tunable prism. Electrowetting prisms can be used in various optical applications, such as beam steering in optical communication systems, adaptive optics for imaging systems, tunable dispersion control for spectroscopy or wavelength division multiplexing. EW lenses and EW prisms both involve electrowetting effect. However, they apply the technique differently. EW lenses primarily modify the focal length of the system. They use a voltage-controlled electrowetting effect to change the radius of curvature of liquid-liquid interface, hence altering the focal length of the system. EW prisms primarily control the angular deflection of light, steering light beams in different directions without changing focus. In some embodiments the teachings herein use electrowetting prisms.
Returning to figures 3A, 3B, 3C and 3D showing a proposed beam steering device being an electrowetting controlled prism with an angle defined to support a TIR condition for example with a large steering angle (|> of 45, the structure of the prism consists of two immiscible, colorless, transparent liquids with the refraction indexes nl>n2. The first liquid is oil with the refraction index nl. The second liquid is a conductive liquid e.g., saline or NaCI dissolved in water, with a typical refraction index n2 ~ 1.33. The liquids are enclosed in a sealed transparent container with the sidewalls S made up of e.g., Indium Tin Oxides (ITO) glass coated with the dielectric insulator and a hydrophobic layer, e.g., silicon based transparent hydrophobic coating. The conductive layer comprises in some embodiments silver nanowires. The conductive layer comprises in some embodiments graphene. The angle of liquid-liquid interface/meniscus can be changed with a different voltages VR and VL, applied to the opposite sidewalls.
In the reflection mode, the controller 101 will activate the image presenting device 105 and close the shutter 105-5. In the reflection mode the ambient light from the real-world will be blocked by the shutter 105-5 and the display image will be reflected and projected to the wearer's eyes by the prisms 105-3. The liquid-liquid interface of the electrowetting device in Figures 3A, 3B, 3C and 3D forms a slope of 45° which can be realized by for example applying 30V to the left sidewall and 80 V to the opposite sidewall. For light from the projector to be reflected to the wearer's eyes, it must undergo total internal reflection (TIR) after entering the first liquid medium (oil) with the refraction index noii.
According to Snell's Law, if light travels from a medium with a higher refractive index (nOii) to a medium with a lower refractive index (water, nw), TIR occurs when the AOR reaches 90° and AOI will be equal or greater than a critical angle. In the teachings herein we define 0oii as AOI (Angle of Incidence), and 0W as AOR (Angle of Reflection). The critical angle in the context of TIR is associated with AOI. t is the specific AOI at which light is refracted along the boundary, beyond which all the light is reflected back into the denser medium, and no refraction into the less dense medium occurs. nOiisin0Oii - nwsin0w - [0W - 90°] -> 0oii > arcsin noii / nw
From experiments it has been found that 0oii is the critical angle 0C =45°, nw=1.33 (typical refraction index for H2O) and 0W - 90°. TIR will then occur if refraction index for oil noii >1.88: nOii > nw/sin 0c ~1.88
High index oil is currently used in microscopy to minimize the amount of light lost due to reflection and refraction at the interface between the microscope lens and the specimen being examined. This helps to improve the contrast and resolution of the image, providing clearer and more detailed information. Currently, an immersion oil with refraction index 1.70 or more has been demonstrated in experiments. This refractive index can be improved with future technological advancements.
In Transmission mode the projector 105-1 is switched off and the shutter 105-5 is in a transmissive state, such as being switched off - in embodiments using a shutter 105-5. The liquid-liquid interface become a straight line and the ambient light from the real-world
environment will be transmitted through the shutter and the electrowetting device to the wearer's eyes. When in transmission mode the system should be letting ambient light in undisturbed. In this case the liquid-liquid interface can be oriented as shown in the figure 1 and could also be a straight line perpendicular to the ambient light to prevent deflection of rays that enter at an angle.
Other geometries could be arranged if another incident angle or other liquids are used or by accepting a lower reflectance than full TIR gives. The principle for the teachings herein is not limited to any one specific geometry.
In the discussion above, the prisms have been shown as being arranged in a row, however, other arrangements are also possible. The prisms 105-3 may thus be arranged in a liquid prisms array, or an arrangement of multiple prisms placed in a linear, two or three- dimensional configuration. By combining multiple prisms, more complex and precise manipulation of light can be achieved.
Figure 3E shows how a liquid prism may be controlled to provide different reflective states, where the prism on the left shows a first reflective state, and the one in the middle shows a transmissive state and the one on the right shows a second reflective state. The prisms are thus each capable in some embodiments of assuming more than one reflective state.
Figures 3F, 3G and 3H thus show that the prisms 105-3 are arranged so that one prism is in front of the other with regards to the user's eye, and wherein the controller (101) is further configured to control a first prism to reflect or refract the light in a first reflective state to enable the second prism to reflect or refract in a second reflective state to enable the light to reach the user's eye.
Figures 3F, 3G and 3H show different situations where the prisms 105-3 are arranged in a matrix distribution, and also where they are controlled to follow the gaze of a user, in order to ensure that the light is reflected into the eye of the user.
If one looks at Figure 3F, the prisms 105-3 are arranged in two rows, and the prisms 105-3 where the user is looking are in a reflective state. In some embodiments the other prisms are in a transmissive state. Which prisms that are in a reflective state and which are in a transmissive state will, of course, depend on the direction the user is looking at, and also the
specific arrangement of prisms. Figures 3G and 3H show how the prisms that are in a reflective state follow the user's eye. It should be noted that even if not explicitly shown in the figures, the prisms also comprise various components for changing the voltage applied to the prism.
The controller is thus in some embodiments configured to adapt the display to a user's gaze. This can be done in at least two major manners, which are not necessarily alternatives to one another, but can be supplemental (i.e. can be combined).
The first manner is to ensure that if a user is looking in a direction, such as where there will be objects, the objects there will be properly displayed. In such embodiments, the prisms where there will be object(s) in the imagery to be displayed will be in a reflective state, and the others (at least some of them) will be in the transmissive state. This can also be used to extend the field of view as compared to a system operating only with a projector and no prisms.
The second manner is to track the gaze of the user. This can be done in many different manners, and as gaze tracking is a known technology it will not be discussed in detail herein. Suffice to say that the OST device 100 in such cases is arranged with a gaze tracking sensor 104 as discussed briefly in connection with figure 1, for tracking the user's gaze, and then the controller 101 will control the prisms 105-3 to be reflective in directions of the user's gaze. This allows for only displaying content where the user is looking and thereby reducing the power needed by the system.
Returning to figure 3F, it should be noted that in some embodiments, the prisms 105-3 are controlled (by the controllerlOl) to assume any of two or more different reflective states (as discussed in relation to figure 3E above), in order to reflect the light to the user's eye. As can be seen in figure 3E the first prism 105-3-1 has a different reflective plane than the second prism 105-3-2.
An array 105-3A of prisms can thus be used as an alternative or as a replacement for a larger prism. And, as discussed above, each (or at least some) prism 105-3 in the array 105-3A can be individually controlled to provide its own prism angle between for example 0° and for example 45° to refract, reflect or transmit incoming light as shown in figure 3E.
The individual control is achieved by applying different voltages VR and VL to opposite sidewalls of a prism, whereby the liquid-liquid interface will change the tilt angle due to electrowetting.
In Figure 3F all the array prisms, except 105-3-1 and 105-3-2, have the same individual tilt angle 0° and can transmit light from the real world and from the projector 105-1. The prism 105-3-1 in this example has 45° tilt and reflects the display light with TIR towards the prism 105- 3-2 in this example, which has 24° tilt and refracts the light towards the eye with the gaze angle (for example +15°). The controller may thus be configured to determine a gaze angle (or direction) and control the prisms 105-3 accordingly as discussed herein to ensure reflection of the light towards the eye in the direction (angle) of gaze. Of course, the examples of angles given herein are only examples, and other variations are possible.
Figure 3G shows the eye is looking straight, and prisms 105-3-3 and 105-3-4 both have 45° tilt so the projected light is reflected with TIR to the eye.
Figure 3H shows the eye looking to the other side compared to figure 3F and the prisms 105-3-4 both have 41° tilt needed to redirect projected light with TIR toward the eye, in this case rotated 15° downward (-15°).
This allows for the extension of viewing zones and hence FOV dynamically, facilitating more immersive and realistic AR experiences. The teachings herein may be used together with eye-tracking as discussed above, whereby the OST device 100 may dynamically adopt the prisms' tilt to project a virtual image of the projector towards the user's gaze direction.
In embodiments where there is more than one liquid prism splitting up the liquid-to- liquid interface in smaller cavities or cells. The cells can be arranged in a matrix or line fashion as in figures 3F, 3G and 3H. Such an arrangement allows for a thinner design of the OST device 100. The thickness d is reduced since the small cells or prisms can be placed closer next to each other in the same plane instead of being placed in an incremental height. Also, the total cell volume is reduced requiring less liquid or in the case of a LC solution less liquid crystal. The shape of the volume holding the liquid(s) is not limited to a specific shape like a triangle, wedge or rectangle. The shape is designed to allow for a surface or an interface between materials to be created. This arrangement is equally valid for an LC design (as will be discussed in the below)
as for one based on electrowetting. This interface is designed with sufficient difference in refractive index to direct the light as wanted. It should be noted that even though the figures (2A and 2B) predominantly show the prisms as straight with regards to the wave guide 105-2), this is only one option and used for illustration. In some embodiments, the prisms 105-3 are arranged tilted with regards to the extension of the wave guide to support the best optimization of optical parameters between the two states, reflective and transmissive. In some embodiments the prisms 105-3 are tilted at a same angle, and in some embodiments the prisms 105-3 are tilted at different angles - at least some of them. And, in some embodiments some of the prisms are tilted at one angle and some prisms are tilted at a second angle. The tilting angle depends on the design of the OST device, but for example it can be mentioned that tilting angle is between 10 and 30 degrees. In some embodiments the line or matrix of prism is thus positioned in a slight angle compared to a vertical design.
As mentioned above, the prisms may also, in some embodiments, be Liquid Cristal components. It should be noted that many of the features discussed with regards to electrowetting prisms also apply to LC prisms, such as - for example - gaze tracking and controlling prisms to show imagery in the direction of a user's gaze. The LC prisms are used to create an interface with a delta in refractive indices (An). Birefringent liquid crystals can have An>0,3 or more. The principle used to control the light is similar as for EW prisms.
Similarly to EW prisms, LC prisms are also electrically controlled. By applying a voltage over the prism, the liquid inside will change its optical characteristics. In a first state where no voltage is applied, the liquid crystal molecules are arranged so that the refractive index of the LC is the same as of the bulk material (i.e. the material of the wave guide), which is the transmissive state of the LC prism. In a second state where a voltage is applied the molecules are arranged so that the refractive index is different to the bulk material and the prism will be in the reflective state. The amount or angle of reflection will depend on the LC used for example on the design of the cavity, the bulk material used and the voltage applied. The LC prisms can thus also be controlled in a number of different states (not just one reflective), as for the EW prisms.
As discussed above, the controller 101 is configured to switch or alternate between a transmission mode and a reflection mode. In some embodiments the controller 101 is configured to alternate with a frequency of more than 60, 80 or 100 Hz. In some embodiments the controller 101 is configured to alternate with a frequency of more than 200 Hz. In some embodiments the controller 101 is configured to alternate with a frequency in the interval of 200 to 500 Hz or 300 to 400Hz. A high frequency provides for a smooth viewing experience, but of course, the frequency cannot be too high, or it will require too much of the prisms 105-3, the shutter 105-4 and the projector 105-1. The frequency used can thus depend on the components used and can vary from implementation to implementation.
In some embodiments the controller is configured to have a duty cycle for alternating between transmission and reflection mode, meaning that in some embodiments, the controller 101 alternates so that each mode gets an equal amount of time, and in some embodiments the controller alternates to that one mode is used for longer times than the other mode. In some embodiments, the controller is configured to adapt the so-called duty cycle (i.e. the portion that first and the second modes are used). In some such embodiments the duty-cycle is adapted based on the imagery to be displayed. For example, if the imagery to be displayed comprises many objects, the duty-cycle can be in favor of the reflection mode (spending more time in reflection mode) allowing the user a better view of the objects.
Referring back to figure 1, in some embodiments the OST device 100 also comprises a light sensor 106. The light sensor 106 that is configured for detecting presence or absence of light and can be used to determine the light condition of the scene. There are many types of different light sensors such as photoresistors LDR (Light Dependent Resistors), photodiodes, phototransistors, CCD Sensors: CCDs (Charge-Coupled Devices), CMOS (Complementary Metal- Oxide-Semiconductor) Sensors, or other image sensors. The type of light sensor could vary from single pixel (single light sensitive element), few pixels with and without filters and many pixels (camera) with and without filters.
In some such embodiments the controller 101 is further configured to adapt the duty cycle based on the surrounding (or background) light detected by the light sensor 106. In some such embodiments the controller 101 is configured to detect that the light detected falls below
a dark surrounding threshold level indicating a dark surrounding, and in response thereto adapt the duty-cycle to favor the transmission mode to allow the background to be seen or perceived by the user. In some embodiments the controller 101 is configured to detect that the light detected falls above a light surrounding threshold level indicating a light surrounding, and in response thereto adapt the duty-cycle to favor the reflection mode to allow the imagery to be seen or perceived by the user. In some such embodiments the duty-cycle is set to 33/66, 30/70, or 25/75 for light surroundings and 66/33, 70/30, or 75/25 for dark surroundings, where the notation T/R indicates the portion of the duty cycle for T transmission mode and R reflection mode.
In some alternative or additional such embodiments, the controller 101 is further configured to adapt the duty cycle based on the imagery to be projected. In some such embodiments the controller 101 is configured to determine that the imagery falls below a dark content threshold level indicating dark content to be displayed, and in response thereto adapt the duty-cycle to favor the reflection mode to allow the dark content to be seen or perceived by the user. In some embodiments the controller 101 is configured to imagery falls above a light content threshold level indicating dark content to be displayed, and in response thereto adapt the duty-cycle to favor the transmission mode to allow the background to be seen or perceived by the user. In some such embodiments the duty-cycle is set to 33/66, 30/70, or 25/75 for dark content and 66/33, 70/30, or 75/25 for light content, where the notation R/T indicates the portion of the duty cycle for T transmission mode and R reflection mode.
Returning to the embodiments, where the gaze of the user is tracked, by determining which objects are to be display, by actually tracking the gaze, or a combination of the two (or other manners), figure 4A shows a schematic view of the gaze being determined to be in one direction, where it is illustrated how the prisms 105-3 in that direction are set to the reflective state in the reflection mode. Similarly, figure 4B shows a schematic view of the gaze being determined to be in a second, different direction, and where it is shown how the prisms 105-3 in that direction are set to the reflective state in the reflection mode.
In the above two different ways to control a prism are proposed. In one state light from the outside passes unhindered (transmissive state) and in another state (reflective state)
the prism directs the light from the projector efficiently to the eye. By selecting the prism so that in the reflection mode (when light comes from the projector) some of the prisms are arranged to let the light pass unhindered with high transmission and some of the prisms are selected to reflect (or refract) the light and by combining this with gaze tracking as discussed above, this will allow for directing the field of view so the field of view follows the gaze of the viewer. This will effectively create an expanded field of view.
It should be noted that by closing the shutter and setting all (or most) prisms in a reflective mode, the OST device can be run as a virtual reality device.
Figure 5 shows a general flowchart for a method according to the teachings herein. The method corresponds to the operation of the OST device 100 as discussed in the above, wherein said method comprises operating 510 the image presenting arrangement 105 in a transmission mode, wherein the shutter is in the open state and the prisms 105-3 are in the transmissive state, whereby a user's eye can perceive incoming light from behind - in the line of sight from the eye - the image presenting arrangement 105, operating 520 the image presenting arrangement 105 in a reflection mode, wherein the shutter is in the closed state and at least some of the prisms 105-3 are in the reflective state, whereby the user's eye can perceive the imagery being projected to at least some of the prisms and reflected by the prisms being in a reflective state, and wherein incoming light from behind - in the line of sight from the eye - is blocked by the shutter 105-5, and repeatedly switching 530 between the reflection mode and the transmission mode enabling the user's eye to perceive both the incoming light and the projected imagery apparently simultaneously.
It should be noted that the other functionalities discussed herein may be included in some or all of the functionalities discussed in relation to figure 5 and the method of figure 5 is a general method and also allows for implementing other features as disclosed in above as sub functionality of any part of the method as disclosed.
Figure 6 shows a component view for a software component or module arrangement 600 according to some embodiments of the teachings herein. The software component arrangement 600 is adapted to be used in an OST device 100 as taught herein and corresponds to the operation of the OST device 100 in the above. The software component arrangement
600 comprises a software component 610 for operating the image presenting arrangement 105 in a transmission mode, wherein the shutter is in the open state and the prisms 105-3 are in the transmissive state, whereby a user's eye can perceive incoming light from behind - in the line of sight from the eye - the image presenting arrangement 105, a software component 620 for operating the image presenting arrangement 105 in a reflection mode, wherein the shutter 105-5 is in the closed state and at least some of the prisms 105-3 are in the reflective state, whereby the user's eye can perceive the imagery being projected to at least some of the prisms and reflected by the prisms being in a reflective state, and wherein incoming light from behind - in the line of sight from the eye - is blocked by the shutter 105-5, and a software component 630 for repeatedly switching between the reflection mode and the transmission mode enabling the user's eye to perceive both the incoming light and the projected imagery apparently simultaneously.
In some embodiments the software component arrangement 600 also comprises software component(s) 640 for further functionalities as discussed in the teachings herein.
For the context of the teachings herein a software code module may be replaced or supplemented by a software component. Alternatively for the context of the teachings herein a software code module may be replaced or supplemented by a circuit configured for performing a corresponding function.
Figure 7 shows a schematic view of a computer-readable medium 102 carrying computer instructions 121 that when loaded into and executed by a controller of an OST device 100 enables the OST device the teachings herein.
The computer-readable medium 102 may be tangible such as a hard drive or a flash memory, for example a USB memory stick or a cloud server. Alternatively, the computer- readable medium 102 may be intangible such as a signal carrying the computer instructions enabling the computer instructions to be downloaded through a network connection, such as an internet connection. In the example of figure 7, a computer-readable medium 700 is shown as being a computer disc 700 carrying computer-readable computer instructions 710, being inserted in a computer disc reader 720. The computer disc reader 720 may be part of a cloud server 730 - or other server - or the computer disc reader may be connected to a cloud server
730 - or other server. The cloud server 730 may be part of the internet or at least connected to the internet. The cloud server 730 may alternatively be connected through a proprietary or dedicated connection. In one example embodiment, the computer instructions are stored at a remote server 730 and be downloaded to the memory 102 of the OST device 100 for being executed by the controller 101. The computer disc reader 720 may also or alternatively be connected to (or possibly inserted into) an OST device 100 for transferring the computer- readable computer instructions 710 to a controller of the OST device via a memory of the OST viewing device 100). OST device. Figure 7 shows both the situation when an OST device 100 receives the computer-readable computer instructions 710 via a server connection and the situation when another OST device 100 receives the computer-readable computer instructions 710 through a wired interface. This enables for computer-readable computer instructions 710 being downloaded into an OST viewing device 100 thereby enabling the OST device 100 to operate according to and implement the invention as disclosed herein.
Claims
1. An Optical-See-Through, OST, device (100) comprising an image presenting arrangement (105) and a controller (101), wherein the image presenting arrangement (105) comprises a plurality of electro-controlled liquid prisms (105-3), which are arranged to be in a reflective state when a first voltage is applied to the prism (105-3) and a transmissive state when a second voltage is applied to the prism (105-3), and a projector configured to project light for imagery onto at least some of the prisms (105-3), wherein the controller (101) is configured to operate the image presenting arrangement (105) in a transmission mode, wherein the prisms (105-3) are in the transmissive state, whereby a user's eye can perceive incoming light from behind the image presenting arrangement (105), operate the image presenting arrangement (105) in a reflection mode, wherein at least some of the prisms (105-3) are in the reflective state, whereby the user's eye can perceive the imagery being projected to at least some of the prisms and reflected by the prisms being in a reflective state, and to repeatedly switch between the reflection mode and the transmission mode enabling the user's eye to perceive both the incoming light and the projected imagery apparently simultaneously.
2. The OST device (100) according to claim 1, wherein the image presenting arrangement (105) further comprises a shutter (105-5) configured to assume an open state where light can travel through it and an closed state where light is blocked and wherein the controller (101) is further configured to operate the shutter (105-5) to be in the open state when the image presenting arrangement (105) is operated in the transmission mode and to operate the shutter (105-5) to be in the closed state and wherein incoming light from behind is blocked by the shutter (105-5) when the image presenting arrangement (105) is operated in the reflection mode.
3. The OST device (100) according to claim 1 or 2, wherein the image presenting arrangement (105) further comprises a wave guide (105-2) which is arranged to house the plurality of prisms (105-3), wherein the projector (105-1) is arranged to project the light into the waveguide (105-2) for transmitting the light to at least some of the prisms (105-3) at an angle so that the light is reflected by the sides of the wave guide at Total Internal Reflection, to remain inside the waveguide (105-2) while being transmitted to the at least some of the prisms (105-3).
4. The OST device (100) according to claim 3, wherein the image presenting arrangement (105) further comprises an incoupling area in the wave guide (105-2) which incoupling area (105-4) is arranged to guide the light projected into the wave guide (105-2) by the projector (105-1) for transmitting the light to at least some of the prisms (105-3) at an angle so that the light is reflected by the sides of the wave guide at Total Internal Reflection, so that the projected light remains inside the wave guide (105-2) while being transmitted to the at least some of the prisms (105-3).
5. The OST device (100) according to any preceding claim, wherein the prisms (105-3) are arranged in a matrix in the wave guide (105-2).
6. The OST device (100) according to claim 5, wherein the prisms (105-3) are arranged to abut one another.
7. The OST device (100) according to any preceding claim, wherein at least some of the prisms (105-3) are configured to assume a second reflective state, and wherein the prisms (105- 3) are arranged so that one prism is in front of the other with regards to the user's eye, and wherein the controller (101) is further configured to control a first prism to reflect the light in a first reflective state to enable the second prism to reflect or refract in a second reflective state to enable the light to reach the user's eye.
8. The OST device (100) according to any preceding claim, wherein the controller (101) is further configured to determine a direction of gaze and to set the prisms in the direction of gaze to the reflective state in the reflection mode.
9. The OST device (100) according to claim 8, wherein the controller (101) is further configured to determine the direction of gaze based on a location of an object to be displayed comprised in the imagery to be projected, wherein the direction of gaze is set to be a direction from the user's eye to the location of the object.
10. The OST device (100) according to claim 8 or 9, wherein the OST device (100) further comprises a gaze-tracking sensor (103) and wherein the controller (101) is further configured to determine the direction of gaze based on gaze-tracking sensor (103).
11. The OST device (100) according to any preceding claim, wherein the imagery comprises a first and a second object, and wherein the controller (101) is further configured to control the projector (105-1) to project the first object in a first reflection mode, and to project the second object in a second reflection mode.
12. The OST device (100) according to any preceding claim, wherein the controller (101) is further configured to repeatedly switch between the reflection mode and the transmission mode at a frequency of higher than 60 Hz.
13. The OST device (100) according to any preceding claim, wherein the controller (101) is further configured to repeatedly switch between the reflection mode and the transmission mode according to a duty cycle, wherein the controller (101) is further configured to operate the image presenting arrangement (105) in the transmission mode for a transmission time period, and operates the image presenting arrangement (105) in the reflection mode for a reflection time period, and to set the lengths of the reflection time period
and/or the transmission time period based on environmental data and /or the imagery to be projected.
14. The OST device (100) according to claim 13, wherein the controller (101) is further configured to set the lengths of the reflection time period and/or the transmission time period based on the imagery by determining that the imagery comprises objects falling below a darkness content threshold level, and, if so, setting the length of the reflection window to be longer than the length of the transmission window or determining that the imagery comprises objects falling above a light content threshold level, and, if so, setting the length of the reflection window to be shorter than the length of the transmission window.
15. The OST device (100) according to claim 13 or 14, wherein the OST device (100) further comprises a light sensor (106) and wherein the controller (101) is further configured to set the lengths of the reflection time period and/or the transmission time period based on environmental data by receiving light data from the light sensor (106), determine if the light data falls below a dark surrounding threshold level, and if so, set the length of the transmission window to be longer than the length of the reflection window or determine if the light data falls above a light surrounding threshold level, and, if so, set the length of the transmission window to be shorter than the length of the reflection window.
16. The OST device (100) according to any preceding claim, wherein the OST device is a head-mounted device arranged to be worn on a user's head.
17. A method for use in an Optical-See-Through, OST, device (100) comprising
a plurality of electro-controlled liquid prisms (105-3), which are arranged to be in a reflective state when a first voltage is applied to the prism (105-3) and a transmissive state when a second voltage is applied to the prism (105-3), and a projector configured to project light for imagery onto at least some of the prisms (105-3), wherein the method comprises operating the image presenting arrangement (105) in a transmission mode, wherein the prisms (105-3) are in the transmissive state, whereby a user's eye can perceive incoming light from behind the image presenting arrangement (105), operating the image presenting arrangement (105) in a reflection mode, wherein at least some of the prisms (105-3) are in the reflective state, whereby the user's eye can perceive the imagery being projected to at least some of the prisms and reflected by the prisms being in a reflective state, and repeatedly switching between the reflection mode and the transmission mode enabling the user's eye to perceive both the incoming light and the projected imagery apparently simultaneously.
18. A computer-readable medium (720) carrying computer instructions (721) that when loaded into and executed by a controller (101) of an OST device (100) enables the OST device (100) to implement the method according to claim 17.
19. A software component arrangement for controlling projection of imagery in an OST device (100) comprising a plurality of electro-controlled liquid prisms (105-3), which are arranged to be in a reflective state when a first voltage is applied to the prism (105-3) and a transmissive state when a second voltage is applied to the prism (105-3), and a projector configured to project light for imagery onto at least some of the prisms (105-3), wherein the software component arrangement (600) comprises
a software component for operating the image presenting arrangement (105) in a transmission mode, wherein the prisms (105-3) are in the transmissive state, whereby a user's eye can perceive incoming light from behind the image presenting arrangement (105), a software component for operating the image presenting arrangement (105) in a reflection mode, wherein at least some of the prisms (105-3) are in the reflective state, whereby the user's eye can perceive the imagery being projected to at least some of the prisms and reflected by the prisms being in a reflective state, and a software component for repeatedly switching between the reflection mode and the transmission mode enabling the user's eye to perceive both the incoming light and the projected imagery apparently simultaneously.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/EP2024/054789 WO2025180593A1 (en) | 2024-02-26 | 2024-02-26 | An optical-see-through device and a method for providing an improved extended reality interface |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/EP2024/054789 WO2025180593A1 (en) | 2024-02-26 | 2024-02-26 | An optical-see-through device and a method for providing an improved extended reality interface |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025180593A1 true WO2025180593A1 (en) | 2025-09-04 |
Family
ID=90105214
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2024/054789 Pending WO2025180593A1 (en) | 2024-02-26 | 2024-02-26 | An optical-see-through device and a method for providing an improved extended reality interface |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025180593A1 (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20100149073A1 (en) * | 2008-11-02 | 2010-06-17 | David Chaum | Near to Eye Display System and Appliance |
| US20150015816A1 (en) * | 2010-02-04 | 2015-01-15 | Samsung Electronics Co., Ltd. | 2d/3d switchable image display device |
| US9335548B1 (en) * | 2013-08-21 | 2016-05-10 | Google Inc. | Head-wearable display with collimated light source and beam steering mechanism |
| US11061239B2 (en) * | 2017-12-18 | 2021-07-13 | Facebook Technologies, Llc | Augmented reality head-mounted display with a pancake combiner and pupil steering |
| US20220321867A1 (en) * | 2019-07-01 | 2022-10-06 | Pcms Holdings, Inc | Method and system for continuous calibration of a 3d display based on beam steering |
| CN117083555A (en) * | 2021-05-21 | 2023-11-17 | 谷歌有限责任公司 | Polarization multiplexing field of view and pupil expansion in planar waveguides |
-
2024
- 2024-02-26 WO PCT/EP2024/054789 patent/WO2025180593A1/en active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20100149073A1 (en) * | 2008-11-02 | 2010-06-17 | David Chaum | Near to Eye Display System and Appliance |
| US20150015816A1 (en) * | 2010-02-04 | 2015-01-15 | Samsung Electronics Co., Ltd. | 2d/3d switchable image display device |
| US9335548B1 (en) * | 2013-08-21 | 2016-05-10 | Google Inc. | Head-wearable display with collimated light source and beam steering mechanism |
| US11061239B2 (en) * | 2017-12-18 | 2021-07-13 | Facebook Technologies, Llc | Augmented reality head-mounted display with a pancake combiner and pupil steering |
| US20220321867A1 (en) * | 2019-07-01 | 2022-10-06 | Pcms Holdings, Inc | Method and system for continuous calibration of a 3d display based on beam steering |
| CN117083555A (en) * | 2021-05-21 | 2023-11-17 | 谷歌有限责任公司 | Polarization multiplexing field of view and pupil expansion in planar waveguides |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10983355B2 (en) | Method and system for occlusion capable compact displays | |
| KR101557758B1 (en) | Transparent component with switchable reflecting elements, and devices including such component | |
| EP2930552B1 (en) | Display apparatus and optical apparatus | |
| US10073264B2 (en) | Substrate-guide optical device | |
| US9223139B2 (en) | Cascading optics in optical combiners of head mounted displays | |
| US20200301239A1 (en) | Varifocal display with fixed-focus lens | |
| US9500866B2 (en) | Near display and imaging | |
| EP3090301B1 (en) | An apparatus or method for projecting light internally towards and away from an eye of a user | |
| IL255049B (en) | Highly efficient compact head-mounted display system | |
| US10197886B2 (en) | Display spectacles having microprism structures and driving method thereof | |
| US20210311314A1 (en) | Wearable apparatus and unmanned aerial vehicle system | |
| US11474357B2 (en) | Augmented reality display device | |
| US12429651B2 (en) | Waveguide with tunable bulk reflectors | |
| US12436394B2 (en) | Augmented reality (or mixed reality) eyewear with see-through optical elements having individually-adjustable opacity/reflectivity levels | |
| US20240295737A1 (en) | Variable world blur for occlusion and contrast enhancement via tunable lens elements | |
| US9519092B1 (en) | Display method | |
| WO2025180593A1 (en) | An optical-see-through device and a method for providing an improved extended reality interface | |
| US12140762B2 (en) | Vision-control system for near-eye display | |
| EP4523030A1 (en) | Waveguide with tunable bulk reflectors | |
| EP3629072B1 (en) | Optical switch and image system using same | |
| JP2022170640A (en) | Display device having transmittance control unit | |
| WO2022091398A1 (en) | Display device including transmittance control unit | |
| KR100245332B1 (en) | Head mounted display | |
| CN117492210A (en) | Glasses and how they work |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24708371 Country of ref document: EP Kind code of ref document: A1 |