[go: up one dir, main page]

WO2025049975A1 - System and methods for display of 3d multi-media - Google Patents

System and methods for display of 3d multi-media Download PDF

Info

Publication number
WO2025049975A1
WO2025049975A1 PCT/US2024/044765 US2024044765W WO2025049975A1 WO 2025049975 A1 WO2025049975 A1 WO 2025049975A1 US 2024044765 W US2024044765 W US 2024044765W WO 2025049975 A1 WO2025049975 A1 WO 2025049975A1
Authority
WO
WIPO (PCT)
Prior art keywords
hologram
optical
hologram pattern
quantization
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/044765
Other languages
French (fr)
Other versions
WO2025049975A4 (en
Inventor
Edward Buckley
Andrzej KACZOROWSKI
Theodore Michel MARESCAUX
Richard Stahl
Gebirie Yizengaw BELAY
Joel Steven Kollin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Swave BV
Original Assignee
Swave BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Swave BV filed Critical Swave BV
Priority to PCT/US2025/010721 priority Critical patent/WO2025151505A1/en
Publication of WO2025049975A1 publication Critical patent/WO2025049975A1/en
Publication of WO2025049975A4 publication Critical patent/WO2025049975A4/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/54Browsing; Visualisation therefor
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B27/0103Head-up displays characterised by optical features comprising holographic elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/04Processes or apparatus for producing holograms
    • G03H1/08Synthesising holograms, i.e. holograms synthesized from objects or objects from holograms
    • G03H1/0808Methods of numerical synthesis, e.g. coherent ray tracing [CRT], diffraction specific
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/04Processes or apparatus for producing holograms
    • G03H1/08Synthesising holograms, i.e. holograms synthesized from objects or objects from holograms
    • G03H1/0841Encoding method mapping the synthesized field into a restricted set of values representative of the modulator parameters, e.g. detour phase coding
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • G02B2027/0174Head mounted characterised by optical features holographic
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/22Processes or apparatus for obtaining an optical image from holograms
    • G03H1/2294Addressing the hologram to an active spatial light modulator
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/04Processes or apparatus for producing holograms
    • G03H1/08Synthesising holograms, i.e. holograms synthesized from objects or objects from holograms
    • G03H1/0808Methods of numerical synthesis, e.g. coherent ray tracing [CRT], diffraction specific
    • G03H2001/0816Iterative algorithms
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H1/00Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
    • G03H1/04Processes or apparatus for producing holograms
    • G03H1/08Synthesising holograms, i.e. holograms synthesized from objects or objects from holograms
    • G03H1/0808Methods of numerical synthesis, e.g. coherent ray tracing [CRT], diffraction specific
    • G03H2001/0825Numerical processing in hologram space, e.g. combination of the CGH [computer generated hologram] with a numerical optical element
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03HHOLOGRAPHIC PROCESSES OR APPARATUS
    • G03H2210/00Object characteristics
    • G03H2210/40Synthetic representation, i.e. digital or optical object decomposition
    • G03H2210/45Representation of the decomposed object
    • G03H2210/454Representation of the decomposed object into planes

Definitions

  • the subject disclosure relates to optical systems and associated applications for displaying three-dimensional (3D) media for virtual reality (VR) and augmented/extended reality (AR).
  • 3D three-dimensional
  • VR virtual reality
  • AR augmented/extended reality
  • FIG. 1 illustrates an example three-dimensional (3D) focus space for an augmented reality device, in accordance with various aspects described herein;
  • FIG. 2 is a logic diagram of an example method for generating a hologram pattern, in accordance with various aspects described herein;
  • FIG. 3 illustrates an example use of hologram replication in a holographic display system, in accordance with various aspects described herein;
  • FIG. 4 illustrates a three-dimensional optical lens assembly, in accordance with various aspects described herein;
  • FIG. 5A is a logic diagram of an example method for displaying multiple two dimensional image layers in corresponding selectable depth planes in an augmented reality device, in accordance with various aspects described herein;
  • FIG. 5B is a logic diagram of an example method for displaying an image in a selectable depth plane in an augmented reality device using replicated hologram patterns, in accordance with various aspects described herein;
  • FIG. 5C is a logic diagram of an example method for displaying multiple two dimensional image layers in corresponding selectable depth planes in an augmented reality device using replicated hologram patterns, in accordance with various aspects described herein;
  • FIGs. 6A & 6B are example logic diagrams of example methods for executing a visual search in an augmented reality system, in accordance with various aspects described herein;
  • FIG. 7 is an example schematic block diagram of another embodiment of a system for implementing augmented reality, in accordance with various aspects described herein;
  • FIG. 8 is an example schematic block diagram of another embodiment of a system for implementing augmented reality, in accordance with various aspects described herein;
  • FIG. 9A illustrates example hologram dot pattern thresholds encoded based on a dither mask, in accordance with various aspects described herein;
  • FIG. 9B illustrates an example of using a dither mask designed for blue noise on an input image, in accordance with various aspects described herein;
  • FIG. 9C illustrates an example of using a dither mask designed with a signal window in the frequency domain on an input image, in accordance with various aspects described herein;
  • FIG. 9D illustrates an example of using a dither mask designed with a signal window in the Fourier space on an input hologram, in accordance with various aspects described herein;
  • FIG. 9E illustrates an example of using a dither mask and inverse dither mask, in accordance with various aspects described herein;
  • FIG. 9F illustrates an example use of mask-based dithering to quantize a hologram, in accordance with various aspects described herein;
  • FIG. 10 illustrates an example schematic block diagram of an embodiment of an ecosystem for implementing augmented reality, in accordance with various aspects described herein;
  • FIG. 11 illustrates an example optical module for generating and projecting augmented reality content in accordance with various aspects described herein;
  • FIG. 12 illustrates an example implementation of augmented reality glasses in accordance with various aspects described herein;
  • FIG. 13 is a schematic block diagram of an example embodiment of a system for implementing augmented reality in accordance with various aspects described herein;
  • FIG. 14 A illustrates an example implementation of augmented reality glasses in accordance with various aspects described herein;
  • FIG. 14B illustrates another example implementation of augmented reality glasses in accordance with various aspects described herein.
  • FIG. 14C illustrates another example implementation of augmented reality glasses in accordance with various aspects described herein.
  • FIG. 1 illustrates an example three-dimensional (3D) focus space 188 for an augmented reality device 182 with a depth plane 184 configured to be selectable from a near eye distance 180 to infinite distance 186.
  • a second spatial light modulator is adapted to display one or more holographic objects viewable by a user, an optical system, such as augmented reality device 182, that can be configured to display virtual media content at a selectable depth plane 184, where the depth of the depth plane can be selected based on desired virtual media content for display.
  • a holographic lens function can be applied to a hologram pattern to program the depth at which an image is displayed in space.
  • the selectable depth plane 184 can be programmed to be represented within the focus space from as close as near eye distance 180 to as far as an essentially infinite distance 186.
  • a holographic lens function can be a mathematical model adapted to mimic a physical lens function to alter a focus plane for a to-be-displayed image.
  • a holographic processor can be configured to apply a holographic lens function to a hologram pattern for images to be displayed by augmented reality device 182 at a desired depth in space.
  • augmented reality device 182 can be configured to display two-dimensional (2D) content on a single depth plane in space, where the depth of the depth plane can be determined using a holographic lens function.
  • augmented reality device 182 can be configured to display two-dimensional (2D) content on multiple depth planes simultaneously, where the depth of each of the depth planes can be determined separately.
  • stereoscopic display technology can be used to provide content at a single depth plane fixed in space.
  • complex and/or bulky optical elements can be implemented to provide stereoscopic display technology enabled for providing a single non-fixed depth plane albeit with otherwise undesirable drawbacks of weight and/or bulk.
  • augmented reality devices such as augmented reality glasses, heads-up displays, and others, can benefit from reducing and/or eliminating these additional complex and/or bulky elements, while allowing display and/or overlay of images, virtual objects and virtual scenes at substantially any desired depth in space.
  • holography-based techniques can be used to display images, virtual objects and virtual scenes at a desired depth in space with few or no added optical elements.
  • a holographic lens function can be used to change the depth for virtual media content from infinity to a desired depth in a focus space.
  • virtual media content can be computed for display at a relative depth of infinity, followed by application of a holographic lens function to change the display depth for virtual media content for display at a desired perception depth.
  • a holographic lens function can be used to provide a virtual image for overlay at substantially any depth in the focus space.
  • a holographic lens function can be adapted to provide virtual media content to overlay at multiple depth planes, including providing full 3D scenes.
  • two-dimensional content can be displayed on multiple depth planes simultaneously, where the relative distance between depth planes can be small enough that a hypothetical viewer can aggregate individual two-dimensional content on each depth of multiple depth planes for perceiving three dimensional (3D) virtual objects and/or three dimensional (3D) virtual scenes.
  • a holographic processor can be adapted for use in an augmented reality device, such as any of the augmented reality devices disclosed herein, to execute one or more holographic lens functions, as part of the computation process of hologram patterns that are to be rendered on the one or more spatial light modulators of the augmented reality device for displaying images, virtual objects and virtual scenes at a desired depth in space with the augmented reality device.
  • an augmented reality device such as augmented reality device 182
  • an optical combiner element can be any of a semi-transparent angle selective combiner, angle-selective semi-transparent mirror or a beam splitter.
  • an optical combiner element can be a holographic optical element or a metasurface.
  • a semi-transparent angle- selective combiner can be an optical component configured to allow light to pass at specific angles, while reflecting or blocking light at other angles.
  • the semi-transparent angle-selective combiner can be used to selectively transmit or reflect light based on the incident angle of the incoming light rays.
  • a semi-transparent angle-selective combiner can be integrated as part of an optical system of an augmented reality device, such as augmented reality device 182.
  • a semitransparent angle-selective combiner can be configured as a holographic optical element along with other optical elements, where the holographic optical element is configured as a diffractive optical element.
  • one or more optical combiner elements can be configured as an element on the surface of and/or in the lenses of the augmented reality glasses.
  • optical combiner elements can include one or more semi-transparent, reflective coatings adapted for implementation on the surface of and/or in the lenses of augmented reality glasses.
  • one or more spatial light modulators adapted for use in an augmented reality device can be configured to modulate phase, amplitude and/or polarization of an incident light beam.
  • one or more of the spatial light modulators can be adapted to modulate phase of an incident light beam.
  • one or more of the spatial light modulators can be adapted to modulate amplitude of an incident light beam.
  • one or more of the spatial light modulators can be adapted to provide both phase and amplitude modulation of an incident light beam.
  • one or more spatial light modulators can be adapted for use in an augmented reality device, such as the augmented reality devices disclosed herein, to function as a holographic display.
  • An example implementation of a holographic display system includes an input plane, corresponding to a plane at which the hologram pattern is displayed and an output plane, corresponding to a plane at which a hypothetical viewer’s eye box is located for viewing of images, virtual objects and/or virtual scenes.
  • a first lens group with focal length fl, can be positioned between an input plane and an output plane, at a distance fl from the input plane, where the first lens group can be adapted to convert a desired hologram pattern from spatial domain to spatial frequency domain, by converting spatial information at the input plane into frequency components at an intermediate plane, referred to as Fourier plane.
  • a holographic display system can be configured to include a second lens group, with focal length f2, where the second lens group can be positioned between the input plane and the output plane at a distance f2 from the output plane.
  • the distance between the first lens group and the second lens group is equal to fl + f2, wherewith the intermediate plane, also referred to as Fourier plane, is located between the first and second lens group, at a distance fl from the first lens group and a distance f2 from the second lens group.
  • the second lens group can be adapted to convert the frequency components of the hologram pattern at the intermediate plane back to spatial information at the output plane.
  • the first lens group and the second lens group can form a 4f optical system.
  • an optical module such as optical module 100 in FIG.
  • the one or more optical elements external to the optical module and being part of the second lens group of a 4f optical system can comprise one or more optical combiner element, such as the optical combiner elements disclosed herein.
  • the one or more optical combiner elements as part of the second lens group of a 4f optical system can be configured as any of a holographic optical element or a meta-surface and being adapted to the lenses of the augmented reality glasses.
  • one or more filter elements are adapted to the optical module for filtering out unwanted signal components, such as noise and/ at the conjugate image.
  • one or more filter elements are placed at the intermediate plane of the 4f optical system, which is formed by the first lens group and the second lens group.
  • both the first and the second lens group may comprise one or a combination of optical elements, where the optical elements are not restricted to lenses only but can be any optical element including lenses and/or (free-form) mirrors.
  • an optical module such as optical module 100 of FIG.
  • the image at the input plane which corresponds to the plane where the hologram pattern is displayed, to another plane in space; 2) apply magnification or demagnification to the image at the input plane, in order for the image at the output plane to be magnified or demagnified respectively; 3) filter out unwanted optical signals, including noise and/or the conjugate image.
  • an augmented reality device such as any of the augmented reality devices disclosed herein, can be adapted to provide reduction techniques for granular interference patterns (speckle) associated with the augmented reality device.
  • coherence associated with one or more illumination sources in an optical module can introduce granular interference patterns (speckle) that can degrade the relative quality of virtual media content for display.
  • an optical module of a given augmented reality device can be adapted to include one or more of a variety of techniques for mitigating the effects of speckle.
  • Example speckle reduction techniques include: modifying the statistical properties of laser illumination sources using depolarization techniques, such as polarization flipping, to introduce controlled amounts of spatiaFtemporal coherence, or employing multiple laser sources with different characteristics; modulating the frequency or wavelength of laser illumination sources (laser chirping) to decorrelate speckle sources at different wavelengths; provide random phase modulation to the optical system to dismpt the coherent nature of the laser illumination sources by using, for example, vibrating or rotating diffusers, or by employing spatial light modulators to introduce random phase variations; rapid scanning of one or more of the laser illumination sources or the holographic imager, so that different speckle patterns are sampled over a temporal range; adding diffusers in the optical path to scatter laser illumination sources in varying directions to reduce the coherence of the light to reduce the visibility of speckle patterns; and providing a deformable mirror to introduce time-variable (temporal) random phase shifts in an illumination source output.
  • depolarization techniques such as polarization flipping
  • an augmented reality device such as augmented reality device 182
  • an array of color filters can be formed above a top surface of one or more spatial light modulators in an augmented reality device, such as augmented reality device 182, for example where the top surface of the one or more spatial light modulator comprises light modulating elements.
  • a color filter array comprises a set of subareas, with each subarea adapted to be one of: 1) transparent as to red light and blocking/absorptive as to green and blue light; 2) transparent as to green light and blocking/absorptive as to red and blue light; or 3) transparent as to blue light and blocking/absorptive as to red and green light.
  • a color filter array can be adapted for displaying multi-color virtual media content using a single spatial light modulator integrated circuit (IC).
  • the optical system associated with an augmented reality device can be configured to include a plurality of spatial light modulators potentially implemented as integrated circuits, where each spatial light modulator can be associated with n a separate color channel, such as the channels used in a red, green, blue (RGB) color model.
  • each spatial light modulator can be associated with n a separate color channel, such as the channels used in a red, green, blue (RGB) color model.
  • RGB red, green, blue
  • a first set of one or more spatial light modulators can be adapted to interact with red light only, a second set of one or more spatial light modulators can be adapted to interact with green light only and finally a third set of one or more spatial light modulator can be adapted to interact with blue light only, with a combined output from the first, second and third set of one or more spatial light modulators configured to generate multi-color virtual media content for display.
  • time-multiplexing is used to generate multi-color virtual media content for display, whereby one or more spatial light modulators are sequentially illuminated with for example, red, green and blue light.
  • An example method for displaying virtual media content begins by receiving data representative of an image, a three dimensional (3D) object and/or a three dimensional (3D) scene, where one or more holographic processors, such as the holographic processors disclosed herein, are configured to compute hologram patterns based on received data and provide the computed hologram patterns to one or more spatial light modulators for rendering on the one or more spatial light modulators.
  • one or more holographic processors such as the holographic processors disclosed herein, are configured to compute hologram patterns based on received data and provide the computed hologram patterns to one or more spatial light modulators for rendering on the one or more spatial light modulators.
  • An example optical system such as an optical system for an augmented reality device, can be configured to apply techniques for minimizing and/or attenuating granular interference patterns (speckle) by at least one of modulating a wavelength of a laser illumination source, depolarizing the laser illumination source, randomly modulating a phase of the laser illumination source, or diffusing the laser illumination source.
  • granular interference patterns speckle
  • FIG. 2 is a logic diagram of an example method for generating a hologram pattern.
  • the method begins at step 200, with an augmented reality device receiving data representing 2D media content for display.
  • the 2D media content can be intended for display at a predetermined depth plane in space.
  • data representing 2D media content for display can include information, such as metadata, indicating a desired display depth for to-be-displayed 2D media content.
  • a display depth for to-be- displayed 2D media content can be determined by an element associated with an augmented reality device.
  • a user can determine a desired display depth for to-be-displayed 2D media content.
  • the method continues at step 201, with random phase being applied to data representing 2D media content.
  • the method continues, by generating a hologram pattern for display at a substantially infinite depth.
  • a mathematical lens function holographic lens function
  • the method then continues at step 204, where a mathematical lens function (holographic lens function) can be applied to the previously determined hologram pattern to change the relative display depth of a to-be-displayed image to a previously determined desired display depth.
  • the method then continues at step 206, with an aberrations correction function being applied to the hologram pattern, as changed at step 204, to correct for distortion(s) introduced by optical elements, such as optical elements associated with the optical module and/or optical elements located outside the optical module.
  • an aberrations correction function or another function can be used to correct for distortion(s) associated with a hypothetical user viewing the 2D media content, including but not limited to correcting distortion related to a refractive error (such as, for example, astigmatism) of a hypothetical viewer to accommodate vision correction.
  • the prescription correction of a user can be an input parameter to the method of FIG. 2 or can be an input parameter to a holographic processor in order to adapt the hologram pattern based on the prescription correction.
  • the mathematical lens function of step 204 and the aberrations correction function of step 206 can be combined in a single mathematical function and applied as a single step to the hologram pattern as provided by step 202.
  • a quantization method e.g. error diffusion, mask-based dithering, or other
  • a completed hologram pattern can be rendered on one or more spatial light modulator devices, configured for displaying the 2D media content at the desired display depth in the focus space.
  • a holographic processor such as any of the holographic processors disclosed herein, are configured to execute the example method for generating a hologram pattern presented in FIG. 2, with the holographic processor adapted to an augmented reality device, such as augmented reality glasses.
  • An example method for displaying an image with a viewing device begins by receiving data representative of virtual media content and a predetermined focus depth for the to-be- displayed virtual media content. The method continues by generating a preliminary hologram pattern from the data for display at an infinite focus depth and applying a mathematical lens function (holographic lens function) to the preliminary hologram pattern, in order to move the depth of the to-be-displayed image from infinite depth to the predetermined focus depth.
  • the mathematical lens function can be adapted to also correct for distortion(s) introduced by optical elements and/or to also correct distortion related to a refractive error (such as, for example, astigmatism) of a hypothetical viewer to accommodate vision correction.
  • the hologram pattern can be quantized using a quantization technique, such as, for example, an error diffusion or mask-based dithering technique.
  • a quantization technique such as, for example, an error diffusion or mask-based dithering technique.
  • FIG. 3 illustrates the use of hologram replication in a holographic display system.
  • one or more spatial light modulator devices can be used as holographic display(s) in an optical module.
  • a spatial light modulator configured as holographic display can require a relatively large array of light modulating elements to provide acceptable and/or optimal resolution for viewing.
  • spatial light modulator devices used as holographic displays can require significantly more pixels (light modulating elements), as compared to a traditional two-dimensional (2D) display, because multiple pixels in the spatial light modulator (potentially configured as (part of) an integrated circuit), can be required for the creation of a single voxel, where a voxel refers to a volume element analogous to a pixel (picture element) in 2D images.
  • a holographic display system can require more than a 1 : 1 ratio for pixels (light modulating elements) to relative resolution of to-be-displayed content as compared to a traditional two-dimensional (2D) display system.
  • the pixel pitch for a spatial light modulator configured as holographic display can be adapted to be equal to or smaller than a wavelength of visible light, or equal to or smaller than half a wavelength of visible light.
  • an array of 16k x 16k light modulating elements can be implemented on a 4x4mm 2 silicon area configured as a spatial light modulator.
  • adding a larger number of pixels (light modulating elements) in a spatial light modulator configured as holographic display can significantly increase computation requirements for processing a hologram patterns (interference patterns).
  • a holographic display system can be adapted to include a mode of operation wherein an array of light modulating elements of one or more spatial light modulator devices used as holographic displays can be divided to provide multiple subarrays of light modulating elements, enabling a same or similar hologram pattern to be rendered on each of the subarrays.
  • the use of subarrays can be used to facilitate a reduction in computation requirements for the holographic display system, by computing a hologram pattern for a portion of the total number of pixels (light modulating elements), with the hologram pattern computed for the portion (a subarray) being replicated onto each of the multiple subarrays.
  • computing hologram patterns using a subarray of the one or more spatial light modulators and replicating it to the other subarrays can reduce computational requirements for the larger array, while also increasing contrast ratio for displayed virtual media content.
  • a viewing device such as the augmented reality devices disclosed herein, can be adapted to enable a mode of operation where a hologram pattern for a single subarray of a multitude of subarrays of one or more spatial light modulators can be computed by a holographic processor, with the hologram pattern, as calculated for the single subarray, can be replicated over the remaining subarrays of the multitude of subarrays.
  • an array of light modulating elements for one or more spatial light modulators configured as holographic display(s) can be divided into four quarters or four subarrays.
  • a hologram pattern can be computed for only a quarter (one of four subarrays) of the array of light modulating elements, with the hologram pattern then displayed in each of the four quarters using the hologram pattern as computed for only a quarter.
  • replicating a hologram pattern, computed for a subarray of an array of light modulating element of one or more spatial light modulators configured as holographic display(s), over the entire array of light modulating elements of the one or more spatial light modulators configured as holographic display(s) can enable virtual media content with relatively lower resolution and relatively higher contrast ratio than computing a hologram pattern for a full array of light modulating elements, while facilitating a lower relative compute requirement.
  • FIG. 4 illustrates a three-dimensional optical lens assembly that includes a first lens group (lens group 310) and a second lens group (lens group 302) in an optical relay system 300.
  • an optical relay system can use a set of optical components (such as, for example, lenses, free form mirrors, etc.) to relay an optical signal from one point to another with minimal distortion or loss of quality.
  • a 4f optical relay system includes two lens groups, a first with focal length fl (lens group 310) and a second with focal length f2 (lens group 302), where the first and second lens group are separated from each other a distance of (approximately) fl + f2, where fl is the focal length of lens group 310 and f2 is the focal length of lens group 302.
  • the first lens group (lens group 310) and the second lens group (lens group 302) of a 4f optical relay system can comprise different optical components, including but not limited to, lenses and free form mirrors.
  • optical relay system 300 includes an input plane where a hologram pattern 308 is located.
  • lens group 310 is positioned at a distance of (approximately) fl from the plane of hologram pattern 308.
  • lens group 310 “performs” a first Fourier transform, converting hologram pattern 308 from spatial domain to the spatial frequency domain.
  • the space between lens group 310 and lens group 302 includes an intermediate plane, such as a Fourier plane 304 where the spatial frequencies of the converted hologram pattern 308 are represented.
  • an intermediate plane such as Fourier plane 304
  • an intermediate plane can be located in between the first lens group (lens group 310) and the second lens group (lens group 302), with a distance fl from the first lens group (lens group 310) and a distance f2 from the second lens group (lens group 302).
  • lens group 302 is placed (approximately) a distance f2 from the intermediate plane, such as Fourier plane 304, “performs” a second Fourier transform, converting the signal from spatial frequency domain back to spatial domain.
  • an output plane for example a plane where an eyebox, such as eyebox 306, can be located).
  • an optical lens assembly such as optical lens assembly 300
  • a spatial filter can be placed in a 4f optical relay system, such as optical lens assembly 300, at a plane different from the intermediate plane (Fourier plane 304).
  • a 4nf optical relay system includes the formation of an intermediate plane, such as Fourier plane 304, where noise, such as quantization noise, can be placed in a region outside a desired signal window and can be filtered out and/or where a conjugate image can be fdtered out.
  • the 4nf system is illustrated with n equal to 1.
  • FIG. 5A is a logic diagram of an example method for displaying three-dimensional (3D) objects and/or three-dimensional scenes with a plurality of selectable depth planes in a focus space.
  • the method begins at step 320, by receiving 3D object and/or 3D scene data, further referred to as 3D data, for display in a focus space and continues at step 322, by forming a set of two-dimensional (2D) layers from the 3D data.
  • a 3D object or a 3D scene can be decomposed in a set of 2D layers, with the 2D layers being parallel to each other, by slicing up the 3D object or 3D scene at regular or irregular intervals, each slice representing a cross section of the 3D object or 3D scene; the slices forming a set of 2D layers.
  • the method then continues at step 324, by applying random phase to each 2D layer of the set of 2D layers, representing the 3D data, and then at step 326, by generating a hologram pattern for each 2D layer of the set of 2D layers for display at infinite depth.
  • a hologram pattern can be generated for each 2D layer of the set of 2D layers using a Fourier Transform function on the data, representing the 2D layer.
  • the method continues at step 328 by applying a mathematical lens function (holographic lens function) to the hologram pattern of each 2D layer, as computed in step 326, to convert each 2D layer of the set of 2D layers from infinity to a desired depth in the focus space and then at step 330, by aggregating converted hologram patterns for the 2D layers of the set of 2D layers into a single hologram pattern.
  • a mathematical lens function holographic lens function
  • the method applies an optical aberrations correction function to the single hologram pattern to correct for distortion(s) introduced by optical elements configured as part of an optical module or that are located outside an optical module.
  • the optical aberrations correction function can be adapted to correct distortion related to a refractive error (such as, for example, astigmatism) of a hypothetical viewer to accommodate for vision correction.
  • the prescription correction of a user can be an input parameter to the method of FIG. 5 A, or can be an input parameter to a holographic processor in order to adapt the hologram pattern based on the prescription correction.
  • the mathematical lens function applied to each of the 2D layers of the set of 2D layers can be adapted to also correct for optical distortion(s) introduced by optical components and to accommodate for vision correction, allowing step 332 to be eliminated.
  • the method continues at step 334, by applying a quantization method, such as, for example, error diffusion or mask-based quantization, etc.
  • the method renders a completed hologram pattern on a spatial light modulator, where the spatial light modulator can be adapted to display the 3D object and/or the 3D scene, associated with the 3D data, in a visually perceptible form, based on the completed hologram pattern.
  • data to be displayed comprises 2D images at predetermined depth planes in a focus space
  • the method of FIG. 5 A can be used, where step 322 can be eliminated.
  • a holographic processor such as the holographic processors disclosed herein, can be configured to execute the method of FIG. 5A.
  • FIG. 5B is a logic diagram of an example method for displaying a two-dimensional (2D) image at a selectable depth plane in a focus space.
  • the method begins at step 340, by receiving data, such as data that includes color information for each pixel in an image and a desired display depth for the image (RGBD data), representative of a two-dimensional (2D) image for display at a predetermined depth in a focus space and continues at step 341, by applying random phase to the data at step 341.
  • the method can generate a hologram pattern from the data as if the 2D image is displayed at infinite depth, with the size of the hologram pattern corresponding to a subarea of a spatial light modulator used to render hologram patterns.
  • the method continues at step 344 by replicating the hologram pattern, generated for a subarea of the spatial light modulator, into the full area of the spatial light modulator and continues at step 346 by applying a mathematical lens function (holographic lens function) to convert the image- to-be-displayed from infinite depth to a desired predetermined depth in the focus space.
  • the method can be used to correct the hologram pattern for distortion(s) introduced by optical elements, configured as part of an associated optical module, or located outside the optical module, using an aberration correction function.
  • an aberration correction function can be adapted to accommodate for vision correction associated with a particular hypothetical user viewing the two-dimensional (2D) image.
  • the prescription correction of a user canbe an input parameter to the method of FIG. 5B, or can be an input parameter to a holographic processor in order to adapt the hologram pattern based on the prescription correction.
  • the mathematical lens function and the aberrations correction function are implemented as one function executed in a single step.
  • the method applies a quantization method, such as, for example, error diffusion or mask-based quantization, to the hologram pattern of step 348.
  • the method renders the completed hologram pattern on a spatial light modulator, where the spatial light modulator can be adapted to display the two-dimensional (2D) image at a predetermined depth in the focus space in a visually perceptible form, based on the completed hologram pattern.
  • a holographic processor such as the holographic processors disclosed herein, can be configured to execute the method of FIG. 5B.
  • FIG. 5C is a logic diagram of an example method for displaying three-dimensional (3D) objects and/or three-dimensional (3D) scenes with a plurality of selectable depth planes in a focus space.
  • the method begins at step 360, by receiving 3D object and/or 3D scene data, further referred to as 3D data, for display in the focus space and continues at step 362, by forming a set of two-dimensional (2D) layers from the 3D data.
  • a 3D object or a 3D scene can be decomposed into a set of 2D parallel layers, by slicing up the 3D object or 3D scene at regular or irregular intervals, each slice representing a cross section of the 3D object or 3D scene, the slices forming the set of 2D layers.
  • the method then continues at step 364, by applying random phase to each 2D layer of the set of 2D layers, representing the 3D data, and then at step 366, by generating a hologram pattern for each 2D layer of the set of 2D layers for display at infinite depth, with the size of each hologram pattern corresponding to a subarea of a spatial light modulator.
  • the method continues at step 368 by replicating the hologram pattern for each 2D layer of the set of 2D layers, with a size equal to a subarea of a spatial light modulator, into the full area of the spatial light modulator to provide a set of second hologram patterns and then at step 370 by applying a mathematical lens function (holographic lens function) to each second hologram pattern of the set of second hologram patterns to generate a set of third hologram patterns with each third hologram pattern layer associated with a different desired depth.
  • a mathematical lens function holographic lens function
  • the method continues at step 372 by aggregating the set of third hologram patterns into a single aggregated hologram pattern and at step 374, where the method corrects the aggregated hologram pattern for distortion(s) introduced by optical elements configmed as part an optical module, or by optical elements located outside an optical module, using an optical aberration correction function.
  • the optical aberration correction function can be adapted to correct for a refractive error (such as, for example, astigmatism) to accommodate vision correction of the viewer.
  • the prescription correction of a user can be an input parameter to the method of FIG. 5C, or can be an input parameter to a holographic processor in order to adapt a hologram pattern based on the prescription correction.
  • the mathematical lens function applied to each of the 2D layers of the set of 2D layers can be adapted to correct for optical distortions introduced by optical components and to accommodate for vision correction, eliminating step 374.
  • the method completes the hologram pattern using a quantization method, such as, for example, error diffusion or mask-based quantization, etc. and at step 378, the method renders a completed hologram pattern on a spatial light modulator, where the spatial light modulator can be adapted to display virtual content media in a visually perceptible form, based on the completed hologram pattern.
  • a holographic processor such as the holographic processors disclosed herein, can be configured to execute the method of FIG. 5C.
  • FIG. 6A is a logic diagram of an example method for executing a visual search in an augmented reality system, such as the augmented reality devices disclosed herein.
  • the method begins at step 400, with the augmented reality system receiving a visual search request.
  • the visual search request can be received from a user.
  • the visual search request can be received from one or more of a third party, from metadata associated with the augmented reality system or from another source.
  • the method continues with the augmented reality system capturing a scene/environment.
  • the augmented reality system can include one or more front-facing cameras (i.e. the cameras capture a user’s view of the scene) to capture the scene.
  • the augmented reality system segments the scene into distinct elements and/or objects.
  • the augmented reality system includes one or more computer vision algorithms adapted to segment the captured scene image into elements and/or objects.
  • one or more scene images captured by the one or more front-facing cameras can be transmitted over a wireless link, such as, for example, over bluetooth or Wi-Fi, to another (mobile) electronic device such as to a smartphone, smart watch or laptop, enabling an external computer vision algorithm to be used to segment scene images into distinct elements and/or objects.
  • segmented scene images can be transmitted back for use in the augmented reality system.
  • the method continues at step 406, where the augmented reality system determines a depth for one or more objects of interest in the scene.
  • the augmented reality system can be implemented with one or more depth sensors adapted to capture a depth or a distance of the object of interest relative to a user of the augmented reality system.
  • an augmented reality system includes one or more gaze or eye tracking sensors adapted to determine a direction of a hypothetical user’s gaze, in order to define the object of interest.
  • an augmented reality system can be implemented without gaze or eye tracking.
  • an object in the center of the field of view of a user will be determined to be the object of interest.
  • the augmented reality system outlines the one or more objects of interest with an augmented reality overlay.
  • the augmented reality overlay can be adapted for display at the same depth as the one or more objects of interest.
  • the user, or the augmented reality system determines whether the object of interest outlined in step 408 is the desired object of interest and when the desired object of interest has been identified, the method continues at step 412, with object related sensor data being transmitted to a search engine.
  • the user can indicate by one or more of a gesture, an audible sound, toggling a button, a touch or swipe on a touch screen, and/or a keystroke on a keypad that the desired object of interest has been identified.
  • the search engine can be located in another location, with the sensor data transmitted using a wireless link, such as, for example, bluetooth or Wi-Fi.
  • the search engine can be in a mobile device located relatively close to the augmented reality system.
  • the search engine can comprise an artificial intelligence engine implemented as part of the augmented reality system, implemented in a mobile device or implemented at a third party.
  • the augmented reality system can be connected to another (mobile) electronic device, such as a smartphone, smart watch, mobile computer or desktop computer, with a cable, over which data can be transmitted between the augmented reality system and the (mobile) electronic device.
  • reverse image search results are received.
  • the reverse image search results are received at the augmented reality system.
  • the reverse image search results are received at a mobile device located in close proximity to the augmented reality system.
  • reverse image search results can include one or more of contextual information for use by the user, metadata or a Uniform Resource Locator (URL) relating to the desired object of interest.
  • the augmented reality system displays the results using an augmented reality overlay at the sensed depth of the desired object of interest.
  • contextual information can be provided with the augmented reality overlay at the sensed depth of the desired object of interest, so that both the desired object of interest and the contextual information are in focus.
  • Another example method begins by receiving a search request, capturing a scene to provide a captured scene and segmenting the captured scene into a plurality of elements and/or objects. The method continues by determining a depth for an object in the scene and outlining the object with an overlay. The method then continues by determining if the object is an object of interest and in response to a determination that the object is the object of interest, transmitting sensor data associated with the scene to a third party for a reverse image search. Finally, the method continues by receiving, from the third party, search results and displaying information representative of the results using an overlay at the depth of the object in the focus space.
  • FIG. 6B is a logic diagram of an example method for executing a visual search in an augmented reality system, such as the augmented reality devices disclosed herein.
  • the method begins at step 500, with the augmented reality system receiving a visual search request.
  • the visual search request can be received from a user.
  • the visual search request can be received from one or more of a third party, from metadata associated with the augmented reality system or from another source.
  • the method continues with the augmented reality system capturing a scene/environment.
  • the augmented reality system can include one or more front-facing cameras (i.e. the cameras capture a user’s view of the scene) to capture the scene.
  • the augmented reality system segments the scene into distinct elements and/or objects.
  • the augmented reality system includes one or more computer vision algorithms adapted to segment the captured scene image into elements and/or objects.
  • one or more scene images captured by the one or more front-facing cameras can be transmitted over a wireless link, such as, for example, over Bluetooth or Wi-Fi, to another (mobile) device such as a smartphone, smart watch or laptop, enabling an external computer vision algorithm to be used to segment scene images into distinct elements and/or objects.
  • segmented scene images can be transmitted back for use in the augmented reality system.
  • the augmented reality system determines what a user is looking at.
  • the augmented reality system includes one or more eye or gaze tracking sensors, along with eye or gaze tracking algorithms adapted to determine where a user is looking and/or what the user is looking at.
  • the method continues at step 508, where the augmented reality system determines a depth for one or more objects of interest.
  • the augmented reality system can be implemented with one or more depth sensors adapted to capture a depth or a distance of the object of interest relative to a user.
  • the augmented reality system outlines the one or more objects of interest with an augmented reality overlay.
  • the augmented reality overlay can be adapted for display at the same depth as the one or more objects of interest.
  • an augmented reality system can be configured to include one or more gaze or eye tracking sensors adapted for determining a direction of a user’s gaze, in order to define the object of interest.
  • the augmented reality system can be implemented without any gaze or eye tracking.
  • an object in the center of the field of view of a user will be determined to be the object of interest.
  • the user determines whether the object of interest outlined in step 518 is the desired object of interest and when the desired object of interest has been identified, the method continues at step 512, with object related sensor data being transmitted to a search engine.
  • the user can indicate by one or more of a gesture, an audible sound, toggling a button, a touch or swipe on a touch screen, and/or a keystroke on a keypad that the desired object of interest has been identified.
  • the search engine can be located in another location, with the sensor data transmitted using a wireless link, such as, for example bluetooth or Wi-Fi.
  • the search engine can be in a mobile device located relatively close to the augmented reality system.
  • the search engine can comprise an artificial intelligence engine implemented as part of the augmented reality system, implemented in a mobile device or implemented at a third party.
  • the augmented reality system can be connected to another (mobile) electronic device, such as a smartphone, smart watch, mobile computer or desktop computer, with a cable, over which data can be transmitted between the augmented reality system and the (mobile) electronic device.
  • reverse image search results are received.
  • the reverse image search results are received at the augmented reality system.
  • the reverse image search results are received at a mobile device located in close proximity to the augmented reality system.
  • reverse image search results can include one or more of contextual information for use by the user, metadata or a Uniform Resource Locator (URL) relating to the desired object of interest.
  • the augmented reality system displays the results using an augmented reality overlay at the sensed depth of the desired object of interest.
  • contextual information can be provided with the augmented reality overlay at the sensed depth of the desired object of interest, so that both the desired object of interest and the contextual information are in focus.
  • Another example method begins by receiving a search request, capturing a scene to provide a captured scene and segmenting the captured scene into a plurality of elements and/or objects. The method continues by identifying an object, wherein the identifying can be based on information representative of tracking a user’s gaze and/or eyes. The method then continues by determining if the object is an object of interest and in response to a determination that the object is the object of interest, transmitting sensor data associated with the scene to a third party for a reverse image search. Finally, the method continues by receiving, from the third party, search results and displaying information representative of the results using an overlay at the depth for the object in the focus space.
  • FIG. 7 is a schematic block diagram of an embodiment of a system for implementing augmented reality system 440, that includes an optical module 424, a holographic processor 438, with associated memory, application processor 436, a radio transceiver 428, camera input 430 and depth sensor 432, along with a power management unit 422 and one or more batteries 426 that together comprise power unit 420.
  • holographic processor 438 and one or more spatial light modulator devices 442 can be adapted to compute and/or generate hologram patterns for projection and/or display on an associated augmented reality device, such as augmented reality system 440.
  • Optical module 442 can be described in greater detail above. (See, for example, optical module 100 with reference to FIG. 11.)
  • additional sensors in addition to the sensors provided above, can be adapted for and/or integrated in, an augmented reality device, such as augmented reality system 440.
  • an augmented reality device such as augmented reality system 440, can be adapted to provide inputs for an additional sensor.
  • additional sensors include, but are not limited to one or more of: [0059] a camera for capturing the environment around the user;
  • one or more haptic sensors are provided.
  • GPS Global Positioning Systems
  • one or more magnetometers to provide compass functionality for navigation; one or more pedometers;
  • one or more ambient light sensors are provided.
  • thermometers and/or temperature sensors one or more thermometers and/or temperature sensors
  • one or more barometer sensors and/or altimeter sensors are provided.
  • sensor data collected by an associated augmented reality device can be used for various functions including, but not limited to image sensing, depth sensing, audio capture, user attention (such as eye tracking), determining geographic location, local environmental factors (as determinable based on any of the sensors included above).
  • simultaneous localization and mapping can be used to construct and update a map of a user’s environment, while simultaneously tracking a user’s location within the environment, thereby providing yet another source of sensor information.
  • optical module 424 can be implemented with light source 444, where light source 444 can include one or more individual light sources adapted to provide illumination for the spatial light modulator(s) 442.
  • light source 444 can be adapted to provide light wavelengths in separate color channels, such as the channels used in the Red, Green, Blue (RGB) color model.
  • light source 444 can be implemented to cover a substantially full spectmm of wavelengths as, for example, a single white light emitter.
  • holographic processor 438 can be configured as one or more compute elements adapted to execute processor functions for computing diffraction (hologram) patterns for display on one or more spatial light modulators embedded in an associated augmented reality device, such as augmented reality system 440.
  • diffraction patterns can include digital media content adapted to for observation by a user of augmented reality system 440.
  • hologram patterns for display /projection can be generated based on generally understood functions, such as Computer Generated Hologram algorithms based on one or more of a Fresnel Transform-Based model, an Angular Spectmm model, a Point Source method, a Gerchberg-Saxton Algorithm, an Iterative Fourier Transform Algorithm (IFTA) or a Wavefront Propagation algorithm (such as, for example, wavelet transform, wavefront recording and reconstruction and kinoform). Additional methods can include using random-phase encoding/decoding techniques to represent complex scenes and/or deep learning techniques, such as convolutional neural networks.
  • Radio transceiver 428 can include one for more receiver units and/or transmitter units configured to enable an exchange of information between an augmented reality system, such as augmented reality system 440 and a wide area network, a local area network and/or a mobile electronic device, such as a smartphone, smart watch, mobile computer or desktop computer.
  • power unit 420 can be provided with a power management unit 422 comprising one or more power management integrated circuits (PMICs), to manage and control power use by the separate elements of augmented reality system 440 to provide efficient power consumption and performance.
  • PMICs power management integrated circuits
  • augmented reality system 440 can include one or more memory devices, with each memory device being comprised of, for example one or more of a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, a quantum register or other quantum memory and/or any other device that stores data in a non-transitory manner.
  • augmented reality system 440 can include one or more control modules or control units configured for driving and/or synchronizing various functions associated with augmented reality system 440.
  • the one or more control units can be implemented using one or more application processing devices.
  • optical module 424 can include one or more optical pre-path(s) and/or one or more optical post-path(s).
  • optical pre-path(s) and/or optical post-path(s), together with light source 444 and spatial light modulator(s) 442 comprise optical module 424.
  • various components included in augmented reality system 440 can be implemented as individual components on one or more printed circuit boards.
  • the power management unit 422, the control unit, holographic processor 438 and radio transceiver 428 can be implemented on a single integrated circuit, or consolidated on a plurality of integrated circuits.
  • the various components of augmented reality system 440 can be adapted for implementation in the frame of an augmented reality device, such as in the temple (or temples) of the augmented reality device.
  • An example optical module for displaying images includes a first interface for interfacing with a network, a depth sensor, a second interface for interfacing with a camera, a light source, a spatial light modulator, a holographic processing module, a combiner mirror, memory; and a processing module operably coupled to the interface and to the memory.
  • the processing module can be operable to receive an image of a scene via the second interface, determine a focus depth for an object using the depth sensor and transmit, via the first interface, information representative of the scene and the focus depth to a third party.
  • the processing module can be further operable to receive information representative of the object from the third party and provide the information representative of the object to the holographic processing module, where the holographic processing module can be operable to generate a hologram pattern based on the information representative of the obj ect and display the hologram pattern on the spatial light modulator, where the spatial tight modulator can be configmed to display digital media content based on the hologram pattern using the combiner mirror.
  • the combiner mirror can be replaced by another semi-transparent angle-selective combiner element implemented as an optical component and configured to allow light to pass at specific angles, while reflecting or blocking light at other angles.
  • the semi-transparent angle-selective combiner can be used to selectively transmit or reflect light based on the incident angle of the incoming light rays.
  • final optical elements can include a semi-transparent, reflective coating adapted as part of the lenses of augmented reality glasses.
  • FIG. 8 is a schematic block diagram of an embodiment of a system for implementing augmented reality.
  • augmented reality system 520 includes an optical module 526, a holographic processor 548, with associated memory, application processor 530, an artificial intelligence engine 534, a radio transceiver 532, camera input 550 and depth sensor 552, along with a power management unit 524 and one or more batteries 528 that together comprise power unit 522.
  • optical module 526 can include one or more spatial tight modulator devices 542 that together are configured to project and/or display digital media content in an associated augmented reality device, such as augmented reality device 520, form hologram patterns displayed and/or rendered using one or more spatial light modulation elements.
  • Optical module 526 can be described in greater detail herein. (See, for example, optical module 100 with reference to FIG. 11)
  • an associated augmented reality device such as augmented reality system 520, can be configured to include additional sensors and/or inputs for additional sensors.
  • additional sensors include, but are not limited to one or more of:
  • a camera for capturing the environment around the user
  • one or more haptic sensors are provided.
  • GPS Global Positioning Systems
  • one or more magnetometers to provide compass functionality for navigation; one or more pedometers;
  • thermometers and/or temperature sensors are one or more thermometers and/or temperature sensors
  • sensors associated with an associated augmented reality device can be adapted to capture data for subsequent processing by an artificial intelligence engine, such as a neural network processor and/or an inference engine.
  • an artificial intelligence engine can be trained for processing sensor data collected by an augmented reality device, such as augmented reality system 520, using the one or more of the sensors listed above.
  • an artificial intelligence engine can be embedded in an associated augmented reality device, such as augmented reality system 520.
  • all or part of the sensor data collected by an augmented reality device can be transmitted over a wireless link via the World Wide Web for processing by a remote artificial intelligence engine for processing.
  • classification results from remote artificial intelligence processing can be transmitted back to an augmented reality device and used, for example, for the display of contextual information to the user.
  • contextual information can include an augmented reality overlay of one or more relevant objects to a user.
  • processed sensor data associated with an augmented reality device such as augmented reality system 520, can be used to provide additional contextual information for an image or object search.
  • sensor data from a plurality of augmented reality devices can be used to train an artificial intelligence engine to generate a trained (neural network) model.
  • the trained model can be packaged for transfer back to an augmented reality device, such as augmented reality system 520 using, for example, a standardized format, such as a TensorFlow SavedModel or ONNX (Open Neural Network Exchange).
  • trained model parameters can be serialized into a file or a set of files, using a format such as HDF5, JSON, or a custom binary format, with the artificial intelligence engine in an augmented reality device, such as augmented reality system 146 the used to deserialize and load the model for inference use by the augmented reality device.
  • augmented reality device such as augmented reality system 146 the used to deserialize and load the model for inference use by the augmented reality device.
  • a trained (neural network) model can include training from a variety of sources (in addition to augmented reality devices). Examples of other sources include data sets from almost any relevant resource.
  • a plurality of augmented reality systems can together be used as a crowd source trained model.
  • the system of FIG. 8 can be used to provide additional functions, such as segmenting elements associated with a particular environment and encoding these elements as descriptors for parameterization with a goal of reducing the relative size of sensor data stores.
  • optical module 526 can be implemented with light source (or light sources) 544, where light source 544 can include one or more individual light sources adapted to provide illumination for the spatial light modulator(s) 542.
  • light source 544 can be adapted to provide light wavelengths in separate color channels, such as the channels used in the Red, Green, Blue (RGB) color model.
  • light source 544 can be implemented to cover a substantially full spectmm of wavelengths as, for example, a single white light emitter.
  • holographic processor 438 can be configured as one or more compute elements adapted to execute processor functions for computing diffraction patterns for display and/or projection by one or more spatial light modulators 541 embedded in an augmented reality device, such as augmented reality system 440.
  • diffraction patterns can include hologram patterns adapted to display and/or project digital media content for observation by a user of augmented reality system 440.
  • hologram patterns for display /proj ection can be generated based on generally understood functions, such as Computer Generated Hologram algorithms based on one or more of a Fresnel Transform-Based model, an Angular Spectrum model, a Point Source method, a Gerchberg-Saxton Algorithm, an Iterative Fourier Transform Algorithm (IFTA) or a Wavefront Propagation algorithm (such as, for example, wavelet transform, wavefront recording and reconstruction and Kino form). Additional methods can include using random-phase encoding/decoding techniques to represent complex scenes and/or deep learning techniques, such as convolutional neural networks.
  • Radio transceiver 532 can include one or more receiver units and/or transmitter units configured to enable an exchange of information between an augmented reality system, such as augmented reality system 520 and a wide area network, a local area network and/or a mobile electronic device, such as a smartphone, smart watch, mobile computer or desktop computer.
  • power unit 522 can be provided with a power management unit 524 comprising one or more power management integrated circuits (PMICs), to manage and control power use by the separate elements of augmented reality system 520, in order to provide efficient power consumption and performance.
  • PMICs power management integrated circuits
  • augmented reality system 520 can include one or more memory devices, with each memory device being comprised of, for example one or more of a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, a quantum register or other quantum memory and/or any other device that stores data in a non-transitory manner.
  • augmented reality system 520 can include one or more control modules or control units configured for driving and/or synchronizing various functions associated with augmented reality system 520.
  • the one or more control units can be implemented using one or more application processing devices.
  • optical module 526 can include one or more optical pre-path(s) and/or one or more optical post-path(s).
  • optical pre-path(s) and/or optical post-path(s), together with light source(s) 544 and spatial light modulator(s) 542 comprise optical module 526.
  • various components included in augmented reality system 520 can be implemented as individual components on one or more printed circuit boards.
  • the power unit 522, the control unit, holographic processor 548, memory and radio transceiver 532 can be implemented on a single integrated circuit or consolidated on a plurality of integrated circuits.
  • the various components of augmented reality system 520 can be adapted for implementation in the frame of an augmented reality device, such as in the temple (or temples) of the augmented reality device.
  • integrating some or all of the integrated circuits/components of augmented reality system 520 on a single integrated circuit can reduce the footprint of augmented reality system 520, enabling compact implementations, while increasing performance and/or power efficiency.
  • integrating some or all of the components of augmented reality system 520 on a single integrated circuit can enable reduced manufacturing costs for augmented reality system 520, while providing enabling lower cost for an associated augmented reality viewing device.
  • An example optical module for displaying images includes a first interface for interfacing with a network, one or more sensors, a second interface for interfacing with a camera, a light source, a spatial light modulator, a holographic processing module, an artificial intelligence engine, a combiner mirror, memory; and a processing module operably coupled to the interface and to the memory.
  • the processing module can be operable to receive an image of a scene via the second interface, receive sensor information from the one or more sensors and classify, by the artificial intelligence engine the scene based on the image of the scene and the sensor information to provide a classified result.
  • the artificial intelligence engine can be operable to provide the classified result to the holographic processing module, where the holographic processing module can be operable to generate a hologram pattern based on the information representative of the object and display the hologram pattern on the spatial light modulator.
  • the spatial light modulator can be configured to display digital media content based on the hologram pattern using the combiner mirror.
  • the combiner mirror can be replaced by another semi-transparent angle-selective combiner element implemented as an optical component and configured to allow light to pass at specific angles, while reflecting orblocking light at other angles.
  • the semi-transparent angle-selective combiner can be used to selectively transmit or reflect light based on the incident angle of the incoming light rays.
  • final optical elements can include a semitransparent, reflective coating adapted as part of the lenses of augmented reality glasses.
  • hologram patterns can be represented in a binary format using a binarization process.
  • hologram patterns can be computed using computer algorithms, such as Computer- Generated Holography (CGH) algorithms.
  • CGH Computer- Generated Holography
  • hologram patterns can be adapted to align with a number of available optical states provided by light modulating elements of a spatial light modulator using a quantization method, enabling mapping of hologram patterns to a given spatial light modulator.
  • the number of available optical states provided by light modulating elements of a spatial light modulator can be finite.
  • light modulating elements of a spatial light modulator can adopt two or more optical states, each state interacting differently with incident light.
  • error diffusion can be used as a quantization method allowing to distribute quantization error for each pixel of a holographic display to neighboring pixels to minimize the impact of the quantization error on visual quality of the display.
  • the holographic display can be implemented using one or more spatial light modulators. Error diffusion algorithm types include, but are not limited to Floyd-Steinberg algorithm, Jarvis-Judice-Ninke algorithm, or a Stucki algorithm, each of which are adapted to define specific patterns for distributing the error to neighboring pixels of a holographic display.
  • a holographic processor such as the holographic processors disclosed herein, can be adapted to execute an error diffusion algorithm on hologram patterns.
  • FIG. 9A illustrates an example dither mask for processing derived from a set of dot patterns, each dot pattern representing a different gray scale.
  • Dither masks can be used in image processing to create an illusion of depth in images with a limited number of pixel states.
  • a dither mask such as dither mask 602 can be used in a dither mask process, such as dither mask process 602, to introduce a controlled form of noise into an image so that quantization errors appear to be relatively random, rather than structured.
  • an example dither mask can be used to quantize pixel values in a traditional 2D image.
  • a small static dither mask (for example, 128 x 128 pixels) can be used to provide a set of threshold values for dithering (quantizing) a traditional 2D image, where the traditional 2D image has a size larger than 128 x 128 pixels.
  • a dither mask for quantizing traditional 2D images can be derived from a set of dot patterns, each dot pattern representing a different gray scale, where the dot patterns for each of the desired gray scales are designed using simulated annealing to have blue noise properties (blue noise contains more energy at higher frequencies and less energy at lower frequencies, making it less noticeable for human vision).
  • a dithering process can be configured for use in hardware by comparing pixel values of, for example, a traditional 2D image, to threshold values in a dither mask, enabling relatively low computational requirements.
  • dither can be enabled for two available states, an upper state and a lower state, when a pixel value of, for example, a traditional 2D image, is larger than a corresponding threshold value of the dither mask, such that the pixel is quantized to the upper state.
  • the pixel value of, for example a traditional 2D image is smaller than the corresponding threshold value in the dither mask, the pixel is quantized to the lower state.
  • a smaller dither mask for example, 128 x 128 pixels
  • a smaller dither mask can be tiled across an image, with the image having a size larger than the dither mask, enabling the dithering of a relatively large image with a smaller dither mask, with the advantage of a small memory size requirement for hardware implementation.
  • dithering can be achieved by a comparison of pixel values to threshold values in a dither mask, each pixel can be taken in any order, enabling a more relaxed hardware design.
  • FIG. 9B illustrates the use of a dither mask on a traditional 2D input image, where the dither mask can be designed to have blue noise properties.
  • dithering using a dither mask with blue noise properties can be suitable for traditional 2D images because the quantization noise can be moved into high spatial frequencies where it is easier for the human visual system to integrate.
  • a given dither mask can be configured so that quantization noise for a 2D image can be moved into relatively high spatial frequencies, where it can be easier for a human visual system to integrate for visual interpretation.
  • a dither mask can be adapted for use to quantize holograms.
  • a dither mask used to quantize traditional 2D images incorporates blue-noise properties
  • a dither mask used to quantize hologram patterns can be designed to incorporate appropriate properties in frequency domain, where quantization noise can be moved to a region outside a desired signal window in frequency domain. Quantizing hologram patterns using a dither mask enables relatively low computational requirements in hardware, as the process can be a relatively simple comparison operation in hardware.
  • FIG. 9C illustrates the use of a dither mask designed with a desired signal window in frequency domain.
  • the dither mask of FIG. 9C can be designed for quantizing hologram patterns, where quantization noise can be moved outside a desired signal window in frequency domain.
  • a dither mask optimized to quantize hologram patterns can be nonoptimal for quantizing traditional 2D images, because a dither mask optimized to quantize hologram patterns does not incorporate blue noise properties.
  • FIG. 9D illustrates the use of a dither mask on a hologram pattern, where the dither mask can be designed with a desired signal window in frequency domain.
  • a Fast Fourier Transform of a dithered hologram pattern includes a relatively well-defined signal window in frequency domain, where quantization noise can be largely moved outside the well-defined signal window.
  • a dither mask designed to quantize hologram patterns can have a size requirement that can be substantially the same size as the hologram pattern to be quantized.
  • a real-valued hologram pattern can be normalized so that each pixel of the real-valued hologram pattern maintains a value between -1 and 1 (with -1 and 1 included).
  • a dither mask can be provided to quantize the pixel values of a normalized real-valued hologram pattern, where each pixel of the normalized real-valued hologram pattern maintains a value between -1 and 1 (with -1 and 1 included), to either -1 or to 1, where the dither mask can be designed to provide a well-defined signal window in frequency domain with quantization noise largely moved outside the well-defined signal window.
  • a dither mask optimized for quantizing hologram patterns has the same size as the hologram patterns to be quantized.
  • an augmented reality device such as the augmented reality devices as disclosed herein, are adapted with one or more mask based dithering methods for the dithering of hologram patterns that are to be rendered on one or more spatial light modulators that are integrated as part of the augmented reality device.
  • a dither mask can be precomputed and stored on one or more memory devices of the augmented reality device, where the dither mask can be designed to provide a well-defined signal window with quantization noise largely moved outside the well-defined signal window.
  • FIG. 9E illustrates an example use of a dither mask optimized for quantizing hologram patterns, where the size of the dither mask can be smaller than the size of the hologram patterns to be quantized.
  • the pixels of a hologram pattern are quantized to either -1 or 1.
  • a dither mask can be adapted to quantize the pixels of a hologram pattern to either -1 or 1, where the dither mask can be smaller than the hologram pattern to be quantized and where the dither mask can be designed to provide a relatively well-defined signal window in frequency domain with quantization noise largely moved outside the well-defined signal window in frequency domain.
  • the threshold values in the dither mask maintain a value between -1 and 1 (with -1 and 1 included).
  • a second dither mask can be provided by taking for each location in the second dither mask the threshold value at the corresponding location in the first dither mask and reversing the sign.
  • the second dither mask can be obtained from the first dither mask by applying a sign flip to each value in the first dither mask.
  • both the first and the second dither mask have an equal size smaller than the size of the to-be-quantized hologram pattern.
  • a hologram pattern can be divided into four quadrants.
  • Top left and bottom right quadrants are quantized by tiling the first dither mask across each of these quadrants.
  • a sign-flip is applied to each value in top right and bottom left quadrant, after which the second dither mask is tiled across the sign-flipped quadrants for quantization, resulting in a unfinished quantized hologram pattern, where a sign flip is applied to the unfinished hologram pattern to arrive to a finished quantized hologram pattern for top right and bottom left quadrant of the hologram pattern.
  • a dithering operation is implemented in quadrants so that a pixel value of a continuous real hologram H r can be quantized to a binary hologram Hb pixel value, as illustrated in FIG. 9E, can result in acceptable quantization results on a hologram pattern.
  • FIG. 9F illustrates an example use of mask-based dithering to quantize a hologram pattern.
  • a 4096 x 4096 hologram pattern can be quantized using a 256 x 256 dither mask, providing a well-defined signal window in frequency domain with quantization noise largely moved outside the well-defined signal window.
  • a memory device can be required for the dither mask (256 x 256 x 8 bits in this example), however the dithering operation requires almost no additional computation.
  • a comparison of values of a hologram pattern to threshold values in a dither mask can be processed in an arbitrary order.
  • An example method comprises receiving a hologram pattern, representing one of an image, a three- dimensional (3D) object, or a three-dimensional (3D) scene for display.
  • the hologram pattern can be spatially divided into four subspace quadrants.
  • the hologram pattern can be quantized using a first and a second dither mask, where the values of the second dither mask are those of the first dither mask, but with the associated signs reversed.
  • the first dither mask can be applied to two subspace quadrants of the four subspace quadrants, such as, for example, top left and bottom right subspace quadrants, while the second dither mask can be adapted for use with the remaining two subspace quadrants of the four subspace quadrants, such as, for example, top right and bottom left subspace quadrants, where the values in these two subspace quadrants are sign flipped before tiling the second dither mask across the two subspace quadrants resulting in an unfinished quantized hologram pattern for each of these two subspace quadrants.
  • a sign flip can then be applied to each value of the unfinished quantized hologram of each of these two subspace quadrants to obtain a finished quantized hologram for all four quadrants.
  • each of the four subspace quadrants can be aligned to a common first axis and a common second axis, wherein the first axis and the second axis cross at a quadripoint of the four subspace quadrants.
  • FIG. 10 illustrates an example schematic block diagram of an embodiment of an ecosystem for implementing augmented reality.
  • augmented reality module 104 includes an optical module 104-3 configured for displaying images, virtual objects and/or virtual scenes with an associated augmented reality device, a processor 104-2, such as a holographic processor, for processing images, three dimensional (3D) objects and three dimensional (3D) scenes for display using optical module 104-3 and a wireless transceiver 104-1 for enabling communication with wireless networks, such as Wide Area Network (WAN) 108.
  • augmented reality module 104 can be adapted for use on an augmented reality device, such as augmented reality glasses.
  • augmented reality module 104 can be adapted to receive media, such as but not limited to one or more of images, partial images or audiovisual content for use with augmented reality glasses.
  • augmented reality module 104 can be adapted to receive three dimensional (3D) data in the form of one or more point clouds, and/or red, green, blue & depth (RGBZ) datasets, and/or one or more 2D images, with each image having an associated depth value (RGBD).
  • 3D three dimensional
  • media such as, for example, two dimensional (2D) images, three dimensional (3D) objects and/or three dimensional (3D) scenes
  • media can be provided to the augmented reality module 104 from one or more of a third party resource, such as third party media resource 106, over a wireless network
  • the wireless network can be one or more of a Wide Area Network, such as WAN 108, a Wireless Local Area Network (LAN), the World Wide Web or a cellular network.
  • WAN 108 Wide Area Network
  • LAN Wireless Local Area Network
  • augmented reality module 104 can be wirelessly coupled to a mobile device, such as mobile device 102.
  • mobile device 102 can be adapted to provide media, such as, for example, two dimensional (2D) images, three dimensional (3D) objects and/or three dimensional (3D) scenes, for augmented reality module 104 and/or to provide processing functionality for use with augmented reality module 104.
  • FIG. 11 illustrates an optical module for generating and displaying augmented reality content in a viewing device, such as augmented reality glasses.
  • optical module 100 provides a structural assembly for positioning various optical elements.
  • an optical module can include one or more illumination sources, such as illuminator 110, configured to provide illumination for a spatial light modulator device, such as spatial light modulator 112.
  • illuminator 110 can be a light source of a single predetermined wavelength.
  • illuminator 110 can be a light source configured to provide a limited range of wavelengths, where the limited range can include a plurality of wavelengths in a predetermined wavelength range.
  • illuminator 110 can be adapted to provide a white light source, where the white light source can be a combination of visible wavelengths.
  • illuminator 110 can be adapted to provide light wavelengths in separate color channels, such as the channels used in the Red, Green, Blue (RGB) color model.
  • Other example color models include, but are not limited to 1) Cyan, Magenta, Yellow, (Key/Black) (CMY(K)); 2) Hue, Saturation, Value (or Brightness) (HSV); 3) Hue, Saturation, Lightness (HSL); 4) YCbCr which separates luminance (brightness) information (Y) from chrominance (color) information (Cb and Cr); 5) LAB (and variants), which can be described as a three component model with (L* (lightness), a* (green to red), and b* (blue to yellow)); and 6) the XYZ color model.
  • illuminator 110 can be adapted to provide light wavelengths according to color models incorporating 4 or more separate color channels.
  • spatial light modulator 112 can be one or more spatial modulator devices implemented using one or more integrated circuits.
  • one or more holographic processors can be configured to compute and/or generate hologram patterns (such as interference patterns) based on the execution of algorithmic models.
  • hologram patterns can be rendered on one or more spatial light modulators of spatial light modulator 112 for displaying, among others, images, virtual objects and/or virtual scenes with a viewing device, such as augmented reality glasses.
  • spatial light modulator 112 can be configured to implement a pixel pitch close to or smaller than the wavelength of the light to be used with the spatial light modulator 112.
  • light to be used with the spatial light modulator 112 can be visible light.
  • the one or more spatial light modulator chips can be configured to implement a pixel pitch in the range of half a wavelength or less than half a wavelength of the light to be used with the spatial light modulator, which can be visible light in the example of augmented reality glasses.
  • a spatial light modulator with a pixel pitch close to or smaller than a wavelength of visible light can enable a relatively large field of view (FoV), whereas a pixel pitch larger than a wavelength of visible light can result in a reduced field of view (FoV).
  • a smaller field of view (FoV) can result in a less than immersive experience for a viewing device user.
  • optical module 100 can be configured to direct light emitted from illuminator 110 toward spatial light modulator 112.
  • Example implementations can include one or more optical elements in the optical path between illuminator 110 and spatial light modulator 112, such as, for example, a collimator lens to provide collimated light to spatial light modulator 112.
  • optical module 100 can be configured to provide alignment for illumination provided by illuminator 110 relative to the spatial light modulator based on the physical position of illuminator 110 relative to the physical position of spatial light modulator 112.
  • optical module 100 can be configured to guide a wavefront generated by spatial light modulator 112 toward one or more optical elements located outside optical module 100, by the use of one or more optical elements, such as mirror 114, which are configured as part of optical module 100, where the one or more optical elements located outside optical module 112 can be used to direct wavefronts, generated by spatial light modulator 112 and transmitted via an optical path of light module 100, to a user's eyes in a perceptible form.
  • the one or more optical elements located outside optical module 100 can include, but are not limited to, reflective and partially reflective optical elements, projection lenses and polarizing elements.
  • the one or more optical elements located outside optical module 100 can be a single optical combiner enabling images, virtual objects and virtual scenes generated by optical module 100 to be overlaid onto a real world environment.
  • one or more optical elements are located outside optical module 100, with the optical elements configured to to steer a wavefront delivered by optical module 100 to a hypothetical user’s eye, where the optical elements are implemented as part of the augmented reality glasses lenses.
  • FIG. 12 provides an illustration of smart/augmented reality (AR) glasses adapted for overlaying holographic content on to a real-world environment.
  • the smart glasses can be configured to render dynamic and/or full color holographic content in front of a user's eyes overlaid onto a real-world environment.
  • smart glasses can be configured with various example components, such as one or more illumination sources, one or more spatial light modulators, one or more optical subsystems and one or more computing elements / compute chip (such as holographic processor devices).
  • smart glasses can be configured with any of one illumination source, one spatial light modulator, one optical subsystem and one compute element / compute chip (such as a holographic processor device) per eye.
  • smart glasses can be configured to include two illumination sources, two spatial light modulators, two optical subsystems and two compute elements (holographic processor devices).
  • an optical module 122 can be configured to combine many of the elements described above in a single unit.
  • Example smart glasses can be adapted to include additional components, such as a control management subsystem, memory, a power management subsystem, one or more embedded batteries, and/or one or more connectors adapted for connection to an external battery.
  • various components of the smart glasses of FIG. 12 can be incorporated in the frame of the smart glasses.
  • a smart glasses configuration can include an illumination source, a spatial light modulator, an optical subsystem and a compute element (such as a holographic processor device) for each of a user’s eyes.
  • an illumination source e.g. miniaturized laser source with coherent light
  • a spatial light modulator comprises an array of individually programmable optical pixels adapted to generate dynamic wavefronts for displaying/projecting holographic content.
  • a pixel can be sized to be half the wavelength of visible light or shorter.
  • an optical pixel of the array can be adapted to interact with a portion of an incident light beam provided by an illumination source, where the amplitude and phase of a resultant light wave generated by an optical pixel can be dependent on a programming state of the optical pixel.
  • an optical pixel can be adapted to modulate the amplitude of a light wave it generates as a function of amplitude of a resultant incident light beam. In an example, an optical pixel can be adapted to modulate the phase of a light wave it generates as a function of the phase of a resultant incident light beam. In an example, an optical pixel can be adapted to modulate the amplitude and the phase of the light wave it generates as a function of the amplitude and phase of the incident light beam.
  • individual light waves generated by each optical pixel can form wavefronts to display or project holographic content.
  • wavefronts resulting from a spatial light modulator are provided to the optical subsystem.
  • Example optical systems can be configured with a variety of elements, including combinations of lenses (e.g. pancake lenses, metal lenses, etc.), mirrors (e.g. freeform mirrors) and diffractive optical elements for (re)directing, filtering and/or magnifying/demagnilying static and/or dynamic wavefronts provided by a spatial light modulator.
  • a wavefront provided by an optical subsystem can be directed to a partially reflective mirror (or partially reflective coating) that can be configured for location in front of the user’ s eye and configured for use with the lenses (such as optical correction lenses) of a pair of smart glasses.
  • a partially reflective mirror / coating can be adapted to reflect static and/or dynamic wavefronts from an optical subsystem toward a user’s eye, so that the user’s eye can capture the wavefronts as holographic content.
  • a reflective mirror / coating (lens treatment 124) in front of a hypothetical user’s eye can be adapted to be partially reflective, so that incident light rays from a real-world environment are transmitted through the optical lenses for perception at the user’s eye. Accordingly, a user can thereby be enabled to see holographic content overlaid onto the real-world environment.
  • the reflective mirror can be adapted to be substantially reflective, so that substantially no light rays incident from the real- world environment are received at the user's eye. In the example, a user will only receive the holographic content without being overlaid on the real-world environment.
  • a mirror can be adapted to switch between a partially reflective state and completely reflective state.
  • optical lenses of smart glasses can be configured as corrective lenses, for example where a user would normally wear prescription glasses.
  • compute chip(s) and/or holographic processor chip(s) can be adapted to execute(s) Computer Generated Holography (CGH) algorithms.
  • holographic interference patterns calculated by one or more compute chip(s) (such as holographic processor chips) executing Computer Generated Holography (CGH) algorithms, can be used to determine the programming of the optical pixels associated with a spatial light modulator.
  • CGH algorithms can be used to compute the digital holograms for desired holographic content.
  • a control management subsystem can be adapted to receive and transmit digital data and/or to control subsystems and components for optimal interaction with each other.
  • Example smart glasses can be configured to include a control management subsystem, memory, a power management subsystem, one or more embedded batteries, and/or one or more connectors for connection to external batteries.
  • the control management subsystem receives and emits digital data and/or controls various smart glasses subsystems and components so that they interact with each other in a desired manner.
  • memory included in example smart glasses can be implemented using virtually any electrical storage technology.
  • pre-determined/pre-computed holographic information can be adapted for storage in the memory so that the information can be directly loaded onto a spatial light modulator without additional/excessive computation.
  • Example uses for smart glasses include watching photos and videos, and / or reading text messages rendered for receipt at user’s eyes.
  • Other example uses include enabling navigation with smart glasses, where information can be provided in front of a user’s eyes to assist the user with navigating (e.g. rendering street names or rendering arrows indicating the direction the user should follow to reach its end destination).
  • Another example use includes a visual search that can be used if a user desires information pertaining to a physical thing in a real-world environment.
  • smart glasses can be adapted to indicate that a given user would like to receive information about a physical thing by, for example, looking at it.
  • image recognition can be implemented to identify a physical thing the user can be looking at and then input for use in a search engine.
  • information received from a search engine can be displayed for use at the user’s eyes.
  • smart glasses could be configured with one or more cameras to enable capturing the real- world environment the user is looking at and eye-tracking to determine a direction the user is looking.
  • augmented reality glasses 120 are configured with an optical module 122 (such as optical module 100 referring to Fig. 11) coupled to one of the temples of augmented reality glasses 120.
  • optical module 122 can be configured to direct a wavefront at a lens (or lenses) of augmented reality glasses 120.
  • the lens (or lenses) of augmented reality glasses 120 can be adapted with a “see through” combiner element with reflective and/or partially reflective properties, such as lens treatment 124, to combine “virtual media content” with a “real-world” scene, allowing a user to see both the virtual media content and the real-world scene simultaneously.
  • virtual media content can be defined as media, including but not limited to two dimensional (2D) images, three dimensional (3D) objects and three dimensional (3D) scenes, delivered by one or more spatial light modulators for visual perception.
  • virtual media content can require a display surface in order to be perceived by a hypothetical user of an augmented reality device, however “virtual media content” can be intended to encompass both the output of a spatial light modulator, such as the spatial light modulators illustrated in multiple FIGs. described herein and the output of a spatial light modulator on a display surface.
  • lens treatment 124 can be adapted to provide a partially reflective surface balancing the transmission and reflection of light allowing a user to view virtual media content, through reflection by lens treatment 124, overlaid onto a real-world scene transmitted through lens treatment 124.
  • Example lens treatments include holographic optical element coatings or meta surface coatings; the coatings adapted to steer a wavefront delivered by an optical module toward one or both eyes of a hypothetical viewer for the viewer to perceive virtual media content, while still being see-through for perception of the real-world environment.
  • Additional example lens treatments include: 1) dielectric coatings designed to enhance reflectivity at specific wavelengths; 2) dichroic coatings to selectively reflect or transmit light based on color; 3) beam splitter coatings are designed to split incident light into two components (reflecting one part while transmitting the other); 4) anti-reflective coatings to minimize unwanted glare or ghosting effects in the digital media content; 5) polarizing coatings used in combination with polarized light sources to reproduce digital media content; and 6) hybrid coatings combing a plurality coating types.
  • FIG. 13 is an example schematic block diagram of an embodiment of a system for implementing augmented reality that includes an optical module 136, a holographic processor 140, memory, a control unit, a data transceiver 138, as well as power unit 132 comprising management unit 134 and one or more batteries 130.
  • holographic processor 140 can be configured to compute and/or generate hologram patterns to be rendered on the one or more spatial light modulators 142 to display virtual media content on an associated augmented reality device, such as augmented reality system 146.
  • Optical module 136 is described in greater detail above. (See, for example, optical module 100 with reference to FIG. 2) and elsewhere herein.
  • optical module 136 can be implemented with illumination source 144, where illumination source 144 can include one or more individual light sources adapted to provide illumination for the spatial light modulator(s) 142.
  • illumination source 144 can be adapted to provide light wavelengths in separate color channels, such as the channels used in the Red, Green, Blue (RGB) color model.
  • illumination source 144 can be implemented to cover a substantially full spectrum of wavelengths as, for example, a single white light emitter.
  • holographic processor 140 can be configured as one or more compute elements, where the compute elements can be configured as integrated circuits, adapted to execute processing functions for computing hologram patterns for rendering on the one or more spatial light modulators 142 in order for the spatial light modulators to deliver virtual media content for display with an associated augmented reality device, such as augmented reality system 146.
  • hologram patterns for rendering on one or more spatial light modulators 142 can be generated based on generally understood functions, such as Computer Generated Hologram algorithms based on one or more of a Fresnel Transform-Based model, an Angular Spectrum model, a Point Source method, a Gerchberg-Saxton Algorithm, an Iterative Fourier Transform Algorithm (IFTA) and/or a Wavefront Propagation algorithm (such as, for example, wavelet transform, wavefront recording and reconstruction and kinoform). Additional methods can include using random-phase encoding/decoding techniques to represent complex scenes and/or deep learning techniques, such as convolutional neural networks.
  • virtual media content can be configured for display at a single depth plane selectable in space between a near eye position (relative to a user) and infinity.
  • a holographic processor can be configured to compute hologram patterns from input media, with the input media comprising two dimensional (2D) and/or three dimensional (3D) data, where the holographic processor is adapted to execute one or more methods to correct for optical aberrations, such as optical aberrations introduced by the one or more optical elements configured as part of the optical module and/or located outside the optical module, one or more methods to accommodate for vision correction, and/or one or more methods for minimizing quantization noise in displayed virtual media content.
  • Data transceiver 138 can include one or more receiver units and/or transmitter units, potentially implemented as one or more integrated circuits configured to enable an exchange of information between an augmented reality system, such as augmented reality system 146, and a wireless network, such as a wide area network (WAN) and/or a local area network (LAN) and/or between an augmented reality system, such as augmented reality system 146, and an (mobile) electronic device, such as a smartphone, smart watch, mobile computer or desktop computer.
  • power unit 132 can be provided with a power management unit 134 comprising one or more power management integrated circuits (PMICs), to manage and control power use by augmented reality system 146, in order to provide relative efficient power consumption and performance of the system.
  • PMICs power management integrated circuits
  • augmented reality system 146 can include one or more memory devices, with the memory devices being comprised of, for example one or more of a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, a quantum register or other quantum memory and/or any other device that stores data in a non-transitory manner.
  • one or more pre-computed hologram patterns are stored on the one or more memory devices.
  • one or more hologram patterns are computed external to augmented reality system 146 and transmitted to augmented reality system 146 over data transceiver 138 for storage on one or more memory devices of augmented reality system 146.
  • augmented reality system 146 can include one or more control units or control modules, potentially implemented as one or more integrated circuits, configured for driving and/or synchronizing various augmented reality system 146 functions.
  • the one or more control units can be implemented using one or more application processing devices.
  • optical module 136 can include one or more optical pre-path(s) and/or one or more optical post-path(s).
  • optical pre-path(s) and/or optical post-path(s), together with illumination source 144 and spatial light modulator(s) 146 comprise optical module 136.
  • various components included in augmented reality system 146 can be implemented as individual components on one or more printed circuit boards.
  • various components of augmented reality system 146 can be adapted for implementation in the frame of an augmented reality device, such as in the temple (or temples) of augmented reality glasses.
  • some or all of the power management unit 134, the control unit, holographic processor 140, memory and data transceiver 138 can be implemented on a single electronic chip (such as a System on Chip or SoC), or consolidated on a plurality of electronic chips.
  • integration of some or all of the components of augmented reality system 146, including the power management unit 134, the control unit, holographic processor 140, memory and radio transceiver 138, on a single electronic chip (such as a System on Chip or SoC) can be used to, for example, reduce the overall footprint of augmented reality system 146 and/or to provide a potentially more compact implementation, all while potentially increasing performance and/or power efficiency.
  • integrating some or all of the components of augmented reality system 146 on a SoC can enable reduced manufacturing costs for augmented reality system 146, enabling overall lower cost for an associated augmented reality viewing device.
  • An example augmented reality device includes an interface for interfacing with a network, an illumination source, a spatial light modulator, a holographic processor, an optical combiner element, memory and a processing module operably coupled to the interface and to the memory.
  • the processing module can be operable to receive, via the interface, data, which can represent two dimensional (2D) images, three dimensional (3D) objects and/or three dimensional (3D) scenes, provide the data to the holographic processor, receive, from the holographic processor, a hologram pattern, provide the hologram pattern to the spatial light modulator for the hologram pattern to be rendered on the spatial light modulator, where data received via the interface can be perceived by a hypothetical user using the optical combiner element.
  • the optical combiner element can be a semi-transparent angle-selective combiner element implemented as an optical component and configured to allow light to pass at specific angles, while reflecting or blocking light at other angles.
  • the semi-transparent angle-selective combiner can be used to selectively transmit or reflect light based on the incident angle of the incoming light rays.
  • the optical combiner element can include a semi-transparent, reflective coating adapted as part of the lenses of augmented reality glasses.
  • an illumination source, a spatial light modulator and a holographic processor can be implemented in an optical module.
  • the augmented reality device can be implemented as augmented reality glasses, with an optical module coupled to a temple of the augmented reality glasses.
  • a spatial light modulator has a respective top and a respective bottom surface, where the illumination source can be configured to direct light at the top surface of the spatial light modulator and where the top surface of the spatial light modulator can be overlayed with a spatially -varying pattern of color filters formed on the top surface.
  • the spatially-varying pattern of color filters comprises red color filters, green color filters and blue color filters; wherein the red color filters are transparent to red light, but blocking/absorptive for green and blue light, wherein the green color filters are transparent to green light, but blocking/absorptive for red and blue light, and wherein the blue color filters are transparent to blue light, but blocking/absorptive for red and green light.
  • the proportion of red, green and blue color fdters in the spatially -varying pattern of color filters can be equal.
  • the proportion of red, green and blue color filters in the spatially -varying pattern of color filters can be unequal.
  • the proportion of red color filters in the spatially -varying pattern of color filters can be 1/4
  • the proportion of green color filters can be 2/4
  • the proportion of blue color filters can be 1/4.
  • a higher proportion can be given to the green color filters in the spatially -varying pattern of color filters as the human visual system can be more sensitive to green.
  • the proportion of red color filters in the spatially-varying pattern of color filters can be 2/6
  • the proportion of green color fdters can be 3/6
  • the proportion of blue color fdters can be 1/6
  • the spatially -varying pattern of color filters comprises red color filters, green color filters transparent for a first wavelength of green light, green color fdters transparent for a second wavelength of green light different from the first wavelength of green light, and blue color filters, where the illumination source of the system can be able to deliver both the first and second wavelength of green light.
  • the spatially -varying pattern of color filters can be adapted with color filters, wherein a first set of color filters can be transparent for a first wavelength of red light and a second set of color filters can be transparent for a second wavelength of red light different from the first wavelength of red light; and/or with color filters, wherein a first set of color filters can be transparent for a first wavelength of blue light and a second set of color filters can be transparent for a second wavelength of blue light different from the first wavelength of blue light.
  • the proportion of the different sets of color filters in the spatially -varying pattern of color filters can be equal or unequal.
  • an optical module for displaying virtual media content on a viewing device comprises an interface, an illumination source, and a wireless transceiver configured to receive data representing two dimensional (2D) images, three dimensional (3D) objects and/or three dimensional (3D) scenes, and/or hologram patterns for rendering on one or more spatial tight modulators for displaying of virtual media content by the optical module.
  • the optical module includes a spatial tight modulator and one or more holographic processors, where the illumination source can be configured to direct illumination to the spatial light modulator and where the one or more holographic processors are adapted to compute hologram pattern for rendering on the spatial light modulator.
  • the optical module can be adapted to generate virtual media content for display using an optical combiner element, where the optical combiner element, potentially implemented as a combiner mirror, can be configured to combine virtual media content with a real-world environment so that the virtual media content and the real-world environment are viewable at the same time when looking at the optical combiner element.
  • the optical combiner element can take the form of any of a combiner mirror, a holographic optical element or a meta-surface.
  • the optical combiner element can be a semi-transparent angle-selective combiner element implemented as an optical component and configured to allow light to pass at specific angles, while reflecting or blocking light at other angles.
  • the semi-transparent angle-selective combiner can be used to selectively transmit or reflect light based on the incident angle of the incoming light rays.
  • the optical combiner element can include a semi-transparent, reflective coating adapted as part of the lenses of augmented reality glasses.
  • FIG. 14A illustrates an example implementation of augmented reality headset/glasses 172 that includes spatial light modulator 152, data transceiver 158, processor 156, main board 162 and battery 164 integrated on a temple of augmented reality glasses.
  • some or all of the electronic components can be implemented as integrated circuits, comprising augmented reality headset/glasses 172, along with additional electronic components, also implemented as integrated circuits, can be combined on a common System on Chip (SoC) or as two or more electronic chips integrated using advanced packaging techniques.
  • SoC System on Chip
  • Example packaging includes multi-chip modules, three-dimensional integrated circuit (3D IC) packages using multi-chip stacking and through silicon vias (TSVs), as well as system-in-package (SiP) and package-on-package (PoP).
  • TSVs through silicon vias
  • SiP system-in-package
  • PoP package-on-package
  • Any of the foregoing can be used to reduce the footprint of the various hardware elements, while potentially increasing efficiency and improving thermal management performance for an augmented reality system.
  • advanced packaging techniques can enable one or more of lower cost, compact design and/or comfort of wear, while increasing aesthetic appeal of augmented reality glasses.
  • lens treatment 160 can be provided to enable a user to perceive virtual media content overlaid onto the real-life environment.
  • lens treatment 160 can include one of a holographic optical element or a meta-surface adapted to the lenses of the augmented reality glasses.
  • lens treatment 160, implemented as a holographic optical element or a metasurface can be configured as a coating for application onto the lenses of augmented reality glasses.
  • FIG. 14B illustrates an example implementation of an augmented reality headset/glasses 170 showing an optical module (such as optical module 100 referring to Fig. 2), comprising spatial light modulator 152 coupled to a secondary board 154 (daughter printed circuit board), all implemented in the temple of augmented reality headset/glasses 170.
  • an optical module such as optical module 100 referring to Fig. 2
  • spatial light modulator 152 coupled to a secondary board 154 (daughter printed circuit board), all implemented in the temple of augmented reality headset/glasses 170.
  • FIG. 14C provides an expanded view of the optical module of augmented reality headset/glasses 170 of FIG. 14B, with the optical module (such as optical module 100 referring to Fig. 11) comprising spatial light modulator 152 coupled to a secondary board 154 (daughter printed circuit board), all implemented in the temple of augmented reality headset/glasses 170.
  • the optical module such as optical module 100 referring to Fig. 11
  • the optical module comprising spatial light modulator 152 coupled to a secondary board 154 (daughter printed circuit board), all implemented in the temple of augmented reality headset/glasses 170.
  • the terms “substantially” and “approximately” provide an industry-accepted tolerance for its corresponding term and/or relativity between items. For some industries, an industry-accepted tolerance is less than one percent and, for other industries, the industry-accepted tolerance is 10 percent or more. Other examples of industry -accepted tolerance range from less than one percent to fifty percent. Industry-accepted tolerances correspond to, but are not limited to, component values, integrated circuit process variations, temperature variations, rise and fall times, thermal noise, dimensions, signaling errors, dropped packets, temperatures, pressures, material compositions, and/or performance metrics.
  • tolerance variances of accepted tolerances may be more or less than a percentage level (e.g., dimension tolerance of less than +/- 1%). Some relativity between items may range from a difference of less than a percentage level to a few percent. Other relativity between items may range from a difference of a few percent to magnitude of differences.
  • the term(s) “configured to”, “operably coupled to”, “coupled to”, and/or “coupling” includes direct coupling between items and/or indirect coupling between items via an intervening item (e.g., an item includes, but is not limited to, a component, an element, a circuit, and/or a module) where, for an example of indirect coupling, the intervening item does not modify the information of a signal but may adjust its current level, voltage level, and/or power level.
  • inferred coupling i.e., where one element is coupled to another element by inference
  • the term “configured to”, “operable to”, “coupled to”, or “operably coupled to” indicates that an item includes one or more of power connections, input(s), output(s), etc., to perform, when activated, one or more its corresponding functions and may further include inferred coupling to one or more other items.
  • the term “associated with”, includes direct and/or indirect coupling of separate items and/or one item being embedded within another item.
  • the term “compares favorably”, indicates that a comparison between two or more items, signals, etc., indicates an advantageous relationship that would be evident to one skilled in the art in light of the present disclosure, and based, for example, on the nature of the signals/items that are being compared.
  • the term “compares unfavorably”, indicates that a comparison between two or more items, signals, etc., fails to provide such an advantageous relationship and/or that provides a disadvantageous relationship.
  • Such an item/signal can correspond to one or more numeric values, one or more measurements, one or more counts and/or proportions, one or more types of data, and/or other information with attributes that can be compared to a threshold, to each other and/or to attributes of other information to determine whether a favorable or unfavorable comparison exists.
  • Examples of such a advantageous relationship can include: one item/signal being greater than (or greater than or equal to) a threshold value, one item/signal being less than (or less than or equal to) a threshold value, one item/signal being greater than (or greater than or equal to) another item/signal, one item/signal being less than (or less than or equal to) another item/signal, one item/signal matching another item/signal, one item/signal substantially matching another item/signal within a predefined or industry accepted tolerance such as 1%, 5%, 10% or some other margin, etc.
  • a predefined or industry accepted tolerance such as 1%, 5%, 10% or some other margin, etc.
  • a favorable comparison may be achieved when the magnitude of signal 1 isgreater than that of signal 2 or when the magnitude of signal 2 is less than that of signal 1.
  • the comparison of the inverse or opposite of items/signals and/or other forms of mathematical or logical equivalence can likewise be used in an equivalent fashion.
  • the comparison to determine if a signal X > 5 is equivalent to determining if -X ⁇ -5
  • the comparison to determine if signal A matches signal B can likewise be performed by determining -A matches -B or not(A) matches not(B).
  • the determination that a particular relationship is present can be utilized to automatically trigger a particular action. Unless expressly stated to the contrary, the absence of that particular condition may be assumed to imply that the particular action will not automatically be triggered.
  • the determination that a particular relationship is present can be utilized as a basis or consideration to determine whether to perform one or more actions. Note that such a basis or consideration can be considered alone or in combination with one or more other bases or considerations to determine whether to perform the one or more actions. In one example where multiple bases or considerations are used to determine whether to perform one or more actions, the respective bases or considerations are given equal weight in such determination. In another example where multiple bases or considerations are used to determine whether to perform one or more actions, the respective bases or considerations are given unequal weight in such determination.
  • one or more claims may include, in a specific form of this generic form, the phrase “at least one of a, b, and c” or of this generic form “at least one of a, b, or c”, with more or less elements than “a”, “b”, and “c”.
  • the phrases are to be interpreted identically.
  • “at least one of a, b, and c” is equivalent to “at least one of a, b, or c” and shall mean a, b, and/or c.
  • it means: “a” only, “b” only, “c” only, “a” and “b”, “a” and “c”, “b” and “c”, and/or “a”, “b”, and “c”.
  • a hologram pattern or hologram refers to a light interference pattern, while a hologram pattern is a diffraction pattern that diffracts incident light.
  • a holographic image refers to the visual result perceivable by a viewer when a hologram pattern is properly illuminated.
  • the visual result perceivable by a viewer includes two-dimensional (2D) images, two-dimensional (2D) representations, two-dimensional (2D) information, three-dimensional (3D) objects and three-dimensional (3D) scenes.
  • processing module may be a single processing device or a plurality of processing devices.
  • a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions.
  • the processing module, module, processing circuit, processing circuitry, and/or processing unit may be, or further include, memory and/or an integrated memory element, which may be a single memory device, a plurality of memory devices, and/or embedded circuitry of another processing module, module, processing circuit, processing circuitry, and/or processing unit.
  • a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information.
  • processing module, module, processing circuit, processing circuitry, and/or processing unit includes more than one processing device, the processing devices may be centrally located (e.g., directly coupled together via a wired and/or wireless bus structure) or may be distributedly located (e.g., cloud computing via indirect coupling via a local area network and/or a wide area network).
  • the processing module, module, processing circuit, processing circuitry and/or processing unit implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry
  • the memory and/or memory element storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry.
  • the memory element may store, and the processing module, module, processing circuit, processing circuitry and/or processing unit executes, hard coded and/or operational instructions corresponding to at least some of the steps and/or functions illustrated in one or more of the Figures.
  • Such a memory device or memory element can be included in an article of manufacture.
  • a flow diagram may include a “start” and/or “continue” indication.
  • the “start” and “continue” indications reflect that the steps presented can optionally be incorporated in or otherwise used in conjunction with one or more other routines.
  • a flow diagram may include an “end” and/or “continue” indication.
  • the “end” and/or “continue” indications reflect that the steps presented can end as described and shown or optionally be incorporated in or otherwise used in conjunction with one or more other routines.
  • start indicates the beginning of the first step presented and may be preceded by other activities not specifically shown.
  • the “continue” indication reflects that the steps presented may be performed multiple times and/or may be succeeded by other activities not specifically shown.
  • a flow diagram indicates a particular ordering of steps, other orderings are likewise possible provided that the principles of causality are maintained.
  • the one or more embodiments are used herein to illustrate one or more aspects, one or more features, one or more concepts, and/or one or more examples.
  • a physical embodiment of an apparatus, an article of manufacture, a machine, and/or of a process may include one or more of the aspects, features, concepts, examples, etc. described with reference to one or more of the embodiments discussed herein.
  • the embodiments may incorporate the same or similarly named functions, steps, modules, etc. that may use the same or different reference numbers and, as such, the functions, steps, modules, etc. may be the same or similar functions, steps, modules, etc. or different ones.
  • signals to, from, and/or between elements in a figure of any of the figures presented herein may be analog or digital, continuous time or discrete time, and single-ended or differential.
  • signals to, from, and/or between elements in a figure of any of the figures presented herein may be analog or digital, continuous time or discrete time, and single-ended or differential.
  • a signal path is shown as a single-ended path, it also represents a differential signal path.
  • a signal path is shown as a differential path, it also represents a single-ended signal path.
  • module is used in the description of one or more of the embodiments.
  • a module implements one or more functions via a device such as a processor or other processing device or other hardware that may include or operate in association with a memory that stores operational instructions.
  • a module may operate independently and/or in conjunction with software and/or firmware.
  • a module may contain one or more submodules, each of which may be one or more modules.
  • a computer readable memory includes one or more memory elements.
  • a memory element may be a separate memory device, multiple memory devices, or a set of memory locations within a memory device.
  • Such a memory device may be a read-only memory, random access memory, volatile memory, nonvolatile memory, static memory, dynamic memory, flash memory, cache memory, a quantum register or other quantum memory and/or any other device that stores data in a non-transitory manner.
  • the memory device may be in a form of a solid-state memory, a hard drive memory or other disk storage, cloud memory, thumb drive, server memory, computing device memory, and/or other non-transitory medium for storing data.
  • the storage of data includes temporary storage (i.e., data is lost when power is removed from the memory element) and/or persistent storage (i.e., data is retained when power is removed from the memory element).
  • a transitory medium shall mean one or more of: (a) a wired or wireless medium for the transportation of data as a signal from one computing device to another computing device for temporary storage or persistent storage; (b) a wired or wireless medium for the transportation of data as a signal within a computing device from one element of the computing device to another element of the computing device for temporary storage or persistent storage; (c) a wired or wireless medium for the transportation of data as a signal from one computing device to another computing device for processing the data by the other computing device; and (d) a wired or wireless medium for the transportation of data as a signal within a computing device from one element of the computing device to another element of the computing device for processing the data by the other element of the computing device.
  • a non-transitory computer readable memory is substantially equivalent
  • One or more functions associated with the methods and/or processes described herein can be implemented via a processing module that operates via the non-human “artificial” intelligence (Al) of a machine.
  • Al non-human “artificial” intelligence
  • Examples of such Al include machines that operate via anomaly detection techniques, decision trees, association mles, expert systems and other knowledge-based systems, computer vision models, artificial neural networks, convolutional neural networks, support vector machines (SVMs), Bayesian networks, genetic algorithms, feature learning, sparse dictionary learning, preference learning, deep learning and other machine learning techniques that are trained using training data via unsupervised, semi-supervised, supervised and/or reinforcement learning, and/or other Al.
  • SVMs support vector machines
  • Bayesian networks genetic algorithms, feature learning, sparse dictionary learning, preference learning, deep learning and other machine learning techniques that are trained using training data via unsupervised, semi-supervised, supervised and/or reinforcement learning, and/or other Al.
  • the human mind is not equipped to perform such Al techniques, not only due to the complexity of these techniques
  • One or more functions associated with the methods and/or processes described herein can be implemented as a large-scale system that is operable to receive, transmit and/or process data on a large-scale.
  • a large-scale refers to a large number of data, such as one or more kilobytes, megabytes, gigabytes, terabytes or more of data that are received, transmitted and/or processed.
  • Such receiving, transmitting and/or processing of data cannot practically be performed by the human mind on a large-scale within a reasonable period of time, such as within a second, a millisecond, microsecond, a real-time basis or other high speed required by the machines that generate the data, receive the data, convey the data, store the data and/or use the data.
  • One or more functions associated with the methods and/or processes described herein can require data to be manipulated in different ways within overlapping time spans. The human mind is not equipped to perform such different data manipulations independently, contemporaneously, in parallel, and/or on a coordinated basis within a reasonable period of time, such as within a second, a millisecond, microsecond, a real-time basis or other high speed required by the machines that generate the data, receive the data, convey the data, store the data and/or use the data.
  • One or more functions associated with the methods and/or processes described herein can be implemented in a system that is operable to electronically receive digital data via a wired or wireless communication network and/or to electronically transmit digital data via a wired or wireless communication network.
  • One or more functions associated with the methods and/or processes described herein can be implemented in a system that is operable to electronically store digital data in a memory device. Such storage cannot practically be performed by the human mind because the human mind is not equipped to electronically store digital data.
  • One or more functions associated with the methods and/or processes described herein may operate to cause an action by a processing module directly in response to a triggering event - without any intervening human interaction between the triggering event and the action. Any such actions may be identified as being performed “automatically”, “automatically based on” and/or “automatically in response to” such a triggering event. Furthermore, any such actions identified in such a fashion specifically preclude the operation of human activity with respect to these actions - even if the triggering event itself may be causally connected to a human activity of some kind.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Holo Graphy (AREA)

Abstract

A method for a display device begins by receiving data representative of a set of two-dimensional (2D) scene layers, with each 2D scene layer having a corresponding predetermined display depth in a focus space, and generating a first hologram pattern for each 2D scene layer of the set of 2D scene layers to create a set of first hologram patterns, where each first hologram pattern of the a set of first hologram patterns is adapted to place an associated 2D scene layer at infinite depth. The method continues by using a mathematical lens function to convert each first hologram pattern of the set of first hologram patterns to a second hologram pattern to create a set of second hologram patterns, where each second hologram pattern is adapted to place an associated 2D scene layer at a corresponding predetermined display depth in the focus space. Finally, the method continues by aggregating the set of second hologram patterns to provide an aggregated hologram pattern.

Description

SYSTEM AND METHODS FOR DISPLAY OF 3D MULTI-MEDIA
Inventors:
Edward Buckley, Andrzej Kaczorowski, Theodore Michel Marescaux, Richard Stahl, Gebirie Yizengaw Belay and Joel Steven Kollin
FIELD OF THE DISCLOSURE
[0001] The subject disclosure relates to optical systems and associated applications for displaying three-dimensional (3D) media for virtual reality (VR) and augmented/extended reality (AR).
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] Reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
[0003] FIG. 1 illustrates an example three-dimensional (3D) focus space for an augmented reality device, in accordance with various aspects described herein;
[0004] FIG. 2 is a logic diagram of an example method for generating a hologram pattern, in accordance with various aspects described herein;
[0005] FIG. 3 illustrates an example use of hologram replication in a holographic display system, in accordance with various aspects described herein;
[0006] FIG. 4 illustrates a three-dimensional optical lens assembly, in accordance with various aspects described herein;
[0007] FIG. 5A is a logic diagram of an example method for displaying multiple two dimensional image layers in corresponding selectable depth planes in an augmented reality device, in accordance with various aspects described herein;
[0008] FIG. 5B is a logic diagram of an example method for displaying an image in a selectable depth plane in an augmented reality device using replicated hologram patterns, in accordance with various aspects described herein;
[0009] FIG. 5C is a logic diagram of an example method for displaying multiple two dimensional image layers in corresponding selectable depth planes in an augmented reality device using replicated hologram patterns, in accordance with various aspects described herein;
[0010] FIGs. 6A & 6B are example logic diagrams of example methods for executing a visual search in an augmented reality system, in accordance with various aspects described herein;
[0011] FIG. 7 is an example schematic block diagram of another embodiment of a system for implementing augmented reality, in accordance with various aspects described herein;
[0012] FIG. 8 is an example schematic block diagram of another embodiment of a system for implementing augmented reality, in accordance with various aspects described herein;
[0013] FIG. 9A illustrates example hologram dot pattern thresholds encoded based on a dither mask, in accordance with various aspects described herein;
[0014] FIG. 9B illustrates an example of using a dither mask designed for blue noise on an input image, in accordance with various aspects described herein;
[0015] FIG. 9C illustrates an example of using a dither mask designed with a signal window in the frequency domain on an input image, in accordance with various aspects described herein;
[0016] FIG. 9D illustrates an example of using a dither mask designed with a signal window in the Fourier space on an input hologram, in accordance with various aspects described herein; [0017] FIG. 9E illustrates an example of using a dither mask and inverse dither mask, in accordance with various aspects described herein;
[0018] FIG. 9F illustrates an example use of mask-based dithering to quantize a hologram, in accordance with various aspects described herein;
[0019] FIG. 10 illustrates an example schematic block diagram of an embodiment of an ecosystem for implementing augmented reality, in accordance with various aspects described herein;
[0020] FIG. 11 illustrates an example optical module for generating and projecting augmented reality content in accordance with various aspects described herein;
[0021] FIG. 12 illustrates an example implementation of augmented reality glasses in accordance with various aspects described herein;
[0022] FIG. 13 is a schematic block diagram of an example embodiment of a system for implementing augmented reality in accordance with various aspects described herein;
[0023] FIG. 14 A illustrates an example implementation of augmented reality glasses in accordance with various aspects described herein;
[0024] FIG. 14B illustrates another example implementation of augmented reality glasses in accordance with various aspects described herein; and
[0025] FIG. 14C illustrates another example implementation of augmented reality glasses in accordance with various aspects described herein.
DETAILED DESCRIPTION
[0026] FIG. 1 illustrates an example three-dimensional (3D) focus space 188 for an augmented reality device 182 with a depth plane 184 configured to be selectable from a near eye distance 180 to infinite distance 186. In an example, a second spatial light modulator is adapted to display one or more holographic objects viewable by a user, an optical system, such as augmented reality device 182, that can be configured to display virtual media content at a selectable depth plane 184, where the depth of the depth plane can be selected based on desired virtual media content for display. In an example of implementation, a holographic lens function can be applied to a hologram pattern to program the depth at which an image is displayed in space. In a specific example, the selectable depth plane 184 can be programmed to be represented within the focus space from as close as near eye distance 180 to as far as an essentially infinite distance 186. In an example, a holographic lens function can be a mathematical model adapted to mimic a physical lens function to alter a focus plane for a to-be-displayed image. In a related example, a holographic processor can be configured to apply a holographic lens function to a hologram pattern for images to be displayed by augmented reality device 182 at a desired depth in space. In an example of implementation, augmented reality device 182 can be configured to display two-dimensional (2D) content on a single depth plane in space, where the depth of the depth plane can be determined using a holographic lens function. In yet another example implementation, augmented reality device 182 can be configured to display two-dimensional (2D) content on multiple depth planes simultaneously, where the depth of each of the depth planes can be determined separately.
[0027] In various examples, stereoscopic display technology can be used to provide content at a single depth plane fixed in space. In an example, complex and/or bulky optical elements can be implemented to provide stereoscopic display technology enabled for providing a single non-fixed depth plane albeit with otherwise undesirable drawbacks of weight and/or bulk. Accordingly, augmented reality devices, such as augmented reality glasses, heads-up displays, and others, can benefit from reducing and/or eliminating these additional complex and/or bulky elements, while allowing display and/or overlay of images, virtual objects and virtual scenes at substantially any desired depth in space. In an alternative example, holography-based techniques can be used to display images, virtual objects and virtual scenes at a desired depth in space with few or no added optical elements.
[0028] In an alternative example of implementation and operation, a holographic lens function can be used to change the depth for virtual media content from infinity to a desired depth in a focus space. In a related example, virtual media content can be computed for display at a relative depth of infinity, followed by application of a holographic lens function to change the display depth for virtual media content for display at a desired perception depth. In an example, a holographic lens function can be used to provide a virtual image for overlay at substantially any depth in the focus space. In an additional example, a holographic lens function can be adapted to provide virtual media content to overlay at multiple depth planes, including providing full 3D scenes. In an example, two-dimensional content can be displayed on multiple depth planes simultaneously, where the relative distance between depth planes can be small enough that a hypothetical viewer can aggregate individual two-dimensional content on each depth of multiple depth planes for perceiving three dimensional (3D) virtual objects and/or three dimensional (3D) virtual scenes. In an example of implementation, a holographic processor can be adapted for use in an augmented reality device, such as any of the augmented reality devices disclosed herein, to execute one or more holographic lens functions, as part of the computation process of hologram patterns that are to be rendered on the one or more spatial light modulators of the augmented reality device for displaying images, virtual objects and virtual scenes at a desired depth in space with the augmented reality device.
[0029] In an example of implementation and operation applicable to one or more examples illustrated herein, an augmented reality device, such as augmented reality device 182, can be configured to provide a free space beam path along with an optical combiner element. In an example, an optical combiner element can be any of a semi-transparent angle selective combiner, angle-selective semi-transparent mirror or a beam splitter. In an example, an optical combiner element can be a holographic optical element or a metasurface. In an example, a semi-transparent angle- selective combiner can be an optical component configured to allow light to pass at specific angles, while reflecting or blocking light at other angles. In another specific example, the semi-transparent angle-selective combiner can be used to selectively transmit or reflect light based on the incident angle of the incoming light rays. In yet another specific example of implementation, a semi-transparent angle-selective combiner can be integrated as part of an optical system of an augmented reality device, such as augmented reality device 182. In a related example, a semitransparent angle-selective combiner can be configured as a holographic optical element along with other optical elements, where the holographic optical element is configured as a diffractive optical element.
[0030] In another example of implementation relating to augmented reality devices, such as augmented reality glasses, one or more optical combiner elements can be configured as an element on the surface of and/or in the lenses of the augmented reality glasses. In yet another example of implementation, optical combiner elements can include one or more semi-transparent, reflective coatings adapted for implementation on the surface of and/or in the lenses of augmented reality glasses.
[0031] In an example, one or more spatial light modulators adapted for use in an augmented reality device, such as any of the augmented reality devices disclosed herein, can be configured to modulate phase, amplitude and/or polarization of an incident light beam. In an example, one or more of the spatial light modulators can be adapted to modulate phase of an incident light beam. In another example, one or more of the spatial light modulators can be adapted to modulate amplitude of an incident light beam. In yet another example, one or more of the spatial light modulators can be adapted to provide both phase and amplitude modulation of an incident light beam. In yet another related example, one or more spatial light modulators can be adapted for use in an augmented reality device, such as the augmented reality devices disclosed herein, to function as a holographic display. An example implementation of a holographic display system includes an input plane, corresponding to a plane at which the hologram pattern is displayed and an output plane, corresponding to a plane at which a hypothetical viewer’s eye box is located for viewing of images, virtual objects and/or virtual scenes. In a example of implementation, a first lens group, with focal length fl, can be positioned between an input plane and an output plane, at a distance fl from the input plane, where the first lens group can be adapted to convert a desired hologram pattern from spatial domain to spatial frequency domain, by converting spatial information at the input plane into frequency components at an intermediate plane, referred to as Fourier plane. In a related example, a holographic display system can be configured to include a second lens group, with focal length f2, where the second lens group can be positioned between the input plane and the output plane at a distance f2 from the output plane. In an example, the distance between the first lens group and the second lens group is equal to fl + f2, wherewith the intermediate plane, also referred to as Fourier plane, is located between the first and second lens group, at a distance fl from the first lens group and a distance f2 from the second lens group. In a related example, the second lens group can be adapted to convert the frequency components of the hologram pattern at the intermediate plane back to spatial information at the output plane. In yet another related example, the first lens group and the second lens group can form a 4f optical system. In an example of implementation and operation, an optical module, such as optical module 100 in FIG. 11, comprises one or more optical elements forming a first lens group of a 4f optical system and one or more optical elements, that together with one or more optical elements external to the optical module, forms the second lens group of a 4f optical system. The one or more optical elements external to the optical module and being part of the second lens group of a 4f optical system can comprise one or more optical combiner element, such as the optical combiner elements disclosed herein. In example augmented reality glasses, the one or more optical combiner elements as part of the second lens group of a 4f optical system can be configured as any of a holographic optical element or a meta-surface and being adapted to the lenses of the augmented reality glasses. In an example of implementation, one or more filter elements are adapted to the optical module for filtering out unwanted signal components, such as noise and/ at the conjugate image. In a related example, one or more filter elements are placed at the intermediate plane of the 4f optical system, which is formed by the first lens group and the second lens group. In an example, both the first and the second lens group may comprise one or a combination of optical elements, where the optical elements are not restricted to lenses only but can be any optical element including lenses and/or (free-form) mirrors. In an example, an optical module, such as optical module 100 of FIG. 2, may have any of following functions: 1) relay the image at the input plane, which corresponds to the plane where the hologram pattern is displayed, to another plane in space; 2) apply magnification or demagnification to the image at the input plane, in order for the image at the output plane to be magnified or demagnified respectively; 3) filter out unwanted optical signals, including noise and/or the conjugate image.
[0032] In a specific example of implementation and operation, an augmented reality device, such as any of the augmented reality devices disclosed herein, can be adapted to provide reduction techniques for granular interference patterns (speckle) associated with the augmented reality device. In an example, coherence associated with one or more illumination sources in an optical module can introduce granular interference patterns (speckle) that can degrade the relative quality of virtual media content for display. In an example of implementation, an optical module of a given augmented reality device can be adapted to include one or more of a variety of techniques for mitigating the effects of speckle. Example speckle reduction techniques include: modifying the statistical properties of laser illumination sources using depolarization techniques, such as polarization flipping, to introduce controlled amounts of spatiaFtemporal coherence, or employing multiple laser sources with different characteristics; modulating the frequency or wavelength of laser illumination sources (laser chirping) to decorrelate speckle sources at different wavelengths; provide random phase modulation to the optical system to dismpt the coherent nature of the laser illumination sources by using, for example, vibrating or rotating diffusers, or by employing spatial light modulators to introduce random phase variations; rapid scanning of one or more of the laser illumination sources or the holographic imager, so that different speckle patterns are sampled over a temporal range; adding diffusers in the optical path to scatter laser illumination sources in varying directions to reduce the coherence of the light to reduce the visibility of speckle patterns; and providing a deformable mirror to introduce time-variable (temporal) random phase shifts in an illumination source output.
[0033] In another example of implementation and operation, applicable to one or more examples provided herein, an augmented reality device, such as augmented reality device 182, can be configured with a spatially -varying pattern of color filters to provide multi-color virtual media content. In an example, an array of color filters can be formed above a top surface of one or more spatial light modulators in an augmented reality device, such as augmented reality device 182, for example where the top surface of the one or more spatial light modulator comprises light modulating elements. In a particular example, a color filter array comprises a set of subareas, with each subarea adapted to be one of: 1) transparent as to red light and blocking/absorptive as to green and blue light; 2) transparent as to green light and blocking/absorptive as to red and blue light; or 3) transparent as to blue light and blocking/absorptive as to red and green light. In an example, a color filter array can be adapted for displaying multi-color virtual media content using a single spatial light modulator integrated circuit (IC).
[0034] In another related implementation, the optical system associated with an augmented reality device, such as augmented reality device 182, can be configured to include a plurality of spatial light modulators potentially implemented as integrated circuits, where each spatial light modulator can be associated with n a separate color channel, such as the channels used in a red, green, blue (RGB) color model. In an example related to an RGB color model, a first set of one or more spatial light modulators can be adapted to interact with red light only, a second set of one or more spatial light modulators can be adapted to interact with green light only and finally a third set of one or more spatial light modulator can be adapted to interact with blue light only, with a combined output from the first, second and third set of one or more spatial light modulators configured to generate multi-color virtual media content for display. In another related example, time-multiplexing is used to generate multi-color virtual media content for display, whereby one or more spatial light modulators are sequentially illuminated with for example, red, green and blue light.
[0035] An example method for displaying virtual media content begins by receiving data representative of an image, a three dimensional (3D) object and/or a three dimensional (3D) scene, where one or more holographic processors, such as the holographic processors disclosed herein, are configured to compute hologram patterns based on received data and provide the computed hologram patterns to one or more spatial light modulators for rendering on the one or more spatial light modulators. An example optical system, such as an optical system for an augmented reality device, can be configured to apply techniques for minimizing and/or attenuating granular interference patterns (speckle) by at least one of modulating a wavelength of a laser illumination source, depolarizing the laser illumination source, randomly modulating a phase of the laser illumination source, or diffusing the laser illumination source.
[0036] FIG. 2 is a logic diagram of an example method for generating a hologram pattern. The method begins at step 200, with an augmented reality device receiving data representing 2D media content for display. In a related example, the 2D media content can be intended for display at a predetermined depth plane in space. In a specific example, data representing 2D media content for display can include information, such as metadata, indicating a desired display depth for to-be-displayed 2D media content. In an alternative example, a display depth for to-be- displayed 2D media content can be determined by an element associated with an augmented reality device. In yet another specific example, a user can determine a desired display depth for to-be-displayed 2D media content. The method continues at step 201, with random phase being applied to data representing 2D media content. At step 202, the method continues, by generating a hologram pattern for display at a substantially infinite depth. The method then continues at step 204, where a mathematical lens function (holographic lens function) can be applied to the previously determined hologram pattern to change the relative display depth of a to-be-displayed image to a previously determined desired display depth. The method then continues at step 206, with an aberrations correction function being applied to the hologram pattern, as changed at step 204, to correct for distortion(s) introduced by optical elements, such as optical elements associated with the optical module and/or optical elements located outside the optical module. In a related example, an aberrations correction function or another function can be used to correct for distortion(s) associated with a hypothetical user viewing the 2D media content, including but not limited to correcting distortion related to a refractive error (such as, for example, astigmatism) of a hypothetical viewer to accommodate vision correction. In a related example, the prescription correction of a user can be an input parameter to the method of FIG. 2 or can be an input parameter to a holographic processor in order to adapt the hologram pattern based on the prescription correction. In an example of implementation and operation, the mathematical lens function of step 204 and the aberrations correction function of step 206 can be combined in a single mathematical function and applied as a single step to the hologram pattern as provided by step 202. The method then continues at step 208, by using a quantization method (e.g. error diffusion, mask-based dithering, or other) to quantize the hologram pattern in accordance with different optical states for light modulating elements of one or more spatial light modulator devices and finally, at step 210, a completed hologram pattern can be rendered on one or more spatial light modulator devices, configured for displaying the 2D media content at the desired display depth in the focus space. In an example of implementation and operation, a holographic processor, such as any of the holographic processors disclosed herein, are configured to execute the example method for generating a hologram pattern presented in FIG. 2, with the holographic processor adapted to an augmented reality device, such as augmented reality glasses.
[0037] An example method for displaying an image with a viewing device, such as an augmented reality device, begins by receiving data representative of virtual media content and a predetermined focus depth for the to-be- displayed virtual media content. The method continues by generating a preliminary hologram pattern from the data for display at an infinite focus depth and applying a mathematical lens function (holographic lens function) to the preliminary hologram pattern, in order to move the depth of the to-be-displayed image from infinite depth to the predetermined focus depth. In an example, the mathematical lens function can be adapted to also correct for distortion(s) introduced by optical elements and/or to also correct distortion related to a refractive error (such as, for example, astigmatism) of a hypothetical viewer to accommodate vision correction. In an example, the hologram pattern can be quantized using a quantization technique, such as, for example, an error diffusion or mask-based dithering technique. Finally, the method continues by rendering the quantized hologram pattern on one or more spatial light modulators configured for displaying the virtual media content.
[0038] FIG. 3 illustrates the use of hologram replication in a holographic display system. In an example, one or more spatial light modulator devices can be used as holographic display(s) in an optical module. In an example, a spatial light modulator configured as holographic display can require a relatively large array of light modulating elements to provide acceptable and/or optimal resolution for viewing. In the example, spatial light modulator devices used as holographic displays can require significantly more pixels (light modulating elements), as compared to a traditional two-dimensional (2D) display, because multiple pixels in the spatial light modulator (potentially configured as (part of) an integrated circuit), can be required for the creation of a single voxel, where a voxel refers to a volume element analogous to a pixel (picture element) in 2D images. In an example, a holographic display system can require more than a 1 : 1 ratio for pixels (light modulating elements) to relative resolution of to-be-displayed content as compared to a traditional two-dimensional (2D) display system. In an example of implementation, the pixel pitch for a spatial light modulator configured as holographic display can be adapted to be equal to or smaller than a wavelength of visible light, or equal to or smaller than half a wavelength of visible light. In a specific related example, an array of 16k x 16k light modulating elements can be implemented on a 4x4mm2 silicon area configured as a spatial light modulator. In yet another specific example, adding a larger number of pixels (light modulating elements) in a spatial light modulator configured as holographic display can significantly increase computation requirements for processing a hologram patterns (interference patterns).
[0039] In an example of implementation and operation, a holographic display system can be adapted to include a mode of operation wherein an array of light modulating elements of one or more spatial light modulator devices used as holographic displays can be divided to provide multiple subarrays of light modulating elements, enabling a same or similar hologram pattern to be rendered on each of the subarrays. In an example, the use of subarrays can be used to facilitate a reduction in computation requirements for the holographic display system, by computing a hologram pattern for a portion of the total number of pixels (light modulating elements), with the hologram pattern computed for the portion (a subarray) being replicated onto each of the multiple subarrays. In an example, computing hologram patterns using a subarray of the one or more spatial light modulators and replicating it to the other subarrays can reduce computational requirements for the larger array, while also increasing contrast ratio for displayed virtual media content. In an example of implementation and operation, a viewing device, such as the augmented reality devices disclosed herein, can be adapted to enable a mode of operation where a hologram pattern for a single subarray of a multitude of subarrays of one or more spatial light modulators can be computed by a holographic processor, with the hologram pattern, as calculated for the single subarray, can be replicated over the remaining subarrays of the multitude of subarrays.
[0040] In a specific related example, an array of light modulating elements for one or more spatial light modulators configured as holographic display(s) can be divided into four quarters or four subarrays. In the example, a hologram pattern can be computed for only a quarter (one of four subarrays) of the array of light modulating elements, with the hologram pattern then displayed in each of the four quarters using the hologram pattern as computed for only a quarter. In an example, replicating a hologram pattern, computed for a subarray of an array of light modulating element of one or more spatial light modulators configured as holographic display(s), over the entire array of light modulating elements of the one or more spatial light modulators configured as holographic display(s) can enable virtual media content with relatively lower resolution and relatively higher contrast ratio than computing a hologram pattern for a full array of light modulating elements, while facilitating a lower relative compute requirement.
[0041] FIG. 4 illustrates a three-dimensional optical lens assembly that includes a first lens group (lens group 310) and a second lens group (lens group 302) in an optical relay system 300. In an example, an optical relay system can use a set of optical components (such as, for example, lenses, free form mirrors, etc.) to relay an optical signal from one point to another with minimal distortion or loss of quality. In an example, a 4f optical relay system includes two lens groups, a first with focal length fl (lens group 310) and a second with focal length f2 (lens group 302), where the first and second lens group are separated from each other a distance of (approximately) fl + f2, where fl is the focal length of lens group 310 and f2 is the focal length of lens group 302. The first lens group (lens group 310) and the second lens group (lens group 302) of a 4f optical relay system can comprise different optical components, including but not limited to, lenses and free form mirrors. In an example, optical relay system 300 includes an input plane where a hologram pattern 308 is located. In an example, lens group 310 is positioned at a distance of (approximately) fl from the plane of hologram pattern 308. In an example, lens group 310 “performs” a first Fourier transform, converting hologram pattern 308 from spatial domain to the spatial frequency domain. In an example, the space between lens group 310 and lens group 302 includes an intermediate plane, such as a Fourier plane 304 where the spatial frequencies of the converted hologram pattern 308 are represented. In an example, an intermediate plane, such as Fourier plane 304, can be located in between the first lens group (lens group 310) and the second lens group (lens group 302), with a distance fl from the first lens group (lens group 310) and a distance f2 from the second lens group (lens group 302). In an example, lens group 302, is placed (approximately) a distance f2 from the intermediate plane, such as Fourier plane 304, “performs” a second Fourier transform, converting the signal from spatial frequency domain back to spatial domain. Finally, a processed or transformed signal can be observed at an output plane (for example a plane where an eyebox, such as eyebox 306, can be located).
[0042] In an example of implementation and operation, an optical lens assembly, such as optical lens assembly 300, includes a spatial filter, used at an intermediate plane, such as Fourier plane 304, formed by a 4f optical relay system, to effectively perform at least one of filtering out noise, such as quantization noise introduced by the discrete number of optical states of the light modulating elements of the spatial light modulator(s) and/or “hiding” a conjugate image. In another related example, a spatial filter can be placed in a 4f optical relay system, such as optical lens assembly 300, at a plane different from the intermediate plane (Fourier plane 304). In an example, an optical module of a viewing device can be configured to provide a 4nf optical relay system, with n = 1, 2, 3, configured with optical components, selected from a group of optical components including but not limited to lenses and/or (free form) mirrors, etc. In a clarifying example, a 4nf optical relay system includes the formation of an intermediate plane, such as Fourier plane 304, where noise, such as quantization noise, can be placed in a region outside a desired signal window and can be filtered out and/or where a conjugate image can be fdtered out. In the example of FIG. 9, the 4nf system is illustrated with n equal to 1.
[0043] FIG. 5A is a logic diagram of an example method for displaying three-dimensional (3D) objects and/or three-dimensional scenes with a plurality of selectable depth planes in a focus space. The method begins at step 320, by receiving 3D object and/or 3D scene data, further referred to as 3D data, for display in a focus space and continues at step 322, by forming a set of two-dimensional (2D) layers from the 3D data. In an example, a 3D object or a 3D scene can be decomposed in a set of 2D layers, with the 2D layers being parallel to each other, by slicing up the 3D object or 3D scene at regular or irregular intervals, each slice representing a cross section of the 3D object or 3D scene; the slices forming a set of 2D layers. The method then continues at step 324, by applying random phase to each 2D layer of the set of 2D layers, representing the 3D data, and then at step 326, by generating a hologram pattern for each 2D layer of the set of 2D layers for display at infinite depth. In an example, a hologram pattern can be generated for each 2D layer of the set of 2D layers using a Fourier Transform function on the data, representing the 2D layer. The method continues at step 328 by applying a mathematical lens function (holographic lens function) to the hologram pattern of each 2D layer, as computed in step 326, to convert each 2D layer of the set of 2D layers from infinity to a desired depth in the focus space and then at step 330, by aggregating converted hologram patterns for the 2D layers of the set of 2D layers into a single hologram pattern. At step 332, the method applies an optical aberrations correction function to the single hologram pattern to correct for distortion(s) introduced by optical elements configured as part of an optical module or that are located outside an optical module. In an example, the optical aberrations correction function can be adapted to correct distortion related to a refractive error (such as, for example, astigmatism) of a hypothetical viewer to accommodate for vision correction. In a related example, the prescription correction of a user can be an input parameter to the method of FIG. 5 A, or can be an input parameter to a holographic processor in order to adapt the hologram pattern based on the prescription correction. In an alternative example, the mathematical lens function applied to each of the 2D layers of the set of 2D layers can be adapted to also correct for optical distortion(s) introduced by optical components and to accommodate for vision correction, allowing step 332 to be eliminated. The method continues at step 334, by applying a quantization method, such as, for example, error diffusion or mask-based quantization, etc. Finally, at step 336, the method renders a completed hologram pattern on a spatial light modulator, where the spatial light modulator can be adapted to display the 3D object and/or the 3D scene, associated with the 3D data, in a visually perceptible form, based on the completed hologram pattern. In an example, when data to be displayed comprises 2D images at predetermined depth planes in a focus space, the method of FIG. 5 A can be used, where step 322 can be eliminated. In another example, a holographic processor, such as the holographic processors disclosed herein, can be configured to execute the method of FIG. 5A.
[0044] FIG. 5B is a logic diagram of an example method for displaying a two-dimensional (2D) image at a selectable depth plane in a focus space. The method begins at step 340, by receiving data, such as data that includes color information for each pixel in an image and a desired display depth for the image (RGBD data), representative of a two-dimensional (2D) image for display at a predetermined depth in a focus space and continues at step 341, by applying random phase to the data at step 341. At step 342, the method can generate a hologram pattern from the data as if the 2D image is displayed at infinite depth, with the size of the hologram pattern corresponding to a subarea of a spatial light modulator used to render hologram patterns. The method continues at step 344 by replicating the hologram pattern, generated for a subarea of the spatial light modulator, into the full area of the spatial light modulator and continues at step 346 by applying a mathematical lens function (holographic lens function) to convert the image- to-be-displayed from infinite depth to a desired predetermined depth in the focus space. At step 348, the method can be used to correct the hologram pattern for distortion(s) introduced by optical elements, configured as part of an associated optical module, or located outside the optical module, using an aberration correction function. In a related example, an aberration correction function can be adapted to accommodate for vision correction associated with a particular hypothetical user viewing the two-dimensional (2D) image. In a related example, the prescription correction of a user canbe an input parameter to the method of FIG. 5B, or can be an input parameter to a holographic processor in order to adapt the hologram pattern based on the prescription correction. In another example, the mathematical lens function and the aberrations correction function are implemented as one function executed in a single step. At step 350, the method applies a quantization method, such as, for example, error diffusion or mask-based quantization, to the hologram pattern of step 348. Finally, at step 352, the method renders the completed hologram pattern on a spatial light modulator, where the spatial light modulator can be adapted to display the two-dimensional (2D) image at a predetermined depth in the focus space in a visually perceptible form, based on the completed hologram pattern. In an example, a holographic processor, such as the holographic processors disclosed herein, can be configured to execute the method of FIG. 5B.
[0045] FIG. 5C is a logic diagram of an example method for displaying three-dimensional (3D) objects and/or three-dimensional (3D) scenes with a plurality of selectable depth planes in a focus space. The method begins at step 360, by receiving 3D object and/or 3D scene data, further referred to as 3D data, for display in the focus space and continues at step 362, by forming a set of two-dimensional (2D) layers from the 3D data. In an example, a 3D object or a 3D scene can be decomposed into a set of 2D parallel layers, by slicing up the 3D object or 3D scene at regular or irregular intervals, each slice representing a cross section of the 3D object or 3D scene, the slices forming the set of 2D layers. The method then continues at step 364, by applying random phase to each 2D layer of the set of 2D layers, representing the 3D data, and then at step 366, by generating a hologram pattern for each 2D layer of the set of 2D layers for display at infinite depth, with the size of each hologram pattern corresponding to a subarea of a spatial light modulator. The method continues at step 368 by replicating the hologram pattern for each 2D layer of the set of 2D layers, with a size equal to a subarea of a spatial light modulator, into the full area of the spatial light modulator to provide a set of second hologram patterns and then at step 370 by applying a mathematical lens function (holographic lens function) to each second hologram pattern of the set of second hologram patterns to generate a set of third hologram patterns with each third hologram pattern layer associated with a different desired depth. The method continues at step 372 by aggregating the set of third hologram patterns into a single aggregated hologram pattern and at step 374, where the method corrects the aggregated hologram pattern for distortion(s) introduced by optical elements configmed as part an optical module, or by optical elements located outside an optical module, using an optical aberration correction function. In an example, the optical aberration correction function can be adapted to correct for a refractive error (such as, for example, astigmatism) to accommodate vision correction of the viewer. In a related example, the prescription correction of a user can be an input parameter to the method of FIG. 5C, or can be an input parameter to a holographic processor in order to adapt a hologram pattern based on the prescription correction. In an alternative example, the mathematical lens function applied to each of the 2D layers of the set of 2D layers can be adapted to correct for optical distortions introduced by optical components and to accommodate for vision correction, eliminating step 374. Finally, at step 376 the method completes the hologram pattern using a quantization method, such as, for example, error diffusion or mask-based quantization, etc. and at step 378, the method renders a completed hologram pattern on a spatial light modulator, where the spatial light modulator can be adapted to display virtual content media in a visually perceptible form, based on the completed hologram pattern. In an example, when data to be displayed comprises 2D images for display at predetermined depth planes in the focus space, the method of FIG. 5C can be used, with step 362 being eliminated. In an example, a holographic processor, such as the holographic processors disclosed herein, can be configured to execute the method of FIG. 5C.
[0046] FIG. 6A is a logic diagram of an example method for executing a visual search in an augmented reality system, such as the augmented reality devices disclosed herein. The method begins at step 400, with the augmented reality system receiving a visual search request. In an example, the visual search request can be received from a user. In an alternative example, the visual search request can be received from one or more of a third party, from metadata associated with the augmented reality system or from another source. At step 402, the method continues with the augmented reality system capturing a scene/environment. In an example, the augmented reality system can include one or more front-facing cameras (i.e. the cameras capture a user’s view of the scene) to capture the scene.
[0047] At step 404, the augmented reality system segments the scene into distinct elements and/or objects. In an example of implementation, the augmented reality system includes one or more computer vision algorithms adapted to segment the captured scene image into elements and/or objects. In another implementation example, one or more scene images captured by the one or more front-facing cameras can be transmitted over a wireless link, such as, for example, over bluetooth or Wi-Fi, to another (mobile) electronic device such as to a smartphone, smart watch or laptop, enabling an external computer vision algorithm to be used to segment scene images into distinct elements and/or objects. In an example, segmented scene images can be transmitted back for use in the augmented reality system. The method continues at step 406, where the augmented reality system determines a depth for one or more objects of interest in the scene. In an example, the augmented reality system can be implemented with one or more depth sensors adapted to capture a depth or a distance of the object of interest relative to a user of the augmented reality system. In a related implementation example, an augmented reality system includes one or more gaze or eye tracking sensors adapted to determine a direction of a hypothetical user’s gaze, in order to define the object of interest. In an alternative example of implementation, an augmented reality system can be implemented without gaze or eye tracking. In an example, an object in the center of the field of view of a user will be determined to be the object of interest.
[0048] At step 408, the augmented reality system outlines the one or more objects of interest with an augmented reality overlay. In an example, the augmented reality overlay can be adapted for display at the same depth as the one or more objects of interest. At step 410, the user, or the augmented reality system determines whether the object of interest outlined in step 408 is the desired object of interest and when the desired object of interest has been identified, the method continues at step 412, with object related sensor data being transmitted to a search engine. In an example, the user can indicate by one or more of a gesture, an audible sound, toggling a button, a touch or swipe on a touch screen, and/or a keystroke on a keypad that the desired object of interest has been identified. In an example, the search engine can be located in another location, with the sensor data transmitted using a wireless link, such as, for example, bluetooth or Wi-Fi. In an alternative example, the search engine can be in a mobile device located relatively close to the augmented reality system. In yet another alternative example, the search engine can comprise an artificial intelligence engine implemented as part of the augmented reality system, implemented in a mobile device or implemented at a third party. In another example, the augmented reality system can be connected to another (mobile) electronic device, such as a smartphone, smart watch, mobile computer or desktop computer, with a cable, over which data can be transmitted between the augmented reality system and the (mobile) electronic device.
[0049] At step 414, reverse image search results are received. In an example, the reverse image search results are received at the augmented reality system. In an alternative example, the reverse image search results are received at a mobile device located in close proximity to the augmented reality system. In an example of implementation, reverse image search results can include one or more of contextual information for use by the user, metadata or a Uniform Resource Locator (URL) relating to the desired object of interest. Finally, at step 416 the augmented reality system displays the results using an augmented reality overlay at the sensed depth of the desired object of interest. In an example, contextual information can be provided with the augmented reality overlay at the sensed depth of the desired object of interest, so that both the desired object of interest and the contextual information are in focus.
[0050] Another example method begins by receiving a search request, capturing a scene to provide a captured scene and segmenting the captured scene into a plurality of elements and/or objects. The method continues by determining a depth for an object in the scene and outlining the object with an overlay. The method then continues by determining if the object is an object of interest and in response to a determination that the object is the object of interest, transmitting sensor data associated with the scene to a third party for a reverse image search. Finally, the method continues by receiving, from the third party, search results and displaying information representative of the results using an overlay at the depth of the object in the focus space.
[0051] FIG. 6B is a logic diagram of an example method for executing a visual search in an augmented reality system, such as the augmented reality devices disclosed herein. The method begins at step 500, with the augmented reality system receiving a visual search request. In an example, the visual search request can be received from a user. In an alternative example, the visual search request can be received from one or more of a third party, from metadata associated with the augmented reality system or from another source. At step 502, the method continues with the augmented reality system capturing a scene/environment. In an example, the augmented reality system can include one or more front-facing cameras (i.e. the cameras capture a user’s view of the scene) to capture the scene.
[0052] At step 504, the augmented reality system segments the scene into distinct elements and/or objects. In an example of implementation, the augmented reality system includes one or more computer vision algorithms adapted to segment the captured scene image into elements and/or objects. In another implementation example, one or more scene images captured by the one or more front-facing cameras can be transmitted over a wireless link, such as, for example, over Bluetooth or Wi-Fi, to another (mobile) device such as a smartphone, smart watch or laptop, enabling an external computer vision algorithm to be used to segment scene images into distinct elements and/or objects. In an example, segmented scene images can be transmitted back for use in the augmented reality system.
[0053] At step 506, the augmented reality system determines what a user is looking at. In an example of implementation, the augmented reality system includes one or more eye or gaze tracking sensors, along with eye or gaze tracking algorithms adapted to determine where a user is looking and/or what the user is looking at. The method continues at step 508, where the augmented reality system determines a depth for one or more objects of interest. In an example, the augmented reality system can be implemented with one or more depth sensors adapted to capture a depth or a distance of the object of interest relative to a user.
[0054] At step 518, the augmented reality system outlines the one or more objects of interest with an augmented reality overlay. In an example, the augmented reality overlay can be adapted for display at the same depth as the one or more objects of interest. In a related implementation example, an augmented reality system can be configured to include one or more gaze or eye tracking sensors adapted for determining a direction of a user’s gaze, in order to define the object of interest. In an alternative example of implementation, the augmented reality system can be implemented without any gaze or eye tracking. In an example, an object in the center of the field of view of a user will be determined to be the object of interest. At step 510, the user, or the augmented reality system determines whether the object of interest outlined in step 518 is the desired object of interest and when the desired object of interest has been identified, the method continues at step 512, with object related sensor data being transmitted to a search engine. In an example, the user can indicate by one or more of a gesture, an audible sound, toggling a button, a touch or swipe on a touch screen, and/or a keystroke on a keypad that the desired object of interest has been identified. In an example, the search engine can be located in another location, with the sensor data transmitted using a wireless link, such as, for example bluetooth or Wi-Fi. In an alternative example, the search engine can be in a mobile device located relatively close to the augmented reality system. In yet another alternative example, the search engine can comprise an artificial intelligence engine implemented as part of the augmented reality system, implemented in a mobile device or implemented at a third party. In another example, the augmented reality system can be connected to another (mobile) electronic device, such as a smartphone, smart watch, mobile computer or desktop computer, with a cable, over which data can be transmitted between the augmented reality system and the (mobile) electronic device.
[0055] At step 514, reverse image search results are received. In an example, the reverse image search results are received at the augmented reality system. In an alternative example, the reverse image search results are received at a mobile device located in close proximity to the augmented reality system. In an example of implementation, reverse image search results can include one or more of contextual information for use by the user, metadata or a Uniform Resource Locator (URL) relating to the desired object of interest. Finally, at step 516 the augmented reality system displays the results using an augmented reality overlay at the sensed depth of the desired object of interest. In an example, contextual information can be provided with the augmented reality overlay at the sensed depth of the desired object of interest, so that both the desired object of interest and the contextual information are in focus.
[0056] Another example method begins by receiving a search request, capturing a scene to provide a captured scene and segmenting the captured scene into a plurality of elements and/or objects. The method continues by identifying an object, wherein the identifying can be based on information representative of tracking a user’s gaze and/or eyes. The method then continues by determining if the object is an object of interest and in response to a determination that the object is the object of interest, transmitting sensor data associated with the scene to a third party for a reverse image search. Finally, the method continues by receiving, from the third party, search results and displaying information representative of the results using an overlay at the depth for the object in the focus space.
[0057] FIG. 7 is a schematic block diagram of an embodiment of a system for implementing augmented reality system 440, that includes an optical module 424, a holographic processor 438, with associated memory, application processor 436, a radio transceiver 428, camera input 430 and depth sensor 432, along with a power management unit 422 and one or more batteries 426 that together comprise power unit 420. In an example, holographic processor 438 and one or more spatial light modulator devices 442 can be adapted to compute and/or generate hologram patterns for projection and/or display on an associated augmented reality device, such as augmented reality system 440. Optical module 442 can be described in greater detail above. (See, for example, optical module 100 with reference to FIG. 11.)
[0058] In an example of implementation, additional sensors, in addition to the sensors provided above, can be adapted for and/or integrated in, an augmented reality device, such as augmented reality system 440. In a related example, an augmented reality device, such as augmented reality system 440, can be adapted to provide inputs for an additional sensor. In various example, additional sensors include, but are not limited to one or more of: [0059] a camera for capturing the environment around the user;
[0060] eye or gaze tracking sensors;
[0061] one or more microphones;
[0062] one or more speakers;
[0063] one or more haptic sensors;
[0064] a Global Positioning Systems (GPS) device;
[0065] one or more accelerometers;
[0066] one or more gyroscopes;
[0067] one or more magnetometers (compass) to provide compass functionality for navigation; one or more pedometers;
[0068] one or more inertial measurement units;
[0069] one or more ambient light sensors;
[0070] one or more thermometers and/or temperature sensors;
[0071] one or more humidity sensors;
[0072] one or more barometer sensors and/or altimeter sensors; and
[0073] one or more touch screens (or touch pads) (e.g. located on the periphery of AR glasses, such as on a temple. [0074] In various related examples, sensor data collected by an associated augmented reality device, such as augmented reality system 146, can be used for various functions including, but not limited to image sensing, depth sensing, audio capture, user attention (such as eye tracking), determining geographic location, local environmental factors (as determinable based on any of the sensors included above). In yet another related example, simultaneous localization and mapping (SLAM) can be used to construct and update a map of a user’s environment, while simultaneously tracking a user’s location within the environment, thereby providing yet another source of sensor information.
[0075] In an example of implementation, optical module 424 can be implemented with light source 444, where light source 444 can include one or more individual light sources adapted to provide illumination for the spatial light modulator(s) 442. In specific example of implementation, light source 444 can be adapted to provide light wavelengths in separate color channels, such as the channels used in the Red, Green, Blue (RGB) color model. In another specific example, of implementation, light source 444 can be implemented to cover a substantially full spectmm of wavelengths as, for example, a single white light emitter.
[0076] In a related example of operation and implementation, holographic processor 438 can be configured as one or more compute elements adapted to execute processor functions for computing diffraction (hologram) patterns for display on one or more spatial light modulators embedded in an associated augmented reality device, such as augmented reality system 440. In an example, diffraction patterns can include digital media content adapted to for observation by a user of augmented reality system 440. In a specific example, hologram patterns for display /projection can be generated based on generally understood functions, such as Computer Generated Hologram algorithms based on one or more of a Fresnel Transform-Based model, an Angular Spectmm model, a Point Source method, a Gerchberg-Saxton Algorithm, an Iterative Fourier Transform Algorithm (IFTA) or a Wavefront Propagation algorithm (such as, for example, wavelet transform, wavefront recording and reconstruction and kinoform). Additional methods can include using random-phase encoding/decoding techniques to represent complex scenes and/or deep learning techniques, such as convolutional neural networks.
[0077] Radio transceiver 428 can include one for more receiver units and/or transmitter units configured to enable an exchange of information between an augmented reality system, such as augmented reality system 440 and a wide area network, a local area network and/or a mobile electronic device, such as a smartphone, smart watch, mobile computer or desktop computer. In an example, power unit 420 can be provided with a power management unit 422 comprising one or more power management integrated circuits (PMICs), to manage and control power use by the separate elements of augmented reality system 440 to provide efficient power consumption and performance.
[0078] In an example of implementation, augmented reality system 440 can include one or more memory devices, with each memory device being comprised of, for example one or more of a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, a quantum register or other quantum memory and/or any other device that stores data in a non-transitory manner. In an additional example of implementation, augmented reality system 440 can include one or more control modules or control units configured for driving and/or synchronizing various functions associated with augmented reality system 440. In various examples, the one or more control units can be implemented using one or more application processing devices. In yet another example of implementation, optical module 424 can include one or more optical pre-path(s) and/or one or more optical post-path(s). In an example, optical pre-path(s) and/or optical post-path(s), together with light source 444 and spatial light modulator(s) 442 comprise optical module 424.
[0079] In an example, various components included in augmented reality system 440, such as power management unit 442, the control unit, holographic processor 438 and radio transceiver 428 can be implemented as individual components on one or more printed circuit boards. In an alternative example, the power management unit 422, the control unit, holographic processor 438 and radio transceiver 428 can be implemented on a single integrated circuit, or consolidated on a plurality of integrated circuits. Moreover, in yet another example, the various components of augmented reality system 440 can be adapted for implementation in the frame of an augmented reality device, such as in the temple (or temples) of the augmented reality device. In an example of operation, integration of some or all of the components of augmented reality system 440 can be used to reduce the overall footprint of augmented reality system 440, providing for a potentially more compact implementation, while potentially increasing performance and/or power efficiency. In another example of implementation, integrating some or all of the components of augmented reality system 440 can enable reduced manufacturing costs for augmented reality system 440, while providing for overall lower cost for an associated augmented reality viewing device. [0080] An example optical module for displaying images includes a first interface for interfacing with a network, a depth sensor, a second interface for interfacing with a camera, a light source, a spatial light modulator, a holographic processing module, a combiner mirror, memory; and a processing module operably coupled to the interface and to the memory. In an example, the processing module can be operable to receive an image of a scene via the second interface, determine a focus depth for an object using the depth sensor and transmit, via the first interface, information representative of the scene and the focus depth to a third party. In a continuing example, the processing module can be further operable to receive information representative of the object from the third party and provide the information representative of the object to the holographic processing module, where the holographic processing module can be operable to generate a hologram pattern based on the information representative of the obj ect and display the hologram pattern on the spatial light modulator, where the spatial tight modulator can be configmed to display digital media content based on the hologram pattern using the combiner mirror. In various examples, the combiner mirror can be replaced by another semi-transparent angle-selective combiner element implemented as an optical component and configured to allow light to pass at specific angles, while reflecting or blocking light at other angles. In a specific related example, the semi-transparent angle-selective combiner can be used to selectively transmit or reflect light based on the incident angle of the incoming light rays. In another example of implementation, final optical elements can include a semi-transparent, reflective coating adapted as part of the lenses of augmented reality glasses.
[0081] FIG. 8 is a schematic block diagram of an embodiment of a system for implementing augmented reality. In the example, augmented reality system 520 includes an optical module 526, a holographic processor 548, with associated memory, application processor 530, an artificial intelligence engine 534, a radio transceiver 532, camera input 550 and depth sensor 552, along with a power management unit 524 and one or more batteries 528 that together comprise power unit 522. In an example, optical module 526 can include one or more spatial tight modulator devices 542 that together are configured to project and/or display digital media content in an associated augmented reality device, such as augmented reality device 520, form hologram patterns displayed and/or rendered using one or more spatial light modulation elements. Optical module 526 can be described in greater detail herein. (See, for example, optical module 100 with reference to FIG. 11) In various specific examples, an associated augmented reality device, such as augmented reality system 520, can be configured to include additional sensors and/or inputs for additional sensors. In various example, additional sensors include, but are not limited to one or more of:
[0082] a camera for capturing the environment around the user;
[0083] eye or gaze tracking sensors;
[0084] one or more microphones;
[0085] one or more speakers;
[0086] one or more haptic sensors;
[0087] a Global Positioning Systems (GPS) device;
[0088] one or more accelerometers;
[0089] one or more gyroscopes;
[0090] one or more magnetometers (compass) to provide compass functionality for navigation; one or more pedometers;
[0091] one or more inertial measurement units;
[0092] one or more ambient tight sensors;
[0093] one or more thermometers and/or temperature sensors;
[0094] one or more humidity sensors;
[0095] one or more barometer sensors and/or altimeter sensors; and [0096] one or more touch screens (or touch pads) (e.g. located on the periphery of AR glasses, such as on a temple. [0097] In a related example of implementation and operation, sensors associated with an associated augmented reality device, such as augmented reality system 520, can be adapted to capture data for subsequent processing by an artificial intelligence engine, such as a neural network processor and/or an inference engine. In a specific example, an artificial intelligence engine can be trained for processing sensor data collected by an augmented reality device, such as augmented reality system 520, using the one or more of the sensors listed above. In an example, an artificial intelligence engine can be embedded in an associated augmented reality device, such as augmented reality system 520. In an alternative example, all or part of the sensor data collected by an augmented reality device can be transmitted over a wireless link via the World Wide Web for processing by a remote artificial intelligence engine for processing. In an example, classification results from remote artificial intelligence processing can be transmitted back to an augmented reality device and used, for example, for the display of contextual information to the user. In an example, contextual information can include an augmented reality overlay of one or more relevant objects to a user. In a related example of implementation and operation, processed sensor data associated with an augmented reality device, such as augmented reality system 520, can be used to provide additional contextual information for an image or object search.
[0098] In a related specific example of implementation and operation, sensor data from a plurality of augmented reality devices can be used to train an artificial intelligence engine to generate a trained (neural network) model. In various examples, the trained model can be packaged for transfer back to an augmented reality device, such as augmented reality system 520 using, for example, a standardized format, such as a TensorFlow SavedModel or ONNX (Open Neural Network Exchange). In another related example, trained model parameters can be serialized into a file or a set of files, using a format such as HDF5, JSON, or a custom binary format, with the artificial intelligence engine in an augmented reality device, such as augmented reality system 146 the used to deserialize and load the model for inference use by the augmented reality device.
[0099] In an additional example of implementation and operation, a trained (neural network) model can include training from a variety of sources (in addition to augmented reality devices). Examples of other sources include data sets from almost any relevant resource. In another example, a plurality of augmented reality systems can together be used as a crowd source trained model. In yet another example, the system of FIG. 8 can be used to provide additional functions, such as segmenting elements associated with a particular environment and encoding these elements as descriptors for parameterization with a goal of reducing the relative size of sensor data stores.
[0100] In an example of implementation, optical module 526 can be implemented with light source (or light sources) 544, where light source 544 can include one or more individual light sources adapted to provide illumination for the spatial light modulator(s) 542. In a specific example of implementation, light source 544 can be adapted to provide light wavelengths in separate color channels, such as the channels used in the Red, Green, Blue (RGB) color model. In another specific example, of implementation, light source 544 can be implemented to cover a substantially full spectmm of wavelengths as, for example, a single white light emitter.
[0101] In a related example of operation and implementation, holographic processor 438 can be configured as one or more compute elements adapted to execute processor functions for computing diffraction patterns for display and/or projection by one or more spatial light modulators 541 embedded in an augmented reality device, such as augmented reality system 440. In an example, diffraction patterns can include hologram patterns adapted to display and/or project digital media content for observation by a user of augmented reality system 440. In a specific example, hologram patterns for display /proj ection can be generated based on generally understood functions, such as Computer Generated Hologram algorithms based on one or more of a Fresnel Transform-Based model, an Angular Spectrum model, a Point Source method, a Gerchberg-Saxton Algorithm, an Iterative Fourier Transform Algorithm (IFTA) or a Wavefront Propagation algorithm (such as, for example, wavelet transform, wavefront recording and reconstruction and Kino form). Additional methods can include using random-phase encoding/decoding techniques to represent complex scenes and/or deep learning techniques, such as convolutional neural networks.
[0102] Radio transceiver 532 can include one or more receiver units and/or transmitter units configured to enable an exchange of information between an augmented reality system, such as augmented reality system 520 and a wide area network, a local area network and/or a mobile electronic device, such as a smartphone, smart watch, mobile computer or desktop computer. In an example, power unit 522 can be provided with a power management unit 524 comprising one or more power management integrated circuits (PMICs), to manage and control power use by the separate elements of augmented reality system 520, in order to provide efficient power consumption and performance. [0103] In an example of implementation, augmented reality system 520 can include one or more memory devices, with each memory device being comprised of, for example one or more of a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, a quantum register or other quantum memory and/or any other device that stores data in a non-transitory manner. In an additional example of implementation, augmented reality system 520 can include one or more control modules or control units configured for driving and/or synchronizing various functions associated with augmented reality system 520. In various examples, the one or more control units can be implemented using one or more application processing devices. In yet another example of implementation, optical module 526 can include one or more optical pre-path(s) and/or one or more optical post-path(s). In an example, optical pre-path(s) and/or optical post-path(s), together with light source(s) 544 and spatial light modulator(s) 542 comprise optical module 526.
[0104] In an example, various components included in augmented reality system 520, such as power unit 522, the control unit, holographic processor 548. memory and radio transceiver 532 can be implemented as individual components on one or more printed circuit boards. In an alternative example, the power unit 522, the control unit, holographic processor 548, memory and radio transceiver 532 can be implemented on a single integrated circuit or consolidated on a plurality of integrated circuits. Moreover, in yet another example, the various components of augmented reality system 520 can be adapted for implementation in the frame of an augmented reality device, such as in the temple (or temples) of the augmented reality device. In an example of operation, integrating some or all of the integrated circuits/components of augmented reality system 520 on a single integrated circuit (e.g. SoC) can reduce the footprint of augmented reality system 520, enabling compact implementations, while increasing performance and/or power efficiency. In another example of implementation, integrating some or all of the components of augmented reality system 520 on a single integrated circuit (e.g. SoC) can enable reduced manufacturing costs for augmented reality system 520, while providing enabling lower cost for an associated augmented reality viewing device.
[0105] An example optical module for displaying images includes a first interface for interfacing with a network, one or more sensors, a second interface for interfacing with a camera, a light source, a spatial light modulator, a holographic processing module, an artificial intelligence engine, a combiner mirror, memory; and a processing module operably coupled to the interface and to the memory. In an example, the processing module can be operable to receive an image of a scene via the second interface, receive sensor information from the one or more sensors and classify, by the artificial intelligence engine the scene based on the image of the scene and the sensor information to provide a classified result. In an example, the artificial intelligence engine can be operable to provide the classified result to the holographic processing module, where the holographic processing module can be operable to generate a hologram pattern based on the information representative of the object and display the hologram pattern on the spatial light modulator. In a related example, the spatial light modulator can be configured to display digital media content based on the hologram pattern using the combiner mirror. In various examples, the combiner mirror can be replaced by another semi-transparent angle-selective combiner element implemented as an optical component and configured to allow light to pass at specific angles, while reflecting orblocking light at other angles. In a specific related example, the semi-transparent angle-selective combiner can be used to selectively transmit or reflect light based on the incident angle of the incoming light rays. In another example of implementation, final optical elements can include a semitransparent, reflective coating adapted as part of the lenses of augmented reality glasses.
[0106] In various examples, hologram patterns can be represented in a binary format using a binarization process. In an example of operation, hologram patterns can be computed using computer algorithms, such as Computer- Generated Holography (CGH) algorithms. In a related example, hologram patterns can be adapted to align with a number of available optical states provided by light modulating elements of a spatial light modulator using a quantization method, enabling mapping of hologram patterns to a given spatial light modulator. In an example, the number of available optical states provided by light modulating elements of a spatial light modulator can be finite. In a related example, light modulating elements of a spatial light modulator can adopt two or more optical states, each state interacting differently with incident light. In another example, error diffusion can be used as a quantization method allowing to distribute quantization error for each pixel of a holographic display to neighboring pixels to minimize the impact of the quantization error on visual quality of the display. In a related example, the holographic display can be implemented using one or more spatial light modulators. Error diffusion algorithm types include, but are not limited to Floyd-Steinberg algorithm, Jarvis-Judice-Ninke algorithm, or a Stucki algorithm, each of which are adapted to define specific patterns for distributing the error to neighboring pixels of a holographic display. In a related example, a holographic processor, such as the holographic processors disclosed herein, can be adapted to execute an error diffusion algorithm on hologram patterns.
[0107] FIG. 9A illustrates an example dither mask for processing derived from a set of dot patterns, each dot pattern representing a different gray scale. Dither masks can be used in image processing to create an illusion of depth in images with a limited number of pixel states. In various relevant examples, a dither mask, such as dither mask 602 can be used in a dither mask process, such as dither mask process 602, to introduce a controlled form of noise into an image so that quantization errors appear to be relatively random, rather than structured. In an example of implementation and operation, an example dither mask can be used to quantize pixel values in a traditional 2D image. In an example, a small static dither mask (for example, 128 x 128 pixels) can be used to provide a set of threshold values for dithering (quantizing) a traditional 2D image, where the traditional 2D image has a size larger than 128 x 128 pixels. In an example, a dither mask for quantizing traditional 2D images can be derived from a set of dot patterns, each dot pattern representing a different gray scale, where the dot patterns for each of the desired gray scales are designed using simulated annealing to have blue noise properties (blue noise contains more energy at higher frequencies and less energy at lower frequencies, making it less noticeable for human vision). In an example of implementation, a dithering process can be configured for use in hardware by comparing pixel values of, for example, a traditional 2D image, to threshold values in a dither mask, enabling relatively low computational requirements. In an example, where dither can be enabled for two available states, an upper state and a lower state, when a pixel value of, for example, a traditional 2D image, is larger than a corresponding threshold value of the dither mask, such that the pixel is quantized to the upper state. In the example, when the pixel value of, for example a traditional 2D image, is smaller than the corresponding threshold value in the dither mask, the pixel is quantized to the lower state. In a related example, a smaller dither mask (for example, 128 x 128 pixels) can be tiled across an image, with the image having a size larger than the dither mask, enabling the dithering of a relatively large image with a smaller dither mask, with the advantage of a small memory size requirement for hardware implementation. In another related example, because dithering can be achieved by a comparison of pixel values to threshold values in a dither mask, each pixel can be taken in any order, enabling a more relaxed hardware design.
[0108] FIG. 9B illustrates the use of a dither mask on a traditional 2D input image, where the dither mask can be designed to have blue noise properties. In an example, dithering using a dither mask with blue noise properties can be suitable for traditional 2D images because the quantization noise can be moved into high spatial frequencies where it is easier for the human visual system to integrate. In an example, a given dither mask can be configured so that quantization noise for a 2D image can be moved into relatively high spatial frequencies, where it can be easier for a human visual system to integrate for visual interpretation. In a specific relevant example of operation, a dither mask can be adapted for use to quantize holograms. In a specific related example, while a dither mask used to quantize traditional 2D images incorporates blue-noise properties, a dither mask used to quantize hologram patterns can be designed to incorporate appropriate properties in frequency domain, where quantization noise can be moved to a region outside a desired signal window in frequency domain. Quantizing hologram patterns using a dither mask enables relatively low computational requirements in hardware, as the process can be a relatively simple comparison operation in hardware.
[0109] FIG. 9C illustrates the use of a dither mask designed with a desired signal window in frequency domain. The dither mask of FIG. 9C can be designed for quantizing hologram patterns, where quantization noise can be moved outside a desired signal window in frequency domain. In an example, a dither mask optimized to quantize hologram patterns can be nonoptimal for quantizing traditional 2D images, because a dither mask optimized to quantize hologram patterns does not incorporate blue noise properties.
[0110] FIG. 9D illustrates the use of a dither mask on a hologram pattern, where the dither mask can be designed with a desired signal window in frequency domain. In the example of FIG. 9D, a Fast Fourier Transform of a dithered hologram pattern includes a relatively well-defined signal window in frequency domain, where quantization noise can be largely moved outside the well-defined signal window. In an example, a dither mask designed to quantize hologram patterns can have a size requirement that can be substantially the same size as the hologram pattern to be quantized. In an example, a real-valued hologram pattern can be normalized so that each pixel of the real-valued hologram pattern maintains a value between -1 and 1 (with -1 and 1 included). In a related example, a dither mask can be provided to quantize the pixel values of a normalized real-valued hologram pattern, where each pixel of the normalized real-valued hologram pattern maintains a value between -1 and 1 (with -1 and 1 included), to either -1 or to 1, where the dither mask can be designed to provide a well-defined signal window in frequency domain with quantization noise largely moved outside the well-defined signal window. In an example, a dither mask optimized for quantizing hologram patterns has the same size as the hologram patterns to be quantized. In an example, an augmented reality device, such as the augmented reality devices as disclosed herein, are adapted with one or more mask based dithering methods for the dithering of hologram patterns that are to be rendered on one or more spatial light modulators that are integrated as part of the augmented reality device. In a related example, a dither mask can be precomputed and stored on one or more memory devices of the augmented reality device, where the dither mask can be designed to provide a well-defined signal window with quantization noise largely moved outside the well-defined signal window.
[0111] FIG. 9E illustrates an example use of a dither mask optimized for quantizing hologram patterns, where the size of the dither mask can be smaller than the size of the hologram patterns to be quantized. In an example, the pixels of a hologram pattern are quantized to either -1 or 1. In an example, a dither mask can be adapted to quantize the pixels of a hologram pattern to either -1 or 1, where the dither mask can be smaller than the hologram pattern to be quantized and where the dither mask can be designed to provide a relatively well-defined signal window in frequency domain with quantization noise largely moved outside the well-defined signal window in frequency domain. The threshold values in the dither mask maintain a value between -1 and 1 (with -1 and 1 included). In a related example, a second dither mask can be provided by taking for each location in the second dither mask the threshold value at the corresponding location in the first dither mask and reversing the sign. The second dither mask can be obtained from the first dither mask by applying a sign flip to each value in the first dither mask. In a related example, both the first and the second dither mask have an equal size smaller than the size of the to-be-quantized hologram pattern. In an example, a hologram pattern can be divided into four quadrants. Top left and bottom right quadrants are quantized by tiling the first dither mask across each of these quadrants. For top right and bottom left quadrants, a sign-flip is applied to each value in top right and bottom left quadrant, after which the second dither mask is tiled across the sign-flipped quadrants for quantization, resulting in a unfinished quantized hologram pattern, where a sign flip is applied to the unfinished hologram pattern to arrive to a finished quantized hologram pattern for top right and bottom left quadrant of the hologram pattern. In an example of implementation, where a dithering operation is implemented in quadrants so that a pixel value of a continuous real hologram Hr can be quantized to a binary hologram Hb pixel value, as illustrated in FIG. 9E, can result in acceptable quantization results on a hologram pattern.
[0112] FIG. 9F illustrates an example use of mask-based dithering to quantize a hologram pattern. In the example, a 4096 x 4096 hologram pattern can be quantized using a 256 x 256 dither mask, providing a well-defined signal window in frequency domain with quantization noise largely moved outside the well-defined signal window. In an example of implementation and operation, a memory device can be required for the dither mask (256 x 256 x 8 bits in this example), however the dithering operation requires almost no additional computation. In yet another related example, a comparison of values of a hologram pattern to threshold values in a dither mask can be processed in an arbitrary order.
[0113] An example method comprises receiving a hologram pattern, representing one of an image, a three- dimensional (3D) object, or a three-dimensional (3D) scene for display. In an example, the hologram pattern can be spatially divided into four subspace quadrants. In a related example, the hologram pattern can be quantized using a first and a second dither mask, where the values of the second dither mask are those of the first dither mask, but with the associated signs reversed. In an example, the first dither mask can be applied to two subspace quadrants of the four subspace quadrants, such as, for example, top left and bottom right subspace quadrants, while the second dither mask can be adapted for use with the remaining two subspace quadrants of the four subspace quadrants, such as, for example, top right and bottom left subspace quadrants, where the values in these two subspace quadrants are sign flipped before tiling the second dither mask across the two subspace quadrants resulting in an unfinished quantized hologram pattern for each of these two subspace quadrants. In an example, a sign flip can then be applied to each value of the unfinished quantized hologram of each of these two subspace quadrants to obtain a finished quantized hologram for all four quadrants. In a related example, each of the four subspace quadrants can be aligned to a common first axis and a common second axis, wherein the first axis and the second axis cross at a quadripoint of the four subspace quadrants.
[0114] FIG. 10 illustrates an example schematic block diagram of an embodiment of an ecosystem for implementing augmented reality. In an example, augmented reality module 104 includes an optical module 104-3 configured for displaying images, virtual objects and/or virtual scenes with an associated augmented reality device, a processor 104-2, such as a holographic processor, for processing images, three dimensional (3D) objects and three dimensional (3D) scenes for display using optical module 104-3 and a wireless transceiver 104-1 for enabling communication with wireless networks, such as Wide Area Network (WAN) 108. [0115] In an example of implementation, augmented reality module 104 can be adapted for use on an augmented reality device, such as augmented reality glasses. In a related example, augmented reality module 104 can be adapted to receive media, such as but not limited to one or more of images, partial images or audiovisual content for use with augmented reality glasses. In specific related, but non-limiting examples, augmented reality module 104 can be adapted to receive three dimensional (3D) data in the form of one or more point clouds, and/or red, green, blue & depth (RGBZ) datasets, and/or one or more 2D images, with each image having an associated depth value (RGBD). In a further related example, media, such as, for example, two dimensional (2D) images, three dimensional (3D) objects and/or three dimensional (3D) scenes, can be provided to the augmented reality module 104 from one or more of a third party resource, such as third party media resource 106, over a wireless network, where the wireless network can be one or more of a Wide Area Network, such as WAN 108, a Wireless Local Area Network (LAN), the World Wide Web or a cellular network.
[0116] In yet another related example, augmented reality module 104 can be wirelessly coupled to a mobile device, such as mobile device 102. In the example, mobile device 102 can be adapted to provide media, such as, for example, two dimensional (2D) images, three dimensional (3D) objects and/or three dimensional (3D) scenes, for augmented reality module 104 and/or to provide processing functionality for use with augmented reality module 104.
[0117] FIG. 11 illustrates an optical module for generating and displaying augmented reality content in a viewing device, such as augmented reality glasses. The term optical module and optical engine can be used interchangeably to refer to the same. In an example of implementation, optical module 100 provides a structural assembly for positioning various optical elements. In an example, an optical module can include one or more illumination sources, such as illuminator 110, configured to provide illumination for a spatial light modulator device, such as spatial light modulator 112. In a specific example, illuminator 110 can be a light source of a single predetermined wavelength. In an alternative example, illuminator 110 can be a light source configured to provide a limited range of wavelengths, where the limited range can include a plurality of wavelengths in a predetermined wavelength range. In another example, illuminator 110 can be adapted to provide a white light source, where the white light source can be a combination of visible wavelengths.
[0118] In yet another example, illuminator 110 can be adapted to provide light wavelengths in separate color channels, such as the channels used in the Red, Green, Blue (RGB) color model. Other example color models (in addition to RGB) include, but are not limited to 1) Cyan, Magenta, Yellow, (Key/Black) (CMY(K)); 2) Hue, Saturation, Value (or Brightness) (HSV); 3) Hue, Saturation, Lightness (HSL); 4) YCbCr which separates luminance (brightness) information (Y) from chrominance (color) information (Cb and Cr); 5) LAB (and variants), which can be described as a three component model with (L* (lightness), a* (green to red), and b* (blue to yellow)); and 6) the XYZ color model. In an alternative example, illuminator 110 can be adapted to provide light wavelengths according to color models incorporating 4 or more separate color channels.
[0119] In an example of implementation and operation, spatial light modulator 112 can be one or more spatial modulator devices implemented using one or more integrated circuits. In an example, one or more holographic processors can be configured to compute and/or generate hologram patterns (such as interference patterns) based on the execution of algorithmic models. In a related example, hologram patterns can be rendered on one or more spatial light modulators of spatial light modulator 112 for displaying, among others, images, virtual objects and/or virtual scenes with a viewing device, such as augmented reality glasses. In an example, spatial light modulator 112 can be configured to implement a pixel pitch close to or smaller than the wavelength of the light to be used with the spatial light modulator 112. In an example, light to be used with the spatial light modulator 112 can be visible light. In another specific example, the one or more spatial light modulator chips can be configured to implement a pixel pitch in the range of half a wavelength or less than half a wavelength of the light to be used with the spatial light modulator, which can be visible light in the example of augmented reality glasses. In an example of operation, a spatial light modulator with a pixel pitch close to or smaller than a wavelength of visible light can enable a relatively large field of view (FoV), whereas a pixel pitch larger than a wavelength of visible light can result in a reduced field of view (FoV). In the example, a smaller field of view (FoV) can result in a less than immersive experience for a viewing device user.
[0120] In an example of implementation and operation, optical module 100 can be configured to direct light emitted from illuminator 110 toward spatial light modulator 112. Example implementations can include one or more optical elements in the optical path between illuminator 110 and spatial light modulator 112, such as, for example, a collimator lens to provide collimated light to spatial light modulator 112. In an alternative example, optical module 100 can be configured to provide alignment for illumination provided by illuminator 110 relative to the spatial light modulator based on the physical position of illuminator 110 relative to the physical position of spatial light modulator 112. In another example of implementation and operation, optical module 100 can be configured to guide a wavefront generated by spatial light modulator 112 toward one or more optical elements located outside optical module 100, by the use of one or more optical elements, such as mirror 114, which are configured as part of optical module 100, where the one or more optical elements located outside optical module 112 can be used to direct wavefronts, generated by spatial light modulator 112 and transmitted via an optical path of light module 100, to a user's eyes in a perceptible form. Examples of the one or more optical elements located outside optical module 100 can include, but are not limited to, reflective and partially reflective optical elements, projection lenses and polarizing elements. In an example, the one or more optical elements located outside optical module 100 can be a single optical combiner enabling images, virtual objects and virtual scenes generated by optical module 100 to be overlaid onto a real world environment. In another example pertaining to augmented reality glasses, one or more optical elements are located outside optical module 100, with the optical elements configured to to steer a wavefront delivered by optical module 100 to a hypothetical user’s eye, where the optical elements are implemented as part of the augmented reality glasses lenses.
[0121] FIG. 12 provides an illustration of smart/augmented reality (AR) glasses adapted for overlaying holographic content on to a real-world environment. In an example, the smart glasses can be configured to render dynamic and/or full color holographic content in front of a user's eyes overlaid onto a real-world environment.
[0122] In an example of implementation, smart glasses can be configured with various example components, such as one or more illumination sources, one or more spatial light modulators, one or more optical subsystems and one or more computing elements / compute chip (such as holographic processor devices). In an example embodiment, smart glasses can be configured with any of one illumination source, one spatial light modulator, one optical subsystem and one compute element / compute chip (such as a holographic processor device) per eye. Accordingly, smart glasses can be configured to include two illumination sources, two spatial light modulators, two optical subsystems and two compute elements (holographic processor devices). Configurations where the number of illumination sources, the number of spatial light modulators, the number of optical subsystems and / or the number of compute chips (holographic processor chips) is more or less than two can also be possible. In a specific example, an optical module 122 can be configured to combine many of the elements described above in a single unit.
[0123] Example smart glasses can be adapted to include additional components, such as a control management subsystem, memory, a power management subsystem, one or more embedded batteries, and/or one or more connectors adapted for connection to an external battery. In another example, various components of the smart glasses of FIG. 12 can be incorporated in the frame of the smart glasses. [0124] In an example of implementation, a smart glasses configuration can include an illumination source, a spatial light modulator, an optical subsystem and a compute element (such as a holographic processor device) for each of a user’s eyes. In an example of implementation, an illumination source (e.g. miniaturized laser source with coherent light) can be adapted to provide illumination for a spatial light modulator. In a related example, a spatial light modulator comprises an array of individually programmable optical pixels adapted to generate dynamic wavefronts for displaying/projecting holographic content. In an example, a pixel can be sized to be half the wavelength of visible light or shorter. In an example, an optical pixel of the array can be adapted to interact with a portion of an incident light beam provided by an illumination source, where the amplitude and phase of a resultant light wave generated by an optical pixel can be dependent on a programming state of the optical pixel.
[0125] In an example of implementation, an optical pixel can be adapted to modulate the amplitude of a light wave it generates as a function of amplitude of a resultant incident light beam. In an example, an optical pixel can be adapted to modulate the phase of a light wave it generates as a function of the phase of a resultant incident light beam. In an example, an optical pixel can be adapted to modulate the amplitude and the phase of the light wave it generates as a function of the amplitude and phase of the incident light beam.
[0126] In a further example, individual light waves generated by each optical pixel can form wavefronts to display or project holographic content. In an example, wavefronts resulting from a spatial light modulator are provided to the optical subsystem. Example optical systems can be configured with a variety of elements, including combinations of lenses (e.g. pancake lenses, metal lenses, etc.), mirrors (e.g. freeform mirrors) and diffractive optical elements for (re)directing, filtering and/or magnifying/demagnilying static and/or dynamic wavefronts provided by a spatial light modulator. Ina specific example, a wavefront provided by an optical subsystem can be directed to a partially reflective mirror (or partially reflective coating) that can be configured for location in front of the user’ s eye and configured for use with the lenses (such as optical correction lenses) of a pair of smart glasses. In an example of operation, a partially reflective mirror / coating can be adapted to reflect static and/or dynamic wavefronts from an optical subsystem toward a user’s eye, so that the user’s eye can capture the wavefronts as holographic content.
[0127] In a specific example of implementation, a reflective mirror / coating (lens treatment 124) in front of a hypothetical user’s eye can be adapted to be partially reflective, so that incident light rays from a real-world environment are transmitted through the optical lenses for perception at the user’s eye. Accordingly, a user can thereby be enabled to see holographic content overlaid onto the real-world environment. In an alternative example, the reflective mirror can be adapted to be substantially reflective, so that substantially no light rays incident from the real- world environment are received at the user's eye. In the example, a user will only receive the holographic content without being overlaid on the real-world environment. In yet another alternative example, a mirror can be adapted to switch between a partially reflective state and completely reflective state. In a generally applicable example, optical lenses of smart glasses can be configured as corrective lenses, for example where a user would normally wear prescription glasses.
[0128] In various examples, compute chip(s) and/or holographic processor chip(s) can be adapted to execute(s) Computer Generated Holography (CGH) algorithms. In an example, holographic interference patterns, calculated by one or more compute chip(s) (such as holographic processor chips) executing Computer Generated Holography (CGH) algorithms, can be used to determine the programming of the optical pixels associated with a spatial light modulator. In an example, CGH algorithms can be used to compute the digital holograms for desired holographic content. In a related example, a control management subsystem can be adapted to receive and transmit digital data and/or to control subsystems and components for optimal interaction with each other. [0129] Example smart glasses can be configured to include a control management subsystem, memory, a power management subsystem, one or more embedded batteries, and/or one or more connectors for connection to external batteries. In an example, the control management subsystem receives and emits digital data and/or controls various smart glasses subsystems and components so that they interact with each other in a desired manner. In various examples, memory included in example smart glasses can be implemented using virtually any electrical storage technology. In a related example, pre-determined/pre-computed holographic information can be adapted for storage in the memory so that the information can be directly loaded onto a spatial light modulator without additional/excessive computation.
[0130] Example uses for smart glasses, such as those described above, include watching photos and videos, and / or reading text messages rendered for receipt at user’s eyes. Other example uses include enabling navigation with smart glasses, where information can be provided in front of a user’s eyes to assist the user with navigating (e.g. rendering street names or rendering arrows indicating the direction the user should follow to reach its end destination). Another example use includes a visual search that can be used if a user desires information pertaining to a physical thing in a real-world environment. In a related example, smart glasses can be adapted to indicate that a given user would like to receive information about a physical thing by, for example, looking at it. In the example, image recognition can be implemented to identify a physical thing the user can be looking at and then input for use in a search engine. In a further related example, information received from a search engine can be displayed for use at the user’s eyes. In an example, smart glasses could be configured with one or more cameras to enable capturing the real- world environment the user is looking at and eye-tracking to determine a direction the user is looking.
[0131] In an example of implementation, augmented reality glasses 120 are configured with an optical module 122 (such as optical module 100 referring to Fig. 11) coupled to one of the temples of augmented reality glasses 120. In the example of Fig. 3, optical module 122 can be configured to direct a wavefront at a lens (or lenses) of augmented reality glasses 120. In a specific example, the lens (or lenses) of augmented reality glasses 120 can be adapted with a “see through” combiner element with reflective and/or partially reflective properties, such as lens treatment 124, to combine “virtual media content” with a “real-world” scene, allowing a user to see both the virtual media content and the real-world scene simultaneously. In the various examples throughout this document, virtual media content can be defined as media, including but not limited to two dimensional (2D) images, three dimensional (3D) objects and three dimensional (3D) scenes, delivered by one or more spatial light modulators for visual perception. In some examples, virtual media content can require a display surface in order to be perceived by a hypothetical user of an augmented reality device, however “virtual media content” can be intended to encompass both the output of a spatial light modulator, such as the spatial light modulators illustrated in multiple FIGs. described herein and the output of a spatial light modulator on a display surface.
[0132] In a specific example, lens treatment 124, can be adapted to provide a partially reflective surface balancing the transmission and reflection of light allowing a user to view virtual media content, through reflection by lens treatment 124, overlaid onto a real-world scene transmitted through lens treatment 124. Example lens treatments include holographic optical element coatings or meta surface coatings; the coatings adapted to steer a wavefront delivered by an optical module toward one or both eyes of a hypothetical viewer for the viewer to perceive virtual media content, while still being see-through for perception of the real-world environment. Additional example lens treatments include: 1) dielectric coatings designed to enhance reflectivity at specific wavelengths; 2) dichroic coatings to selectively reflect or transmit light based on color; 3) beam splitter coatings are designed to split incident light into two components (reflecting one part while transmitting the other); 4) anti-reflective coatings to minimize unwanted glare or ghosting effects in the digital media content; 5) polarizing coatings used in combination with polarized light sources to reproduce digital media content; and 6) hybrid coatings combing a plurality coating types.
[0133] FIG. 13 is an example schematic block diagram of an embodiment of a system for implementing augmented reality that includes an optical module 136, a holographic processor 140, memory, a control unit, a data transceiver 138, as well as power unit 132 comprising management unit 134 and one or more batteries 130. In an example, holographic processor 140 can be configured to compute and/or generate hologram patterns to be rendered on the one or more spatial light modulators 142 to display virtual media content on an associated augmented reality device, such as augmented reality system 146. Optical module 136 is described in greater detail above. (See, for example, optical module 100 with reference to FIG. 2) and elsewhere herein.
[0134] In an example of implementation, optical module 136 can be implemented with illumination source 144, where illumination source 144 can include one or more individual light sources adapted to provide illumination for the spatial light modulator(s) 142. In a specific example of implementation, illumination source 144 can be adapted to provide light wavelengths in separate color channels, such as the channels used in the Red, Green, Blue (RGB) color model. In another specific example of implementation, illumination source 144 can be implemented to cover a substantially full spectrum of wavelengths as, for example, a single white light emitter.
[0135] In a related example of operation and implementation, holographic processor 140 can be configured as one or more compute elements, where the compute elements can be configured as integrated circuits, adapted to execute processing functions for computing hologram patterns for rendering on the one or more spatial light modulators 142 in order for the spatial light modulators to deliver virtual media content for display with an associated augmented reality device, such as augmented reality system 146. In a specific example, hologram patterns for rendering on one or more spatial light modulators 142 can be generated based on generally understood functions, such as Computer Generated Hologram algorithms based on one or more of a Fresnel Transform-Based model, an Angular Spectrum model, a Point Source method, a Gerchberg-Saxton Algorithm, an Iterative Fourier Transform Algorithm (IFTA) and/or a Wavefront Propagation algorithm (such as, for example, wavelet transform, wavefront recording and reconstruction and kinoform). Additional methods can include using random-phase encoding/decoding techniques to represent complex scenes and/or deep learning techniques, such as convolutional neural networks.
[0136] In a related example, virtual media content can be configured for display at a single depth plane selectable in space between a near eye position (relative to a user) and infinity. In an example, a holographic processor can be configured to compute hologram patterns from input media, with the input media comprising two dimensional (2D) and/or three dimensional (3D) data, where the holographic processor is adapted to execute one or more methods to correct for optical aberrations, such as optical aberrations introduced by the one or more optical elements configured as part of the optical module and/or located outside the optical module, one or more methods to accommodate for vision correction, and/or one or more methods for minimizing quantization noise in displayed virtual media content. Data transceiver 138 can include one or more receiver units and/or transmitter units, potentially implemented as one or more integrated circuits configured to enable an exchange of information between an augmented reality system, such as augmented reality system 146, and a wireless network, such as a wide area network (WAN) and/or a local area network (LAN) and/or between an augmented reality system, such as augmented reality system 146, and an (mobile) electronic device, such as a smartphone, smart watch, mobile computer or desktop computer. In an example, power unit 132 can be provided with a power management unit 134 comprising one or more power management integrated circuits (PMICs), to manage and control power use by augmented reality system 146, in order to provide relative efficient power consumption and performance of the system. [0137] In an example of implementation, augmented reality system 146 can include one or more memory devices, with the memory devices being comprised of, for example one or more of a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, a quantum register or other quantum memory and/or any other device that stores data in a non-transitory manner. In an example of implementation and operation, one or more pre-computed hologram patterns are stored on the one or more memory devices. In another example, one or more hologram patterns are computed external to augmented reality system 146 and transmitted to augmented reality system 146 over data transceiver 138 for storage on one or more memory devices of augmented reality system 146. In an additional example of implementation, augmented reality system 146 can include one or more control units or control modules, potentially implemented as one or more integrated circuits, configured for driving and/or synchronizing various augmented reality system 146 functions. In various examples, the one or more control units can be implemented using one or more application processing devices. In yet another example of implementation, optical module 136 can include one or more optical pre-path(s) and/or one or more optical post-path(s). In an example, optical pre-path(s) and/or optical post-path(s), together with illumination source 144 and spatial light modulator(s) 146 comprise optical module 136.
[0138] In an example, various components included in augmented reality system 146, such as power management unit 134, the control unit, holographic processor 140, memory and data transceiver 138 can be implemented as individual components on one or more printed circuit boards. In another example, various components of augmented reality system 146 can be adapted for implementation in the frame of an augmented reality device, such as in the temple (or temples) of augmented reality glasses. In an alternative example, some or all of the power management unit 134, the control unit, holographic processor 140, memory and data transceiver 138 can be implemented on a single electronic chip (such as a System on Chip or SoC), or consolidated on a plurality of electronic chips. In an example of operation, integration of some or all of the components of augmented reality system 146, including the power management unit 134, the control unit, holographic processor 140, memory and radio transceiver 138, on a single electronic chip (such as a System on Chip or SoC) can be used to, for example, reduce the overall footprint of augmented reality system 146 and/or to provide a potentially more compact implementation, all while potentially increasing performance and/or power efficiency. In another example of implementation, integrating some or all of the components of augmented reality system 146 on a SoC can enable reduced manufacturing costs for augmented reality system 146, enabling overall lower cost for an associated augmented reality viewing device.
[0139] An example augmented reality device includes an interface for interfacing with a network, an illumination source, a spatial light modulator, a holographic processor, an optical combiner element, memory and a processing module operably coupled to the interface and to the memory. In an example, the processing module can be operable to receive, via the interface, data, which can represent two dimensional (2D) images, three dimensional (3D) objects and/or three dimensional (3D) scenes, provide the data to the holographic processor, receive, from the holographic processor, a hologram pattern, provide the hologram pattern to the spatial light modulator for the hologram pattern to be rendered on the spatial light modulator, where data received via the interface can be perceived by a hypothetical user using the optical combiner element. In various examples, the optical combiner element can be a semi-transparent angle-selective combiner element implemented as an optical component and configured to allow light to pass at specific angles, while reflecting or blocking light at other angles. In a specific related example, the semi-transparent angle-selective combiner can be used to selectively transmit or reflect light based on the incident angle of the incoming light rays. In another example of implementation, the optical combiner element can include a semi-transparent, reflective coating adapted as part of the lenses of augmented reality glasses. [0140] In an example of implementation, an illumination source, a spatial light modulator and a holographic processor can be implemented in an optical module. In another example, the augmented reality device can be implemented as augmented reality glasses, with an optical module coupled to a temple of the augmented reality glasses.
[0141] In another example of implementation, a spatial light modulator has a respective top and a respective bottom surface, where the illumination source can be configured to direct light at the top surface of the spatial light modulator and where the top surface of the spatial light modulator can be overlayed with a spatially -varying pattern of color filters formed on the top surface. In an example of implementation, the spatially-varying pattern of color filters comprises red color filters, green color filters and blue color filters; wherein the red color filters are transparent to red light, but blocking/absorptive for green and blue light, wherein the green color filters are transparent to green light, but blocking/absorptive for red and blue light, and wherein the blue color filters are transparent to blue light, but blocking/absorptive for red and green light. In an example, the proportion of red, green and blue color fdters in the spatially -varying pattern of color filters can be equal. In another example, the proportion of red, green and blue color filters in the spatially -varying pattern of color filters can be unequal. In an example of implementation, the proportion of red color filters in the spatially -varying pattern of color filters can be 1/4, the proportion of green color filters can be 2/4 and the proportion of blue color filters can be 1/4. In an example, a higher proportion can be given to the green color filters in the spatially -varying pattern of color filters as the human visual system can be more sensitive to green. In another example of implementation, the proportion of red color filters in the spatially-varying pattern of color filters can be 2/6, the proportion of green color fdters can be 3/6 and the proportion of blue color fdters can be 1/6. In another example, the spatially -varying pattern of color filters comprises red color filters, green color filters transparent for a first wavelength of green light, green color fdters transparent for a second wavelength of green light different from the first wavelength of green light, and blue color filters, where the illumination source of the system can be able to deliver both the first and second wavelength of green light. In another example, the spatially -varying pattern of color filters can be adapted with color filters, wherein a first set of color filters can be transparent for a first wavelength of red light and a second set of color filters can be transparent for a second wavelength of red light different from the first wavelength of red light; and/or with color filters, wherein a first set of color filters can be transparent for a first wavelength of blue light and a second set of color filters can be transparent for a second wavelength of blue light different from the first wavelength of blue light. In all examples herein, the proportion of the different sets of color filters in the spatially -varying pattern of color filters can be equal or unequal.
[0142] In another example of implementation and operation, an optical module for displaying virtual media content on a viewing device comprises an interface, an illumination source, and a wireless transceiver configured to receive data representing two dimensional (2D) images, three dimensional (3D) objects and/or three dimensional (3D) scenes, and/or hologram patterns for rendering on one or more spatial tight modulators for displaying of virtual media content by the optical module. In an example the optical module includes a spatial tight modulator and one or more holographic processors, where the illumination source can be configured to direct illumination to the spatial light modulator and where the one or more holographic processors are adapted to compute hologram pattern for rendering on the spatial light modulator. In a related example, the optical module can be adapted to generate virtual media content for display using an optical combiner element, where the optical combiner element, potentially implemented as a combiner mirror, can be configured to combine virtual media content with a real-world environment so that the virtual media content and the real-world environment are viewable at the same time when looking at the optical combiner element. In various examples, the optical combiner element can take the form of any of a combiner mirror, a holographic optical element or a meta-surface. In one or more examples, the optical combiner element can be a semi-transparent angle-selective combiner element implemented as an optical component and configured to allow light to pass at specific angles, while reflecting or blocking light at other angles. In a specific related example, the semi-transparent angle-selective combiner can be used to selectively transmit or reflect light based on the incident angle of the incoming light rays. In another example of implementation, the optical combiner element can include a semi-transparent, reflective coating adapted as part of the lenses of augmented reality glasses.
[0143] FIG. 14A illustrates an example implementation of augmented reality headset/glasses 172 that includes spatial light modulator 152, data transceiver 158, processor 156, main board 162 and battery 164 integrated on a temple of augmented reality glasses. In various examples, some or all of the electronic components can be implemented as integrated circuits, comprising augmented reality headset/glasses 172, along with additional electronic components, also implemented as integrated circuits, can be combined on a common System on Chip (SoC) or as two or more electronic chips integrated using advanced packaging techniques. Example packaging includes multi-chip modules, three-dimensional integrated circuit (3D IC) packages using multi-chip stacking and through silicon vias (TSVs), as well as system-in-package (SiP) and package-on-package (PoP). Any of the foregoing can be used to reduce the footprint of the various hardware elements, while potentially increasing efficiency and improving thermal management performance for an augmented reality system. In an example, advanced packaging techniques can enable one or more of lower cost, compact design and/or comfort of wear, while increasing aesthetic appeal of augmented reality glasses. In the examples of FIG. 5A and FIG. 5B, lens treatment 160 can be provided to enable a user to perceive virtual media content overlaid onto the real-life environment. In another example, lens treatment 160 can include one of a holographic optical element or a meta-surface adapted to the lenses of the augmented reality glasses. In an example, lens treatment 160, implemented as a holographic optical element or a metasurface can be configured as a coating for application onto the lenses of augmented reality glasses.
[0144] FIG. 14B illustrates an example implementation of an augmented reality headset/glasses 170 showing an optical module (such as optical module 100 referring to Fig. 2), comprising spatial light modulator 152 coupled to a secondary board 154 (daughter printed circuit board), all implemented in the temple of augmented reality headset/glasses 170.
[0145] FIG. 14C provides an expanded view of the optical module of augmented reality headset/glasses 170 of FIG. 14B, with the optical module (such as optical module 100 referring to Fig. 11) comprising spatial light modulator 152 coupled to a secondary board 154 (daughter printed circuit board), all implemented in the temple of augmented reality headset/glasses 170.
[0146] It is noted that terminologies as may be used herein such as bit stream, stream, signal sequence, etc. (or their equivalents) have been used interchangeably to describe digital information whose content corresponds to any of a number of desired types (e.g., data, video, speech, text, graphics, audio, etc. any of which may generally be referred to as ‘data’).
[0147] As may be used herein, the terms “substantially” and “approximately” provide an industry-accepted tolerance for its corresponding term and/or relativity between items. For some industries, an industry-accepted tolerance is less than one percent and, for other industries, the industry-accepted tolerance is 10 percent or more. Other examples of industry -accepted tolerance range from less than one percent to fifty percent. Industry-accepted tolerances correspond to, but are not limited to, component values, integrated circuit process variations, temperature variations, rise and fall times, thermal noise, dimensions, signaling errors, dropped packets, temperatures, pressures, material compositions, and/or performance metrics. Within an industry, tolerance variances of accepted tolerances may be more or less than a percentage level (e.g., dimension tolerance of less than +/- 1%). Some relativity between items may range from a difference of less than a percentage level to a few percent. Other relativity between items may range from a difference of a few percent to magnitude of differences.
[0148] As may also be used herein, the term(s) “configured to”, “operably coupled to”, “coupled to”, and/or “coupling” includes direct coupling between items and/or indirect coupling between items via an intervening item (e.g., an item includes, but is not limited to, a component, an element, a circuit, and/or a module) where, for an example of indirect coupling, the intervening item does not modify the information of a signal but may adjust its current level, voltage level, and/or power level. As may further be used herein, inferred coupling (i.e., where one element is coupled to another element by inference) includes direct and indirect coupling between two items in the same manner as “coupled to”.
[0149] As may even further be used herein, the term “configured to”, “operable to”, “coupled to”, or “operably coupled to” indicates that an item includes one or more of power connections, input(s), output(s), etc., to perform, when activated, one or more its corresponding functions and may further include inferred coupling to one or more other items. As may still further be used herein, the term “associated with”, includes direct and/or indirect coupling of separate items and/or one item being embedded within another item.
[0150] As may be used herein, the term “compares favorably”, indicates that a comparison between two or more items, signals, etc., indicates an advantageous relationship that would be evident to one skilled in the art in light of the present disclosure, and based, for example, on the nature of the signals/items that are being compared. As may be used herein, the term “compares unfavorably”, indicates that a comparison between two or more items, signals, etc., fails to provide such an advantageous relationship and/or that provides a disadvantageous relationship. Such an item/signal can correspond to one or more numeric values, one or more measurements, one or more counts and/or proportions, one or more types of data, and/or other information with attributes that can be compared to a threshold, to each other and/or to attributes of other information to determine whether a favorable or unfavorable comparison exists. Examples of such a advantageous relationship can include: one item/signal being greater than (or greater than or equal to) a threshold value, one item/signal being less than (or less than or equal to) a threshold value, one item/signal being greater than (or greater than or equal to) another item/signal, one item/signal being less than (or less than or equal to) another item/signal, one item/signal matching another item/signal, one item/signal substantially matching another item/signal within a predefined or industry accepted tolerance such as 1%, 5%, 10% or some other margin, etc. Furthermore, one skilled in the art will recognize that such a comparison between two items/signals can be performed in different ways. For example, when the advantageous relationship is that signal 1 has a greater magnitude than signal 2, a favorable comparison may be achieved when the magnitude of signal 1 isgreater than that of signal 2 or when the magnitude of signal 2 is less than that of signal 1. Similarly, one skilled in the art will recognize that the comparison of the inverse or opposite of items/signals and/or other forms of mathematical or logical equivalence can likewise be used in an equivalent fashion. For example, the comparison to determine if a signal X > 5 is equivalent to determining if -X < -5, and the comparison to determine if signal A matches signal B can likewise be performed by determining -A matches -B or not(A) matches not(B). As may be discussed herein, the determination that a particular relationship is present (either favorable or unfavorable) can be utilized to automatically trigger a particular action. Unless expressly stated to the contrary, the absence of that particular condition may be assumed to imply that the particular action will not automatically be triggered. In other examples, the determination that a particular relationship is present (either favorable or unfavorable) can be utilized as a basis or consideration to determine whether to perform one or more actions. Note that such a basis or consideration can be considered alone or in combination with one or more other bases or considerations to determine whether to perform the one or more actions. In one example where multiple bases or considerations are used to determine whether to perform one or more actions, the respective bases or considerations are given equal weight in such determination. In another example where multiple bases or considerations are used to determine whether to perform one or more actions, the respective bases or considerations are given unequal weight in such determination.
[0151] As may be used herein, one or more claims may include, in a specific form of this generic form, the phrase “at least one of a, b, and c” or of this generic form “at least one of a, b, or c”, with more or less elements than “a”, “b”, and “c”. In either phrasing, the phrases are to be interpreted identically. In particular, “at least one of a, b, and c” is equivalent to “at least one of a, b, or c” and shall mean a, b, and/or c. As an example, it means: “a” only, “b” only, “c” only, “a” and “b”, “a” and “c”, “b” and “c”, and/or “a”, “b”, and “c”.
[0152] As may also be used herein, a hologram pattern or hologram refers to a light interference pattern, while a hologram pattern is a diffraction pattern that diffracts incident light. Further, a holographic image refers to the visual result perceivable by a viewer when a hologram pattern is properly illuminated. As such the visual result perceivable by a viewer includes two-dimensional (2D) images, two-dimensional (2D) representations, two-dimensional (2D) information, three-dimensional (3D) objects and three-dimensional (3D) scenes.
[0153] As may also be used herein, the terms “processing module”, “processing circuit”, “processor”, “processing circuitry”, and/or “processing unit” may be a single processing device or a plurality of processing devices. Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions. The processing module, module, processing circuit, processing circuitry, and/or processing unit may be, or further include, memory and/or an integrated memory element, which may be a single memory device, a plurality of memory devices, and/or embedded circuitry of another processing module, module, processing circuit, processing circuitry, and/or processing unit. Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information. Note that if the processing module, module, processing circuit, processing circuitry, and/or processing unit includes more than one processing device, the processing devices may be centrally located (e.g., directly coupled together via a wired and/or wireless bus structure) or may be distributedly located (e.g., cloud computing via indirect coupling via a local area network and/or a wide area network). Further note that if the processing module, module, processing circuit, processing circuitry and/or processing unit implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry, the memory and/or memory element storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry. Still further note that, the memory element may store, and the processing module, module, processing circuit, processing circuitry and/or processing unit executes, hard coded and/or operational instructions corresponding to at least some of the steps and/or functions illustrated in one or more of the Figures. Such a memory device or memory element can be included in an article of manufacture.
[0154] One or more embodiments have been described above with the aid of method steps illustrating the performance of specified functions and relationships thereof. The boundaries and sequence of these functional building blocks and method steps have been arbitrarily defined herein for convenience of description. Alternate boundaries and sequences can be defined so long as the specified functions and relationships are appropriately performed. Any such alternate boundaries or sequences are thus within the scope and spirit of the claims. Further, the boundaries of these functional building blocks have been arbitrarily defined for convenience of description. Alternate boundaries could be defined as long as the certain significant functions are appropriately performed. Similarly, flow diagram blocks may also have been arbitrarily defined herein to illustrate certain significant functionality.
[0155] To the extent used, the flow diagram block boundaries and sequence could have been defined otherwise and still perform the certain significant functionality. Such alternate definitions of both functional building blocks and flow diagram blocks and sequences are thus within the scope and spirit of the claims. One of average skill in the art will also recognize that the functional building blocks, and other illustrative blocks, modules and components herein, can be implemented as illustrated or by discrete components, application specific integrated circuits, processors executing appropriate software and the like or any combination thereof.
[0156] In addition, a flow diagram may include a “start” and/or “continue” indication. The “start” and “continue” indications reflect that the steps presented can optionally be incorporated in or otherwise used in conjunction with one or more other routines. In addition, a flow diagram may include an “end” and/or “continue” indication. The “end” and/or “continue” indications reflect that the steps presented can end as described and shown or optionally be incorporated in or otherwise used in conjunction with one or more other routines. In this context, “start” indicates the beginning of the first step presented and may be preceded by other activities not specifically shown. Further, the “continue” indication reflects that the steps presented may be performed multiple times and/or may be succeeded by other activities not specifically shown. Further, while a flow diagram indicates a particular ordering of steps, other orderings are likewise possible provided that the principles of causality are maintained.
[0157] The one or more embodiments are used herein to illustrate one or more aspects, one or more features, one or more concepts, and/or one or more examples. A physical embodiment of an apparatus, an article of manufacture, a machine, and/or of a process may include one or more of the aspects, features, concepts, examples, etc. described with reference to one or more of the embodiments discussed herein. Further, from figure to figure, the embodiments may incorporate the same or similarly named functions, steps, modules, etc. that may use the same or different reference numbers and, as such, the functions, steps, modules, etc. may be the same or similar functions, steps, modules, etc. or different ones.
[0158] Unless specifically stated to the contra, signals to, from, and/or between elements in a figure of any of the figures presented herein may be analog or digital, continuous time or discrete time, and single-ended or differential. For instance, if a signal path is shown as a single-ended path, it also represents a differential signal path. Similarly, if a signal path is shown as a differential path, it also represents a single-ended signal path. While one or more particular architectures are described herein, other architectures can likewise be implemented that use one or more data buses not expressly shown, direct connectivity between elements, and/or indirect coupling between other elements as recognized by one of average skill in the art.
[0159] The term “module” is used in the description of one or more of the embodiments. A module implements one or more functions via a device such as a processor or other processing device or other hardware that may include or operate in association with a memory that stores operational instructions. A module may operate independently and/or in conjunction with software and/or firmware. As also used herein, a module may contain one or more submodules, each of which may be one or more modules.
[0160] As may further be used herein, a computer readable memory includes one or more memory elements. A memory element may be a separate memory device, multiple memory devices, or a set of memory locations within a memory device. Such a memory device may be a read-only memory, random access memory, volatile memory, nonvolatile memory, static memory, dynamic memory, flash memory, cache memory, a quantum register or other quantum memory and/or any other device that stores data in a non-transitory manner. Furthermore, the memory device may be in a form of a solid-state memory, a hard drive memory or other disk storage, cloud memory, thumb drive, server memory, computing device memory, and/or other non-transitory medium for storing data. The storage of data includes temporary storage (i.e., data is lost when power is removed from the memory element) and/or persistent storage (i.e., data is retained when power is removed from the memory element). As used herein, a transitory medium shall mean one or more of: (a) a wired or wireless medium for the transportation of data as a signal from one computing device to another computing device for temporary storage or persistent storage; (b) a wired or wireless medium for the transportation of data as a signal within a computing device from one element of the computing device to another element of the computing device for temporary storage or persistent storage; (c) a wired or wireless medium for the transportation of data as a signal from one computing device to another computing device for processing the data by the other computing device; and (d) a wired or wireless medium for the transportation of data as a signal within a computing device from one element of the computing device to another element of the computing device for processing the data by the other element of the computing device. As may be used herein, a non-transitory computer readable memory is substantially equivalent to a computer readable memory. A non-transitory computer readable memory can also be referred to as a non-transitory computer readable storage medium.
[0161] One or more functions associated with the methods and/or processes described herein can be implemented via a processing module that operates via the non-human “artificial” intelligence (Al) of a machine. Examples of such Al include machines that operate via anomaly detection techniques, decision trees, association mles, expert systems and other knowledge-based systems, computer vision models, artificial neural networks, convolutional neural networks, support vector machines (SVMs), Bayesian networks, genetic algorithms, feature learning, sparse dictionary learning, preference learning, deep learning and other machine learning techniques that are trained using training data via unsupervised, semi-supervised, supervised and/or reinforcement learning, and/or other Al. The human mind is not equipped to perform such Al techniques, not only due to the complexity of these techniques, but also due to the fact that artificial intelligence, by its very definition - requires “artificial” intelligence - i.e., machine/non-human intelligence.
[0162] One or more functions associated with the methods and/or processes described herein can be implemented as a large-scale system that is operable to receive, transmit and/or process data on a large-scale. As used herein, a large-scale refers to a large number of data, such as one or more kilobytes, megabytes, gigabytes, terabytes or more of data that are received, transmitted and/or processed. Such receiving, transmitting and/or processing of data cannot practically be performed by the human mind on a large-scale within a reasonable period of time, such as within a second, a millisecond, microsecond, a real-time basis or other high speed required by the machines that generate the data, receive the data, convey the data, store the data and/or use the data.
[0163] One or more functions associated with the methods and/or processes described herein can require data to be manipulated in different ways within overlapping time spans. The human mind is not equipped to perform such different data manipulations independently, contemporaneously, in parallel, and/or on a coordinated basis within a reasonable period of time, such as within a second, a millisecond, microsecond, a real-time basis or other high speed required by the machines that generate the data, receive the data, convey the data, store the data and/or use the data. [0164] One or more functions associated with the methods and/or processes described herein can be implemented in a system that is operable to electronically receive digital data via a wired or wireless communication network and/or to electronically transmit digital data via a wired or wireless communication network. Such receiving and transmitting cannot practically be performed by the human mind because the human mind is not equipped to electronically transmit or receive digital data, let alone to transmit and receive digital data via a wired or wireless communication network. [0165] One or more functions associated with the methods and/or processes described herein can be implemented in a system that is operable to electronically store digital data in a memory device. Such storage cannot practically be performed by the human mind because the human mind is not equipped to electronically store digital data.
[0166] One or more functions associated with the methods and/or processes described herein may operate to cause an action by a processing module directly in response to a triggering event - without any intervening human interaction between the triggering event and the action. Any such actions may be identified as being performed “automatically”, “automatically based on” and/or “automatically in response to” such a triggering event. Furthermore, any such actions identified in such a fashion specifically preclude the operation of human activity with respect to these actions - even if the triggering event itself may be causally connected to a human activity of some kind.
[0167] While particular combinations of various functions and features of the one or more embodiments have been expressly described herein, other combinations of these features and functions are likewise possible. The present disclosure is not limited by the particular examples disclosed herein and expressly incorporates these other combinations.

Claims

CLAIMS What is claimed is:
1. A method for a display device, the method comprises : receiving data representative of a set of two-dimensional (2D) scene layers, each 2D scene layer having a corresponding predetermined display depth in a focus space; generating a first hologram pattern for each 2D scene layer of the set of 2D scene layers to create a set of first hologram patterns, wherein each first hologram pattern of the a set of first hologram patterns is adapted to place an associated 2D scene layer at infinite depth; converting, using a mathematical lens function, each first hologram pattern of the set of first hologram patterns to a second hologram pattern to create a set of second hologram patterns, wherein each second hologram pattern is adapted to place an associated 2D scene layer at a corresponding predetermined display depth in the focus space; aggregating the set of second hologram patterns to provide an aggregated hologram pattern.
2. The method of claim 1, wherein the data is generated from at least one of a three-dimensional (3D) object or a three-dimensional (3D) scene.
3. The method of claim 1, further comprising: applying random phase to each 2D scene layer of the set of 2D scene layers.
4. The method of claim 1, further comprising: correcting, using an aberration correction function, the aggregated hologram pattern.
5. The method of claim 1, further comprising: converting, using a vision correction function, the aggregated hologram pattern, wherein the vision correction function is adapted to modify the aggregated hologram pattern to correct for a vision impairment.
6. The method of claim 1, wherein the mathematical lens function includes at least one of an aberration correction function or a vision correction function.
7. The method of claim 1, further comprising: finalizing, using a quantization function, the aggregated hologram pattern.
8. The method of claim 7, wherein the quantization function is based on one of error diffusion quantization or mask-based quantization.
9. The method of claim 7, wherein the quantized hologram pattern is rendered on the one or more spatial light modulator devices of the display device.
10. The method of claim 1, wherein each first hologram pattern of the set of first hologram patterns is generated using a Fourier transform.
11. A method for a display device, with the display device comprising at least one spatial light modulator device configured for rendering one or more hologram patterns, the method comprising: dividing the at least one spatial light modulator device into a set of subareas; generating a hologram pattern for a first subarea of the set of subareas; rendering the hologram pattern of the first subarea on each subarea of the set of subareas.
12. The method of claim 11, wherein each subarea of the set of subareas has a width equal to the width of every other subarea of the set of subareas and wherein each subarea further has a length equal to the length of every other subarea of the set of subareas.
13. A method for a display device, the method comprises : receiving data representative of media for display at a predetermined display depth in a focus space; generating, from the data, a first hologram pattern, wherein the first hologram pattern has a size corresponding to a portion of a spatial light modulator, wherein the first hologram pattern is adapted to place the media at an infinite depth; placing two or more replications of the first hologram pattern in an array creating a second hologram pattern, wherein the second hologram pattern is configured to have a width that is a multiple of a width of the first hologram pattern and wherein the second hologram pattern is further configured to have a length that is a multiple of a length of the first hologram pattern; converting, using a mathematical lens function, the second hologram pattern to a third hologram pattern, wherein the third hologram is adapted to place the media for display at a predetermined display depth in the focus space.
14. The method of claim 13, further comprising: applying random phase to the data representative of media before generating the first hologram pattern.
15. The method of claim 13, further comprising: correcting, using an aberration correction function, the third hologram pattern.
16. The method of claim 13, further comprising: converting, using a vision correction function, the third hologram pattern, wherein the vision correction function is adapted to modify the third hologram pattern to correct for a vision impairment.
17. The method of claim 13, wherein the mathematical lens function further includes one of an aberration correction function or a vision correction function.
18. The method of claim 13, further comprising: finalizing, using a quantization function, the third hologram pattern to provide a quantized hologram pattern.
19. The method of claim 18, wherein the quantization function is based on one of error diffusion quantization or mask-based quantization.
20. The method for a display device of claim 18, wherein the display device comprises one or more spatial light modulator devices, wherein the quantized hologram pattern is rendered on the one or more spatial light modulator devices.
21. The method of claim 12, wherein the first hologram pattern is generated using a Fourier transform.
22. A method for a display device, wherein the display device comprises at least one spatial light modulator device, the method comprises: receiving data representative of a set of two-dimensional (2D) scene layers, wherein a 2D scene layer has a corresponding predetermined display depth in a focus space; generating a first hologram pattern for each 2D scene layer of the set of 2D scene layers to create a set of first hologram patterns, wherein each first hologram pattern places an associated 2D scene layer at an infinite depth; wherein each first hologram pattern has a size corresponding to a portion of the at least one spatial light modulator device; for each first hologram pattern of the set of first hologram patterns, placing two or more replications of the first hologram pattern in an array creating a second hologram pattern to create a set of second hologram patterns, wherein a width of each second hologram pattern is a multiple of a width of an associated first hologram pattern and wherein a length of each second hologram pattern is a multiple of a length of an associated first hologram pattern converting, using a mathematical lens function, each second hologram pattern of the set of second hologram patterns to create a set of third hologram patterns, wherein each third hologram pattern is adapted to place an associated 2D scene layer at a corresponding predetermined display depth in the focus space; aggregating the set of third hologram patterns to provide an aggregated hologram pattern.
23. The method of claim 22, wherein each first hologram pattern of the set of first hologram patterns has a width equal to the width of every other first hologram pattern of the set of first hologram patterns and wherein each first hologram pattern further has a length equal to the length of every other first hologram pattern of the set of first hologram patterns.
24. The method of claim 22, wherein the data is generated from at least one of a three-dimensional (3D) object or a three-dimensional (3D) scene.
25. The method of claim 22, further comprising: applying random phase to each 2D scene layer of the set of 2D scene layers.
26. The method of claim 22, further comprising: correcting, using an aberration correction function, the aggregated hologram pattern.
27. The method of claim 22, further comprising: converting, using a vision correction function, the aggregated hologram pattern, wherein the vision correction function is adapted to modify the aggregated hologram pattern to correct for a vision impairment.
28. The method of claim 22, wherein the mathematical lens function further includes at least one of an aberration correction function or a vision correction function.
29. The method of claim 22, further comprising: finalizing, using a quantization function, the aggregated hologram pattern to provide a quantized hologram pattern.
30. The method of claim 29, wherein the quantization function is based on at least one of error diffusion quantization or mask-based quantization.
31. The method for a display device of claim 29, wherein the display device comprises one or more spatial light modulator devices, wherein the quantized hologram pattern is adapted for rendering on the one or more spatial light modulator devices.
32. The method of claim 22, wherein a first hologram pattern of the set of first hologram patterns is generated using a Fourier transform.
33. A method for execution by one or more processing modules of one or more computing devices of a mobile device, the method comprises: receiving a visual search request; capturing, in response to the visual search request, a scene; segmenting the scene into a plurality of elements; determining a relative depth of one or more elements of the plurality of elements; facilitating display of an indicator associated with one or more of the one or more elements; determining whether the indicator is associated with an element of interest; in response to a determination that the indicator is associated with an element of interest, transmitting sensor data associated with the element of interest to a processing device; receiving, from the processing device a reverse image search result; and facilitating display of an indicator representative of the reverse image search result.
34. The method of claim 33, wherein the segmenting the scene into a plurality of elements is executed by one or more processing devices, wherein the one or more processing devices are external to the mobile device and wherein segmented scene data is adapted for transmission to the mobile device by the one or more processing devices.
35. The method of claim 33, wherein the indicator associated with the element of interest is an outline of the element of interest.
36. The method of claim 33, wherein the outline is representative of an augmented reality overlay.
37. The method of claim 33, wherein the indicator associated with the element of interest is adapted for display at a predetermined depth relative to a depth of the element of interest.
38. The method of claim 37, wherein the relevant depth is a same depth as a depth of the element of interest.
39. The method of claim 33, wherein the indicator representative of the reverse image search result includes contextual information pertaining to the element of interest.
40. The method of claim 33, wherein the indicator representative of the reverse image search result is adapted for display at a predetermined depth relative to a depth of the element of interest.
41. The method of claim 40, wherein the relevant depth is a same depth as a depth of the element of interest.
42. The method of claim 33, wherein the processing device is a remote server.
43. The method of claim 30, wherein the mobile device is an augmented reality device.
44. The method of claim 43, wherein the mobile device is further configured to collect metadata.
45. The method of claim 44, further comprising: transmitting the metadata to the processing device.
46. The method of claim 44, wherein the metadata is captured by one or more sensors associated with the mobile device.
47. The method of claim 44, wherein the metadata comprises at least one of descriptive metadata, geospatial metadata or contextual metadata.
48. A method for execution by one or more processing modules of one or more computing devices of a mobile device, the method comprises: receiving a visual search request; capturing, in response to the visual search request, a scene; segmenting the scene into a plurality of elements; determining, based on a tracking element, an potential element of interest from the plurality of elements; determining a relative depth of the potential element of interest; facilitating display of an indicator associated with the potential element of interest; determining whether the indicator is associated with the actual element of interest; in response to a determination that the indicator is associated with the actual element of interest, transmitting sensor data associated with the actual element of interest to a processing device; receiving, from the processing device a reverse image search result; and facilitating display of an indicator representative of the reverse image search result.
49. The method of claim 48, wherein the tracking element is at least one of an eye tracking sensor or a gaze tracking sensor.
50. The method of claim 48, wherein the segmenting the scene into a plurality of elements is executed by one or more processing devices, wherein the one or more processing devices are external to the mobile device and wherein segmented scene data is adapted for transmission to the mobile device by the one or more processing devices.
51. The method of claim 48, wherein the indicator associated with any of the potential element of interest is an outline of the potential element of interest .
52. The method of claim 48, wherein the outline is representative of an augmented reality overlay.
53. The method of claim 48, wherein the indicator associated with the potential element of interestis displayed at a predetermined depth relative to a depth of the potential element of interest.
54. The method of claim 53, wherein the predetermined depth is a same depth as the depth of the potential element of interest.
55. The method of claim 48, wherein the indicator representative of the reverse image search result includes contextual information pertaining to the element of interest.
56. The method of claim 48, wherein the indicator representative of the reverse image search result is adapted for display at a predetermined depth relative to a depth of the element of interest.
57. The method of claim 56, wherein the predetermined depth is a same depth as the depth of the element of interest.
58. The method of claim 45, wherein the processing device is a remote server.
59. The method of claim 45, wherein the mobile device is an augmented reality device.
60. The method of claim 45, wherein the mobile device is further configured to collect metadata.
61. The method of claim 60, further comprising: transmitting metadata to the processing device.
62. The method of claim 60, wherein the metadata comprises at least one of descriptive metadata, geospatial metadata or contextual metadata.
63. The method of claim 60, wherein the metadata is captured by one or more sensors associated with the mobile device.
64. An augmented reality device comprises: an interface for interfacing with a network; a light source; a spatial light modulator; one or more optical elements; memory; and a processing module operably coupled to the interface and to the memory.
65. The augmented reality device of claim 64, with the processing module operable to: receive, via the interface, data representative of a hologram pattern; provide the data to the spatial light modulator, wherein the spatial light modulator is configured to render the hologram pattern; facilitate illuminating, by the light source, the rendered hologram pattern to provide media visible to a user.
66. The augmented reality device of claim 64, further comprising a holographic processing module, wherein the processing module is further operable to: receive, via the interface, data representative of media for display; transmit the data to the holographic processing module; receive, from the holographic processing module, a hologram pattern; provide the hologram pattern to the spatial light modulator for rendering to provide a rendered hologram pattern; facilitate illuminating, by the light source, the rendered hologram pattern to provide media visible to a user.
67. The augmented reality device of claim 64, wherein the holographic processing module is configured to execute a Computer-Generated Holography (CGH) algorithm.
68. The augmented reality device of claim 64, wherein the CGH algorithm comprises at least one of a Fourier transform algorithm, a Fresnel transform algorithm, an iterative Fourier transform algorithm (IFTA), a point cloud method-based algorithm, an angular spectrum method-based algorithm, or a look-up table (LUT) method-based algorithm.
69. The augmented reality device of claim 64, wherein the media for display comprises at least one of a two- dimensional (2D) image, a two-dimensional (2D) representation, two-dimensional (2D) information, a three- dimensional (3D) object or a three-dimensional (3D) scene.
70. An optical display system comprising: one or more spatial light modulators configured to display holographic images viewable by a user; one or more illumination sources; and one or more optical elements.
71. The optical display system of claim 70, wherein the holographic images comprise at least one of a two- dimensional (2D) image, a two-dimensional (2D) representation, two-dimensional (2D) information, a three- dimensional (3D) object or a three-dimensional (3D) scene.
72. The optical display system of claim 70 further comprising: one or more holographic processing modules configured to execute one or more Computer-Generated Holography (CGH) algorithms.
73. The optical display system of claim 72, wherein the one or more CGH algorithms comprise at least one of a Fourier transform algorithm, a Fresnel transform algorithm, an iterative Fourier transform algorithm (IFTA), a point cloud method-based algorithm, an angular spectrum method-based algorithm, or a look-up table (LUT) method-based algorithm.
74. The optical display system of claim 70, wherein a spatial light modulator comprises an array of light modulating elements.
75. The optical display system of claim 74, wherein a pitch of two adjacent light modulating elements of the array of light modulating elements is equal to or less than the wavelength of a predetermined visible light wavelength.
76. The optical display system of claim 74, wherein each light modulating element of the array of light modulating elements modulates at least one of amplitude, phase or polarization of light incident to the light modulating element.
77. The optical display system of claim 70, further comprising: one or more displays, wherein a display of the one or more displays is at least one of a head-mounted display, a head-up display, a stereoscopic display, or a holographic display.
78. A method for execution by one or more processing modules of one or more computing devices, the method comprises: generating a quantization mask, wherein the quantization mask is adapted to move noise associated with a quantization process outside a predetermined signal window in a frequency domain. quantizing, based on the quantization mask, a continuous hologram to generate a quantized hologram.
79. The method of claim 76, wherein quantizing the continuous hologram includes comparing a value of the continuous hologram pattern to a corresponding value of the quantization mask.
80. The method of claim 76, wherein the quantization mask is configured to have a same size as the continuous hologram.
81. The method of claim 76, wherein the quantization mask is configured to have a size smaller than the continuous hologram.
82. The method of claim 79, wherein the quantization mask is replicated over the continuous hologram for quantization of the continuous hologram to create a quantized hologram.
83. A method for execution by one or more processing modules of one or more computing devices, the method comprises: generating a first quantization mask, with the first quantization mask having a size smaller than a size of a continuous hologram, wherein the first quantization mask is adapted to move noise associated with a quantization process outside a predetermined signal window in a frequency domain; generating a second quantization mask, with the second quantization mask having a size smaller than the size of the continuous hologram, wherein the second quantization mask is adapted to move noise associated with the quantization process outside a predetermined signal window in the frequency domain; using the first and the second quantization masks to quantize a continuous hologram to generate a quantized hologram.
84. The method of claim 83, wherein the second quantization mask is different from the first quantization mask.
85. The method of claim 83, wherein using the first and the second quantization mask to quantize a continuous hologram into a quantized hologram includes comparing a value of the continuous hologram to a corresponding value of the first quantization mask or of the second quantization mask.
86. The method of claim 83, wherein the second quantization mask is configured to have a same width as the first quantization mask and is configured to have a same length as the first quantization mask.
87. The method of claim 83, wherein the first and the second quantization mask are alternately replicated along a horizontal direction or a vertical direction, or along both a horizontal and a vertical direction of the continuous hologram.
88. A method for execution by one or more processing modules of one or more computing devices, the method comprises: receiving a continuous hologram, wherein the continuous hologram is divided into an array of pixels, wherein each pixel is associated with a value corresponding to a value of the continuous hologram; using a quantization mask to quantize the pixel value of each pixel of the array of pixels to one of a plurality of states, wherein the quantization mask is configured to facilitate moving noise associated with the quantization process outside a predetermined signal window in a frequency domain.
89. The method of claim 88, the method comprising: adapting the continuous hologram so that values of the continuous hologram lie between -1 and 1, with -1 and 1 included; dividing the continuous hologram to provide four quadrants sharing a quadripoint, a first quadrant including a portion of the array of pixels, wherein each of a second quadrant (top left quadrant), a third quadrant (bottom left quadrant) and a fourth quadrant (bottom right quadrant) respectively includes another equal size portion of the array of pixels; generating a first quantization mask, wherein each value of the first quantization mask lies between -1 and 1, with -1 and 1 included; reversing the sign of each value of the first quantization mask to provide a second quantization mask; using the first quantization mask over the second and the fourth quadrant for quantization of the second and the fourth quadrant of the continuous hologram; reversing the sign of each value of the first quadrant and of the third quadrant of the continuous hologram providing a sign-reversed first quadrant and a sign-reversed third quadrant; using the second quantization mask over the sign-reversed first and the sign-reversed third quadrant of the continuous hologram providing a quantized sign-reversed first quadrant and a quantized sign- reversed third quadrant; reversing the sign of each value of the quantized sign-reversed first quadrant and of the quantized sign-reversed third quadrant providing a quantized hologram for the first and fourth quadrant of the continuous hologram.
90. The method of claim 89, wherein the first and the second quantization mask have a size smaller than the size of any of the four quadrants, the method comprising: replicating the first quantization mask over the second quadrant and over the fourth quadrant for quantization of the second and the fourth quadrant of the continuous hologram providing a quantized hologram for the second and the fourth quadrant, reversing the sign of each value of the first quadrant and of the third quadrant of the continuous hologram providing a sign-reversed first quadrant and a sign-reversed third quadrant; replicating the second quantization mask over the sign-reversed first quadrant and over the sign- reversed third quadrant for quantization of the sign-reversed first and the sign-reversed third quadrant providing a quantized sign-reversed first quadrant and a quantized sign-reversed third quadrant; reversing the sign of each value of the quantized sign-reversed first quadrant and of the quantized sign-reversed third quadrant providing a quantized hologram for the first and the fourth quadrant of the continuous hologram.
91. A method for a display device, the method comprises : receiving data representative of media for display at a predetermined display depth in a focus space; generating, from the data, a first hologram pattern, which places the media at an infinite depth; converting, using a mathematical lens function, the first hologram pattern to a second hologram pattern, which places the media for display from the infinite depth to the predetermined display depth in the focus space.
92. The method of claim 91, whereby the media for display at a predetermined display depth in a focus space comprises one of a two-dimensional (2D) image, a two-dimensional (2D) representation or two-dimensional (2D) information.
93. The method of claim 91 , further comprising: applying random phase to the data representative of media for display at a predetermined display depth in a focus space before generating the first hologram pattern.
94. The method of claim 91, wherein the mathematical lens function further includes one of an aberration correction function or a vision correction function.
95. The method of claim 91, further comprising: finalizing, using a quantization function, the second hologram pattern in a quantized hologram pattern.
96. The method of claim 95, wherein the quantization function is based on one of error diffusion quantization or mask-based quantization.
97. The method for a display device of claim 95, wherein the display device comprises one or more spatial light modulator devices, wherein the quantized hologram partem is rendered on the one or more spatial light modulator devices.
98. The method of claim 95, wherein the first hologram pattern is generated using a Fourier transform.
99. An optical display system comprising: one or more spatial light modulators; one or more optical light modules; and one or more optical combiners.
100. The optical display system of claim 99, wherein the one or more spatial light modulators comprise an array of light modulating elements, wherein each light modulating element of the array of light modulating elements is individually addressable to control a state of the light modulating element, wherein each light modulating element can exhibit at least two different states, each state having different optical properties.
101. The optical display system of claim 100, wherein the pitch between two adjacent light modulating elements of the array of light modulating elements is equal to or smaller than a wavelength of the light incident to the one or more spatial light modulators.
102. The optical display system of claim 100, wherein the pitch between two adjacent light modulating elements of the array of light modulating elements is equal to or smaller than half a wavelength of light incident to the one or more spatial light modulators.
103. The optical display system of claim 100, wherein a light modulating element of the array of light modulating elements is configured to modulate any of amplitude, phase of polarization of light incident to the light modulating element.
104. The optical display system of claim 100, wherein a light modulating element of the array of light modulating elements of the one or more spatial light modulators comprises: a phase change material; a heater element; two or more electrodes connected to the heater element, wherein the light modulating element is configured to be individually addressable via the two or more electrodes; wherein a state of the light modulating element is configured to be alterable by changing a state of the phase change material in response to a thermal contribution from the heater element.
105. The optical display system of claim 99, wherein an optical light module of the one or more optical tight modules comprises: an illumination scheme; a first group of optical elements; and a second group of optical elements.
106. The optical display system of claim 105, wherein the illumination scheme comprises: one or more illumination sources and an optical assembly, wherein the optical assembly is configured to transmit light from the one or more illumination sources to the one or more spatial light modulators.
107. The optical display system of claim 106, wherein the one or more illumination sources is an RGB laser source.
108. The optical display system of claim 106, wherein the optical assembly comprises at least one of: a beam splitter; a Total Internal Reflection (TIR) prism; a free form prism.
109. The optical display system of claim 106, wherein the optical assembly includes a collimator element configured to provide collimated illumination to the one or more spatial tight modulators.
110. The optical display system of claim 106, wherein at least one of the first group of optical elements or of the second group of optical elements of the optical light module comprises one or more achromatic lenses.
111. The optical display system of claim 105, wherein at least one of the first group of optical elements or of the second group of optical elements of the optical light module comprise at least one of: a lens; a mirror; a prism; a beam splitter; an optical filter; a polarizer.
112. The optical display system of claim 105, wherein an optical light module of the one or more optical light modules comprises: one or more optical corrector elements, wherein the one or more optical corrector elements are configured to correct for aberrations introduced by optical elements associated with the optical light module.
113. The optical display system of claim 112, wherein the one or more optical corrector elements includes at least one of: a toroid optical element; or a free form optical element.
114. The optical display system of claim 99, wherein an optical combiner of the one or more optical combiners includes at least one of: a holographic optical element; a meta surface; a semi-reflective element.
115. The optical display system of claim 105, wherein a first group of optical elements of the one or more optical light modules forms a first lens group of a 4f optical system and wherein a second group of optical elements together with the one or more optical combiners are configured to form a second lens group of the 4f optical system.
116. The optical display system of claim 115, wherein the first lens group of the 4f optical system is adapted to perform a Fourier transform to information representative of a hologram at a plane of the one or more spatial light modulators by converting the information at a plane of the one or more spatial light modulators from a spatial domain to a spatial frequency domain, wherein the spatial frequency information is at a Fourier plane of the 4f optical system and wherein the second lens group of the 4f system is adapted to perform an inverse Fourier transform to the spatial frequency information at the Fourier plane by converting the spatial frequency information at the Fourier plane from the frequency domain to the spatial domain.
117. The optical display system of claim 99, wherein the one or more optical light modules comprise: an illumination scheme; a first group of optical elements, wherein the first group of optical elements forms a first lens group of a 4f optical system and wherein the one or more optical combiners form a second lens group of the 4f optical system.
118. The optical display system of claim 99, wherein the one or more optical light modules further comprise one or more optical filter elements.
119. The optical display system of claim 99, wherein the one or more optical filter elements are configured to filter out unwanted frequency -based information at a Fourier plane.
120. The optical display system of claim 99 further comprising one or more processing units.
121. The optical display system of claim 120, wherein the one or more processing units are implemented on a system-on-chip.
122. The optical display system of claim 120, wherein the one or more processing units are configured to execute one or more Computer-Generated Holography (CGH) algorithms to generate one or more hologram patterns for rendering using the one or more spatial light modulators.
123. The optical display system of claim 120, wherein the one or more CGH algorithms comprise at least one of a Fourier transform algorithm, a Fresnel transform algorithm, an iterative Fourier transform algorithm (IFTA), a point cloud method-based algorithm, an angular spectrum method-based algorithm, or a look-up table (LUT) method-based algorithm.
124. The optical display system of claim 120, wherein the one or more processing units are adapted to modify one or more hologram patterns for rendering using the one or more spatial tight modulators to provide modified hologram patterns, wherein the modified hologram patterns are adapted to correct for aberrations introduced by the one or more optical tight modules or by the one or more optical combiners.
125. The optical display system of claim 120, wherein the one or more processing units are adapted to modify one or more hologram patterns for rendering using the one or more spatial tight modulators to provide modified hologram patterns, wherein the modified hologram patterns are adapted to correct for a refractive error or a vision impairment associated with a user of the optical display system.
PCT/US2024/044765 2023-09-01 2024-08-30 System and methods for display of 3d multi-media Pending WO2025049975A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2025/010721 WO2025151505A1 (en) 2024-01-08 2025-01-08 Method and apparatus for holographic display

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202363580198P 2023-09-01 2023-09-01
US63/580,198 2023-09-01
US202463618818P 2024-01-08 2024-01-08
US63/618,818 2024-01-08

Publications (2)

Publication Number Publication Date
WO2025049975A1 true WO2025049975A1 (en) 2025-03-06
WO2025049975A4 WO2025049975A4 (en) 2025-05-08

Family

ID=94820458

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/044765 Pending WO2025049975A1 (en) 2023-09-01 2024-08-30 System and methods for display of 3d multi-media

Country Status (1)

Country Link
WO (1) WO2025049975A1 (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120170089A1 (en) * 2010-12-31 2012-07-05 Sangwon Kim Mobile terminal and hologram controlling method thereof
KR101550934B1 (en) * 2007-05-16 2015-09-07 시리얼 테크놀로지즈 에스.에이. Real-time Video Hologram Generation Method for Extending the 3D-Rendering Graphics Pipeline
KR20160081527A (en) * 2014-12-31 2016-07-08 한국전자통신연구원 Method for processing holographic image and displaying, and processing apparatus for holographic image, and display and computer readable recording medium for holographic image
US20160378062A1 (en) * 2014-03-20 2016-12-29 Olympus Corporation Hologram data generating method, hologram image reproduction method, and hologram image reproduction device
KR101754976B1 (en) * 2015-06-01 2017-07-06 주식회사 쓰리디팩토리 Contents convert method for layered hologram and apparatu
KR20170132311A (en) * 2015-04-01 2017-12-01 시리얼 테크놀로지즈 에스.에이. Computation of Hologram for Hologram Reconstruction of 2D and / or 3D Scene
KR101883233B1 (en) * 2017-02-21 2018-07-30 주식회사 미래기술연구소 Method for generating and reconstructing 3-dimensional computer generated hologram
US20190361392A1 (en) * 2018-05-25 2019-11-28 International Business Machines Corporation Image and/or video capture from different viewing angles of projected mirror like reflective holographic image surface
KR20220054619A (en) * 2019-09-03 2022-05-03 라이트 필드 랩 인코포레이티드 Lightfield display for mobile devices
US20230236544A1 (en) * 2022-01-21 2023-07-27 Samsung Electronics Co., Ltd. Method and apparatus for modulating depth of hologram and holographic display using the same

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101550934B1 (en) * 2007-05-16 2015-09-07 시리얼 테크놀로지즈 에스.에이. Real-time Video Hologram Generation Method for Extending the 3D-Rendering Graphics Pipeline
US20120170089A1 (en) * 2010-12-31 2012-07-05 Sangwon Kim Mobile terminal and hologram controlling method thereof
US20160378062A1 (en) * 2014-03-20 2016-12-29 Olympus Corporation Hologram data generating method, hologram image reproduction method, and hologram image reproduction device
KR20160081527A (en) * 2014-12-31 2016-07-08 한국전자통신연구원 Method for processing holographic image and displaying, and processing apparatus for holographic image, and display and computer readable recording medium for holographic image
KR20170132311A (en) * 2015-04-01 2017-12-01 시리얼 테크놀로지즈 에스.에이. Computation of Hologram for Hologram Reconstruction of 2D and / or 3D Scene
KR101754976B1 (en) * 2015-06-01 2017-07-06 주식회사 쓰리디팩토리 Contents convert method for layered hologram and apparatu
KR101883233B1 (en) * 2017-02-21 2018-07-30 주식회사 미래기술연구소 Method for generating and reconstructing 3-dimensional computer generated hologram
US20190361392A1 (en) * 2018-05-25 2019-11-28 International Business Machines Corporation Image and/or video capture from different viewing angles of projected mirror like reflective holographic image surface
KR20220054619A (en) * 2019-09-03 2022-05-03 라이트 필드 랩 인코포레이티드 Lightfield display for mobile devices
US20230236544A1 (en) * 2022-01-21 2023-07-27 Samsung Electronics Co., Ltd. Method and apparatus for modulating depth of hologram and holographic display using the same

Also Published As

Publication number Publication date
WO2025049975A4 (en) 2025-05-08

Similar Documents

Publication Publication Date Title
US11733648B2 (en) Deep computational holography
Skirnewskaja et al. Automotive holographic head‐up displays
US20230005167A1 (en) Virtual and augmented reality systems and methods
CN111819798B (en) Controlling image display in a surrounding image area via real-time compression
CN102959477B (en) Display device
JP6675312B2 (en) Methods and systems for augmented reality
US20250106503A1 (en) Virtual and augmented reality systems and methods using display system control information embedded in image data
US11281003B2 (en) Near eye dynamic holography
EP3921703B1 (en) Holographic image generated based on eye position
US12019396B2 (en) Real time holography using learned error feedback
CN107390379B (en) Near-to-eye holographic three-dimensional display system and display method
US20230171385A1 (en) Methods, systems, and computer readable media for hardware-in-the-loop phase retrieval for holographic near eye displays
US10845761B2 (en) Reduced bandwidth holographic near-eye display
KR102262214B1 (en) Apparatus and method for displaying holographic 3-dimensional image
WO2020205101A1 (en) Electronic device displays with holographic angular filters
CN107479197B (en) Holographic near-eye display system
EP3712711B1 (en) Method and apparatus for processing holographic image
WO2025049975A1 (en) System and methods for display of 3d multi-media
CN116165864B (en) A method and system for realizing a binary tomographic three-dimensional scene for augmented reality
KR20220146169A (en) Holographic display apparatus including free-formed surface and operating method of the same
EP3719581B1 (en) Method and apparatus for processing hologram image data
WO2025151505A1 (en) Method and apparatus for holographic display
WO2024029212A1 (en) Stereoscopic image display device and stereoscopic image display method
TW202209026A (en) Apparatus and method for computing hologram data

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24861180

Country of ref document: EP

Kind code of ref document: A1