[go: up one dir, main page]

GB2639832A - Augmented reality display device - Google Patents

Augmented reality display device

Info

Publication number
GB2639832A
GB2639832A GB2404118.8A GB202404118A GB2639832A GB 2639832 A GB2639832 A GB 2639832A GB 202404118 A GB202404118 A GB 202404118A GB 2639832 A GB2639832 A GB 2639832A
Authority
GB
United Kingdom
Prior art keywords
augmented reality
pair
camera
spectacles according
lens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2404118.8A
Other versions
GB202404118D0 (en
Inventor
Salisbury Richard
Davie Alan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Thermoteknix Systems Ltd
Original Assignee
Thermoteknix Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thermoteknix Systems Ltd filed Critical Thermoteknix Systems Ltd
Priority to GB2404118.8A priority Critical patent/GB2639832A/en
Publication of GB202404118D0 publication Critical patent/GB202404118D0/en
Publication of GB2639832A publication Critical patent/GB2639832A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/001Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
    • AHUMAN NECESSITIES
    • A42HEADWEAR
    • A42BHATS; HEAD COVERINGS
    • A42B3/00Helmets; Helmet covers ; Other protective head coverings
    • A42B3/04Parts, details or accessories of helmets
    • A42B3/0406Accessories for helmets
    • A42B3/042Optical devices
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • G02B13/001Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
    • G02B13/008Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras designed for infrared light
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/12Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices with means for image conversion or intensification
    • G02B23/125Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices with means for image conversion or intensification head-mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • G02C11/10Electronic devices other than hearing aids
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • G02B2027/0152Head-up displays characterised by mechanical features involving arrangement aiming to get lighter or better balanced devices
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Health & Medical Sciences (AREA)
  • Acoustics & Sound (AREA)
  • General Health & Medical Sciences (AREA)
  • Otolaryngology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Astronomy & Astrophysics (AREA)

Abstract

A pair of augmented reality spectacles 1 include a capture lens 3, an optical connection element 4 and an optical projector unit 21. The capture lens is on a front frame 10 of the spectacles. The optical connection element 4 directs light incident upon the capture lens to a camera 51 for detection. The camera outputs image data in response to detecting incident light. The optical projector unit projects augmented reality data that is at least partially derived from the output image data of the camera on to one or both spectacle lenses 2. There may be a single projector unit projecting augmented reality data onto one or both spectacle lenses. There may be two projector units, each projector unit projecting augmented reality data on to a different spectacle lens. Each projector unit may project different augmented reality data on to each spectacle lens. The camera may be mounted on the temple of the augmented reality spectacles. The camera may be mounted separately to the augmented reality spectacles and optically coupled to the optical connection element. The camera may detect incident light in a selected detection wavelength band outside the visible wavelength band such as long/short wavelength infrared (LWIR, SWIR).

Description

AUGMENTED REALITY DISPLAY DEVICE Technical Field of the Invention The present invention relates to augmented reality display devices. In particular, the present invention relates to augmented reality display device of the optical see through type such as augmented reality spectacles. More particularly, the present invention relates to augmented reality spectacles wherein at least part of the augmented reality data displayed relates to images captured by an associated camera, and especially to augmented reality spectacles where the associated camera is mounted on the spectacles.
Background to the Invention
Augmented reality display devices, of the "optical see-through" type comprise at least one viewing pane or lens of substantially transparent material through which a user may view the real world and at least one projection unit configured to optically project augmented reality data on to the viewing pane. In this manner the augmented reality data appears superimposed on top of the real scene visible by the user through the viewing pane. The augmented reality data may be computer generated and can comprise any relevant data including alphanumeric data, icons, or image data derived from an associated camera.
An increasingly common example of an optical see through augmented reality device is a helmet mounted head up display (HUD). These may be used by dismounted soldiers to view relevant information such as tactical, targeting or ranging data without recourse to a separate screen. For helmet mounted HUDs, an associated camera is typically also mounted on the helmet. This set up allows a user to see a combined image where data captured by the associated camera is fused with the real scene from the user's perspective. For instance, if the associated camera was an infrared camera, a user may be able to see an infrared image fused with the real scene.
Recently, there has been a desire to substitute augmented reality spectacles (alternatively referred to as smart glasses) in place of relatively cumbersome helmet mounted HUDs. Augmented reality spectacles essentially comprise a pair of spectacles 30 having a pair of lenses fitted to a frame, the frame comprising a frame front and rearwardly extendable temples (commonly referred to as arms) and at least one optical projector unit configured to augmented reality data on to at least one lens. Typically, but not always, the lenses are plain and non-magnifying. Augmented reality spectacles are typically lighter in weight and more comfortable to wear for extended periods than helmet mounted HUDs.
Nevertheless, there are difficulties with replacing existing helmet mounted HUDs by augmented reality spectacles where the augmented reality data is at least partially derived from an associated camera. Firstly, mounting the associated camera on a separate helmet is no longer convenient given the need for at least a data connection between the associated camera and the augmented reality spectacles. Secondly, any displacement between the viewing pane and associated camera means that the scene from the associated camera and the scene observed directly through the viewing pane will not exactly overlap. It is possible to use an image processing transformation such as affine or homographic, to attempt to correct for these effects in a limited set of circumstances. Nevertheless, these techniques do not work effectively over a full range of object distances and cannot easily correct for some well-known parallax effects such as obscuration and foreshortening. Additionally, in view of the computational complexity involved, carrying out such image processing transformations in real time requires significant processing and energy resources.
Whilst it would be possible to alleviate the above issues by mounting the associated camera on the spectacles, there is a conflict between the best viewing position for the camera and the most convenient position to mount the camera. The best viewing position is clearly towards the front of the spectacles frame, so as not to have an obscured forward view. In contrast, the most convenient mounting position for a camera, bearing in mind the typical size and weight of a camera (especially when the camera comprises associated power/processing units) is towards the rear of the spectacle frame. This improves the overall balance of the spectacle frame enhancing user comfort and helping the spectacles to remain in position when worn.
Accordingly, it is an object of the present invention to provide an augmented reality display device that at least partially overcomes or alleviates some of the above problems.
Summary of the Invention
According to a first aspect of the present invention there is provided a pair of augmented reality spectacles comprising: a capture lens provided on a frame front of the spectacles; an optical connection element configured to direct light incident upon the capture lens to a camera for detection, the camera configured to output image data in response to detecting incident light; and an optical projector unit configured to project augmented reality data on to at least one spectacle lens, wherein the augmented reality data is at least partially derived from the output image data of the camera.
The present invention thereby provides a pair of augmented reality spectacles I0 with a camera mounted in an optimum position for balance but capable of capturing unobscured forward-facing images. Furthermore, as the capture lens can be located close to one or both spectacle lenses potential parallax issues can be minimised. Indeed, in some implementations the relative positioning of the capture lens and one or both spectacle lenses can be sufficiently close to remove the requirement for image I5 processing transformations in real time to make such parallax corrections. Furthermore, the reduction in processing and energy resources can result in a corresponding reduction in power consumption, allowing the use of a reduced power source and/or increasing the operational time for the augmented reality spectacles for a given power source. Accordingly, the present invention provides a pair of augmented reality spectacles that can be comfortable for a user to wear and which provides improved performance.
The augmented reality spectacles may comprise a single projector unit. The single projector unit may be configured to project augmented reality data on to one or other spectacle lens. In alternative embodiments, the single projector unit may be configured to project augmented reality data on to both spectacle lenses. In further embodiments, the augmented reality spectacles may comprise two projector units. In such embodiments, each projector unit may be configured to project augmented reality data on to a different spectacle lens. In some such embodiments, each projector unit may be configured to project the same augmented reality data on to each spectacle lens. In other embodiments, each projector unit may he configured to project different augmented reality data on to each spectacle lens.
The camera may be mounted on the augmented reality spectacles. The camera may be mounted on the temple of the augmented reality spectacles. The camera may be mounted on the temple tip of the augmented reality spectacles. This potentially allows the augmented reality spectacles to be used as an independent device.
In alternative embodiments, the camera may he mounted separately to the augmented reality spectacles and optically coupled to the optical connection element. This can allow the augmented reality spectacles to comprise an element of a larger system. In one such embodiment, the camera may be mounted on a separate user mounted unit. The separate user mounted unit may be mounted on a helmet or on a backpack. The separate unit may comprise a battle management unit (BMU) or part of a BMU. Given that the augmented reality spectacles may be relatively flimsy compared to other user mounted units, this can reduce the cost impact of damage to the spectacles in use.
The camera may comprise a sensing array. The sensing array may be I5 configured to detect light incident on the camera. The camera may be configured to detect incident light in a selected detection wavelength band. The detection wavelength band may be outside the visible wavelength band. In some embodiments, the detection wavelength band may correspond to infrared or ultraviolent light. In particular embodiments, the detection wavelength band may correspond to long wavelength infrared red light (LWIR) and/or short wavelength infrared light (SWIR). Typically, LWIR may be defined as wavelengths of approximately 8-15pm. Typically, SWIR may be defined as wavelengths of approximately 1.4-3pm. Additionally or alternatively, the detection wavelength band may comprise any one or more of: near infrared (MIR), -0.75-1.4pm; mid-wavelength infrared (MWIR), -3-8pm; or far infrared (FIR). -15-1000pm.
In some embodiments, the augmented reality spectacles may comprise a single capture lens. In such embodiments, the capture lens may be provided on a bridge of the frame front. This minimises the displacement between each spectacle lenses and the capture lens. Such an arrangement is beneficial if augmented reality data based on the light incident on the capture lens is projected onto both spectacle lenses. Alternatively, in such embodiments, the capture lens may be provided on an end piece of the frame front. This minimises the displacement between one spectacle lens and the capture lens. Such an arrangement is beneficial if augmented reality data based on the light incident on the capture lens is projected only onto the spectacle lens closest to the end piece on which the capture lens is provided.
In some embodiments, the augmented reality spectacles may comprise two or more capture lenses. In such embodiments. each capture lens may be provided with a dedicated optical connection element. In such embodiments, each capture lens may be provided with a dedicated camera.
In embodiments with two capture lenses, each capture lens may be provided on I 0 a bridge of the frame front. In such embodiments, the capture lenses may be laterally displaced from each other towards the respective spectacle lenses. Alternatively, in such embodiments, each capture lens may be provided on opposing end pieces of the frame front. Such arrangements are beneficial if different augmented reality data is projected onto each spectacle lens based on the light incident on the respective capture lens closest I5 to the spectacle lens. This minimises the di spl ace me nt between each spectacle lens and the respective capture lens. This allows a user to experience stereoscopic augmented reality. Typically, such stereoscopic effects are enhanced when the respective capture lenses are provided on the opposing end pieces rather than both being on the bridge.
In a further embodiment comprising two capture lenses, each capture lens may be provided on the frame front above the centre of the respective spectacle lenses. This is beneficial in situations where close range viewing is anticipated and therefore some parallax correction is unavoidable. Such positioning removes the horizontal parallax component leaving only a vertical component to be corrected. This can provide a significant reduction of the processing required to implement parallax correction calculations. Furthermore, assuming the lenses are mounted with the same offset relative to each spectacle lens, the same vertical parallax correction factor will apply to each spectacle lens. This further reduces the complexity of the parallax correction calculations.
In some embodiments, the spectacles may comprise a capture lens pair. The capture lens pair may comprise a pair of adjacent capture lenses. The lenses within the capture lens pair may be closely spaced. The lenses within the lens pair may be displaced from each other laterally, vertically in any other direction. The displacement between the lenses in each lens pair may be of the same order as the diameter of each lens. In particular embodiments, the separation of adjacent lenses may be in the range of 0.5 to 4 Limes the diameter of each lens.
In such embodiments, each capture lens of the pair may be provided with a dedicated optical connection element. Each capture lens of the pair may be provided with a dedicated camera.
The capture lens pair may be provided on a bridge of the frame front. As with a single capture lens, this minimises the displacement between each spectacle lenses and I0 the capture lens pair. Such an arrangement is beneficial if augmented reality data based on the light incident on the capture lens pair is projected onto both spectacle lenses. Alternatively, in such embodiments, the capture lens pair may be provided on an end piece of the frame front. This minimises the displacement between one spectacle lens and the capture lens pair. Such an arrangement is beneficial if augmented reality data I5 based on the light incident on the capture lens pair is projected only onto the spectacle lens closest to the end piece on which the capture lens pair is provided.
In some embodiments, the augmented reality spectacles may comprise two or more capture lens pairs. In embodiments with two capture lens pairs, each capture lens pair may be provided on a bridge of the 'frame front. In such embodiments, the capture lens pairs may be laterally displaced from each other towards the respective spectacle lenses. Alternatively, in such embodiments, each capture lens pair may he provided on an opposing end piece of the frame front. Such arrangements are beneficial if different augmented reality data is projected on to each spectacle lens based on the light incident on the respective capture lens pair closest to the spectacle lens. This minimises the displacement between each spectacle lens and the respective capture lens pair. This allows a user to experience stereoscopic augmented reality. Typically, such stereoscopic effects are enhanced when the respective capture lens pairs are provided on the opposing end pieces rather than both being on the bridge.
In a further embodiment comprising two capture lens pairs, each capture lens 30 pair may be provided on the frame front above the centre of the respective spectacle lenses. This is beneficial in situations where close range viewing is anticipated and therefore some parallax correction is unavoidable. Such positioning removes the horizontal parallax component leaving only a vertical component to be corrected. This can provide a significant reduction of the processing required to implement parallax correction calculations. Furthermore, assuming the lens pairs are mounted with the same offset relative to each spectacle lens, the same vertical parallax correction factor will apply to each spectacle lens. This further reduces the complexity of the parallax correction calculations.
In embodiments with more than one camera, each camera may have the same detection wavelength band. Additionally or alternatively, in such embodiments, one or more cameras may be configured to detect light in a first detection wavelength band and one or more cameras may be configured to detect light in a second, different detection wavelength band. This allows the optical data to be captured in different wavelength hands. Accordingly, the augmented reality data may be derived from optical data from different wavelength bands. This can provide an enhanced augmented reality experience.
In embodiments with one or more capture lens pairs, the dedicated cameras for the separate capture lenses within each pair may be configured to detect light in different detection wavebands. This allows the optical data to be captured in different wavelength bands. In one particular embodiment, one camera for the lens pair may be configured to detect light in the LWIR detection wavelength band and the other camera for the lens pair may be configured to detect light in the SWIR detection wavelength band. In other embodiments, other combinations of different wavebands may be used including but not limited to: NIR, SWIR, MWIR, LWIR, FIR, infrared, visible and ultraviolet.
Each optical connection element may be optically coupled to the associated capture lens. In particular, each optical connection element may be optically coupled to the rear of the associated capture lens.
Each optical connection element may be an optical fibre. In some embodiments, each optical connection element may comprise a bundle of optical fibres. Each optical connection element may comprise a protective outer cover. The protective outer cover may be substantially opaque to light.
Each optical connection element may be adapted to primarily direct light in a particular detection wavelength hand. In such embodiment, the detection wavelength band may match the detection wavelength band of the camera. In particular embodiments, optical connection elements may comprise an LWIR fibre or fibre bundle and/or an SWIR fibre or fibre bundle.
Each optical connection element may be provided within a corresponding duct in the frame. In embodiments comprising more than one optical connection element, two or more optical connection elements may share a duct. Additionally or alternatively, in such embodiments, a separate duct may be provided for each optical connection element or for particular sets of optical connection elements.
A first portion of the or each duct may extend through the frame front to the end pieces. A second portion of the or each duct may extend through the temples to the camera. The optical connection element may be adapted to have sufficient slack or elasticity to accommodate hinging of the temple relative to the frame front. If required, the duct may additionally he adapted to accommodate slack or elasticity in the optical connection element.
Each camera may he provided within a camera housing. In some embodiments, each camera may have a separate camera housing. In other embodiments, multiple cameras may be provided within a single camera housing. The camera housing may be mounted on the temple. In some embodiments, the camera housing may be mounted on the temple end. This provides the camera housing at or behind the ear of the user when worn. This improves the balance and comfort of the spectacles for extended wear. In some embodiments, the camera housing may be shaped to better fit to or around a user's ear.
A power source may be provided within the camera housing. The power source may be a battery. In some such embodiments, the battery may be a rechargeable battery. Additionally or alternatively, the camera housing may comprise a power connection. The power connection may comprise a power socket or cable suitable for receiving power from an external source.
A processing unit may be provided within the camera housing. The processing unit may be configured to receive image data from the camera. The processing unit may be configured to generate augmented reality data by processing the received image data. The processing unit may be configured to generated augmented reality data by processing the received image data. In such embodiments, the processing unit may be configured to carry out any suitable image processing operations, including but not limited to image transformations, edge detection, feature detection, feature highlighting, feature extraction, contrast variation, brightness variation or the like. In some embodiments, a dedicated processing unit may be provided for each camera. In other embodiments, a single processing unit may be provided for multiple cameras. In a further embodiment active IR illumination may be provided through a suitable attachment to the spectacles.
A communication connection or communication unit may he provided within the camera housing. The communication connection may comprise a cable and/or a cable socket. The communication unit may comprise a wireless communication device configured according to a known communication protocol such as Bluetooth, Wi-Fi, 3G, 4G, 5G, ISW, Intra Soldier Wireless or the like. The communication connection or communication unit can facilitate exchange of data with one or more external devices. For instance, this can enable image data from the camera to be passed to an external device for processing and/or augmented reality data to be received from the external device. The received augmented reality data may comprise processed image data from the camera and/or any other suitable information.
In some embodiments, the communication connection or communication unit may comprise an optical coupler. This can enable light from the optical connection element to be directed to a camera mounted separately to the augmented reality spectacles.
A power connection may be provided within the camera housing. The power connection may comprise a cable and/or a cable socket or connector pads. The power connection can facilitate the receipt of power from an external device to directly operate the device and/or to recharge the internal power supply.
Each optical projector unit may be configured to project augmented reality data using visible light. Each optical projector unit or may comprise a display unit and a light directing system. The display unit may be any suitable form of display, including but not limited to OLED, LED, LCD and the like. The light directing system may comprise one or more curved mirrors and/or one or more waveguides. Any such waveguides may be diffractive waveguides, holographic waveguides or reflective waveguides. In suitable embodiments, a diffractive waveguide may comprise one or more slanted diffraction grating elements singly or in combination. In suitable embodiments, a holographic waveguide may comprise multiple holographic optical elements sandwiched together. In suitable embodiments, a reflective waveguide may comprise one or more semi-reflective mirrors.
The brightness of the light projected by the optical projector unit output may be controllably variable. This allows the brightness of the projected augmented reality data to be adjusted to align with ambient visible light conditions. In some embodiments, this brightness may be automatically varied in response to sensing of ambient light. Additionally or alternatively, the brightness may be varied in response to user control inputs. In some embodiments, the brightness may be varied periodically. This can make it easier for a user to distinguish between the viewed scene and the augmented reality data.
The spectacles may comprise a frame within which the spectacle lenses are mounted. The frame may comprise a frame front and a pair of rearwardly extendable temples. The temples may be connected to the frame front via hinges. The hinges may be mounted on end pieces at either end of the frame front. The spectacle lenses may be mounted within frame rims of the frame front, which may be partial or complete frame rims. In some embodiments, the frame rims may be provided with nose pads and/or nose pad arms adapted to fit to or around a user's nose. The frame rims may be connected to each other via a bridge of the frame front. The temples may have temple tips that are covered and/or shaped to better fit a user. Such temple tips may be fixed or replaceable to accommodate personal preference, comfort, hygiene or service.
The augmented reality spectacle may be adapted to connect to one or more external devices. Suitable external devices include user wearable devices, such as battle management units or the like or remotely located battle controllers.
Detailed Description of the Invention
In order that the invention may be more clearly understood one or more embodiments thereof will now be described, by way of example only, with reference to the accompanying drawings, of which: Figure 1 is a schematic view of a pair of augmented reality spectacles according to the present invention; Figure 2 is (a) a schematic block diagram of the imaging and projecting elements of the augmented reality spectacles according to figure 1, and (b) a schematic block diagram of the imaging and projecting elements of an alternative embodiment of the augmented reality spectacles according to figure 1; Figure 3 Figure 4 Figure 6 is a schematic view of the augmented reality spectacles as worn by a user alone with a battle helmet; is (a) a schematic view of an alternative embodiment of a pair of augmented reality spectacles according to the present invention and (b) a schematic block diagram of the imaging and projecting elements of the alternative embodiment of the augmented reality spectacles of figure 4a; is (a) a schematic view of an alternative embodiment of a pair of augmented reality spectacles according to the present invention and (b) a schematic block diagram of the imaging and projecting elements of the alternative embodiment of the augmented reality spectacles of figure 5a; is (a) a schematic view of an alternative embodiment of a pair of augmented reality spectacles according to the present invention and (b) a schematic block diagram of the imaging and projecting elements of the alternative embodiment of the augmented reality spectacles of figure 6a; Figure 5 Figure 7 is (a) a schematic view of an alternative embodiment of a pair of augmented reality spectacles according to the present invention, and (b) a corresponding embodiment to figure 7a where a capture lens pair is substituted for each capture lens; and Figure 8 is a schematic block diagram of the imaging and projecting elements of in an embodiment where the camera is mounted separately to the augmented reality spectacles.
Turning now to figure 1, a pair of augmented spectacles 1 comprise a pair of spectacle lenses 2 mounted within a frame 10. The frame 10 comprises a frame front I 1 and a pair of rearwardly extendable temples 16, also commonly referred to as arms.
The templates 5 are connected to the frame front 11 via hinges 15 mounted on end pieces 12 at either end of the frame front 11. The spectacle lenses 2 are enclosed within frame rims 13 of the frame front 11. The frame rims 13 are connected to each other via a bridge 14 of the frame front 11. The temples 16 have temple tips 17 shaped to better fit to or around a user's ear.
In the illustrated example, the frame rims 13 are full rims which substantially surround each spectacle lens 2 but the skilled person will appreciate that partial frame rims may be substituted if desired or if appropriate. Similarly, the skilled person will appreciate that other common spectacle elements such as nose pads and/or nose pad arms adapted to fit to or around a user's nose may be provided if desired or if appropriate.
As illustrated in figure 1, the augmented reality spectacles 1 additionally comprise a capture lens 3 provided on bridge 14 of the frame front II. An optical connection element 4, which may typically be an optical fibre or an optical fibre bundle, is optically coupled to the rear of the capture lens 3. In this manner, the optical connection element 4 may be configured to direct light incident upon the capture lens 3 to a camera 51 provided with a camera housing 50 for detection. In the example of figure 1, the optical connection element 4 is shown external to the frame 10. The skilled person will appreciate that the optical connection element 4 would more typically be provided within a corresponding duct (not shown) extending between the capture lens 3 and the camera housing 50.
As illustrated below, the camera housing 50 is provided on one temple tip 17 of the frame 10. The skilled person will appreciate that the external shape of the housing or temple tip 17 can be varied as required or desired to better accommodate a camera 51 (and any other components) or to better fit a user.
The camera 51 is configured to output image data in response to detected incident light. Accordingly, despite the camera 51 being mounted on the temple tip 17, it can capture light incident upon capture lens 3 which is minimally displaced from either spectacle lens 2 and thereby the output image data substantially corresponding to the view of the user through spectacle lens 2.
As illustrated in figure 2a, the augmented reality spectacles 1 further comprise an optical projector unit 21 configured to project augmented reality data on to onto one spectacle lens 2 or onto both spectacle lenses 2. The projector unit 21 can be mounted onto or integrated into the frame front 11 at any suitable position, as understood by the skilled person and well known in the art. Optionally, as shown in figure 2a, the augmented reality spectacles I can comprise two projector units 21, each configured to project augmented reality data on to a different spectacle lens 2. The skilled person will appreciate that projector units suitable for augmented reality spectacles are well known in the art.
In the present invention, the projected augmented reality data is at least partially derived from the output image data of the camera 51. This therefore enables the user to view additional information derived from the image data overlaid on a scene viewable through the spectacles 1. As there is minimal displacement between each spectacle lens 2 and the capture lens 3, augmented reality data can be overlaid on the scene view-able by a user through spectacle lens 2, either directly or with minimal transformation.
In many embodiments, the camera 51 is adapted to detect incident light in a detection wavelength band outside visible light. In this manner, the projected augmented reality data provides the user with information about a viewed scene in wavelengths outside the visible range. For example, the camera may have a detection wavelength band covering any one or more of: near infrared (NIR), -0.75-1.4pm; short wavelength infrared (SWIR), 1.4-3pm; mid-wavelength infrared (MWIR), -3-8pm; long wavelength infrared light (LWIR), 8-15p m; or far infrared (FIR), -15-1000p m.
The skilled person will appreciate that other wavebands such as visible, ultraviolet or active (illuminated) infrared may also be used. In particular implementations, the camera 51 may be configured to detect light in the LWIR and/or SWIR band.
In such cases, the optical connection element 4 may be specifically adapted to better direct light in the selected detection wavelength band. In particular, the optical connection element 4 may be an optical fibre or optical fibre bundle or bundles, adapted to transmit light in the selected detection wavelength band. For instance, if the camera 51 has a detection waveband corresponding to SWIR, the optical connection element 4 may be an SWIR optical fibre or SWIR optical fibre bundle. Similarly, if the camera 51 has a detection waveband corresponding to LWIR, the optical connection element 4 may be a LWIR optical fibre or LWIR optical fibre bundle.
As illustrated in figure 2b, the camera housing 20 may comprise one or more other components. In this example, the camera housing 50 additionally comprises a processing unit 52, configured to receive image data from the camera 51 and to thereby I5 generate augmented reality data by processing the received image data. The processing unit 52 is configured to carry out any suitable image processing operations, including but not limited to image transformations, edge detection, feature detection, feature highlighting, feature extraction, contrast variation, brightness variation or the like.
Further illustrated in figure 2b are a power source 53 and a communication unit 54. The power source 51 can comprise a battery and/or a power socket enabling the battery to he recharged and/or the spectacles to be powered from an external device. The communication unit 54 may comprise any suitable wired or wireless data connection, including potentially an optical data connection as discussed further in relation to figure 8 below. This can enable image data from the camera to he passed to an external device for processing and/or augmented reality data to be received from the external device. The received augmented reality data may comprise processed image data from the camera and/or any other suitable information. For instance, the received augmented reality data may comprise alphanumeric data such as messages, range or identification tags for objected viewable with the augmented reality spectacles. As with figure 2a, optionally the augmented reality spectacles 1 can comprise two projector units 21, each configured to project augmented reality data on to a different spectacle lens 2.
Turning now to figure 3, this illustrates the spectacles 1 of figure 1 as worn by a user 9. The user 9 is further wearing a battle helmet 8, provided with a battle management unit (BMU) 7 and a connector cable 6. The cable 6 may enable the supply of power to the spectacles 1 and/or the exchange of data between BMU 7 and camera 5I or processing unit 52 as required. In this manner, the BMU 7 may be configured to carry out processing of the image data from camera 51 and/or supply the projector unit 21 with augmented reality data for projection.
I 0 The BMU 7 can comprise any suitable combination of data processing and data storage along with means for communication with one or more other devices including BMUs worn by others. The BMU 7 can also include an internal power source. Typically, the BMU 7 may be connected to a power source and/or additional processing/storage capacity worn elsewhere by the user, for instance on their helmet, I5 chest pack, vest or back. The BMU 7 enables communication of data between multiple individuals and/or overall battle control systems. Accordingly, augmented reality data derived from such communication can be output via projector unit 21.
Turning to figure 4, an alternative embodiment of the augmented reality spectacles 1 is shown schematically (figure 4a) and in a block diagram (figure 4b). In this embodiment, the spectacles I are provided with two capture lenses 3a, 3b. As illustrated, each capture lens 3a, 3h is provided on one of the end pieces 12 of the frame front 11. Additionally, each capture lens 3a, 3b is connected via a dedicated optical connection element 4a, 4b to a dedicated camera 51a, 51b. The respective cameras 51a, 5 lb are provided within separate dedicated housings 50a, 50b. nevertheless, the skilled person will appreciate that both cameras could be provided within a single housing if required or appropriate.
Each camera 51a, 51b in figure 4b is connected via a dedicated processing unit 52a, 52b to a dedicated projector unit 21a, 21b so as to control the projection of augmented reality data derived from the image data output by the respective camera 51a, 51b. The skilled person will appreciate that in alternative embodiments, each camera 51a, 51b can be connected directly to the dedicated projector unit 21a, 21b similarly to the arrangement shown in figure 2a.
The embodiment of figure 4 is particularly well suited for providing stereoscopic augmented reality data. The skilled person will appreciate that such a stereoscopic effect can be generated using alternative capture lens locations, for example by providing the respective capture lenses 3a, 3b on the bridge 14. Nevertheless, given the relatively small displacement between the respective capture lens 3a, 3b, this does not provide a significant stereoscopic effect.
hi a further alternative embodiment, it is possible to substitute a single capture lens 3 for a capture lens pair 30, as illustrated in figure 5a (schematically) and in a block diagram (figure 5b). Within the capture lens pair 30, each lens 31, 32 is optically coupled to a dedicated optical connection element 41, 42 and a dedicated camera 511, 512. In the illustrated example, the optical connection elements 41, 42 are contained within ducts within the frame 10 rather than being shown externally as in the earlier figures.
The individual lenses 31, 32 in the capture lens pair 30 are positioned adjacent to each other. Typically, the relative displacement is of the order of the lens 31, 32 diameter and thus each camera 511, 512 is configured to capture essentially an equivalent field of view. Accordingly, image transformation is not required between the image data captured by the respective cameras 511, 512. Whilst a lateral displacement between lenes 31, 32 is illustrated in figure 5, the skilled person will appreciate that the relative orientation between the lenses 31, 32 of the capture lens pair 30 can be varied as required or appropriate.
In this embodiment, camera 511 has a different detection wavelength band to 25 camera 512. For example, camera 511 may be configured to detect SWIR and camera 512 may be configured to detect LWIR. The skilled person will appreciate that other combinations of detection wavelength bands may be used if required or desired.
In this embodiment, the augmented reality data projected by projector unit 21 is derived from a combination of the image data from both cameras 511, 512. The user is therefore provided with augmented reality data containing information about a viewed scene in two different wavelength bands outside the visible range. This can improve performance of the device in conditions where one of the wavelength bands is blocked or attenuated, for instance by smoke or other factors.
As shown in figure 5b, each camera 511, 512 is connected to a common processing unit 52. Alternatively, the skilled person will appreciate that each camera 511, 512 could be provided with a dedicated processing unit. Similarly, the skilled person will appreciate that, optionally the augmented reality spectacles 1 can comprise two projector units 21, each configured to project augmented reality data on to a different spectacle lens 2 rather than a single projector unit 21 configured to project augmented reality data on to one or both spectacle lenses.
Whilst the spectacles 1 in figure 5 comprise a single camera housing 50 for both cameras 511, 512, the skilled person will appreciate that separate camera housing may be provided on each temple tip 17, as shown in figure 4, if required or desired.
Turning now to figure 6, a further alternative embodiment of the augmented reality spectacles 1 is shown is shown schematically (figure 6a) and in a block diagram (figure 6b). In this embodiment, the spectacles 1 are provided with two capture lenses pairs 30a, 30b. As illustrated, each capture lens pair 30a, 30b is provided on one of the end pieces 12 of the frame front 11. Accordingly, the embodiment of figure 6 can provide a stereoscopic equivalent to the multi-wavelength function provided by the embodiment of figure 5.
As with the embodiment of figure 5, within each capture lens pair 30a, 30b, each lens 31a, 32a, 31b, 32b is optically coupled to a dedicated optical connection element 41a, 42a, 41b, 42b and a dedicated camera 51 I a, 512a, 51 lb, 512h. In the illustrated example, the optical connection elements 41a, 42a, 41b, 42b are contained within ducts within the frame 10 rather than being shown externally as in the earlier figures.
As with the embodiment of figure 5, the individual lenses 31a, 32a, 31b, 32b in each capture lens pair 30a, 30b are positioned adjacent to each other. In this respect, cameras 511a, 512a are configured to capture essentially an equivalent field of view, as are cameras 511 b, 512b. Accordingly, image transformation is not required between the image data captured by the respective cameras 511a, 512a or between respective cameras 511b, 512b. Whilst a vertical displacement between lenes 31a, 32a and between 31b, 32b is illustrated in figure 6, the skilled person will appreciate that the relative orientation between the lenses 31a, 32a, and 31b, 32b of the capture lens pairs 30a, 30b can be varied as required or appropriate.
As in the embodiment of figure 5, cameras 511a, 511b have a different detection wavelength band to cameras 5I2a, 5 I 2b. For example, cameras 511a, 51 lb may be configured to detect SWIR and cameras 512a, 5126 may be configured to detect LWIR. The skilled person will appreciate that other combinations of detection wavelength bands may be used if required or desired.
The output of each camera 511a, 512a, 511b, 512b is passed to a dedicated processing unit 521a, 522a, 521b, 522b. This enables augmented reality data derived from the image data of each camera 511a, 512a, 511b, 512b to be generated. The skilled person will appreciate that whilst the embodiment of figure 6 provides dedicated processing units 521a, 522a, 521b, 522h for each camera 511 a, 512a, 51 1 h, 512h, it would be possible for each camera 511a, 512a, 511b, 512b to be connected to a single processing unit as in figure 5.
In this embodiment, the augmented reality data projected by projector unit 21a is derived from a combination of the image data from both cameras 511a and 512a.
Similarly, the augmented reality data projected by projector unit 21b is derived from a combination of the image data from both cameras 51 lb and 5I2h. The user is therefore provided with augmented reality data containing information about a viewed scene in two different wavelength bands outside the visible range. This can improve performance of the device in conditions where one of the wavelength bands is blocked or attenuated, for instance by smoke or other factors. The use of two lens pairs 30a, 30b further provides stereoscopic augmented reality data.
Whilst the spectacles 1 in figure 6 comprise a single camera housing 50 for all cameras 511a, 512a, 51 lb, 512b, the skilled person will appreciate that separate camera housing may be provided on each temple tip 17, as shown in figure 4, if required or desired. Similarly, the skilled person will also appreciate that the housing 50 in figure 6 could additionally comprise a power source 53 and a communication unit 54, if required or desired. These are omitted in the present figure for clarity.
Turning now to figure 7, further alternative embodiments of the augmented reality spectacles 1 is shown is shown schematically. In the embodiment, of figure 7a two capture lenses 3a, 3b are provided. Each capture lens 3a 3h is provided on the frame front 11 vertically above the optical centre of the respective spectacle lenses 2. In the embodiment of figure 7b, two capture lens pairs 30a, 30b are provided. Each capture lens 31a, 32a, 31b, 32b is provided on the frame front 11 vertically above the optical centre of the respective spectacle lenses 2. This positioning for a lens 3a, 3b (or lens pair 30a, 30b) substantially eliminates the horizontal parallax component between the spectacle lens 2 and the respective lens a lens 3a, 3b (or lens pair 30a, 30b). This is beneficial in situations where close range viewing is anticipated and therefore some parallax correction is unavoidable. In particular, these embodiments remove the horizontal parallax component leaving only a vertical component to be corrected. This can provide a significant reduction of the processing required to implement parallax correction calculations. The skilled person will appreciate that these embodiments could utilise camera 51, processing unit 52 and projector arrangements of the types described in relation to figures 4, 5 and 6 as required or as appropriate. Turning now to figure 8, a block diagram of a further embodiment of the invention is shown. In this embodiment, the camera 51 is mounted separate to the augmented reality spectacles 1.
More specifically, the communication unit 54 comprises an optical coupler within housing 50. This can allow light from the optical connection element to be directed to a camera 51 mounted separately. In the example illustrated, the camera 51 is mounted on or within a BMU 7, which could be helmet 8 mounted as shown in figure 3. The light is coupled to the BMU mounted camera 51 by a cable 6 which incorporates a suitable optical fibre or optical fibre bundle. If appropriate, the cable 6 can additionally include electrical wires to facilitate the return transmission of augmented reality data to the projector unit 21 (or a pair of projector units 21). Whilst this embodiment has been illustrated in respect of a simple single capture lens/single camera implementation, the skilled person will appreciate that the same principles can be applied to a multi-capture lens/capture lens pair and/or multi-camera implementation.
The one or more embodiments are described above by way of example only. Many variations are possible without departing from the scope of protection afforded by the appended claims.

Claims (27)

  1. CLAIMS1. A pair of augmented reality spectacles comprising: a capture lens provided on a frame front of the spectacles; an optical connection element configured to direct light incident upon the capture lens to the camera for detection, the camera configured to output image data in response to detecting incident light; and an optical projector unit configured to project augmented reality data on to at least one spectacle lens, wherein the augmented reality data is at least partially derived from the output image data of the camera.
  2. 2. A pair of augmented reality spectacles according to claim 1, wherein the augmented reality spectacles comprise a single projector unit configured to project augmented reality data on to one spectacle lens or on to each spectacle lens.
  3. 3. A pair of augmented reality spectacles according to claim 1, wherein the augmented reality spectacles comprise two projector units, each projector unit configured to project augmented reality data on to a different spectacle lens.
  4. A pair of augmented reality spectacles according to claim 3, wherein each projector unit is configured to project different augmented reality data on to each spectacle lens.
  5. 5. A pair of augmented reality spectacles according to any preceding claim, wherein the camera is mounted on the temple of the augmented reality spectacles.
  6. 6. A pair of augmented reality spectacles according to any one of claims 1 to 5, wherein the camera is mounted separately to the augmented reality spectacles and optically coupled to the optical connection element.
  7. 7. A pair of augmented reality spectacles according to any preceding claim, wherein the camera is configured to detect incident light in a selected detection wavelength band outside the visible wavelength band.
  8. 8. A pair of augmented reality spectacles according to claim 7, wherein the detection wavelength band is long wavelength infrared (LWIR) and/or short wavelength infrared (SWIR).
  9. 9. A pair of augmented reality spectacles according to any preceding claim, wherein the augmented reality spectacles comprise a single capture lens on a bridge of the frame front or on an end piece of the frame front.
  10. 10. A pair of augmented reality spectacles according to any one of claims 1 to 8, wherein the augmented reality spectacles comprise two or more capture lenses, each capture lens provided with a dedicated optical connection element and a dedicated camera.
  11. 11. A pair of augmented reality spectacles according to claim 10, wherein there are two capture lenses provided on opposing end pieces of the frame front or two capture lenses provided on the frame front above the centre of the respective spectacle lenses.
  12. 12. A pair of augmented reality spectacles according to any one of claims 1 to 8, wherein the spectacles comprise a capture lens pair, each capture lens of the pair provided with a dedicated optical connection element and a dedicated camera.
  13. 13. A pair of augmented reality spectacles according to claim 12, wherein the capture lens pair is provided on a bridge of the frame front or is provided on an end piece of the frame front.
  14. 14. A pair of augmented reality spectacles according to any one of claims 12 or 13, wherein there are two or more capture lens pairs, each capture lens pair provided on an opposing end piece of the frame front or each capture lens pair provided on the frame front above the centre of the respective spectacle lenses.
  15. 15. A pair of augmented reality spectacles according to any one of claims 12 to 14, wherein the dedicated cameras for the separate capture lenses within each pair are configured to detect light in different detection wavebands.
  16. 16. A pair of augmented reality spectacles according to claim 15, wherein one camera for the lens pair is configured to detect light in the LWIR detection wavelength band and the other camera for the lens pair is configured to detect light in the SWIR detection wavelength hand.
  17. 17. A pair of augmented reality spectacles according to any preceding claim, wherein each optical connection element is an optical fibre or a bundle of optical fibres.
  18. 18. A pair of augmented reality spectacles according to any preceding claim, wherein each optical connection element is adapted to primarily direct light in a wavelength band matching the detection wavelength band of the associated camera.
  19. 19. A pair of augmented reality spectacles according to claim 18, wherein each optical connection element is an LWIR optical fibre or an SWIR optical fibre or a bundle of LWIR optical fibres or a bundle of SWIR optical fibres.
  20. 20. A pair of augmented reality spectacles according to any preceding claim, wherein each optical connection element is provided within a corresponding duct in the frame.
  21. 21. A pair of augmented reality spectacles according to claim 20, wherein each optical connection element is adapted to have sufficient slack or elasticity to accommodate hinging of the temple relative to the frame front.
  22. 22. A pair of augmented reality spectacles according to any preceding claim, wherein each camera is provided within a camera housing mounted on the temple or the temple end.
  23. 23. A pair of augmented reality spectacles according to claim 22, wherein a processing unit is provided within the camera housing, the processing unit configured to receive image data from the camera and thereby generate augmented reality data by processing the received image data.
  24. 24. A pair of augmented reality spectacles according to claim 23, wherein a dedicated processing unit is provided for each camera.
  25. 25. A pair of augmented reality spectacles according to any one of claims 22 to 24, wherein a communication connection or communication unit is provided within the camera housing enabling connection to one or more external devices.
  26. 26. A pair of augmented reality spectacles according to claim 25, wherein the communication connection or communication unit comprises an optical coupler.
  27. 27. A pair of augmented reality spectacles according to any preceding claim, wherein each optical projector unit comprises a display unit and a light directing system together configured to project augmented reality data using visible light.A pair of augmented reality spectacles according to any preceding claim, wherein the brightness of the light projected by the optical projector unit output is controllably variable.
GB2404118.8A 2024-03-22 2024-03-22 Augmented reality display device Pending GB2639832A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2404118.8A GB2639832A (en) 2024-03-22 2024-03-22 Augmented reality display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2404118.8A GB2639832A (en) 2024-03-22 2024-03-22 Augmented reality display device

Publications (2)

Publication Number Publication Date
GB202404118D0 GB202404118D0 (en) 2024-05-08
GB2639832A true GB2639832A (en) 2025-10-08

Family

ID=90923755

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2404118.8A Pending GB2639832A (en) 2024-03-22 2024-03-22 Augmented reality display device

Country Status (1)

Country Link
GB (1) GB2639832A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6057966A (en) * 1997-05-09 2000-05-02 Via, Inc. Body-carryable display devices and systems using E.G. coherent fiber optic conduit
CA2388766A1 (en) * 2002-06-17 2003-12-17 Steve Mann Eyeglass frames based computer display or eyeglasses with operationally, actually, or computationally, transparent frames
CN107811706B (en) * 2017-11-27 2019-02-26 东北大学 A Surgical Navigation System Based on Image Transmitting Optical Fiber
CN113709410A (en) * 2020-05-21 2021-11-26 幻蝎科技(武汉)有限公司 Method, system and equipment for enhancing human eye visual ability based on MR glasses

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6057966A (en) * 1997-05-09 2000-05-02 Via, Inc. Body-carryable display devices and systems using E.G. coherent fiber optic conduit
CA2388766A1 (en) * 2002-06-17 2003-12-17 Steve Mann Eyeglass frames based computer display or eyeglasses with operationally, actually, or computationally, transparent frames
CN107811706B (en) * 2017-11-27 2019-02-26 东北大学 A Surgical Navigation System Based on Image Transmitting Optical Fiber
CN113709410A (en) * 2020-05-21 2021-11-26 幻蝎科技(武汉)有限公司 Method, system and equipment for enhancing human eye visual ability based on MR glasses

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DISPLAYS : FUNDAMENTALS AND APPLICATIONS, SECOND EDITION, 2016, HAINICH ROLF R ET AL, "Near-Eye Displays (NED)", pages 417-494 *

Also Published As

Publication number Publication date
GB202404118D0 (en) 2024-05-08

Similar Documents

Publication Publication Date Title
JP7200637B2 (en) Head-mounted display and display system
US8355610B2 (en) Display systems
US8971023B2 (en) Wearable computing device frame
KR101441873B1 (en) Head mounted monocular display device
CN105247861B (en) Infrared video display glasses
US6088165A (en) Enhanced night vision device
US20020030639A1 (en) Image display apparatus
JP6197864B2 (en) Wearable computer
KR20180015620A (en) Efficient thin curved eyepiece for see-through head wearable display
WO2016052134A1 (en) Head-mounted display
KR20220054245A (en) Optical Persistence (OST) Near Eye Display (NED) System Integrating Ophthalmic Correction
KR102254174B1 (en) A head mounted display apparatus with night vision function
US20230176383A1 (en) Eyebox steering and field of view expansion using a beam steering element
US11867908B2 (en) Folded optic augmented reality display
GB2639832A (en) Augmented reality display device
US20190011702A1 (en) Helmet-Mounted Visualization Device Without Parallax and its Operation Method
US12105292B2 (en) Optical see through (OST) near eye display (NED) system integrating ophthalmic correction
KR20210039582A (en) A head mounted display apparatus having plural optical elements of different transmittances and reflectances
WO2000052515A1 (en) Enhanced night vision device
KR102800654B1 (en) Augmented reality glass device capable of generating electricity using solar panels
KR102226639B1 (en) An optical apparatus with plural pin-mirrors and a head mounted display apparatus having thereof
CN219676380U (en) Wearable device
JP7525136B1 (en) Head-mounted Vision Device
KR20210116943A (en) An optical device with wider field of view using concave lens and a head mounted display apparatus having thereof
JP2023036737A (en) Display device