[go: up one dir, main page]

WO2025067746A1 - Procédé d'étalonnage de lunettes à réalité augmentée comportant un affichage rétinien virtuel, unité de calcul et lunettes à réalité augmentée - Google Patents

Procédé d'étalonnage de lunettes à réalité augmentée comportant un affichage rétinien virtuel, unité de calcul et lunettes à réalité augmentée Download PDF

Info

Publication number
WO2025067746A1
WO2025067746A1 PCT/EP2024/072083 EP2024072083W WO2025067746A1 WO 2025067746 A1 WO2025067746 A1 WO 2025067746A1 EP 2024072083 W EP2024072083 W EP 2024072083W WO 2025067746 A1 WO2025067746 A1 WO 2025067746A1
Authority
WO
WIPO (PCT)
Prior art keywords
data glasses
calibration
wearer
glasses
smartglasses
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/EP2024/072083
Other languages
German (de)
English (en)
Inventor
Christian Adam Grafenburg
Carsten Reichert
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of WO2025067746A1 publication Critical patent/WO2025067746A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0181Adaptation to the pilot/driver
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B27/0103Head-up displays characterised by optical features comprising holographic elements

Definitions

  • a method for calibrating data glasses having a virtual retinal display has already been proposed, wherein scanned images are generated by a laser projector of the virtual retinal display and projected into an eyebox of the data glasses as beamlets via an optical system of the data glasses, wherein calibration images and/or calibration patterns are output to a wearer of the data glasses by means of the beamlet, and wherein user feedback, which is provided by the wearer of the data glasses based on his perception, is recorded and used to calibrate the data glasses.
  • the invention is based on a method for calibrating data glasses having a virtual retinal display (retinal scan display), wherein scanned images are generated by a laser projector of the virtual retinal display and projected into an eyebox of the data glasses via an optical system of the data glasses as beamlets, wherein calibration images and/or calibration patterns are output to a wearer of the data glasses by means of the beamlet, and wherein a, in particular conscious, user feedback, which is given by the wearer of the data glasses based on his, in particular conscious, perception, is recorded and used to calibrate the data glasses is used.
  • User feedback is therefore particularly dependent on conscious observation and evaluation by the user.
  • an automated mechanical actuator for adjusting the position of the beamlet is controlled by a computing unit of the data glasses, or an actuation of a manual mechanical actuator for manually adjusting the position of the beamlet is initiated, or a manual conversion of the data glasses is initiated for manually adjusting the position of the beamlet, or the laser projector and/or the optical system for adjusting the brightness, sharpness, or hue of the reproduced calibration images and/or calibration patterns is controlled by the computing unit of the data glasses.
  • Data glasses should be understood in particular as a wearable (head-mounted display) by means of which information can be added to a user's field of vision.
  • Data glasses preferably enable augmented reality, virtual reality and/or mixed reality applications.
  • Data glasses are also commonly referred to as smart glasses or VR glasses or AR glasses.
  • the data glasses have the virtual retinal display (also called retinal scan display or light field display), which is particularly familiar to those skilled in the art.
  • the virtual retinal display is particularly designed to sequentially scan an image content by deflecting at least one visible laser beam from at least one temporally modulated light source, such as one or more (RGB) laser diodes of a laser projector, and to project it directly onto the retina of the user's eye using optical elements.
  • the virtual retinal display is generally based on the use of at least one combiner.
  • the combiner is an element that transmits light from the environment to the eye (real image) and the transmitted Ambient light is simultaneously overlaid/combined with light from artificially generated image content (augmented image/scanned image) of the laser projector of the virtual retinal display.
  • a well-known type of combiner is the so-called free-space combiner.
  • the free-space combiner can be integrated into a spectacle lens of the data glasses. Alternatively, the free-space combiner can also be designed as a separate optical element, separate from the spectacle lens.
  • the free-space combiner overlaid/combines the augmented image/scanned image of the virtual retinal display with the ambient image by serving as a reflective or diffracting surface for the image content (the scanned image) projected by the laser projector of the virtual retinal display.
  • the free-space combiner allows other wavelengths, which are predominantly contained in the ambient image, to pass through essentially unhindered.
  • This reflection or diffraction can be generated by reflection at a refractive index transition and/or by diffractive structures (such as holograms).
  • a holographic optical element (HOE) can be integrated into the lens as a free-space combiner. HOEs possess a high angular and wavelength selectivity, which is advantageous for achieving the desired selective reflection effect.
  • the laser projector emits, in particular, at least laser light in the red, green, and blue spectral ranges.
  • the laser projector can also emit an infrared laser beam, which can be used, for example, to track the eye movement of a user of the data glasses.
  • the laser projector could, for example, have a laser feedback interferometry (LFI) sensor.
  • LFI laser feedback interferometry
  • the scanned image is then generated, for example, with the aid of a two-dimensional deflection of the laser light emitted by the laser projector by a MEMS mirror system of the laser projector with a tiltable element (2D MEMS mirror)/with tiltable elements (two 1D MEMS mirrors).
  • This laser light thus writes a rasterized image onto the retina of the user of the data glasses.
  • a laser light beam generated by the laser projector is then preferably aligned via the MEMS mirror system and the free-space combiner so that it has a small beam diameter/circle of confusion at the location of the entrance pupil of the user's eye.
  • This small circle of confusion in the pupil plane of the user's eye is also called a beamlet.
  • a beamlet is preferably a small area in the pupil of the user's eye through which the individual rays of all pixels of the scanned images pass. The beamlet therefore contains all the image information (pixels) of each scanned image. If the pupil moves due to the user's eye movement, the user will only see a non-vignetted scanned image/augmented image as long as the beamlet is still within the pupil area of their eye.
  • the area in which the center of the user's pupil can move so that a complete image is still visible to the user is called the eyebox.
  • An "eyebox" is therefore preferably understood to be a spatial area within which all light rays of a scanning projection of the laser projector can pass through the entrance pupil of the user's eye.
  • the data glasses can obtain information about the current pupil position and other parameters of the user's eye.
  • the beamlet can then be continuously moved by a 2D tilt mirror system of the data glasses, which is different from the MEMS mirror system of the laser projector, and thus track the movement of the user's pupil (moving beamlet principle).
  • the use of a lens with a variable focal length (varifocal lens) in the beam path of the virtual retinal display can optimize the imaging performance and/or image sharpness of the scanned images/augmented images.
  • Calibration images and/or calibration patterns are, in particular, images or patterns that are specifically provided for carrying out a calibration procedure, in particular a field calibration procedure.
  • the calibration images and/or calibration patterns are generated using the laser light of the laser projector.
  • the user feedback can be provided by the wearer of the data glasses in various ways. For example, by an acoustic signal, by a (hand) gesture, by an eye movement, by an eyelid movement, by a head movement, by actuating an actuating element of the data glasses or another device that is in communication with the data glasses, e.g. a smartphone, or by any other conceivable means of communication.
  • the calibration itself can be a software and/or hardware setting change on the data glasses, in particular one or more components of the data glasses, such as optical System, laser projector, or head-mounted data glasses positioning elements.
  • Configured or “intended” is understood to mean, in particular, that an object is specifically programmed, designed, and/or equipped.
  • the fact that an object is intended or configured for a specific function is understood to mean, in particular, that the object fulfills and/or performs this specific function in at least one application and/or operating state.
  • the automated mechanical actuator can, for example, be an electrically actuated linear actuator or an electric motor of the data glasses.
  • the manual mechanical actuator can, for example, comprise a manually operable adjustment wheel or a manually operable adjustment slide of the data glasses.
  • the automated and/or manual mechanical actuator can, for example, be configured to adjust elements of the optical system of the data glasses relative to one another or internally.
  • the mechanical actuator can, for example, be configured to adjust elements of the laser projector of the data glasses relative to one another or internally.
  • the mechanical actuator can, for example, be configured to adjust elements for positioning the data glasses relative to the head/eye of the wearer of the data glasses.
  • the manual conversion of the data glasses can, in particular, comprise a replacement or manual deformation of one or more parts of the data glasses, such as nose pads, temples, or lens spacers.
  • the repositioning of the data glasses relative to the head/eye of the wearer of the data glasses also produces an adjustment of the position of the beamlet.
  • the manual mechanical actuator could, for example, be an adjusting screw for adjusting the distance between the eyes.
  • the manual mechanical actuator could, for example, be an adjusting screw on the eyeglass frame, e.g., on the ear, for changing the geometry of the eyeglasses, such as the inclination of a temple to a lens.
  • the manual mechanical actuator could, for example, be an adjusting screw intended for moving optical components, e.g., projection optics, lenses, etc., of the optical system, e.g., within the eyeglass frame.
  • a “computing unit” is understood to mean, in particular, a unit with an information input, an information processing unit, and an information output unit.
  • the computing unit comprises at least one processor, a memory, input and output means, further electrical components, an operating program, control routines, control routines, and/or calculation routines.
  • the components of the computing unit are preferably arranged on a common circuit board and/or advantageously arranged in a common housing, in particular the data glasses.
  • the computing unit can also be arranged at least partially external to the data glasses (e.g., on a connected mobile device) or distributed (e.g., in a cloud).
  • the data glasses issue instructions to the wearer of the data glasses, in particular via a visual display generated by the virtual retinal display or via an acoustic sound output, as to how the manual mechanical actuator must be operated to calibrate the data glasses or which manual modification of the data glasses must be carried out to calibrate the data glasses.
  • This can advantageously simplify and/or accelerate calibration considerably.
  • a high level of user-friendliness can advantageously be achieved.
  • the risk of incorrect calibration can advantageously be reduced.
  • Field calibration which can in particular be carried out by the user of the data glasses themselves, can advantageously be enabled.
  • the data glasses can transmit visual and/or acoustic instructions similar to: "Please turn screw number two three turns to the right.”
  • a position and/or inclination of a temple of the data glasses relative to a lens of the data glasses an adjustment of a hinge of the data glasses, an adjustment of a distance between the lenses of the data glasses, or a relative positioning and/or inclination of optical elements of the optical system of the data glasses to one another can be adjusted.
  • a nose pad of the data glasses is modified or replaced.
  • the nose pad comprises at least one bridge plate or at least two bridge plates or is formed by one or more bridge plates.
  • alternative designs of nose pads for data glasses without bridge plates are also conceivable.
  • a position of at least one tilt mirror, in particular of the 2D tilt mirror system, of the optical system of the data glasses is adjusted, in particular one that modifies a position of the beamlet.
  • the tilt mirror is different from the MEMS mirrors of the MEMS mirror system of the laser projector.
  • the main task of the tilt mirror is to move the beamlet, preferably to track the pupil of the beamlet within the framework of the moving beamlet principle.
  • the calibration image and/or the calibration pattern displayed to the wearer of the data glasses comprise a plurality of preferably geometrically identical white surfaces, in particular rectangles, circles or other shapes, the brightnesses of which are adjusted to one another, in particular individually, by the wearer of the data glasses for the purpose of calibrating a surface brightness of an image output from the data glasses.
  • This advantageously enables a particularly simple calibration, in particular a field calibration, of the surface brightness of the image output from the data glasses. Due to various factors, including an angle-, wavelength- and/or temperature-dependent diffraction efficiency of holograms, which can be used as free-space combiners in the data glasses, the homogeneity of the scanned images may not always be ensured.
  • white rectangles can be displayed to the user next to one another within the user's field of view.
  • One of the rectangles can now be selected and made brighter or darker by commands or user input until it is as bright as a neighboring one.
  • Rectangle A change in the brightness of a rectangle can be achieved, for example, by an adjustment on the laser projector or the optical system of the data glasses. This is repeated with several rectangles until relative brightness information is available for the entire image. This can then be used to homogeneously illuminate the entire field of view during operation of the virtual retinal display, in particular without having to rely on the factory calibration, which could possibly become obsolete in the meantime.
  • the size of the white areas can be selected so that the probability of local brightness jumps or deviations is as low as possible. It is conceivable that the user can change the size or position of the white areas if perceptible differences in brightness occur within one of the white areas.
  • the calibration image and/or the calibration pattern displayed to the wearer of the data glasses comprise a plurality of, preferably geometrically identical, colored areas, in particular rectangles, circles, or other shapes, whose color tones are adjusted to one another by the wearer of the data glasses, in particular individually, for the calibration of a planar color homogeneity of an image output from the data glasses, preferably by means of a surface-by-surface change of an RGB color mix of the image output from the data glasses.
  • This advantageously enables a particularly simple calibration, in particular a field calibration, of the color homogeneity of the image output from the data glasses.
  • the calibration of the color homogeneity using the colored areas can proceed analogously to the previously described calibration of the area brightness, wherein in particular only the white areas are replaced by the colored areas.
  • each RGB color is individually adjusted/calibrated to one another and/or that the RGB laser beams are superimposed in the color homogeneity calibration, thus allowing the user to aim for the highest possible "whiteness" of the colored areas during calibration.
  • the RGB color mix is optimally adjusted when the superimposed RGB laser beams appear white to the wearer of the data glasses.
  • the user can mark individual colored areas and, using commands or user inputs, adjust the respective colors red, green, and blue to one another, e.g., by brightening them. by darkening the color or by changing the wavelength, until the color impression of the colored surface is pure white for the user.
  • a change in brightness or a change in the wavelength of a color component can be achieved, for example, by adjusting the laser projector of the data glasses or the optical system of the data glasses.
  • the calibration pattern displayed to the wearer of the data glasses comprises a plurality of areas with, preferably high-contrast, lines or shapes, for example line pair bars, whose image sharpnesses are adjusted to one another, in particular individually, by the wearer of the data glasses for the calibration of a planar image sharpness impression of an image output from the data glasses, a simple and/or rapid sharpness calibration can advantageously be enabled.
  • a change in the sharpness impression of the calibration pattern can be achieved in particular by an adjustment on the laser projector and preferably by an adjustment of the optical system, e.g. via an electrically actuated lens with variable focus/a varioptic lens.
  • the change in the sharpness impression of the calibration pattern can be achieved by shifting optical components in the beam path of the optical system of the data glasses, e.g. by means of adjusting screws or automated actuators.
  • the image representing the calibration pattern could also be softened by software, e.g., if an image area is 'too sharp' and thus stands out compared to other image areas.
  • high-contrast patterns e.g., separate white and black lines, are displayed to the user in one part of the field of view. The user then changes the sharpness of the lines using the settings made or instructed until they have a particularly appealing (high) contrast.
  • Calibration is preferably performed at several points in the user's field of view until the system can determine a good compromise sharpness setting for the entire image, e.g., through interpolation or averaging.
  • the calibration pattern displayed to the wearer of the data glasses comprises a grid structure and/or a scale, wherein the wearer of the data glasses is requested to provide user feedback regarding which parts of the grid structure and/or the scale are currently perceptible, wherein based on this user feedback coordinates of a center of a current field of view are determined and wherein based on the determined coordinates a center of a future image output of the data glasses is calibrated, in particular via a control of the laser projector and/or the optical system.
  • the beamlet is preferably held stationary and displays, for example, the grid structure.
  • the user tells the system/data glasses, for example, which intersection point of the grid structure is centrally located in front of them.
  • the data glasses in particular the virtual retinal display, can now, for example, adapt the image content of the output scanned image or a beam path of the virtual retinal display such that the new future center of the field of view is located at this location and/or the image information to be displayed is arranged around the determined center of the field of view. It is also conceivable that the user could use this calibration or a similar process to determine where the field of view should be located—for example, not centered on their line of sight, but offset slightly to the right, left, top, or bottom.
  • the user could read the desired off-center coordinate of the calibration pattern, which is designed as a grid structure, and communicate it to the smart glasses.
  • the grid structure could be designed as a Cartesian grid.
  • the grid structure could be formed by markings in the corners of the field of view.
  • the user feedback of the wearer of the data glasses is recorded by means of at least one microphone for recording voice commands, by means of a push button on the data glasses, by means of a camera for gesture recognition, by means of an eye-tracking system of the data glasses for recording gaze commands or by means of an input into an external mobile device, such as a smartphone or a tablet, a particularly high level of user-friendliness and/or field calibration suitability can advantageously be achieved.
  • the computing unit for implementing the above-described method and/or the data glasses with the virtual retinal display and the computing unit are proposed. This advantageously allows the provision of data glasses with a high degree of user-friendliness, in particular with a field calibration option.
  • the method according to the invention, the computing unit according to the invention, and the data glasses according to the invention are not intended to be limited to the application and embodiment described above.
  • the method according to the invention, the computing unit according to the invention, and the data glasses according to the invention can have a number of individual elements, components, units, and method steps that differs from the number stated herein in order to fulfill a functionality described herein.
  • values within the stated limits are also to be considered disclosed and can be used arbitrarily.
  • Fig. 1 a is a schematic representation of a top view of data glasses with a virtual retinal display
  • Fig. 1 b a schematic representation of the data glasses from the front
  • Fig. 2 is a schematic flow diagram of a method for calibrating the data glasses
  • Fig. 3 an exemplary calibration image for a surface brightness calibration
  • Fig. 4 an exemplary calibration image for a color homogeneity calibration
  • Fig. 5 shows an exemplary calibration image for an image sharpness calibration
  • Fig. 6 shows a first exemplary calibration image for a field-of-view calibration
  • Fig. 7 shows a second exemplary calibration image for the field-of-view calibration.
  • FIGs 1a and 1b schematically show different views of data glasses 12.
  • the data glasses 12 comprise a spectacle frame 62.
  • the data glasses 12 comprise spectacle temples 32, 32'.
  • the data glasses 12 comprise spectacle hinges 34, 34'. Each of the spectacle temples 32, 32' is hinged to the spectacle frame 62 via one of the spectacle hinges 34, 34'.
  • the data glasses 12 comprise spectacle lenses 30, 30'.
  • the spectacle lenses 30, 30' are enclosed in the spectacle frame 62.
  • the data glasses 12 comprise a nose pad 38.
  • the nose pad 38 is replaceable. Different nose pads 38 can be provided/suitable for different users of the data glasses 12.
  • the data glasses 12 comprise an adjustment function for adjusting a distance 36 between the spectacle lenses 30, 30'.
  • the data glasses 12 comprise a computing unit 26.
  • the computing unit 26 is integrated into the data glasses 12. Alternatively, the computing unit 26 could also be integrated into an external mobile device 58, which is connected to the data glasses 12 at least for communication purposes.
  • the data glasses 12, in particular the Computing unit 26 has a memory unit 60.
  • Memory unit 60 is provided for storing wearer-specific calibration settings. Using the memory data of memory unit 60, wearer-specific calibration settings can be reset or alternately set between different wearers of the same data glasses 12. It is also conceivable for the data glasses to automatically detect which of the stored calibration settings are currently appropriate, e.g., based on user recognition.
  • the data glasses 12 have a virtual retinal display 10.
  • the virtual retinal display 10 is designed to project an artificially generated image content/scanned images 16 (see Figs. 3 to 7) directly onto the retina of an eye of a user of the data glasses 12.
  • the data glasses 12 have a laser projector 14.
  • the laser projector 14 is integrated, for example, into one of the temples 32, 32' of the data glasses 12.
  • the laser projector 14 is designed to generate visible laser light.
  • the visible laser light generates the scanned images 16.
  • the laser projector 14 comprises a MEMS mirror system (not shown) for scanning the visible laser light of the laser projector 14 to generate the two-dimensional scanned image 16.
  • the data glasses 12 comprise a tilt mirror 40.
  • the tilt mirror 40 is provided for controlling a position of the laser light emitted by the laser projector 14, in particular a beamlet of the virtual retinal display 10.
  • the data glasses 12 have an optical system 18.
  • the optical system 18 comprises a deflection element 64 (cf. the synopsis of Figures 1a and 1b).
  • the deflection element 64 is designed as a holographic optical element (HOE).
  • HOE holographic optical element
  • the deflection element 64 is integrated into one of the lenses 30, 30'.
  • the deflection element 64 is provided for deflecting the laser light emitted by the laser projector 14 toward a pupil plane 66 of the data glasses 12.
  • the pupil plane 66 of the data glasses 12 is defined by a plane in which an eye of a wearer of the data glasses 12 is positioned when the data glasses 12 are properly handled.
  • the deflection element 64 is provided for focusing the laser light emitted by the laser projector 14 onto the pupil plane 66 of the data glasses 12.
  • the deflection element 64 is provided for generating the beamlet.
  • the optical system 18 may also comprise further optical elements (not shown here), such as a varifocal lens.
  • the scanned images 16 generated by the laser projector 14 are projected as a beamlet into an eyebox of the data glasses 12 via the optical system 18 of the data glasses 12.
  • the data glasses 12 have an automated mechanical actuator 24.
  • the automated mechanical actuator 24 is controlled by the computing unit 26 of the data glasses 12 to adjust the position of the beamlet during calibration of the data glasses 12.
  • the data glasses 12 have a manual mechanical actuator 28.
  • the manual mechanical actuator 28 is actuated by the wearer of the data glasses 12 to adjust the position of the beamlet during calibration of the data glasses 12.
  • a relative position and/or a relative inclination of the spectacle lens 30, 30' to the spectacle temples 32, 32' is set.
  • the spectacle hinge 34, 34' of the data glasses 12 is alternatively or additionally adjusted.
  • the distance 36 between the lenses 30, 30' of the data glasses 12 is alternatively or additionally adjusted.
  • the relative positioning and/or inclination of optical elements of the optical system 18 of the data glasses 12 is alternatively or additionally adjusted.
  • the data glasses 12 have a microphone 50 (see Fig. 1 b).
  • the microphone 50 is provided for recording user feedback and/or voice commands from the wearer of the data glasses 12, in particular in connection with a calibration of the data glasses 12.
  • the data glasses 12 have a loudspeaker 68 (see Fig. 1 b).
  • the loudspeaker 68 is provided for outputting setting commands to the wearer of the data glasses 12 during a calibration, in particular an actuation of the manual mechanical actuator 28.
  • the data glasses 12 have a push button switch 52 (see Fig. 1 b).
  • the push button switch 52 is provided for manually setting/adjusting a state of the data glasses 12.
  • the push button switch 52 is provided for receiving user feedback from the wearer of the data glasses 12, in particular in connection with a Calibration of the data glasses 12.
  • the data glasses 12 have a camera 54 (see Fig. 1 b).
  • the camera 54 is designed for gesture recognition. Depending on a recognized gesture, a state of the data glasses 12 is changed/adjusted during the calibration.
  • the camera 54 is designed to receive user feedback from the wearer of the data glasses 12 via gesture recognition.
  • the computing unit 26 is configured to process and recognize the gestures recorded by the camera 54.
  • the data glasses 12 include an eye-tracking system 56.
  • the eye-tracking system 56 is embodied, for example, as a known LFI eye-tracking system 56 based on exploiting the "bright pupil effect.” Alternative embodiments of eye-tracking systems 56 are also conceivable.
  • the eye-tracking system 56 is designed to receive user feedback from the wearer of the data glasses 12.
  • the computing unit 26 is configured to process and evaluate the eye movements recorded by the eye-tracking system 56. For example, the detection of a specific gaze direction or a blink code (so-called gaze commands) can be recognized as user feedback by the eye-tracking system 56. Furthermore, the external mobile device 58 is configured to receive user feedback, e.g., via a touchscreen control of the mobile device 58.
  • Figure 2 shows a schematic flow diagram of a method for calibrating the data glasses 12.
  • the calibration in particular the field calibration, is started by the user.
  • the user can select from a variety of calibration options (surface brightness calibration, color homogeneity calibration, field-of-view calibration, etc.). It is also conceivable that, after the start of the calibration, all necessary calibration steps are carried out automatically one after the other.
  • scanned images 16 are generated by the laser projector 14 and projected as beamlets into an eyebox of the data glasses 12 via the optical system 18 of the data glasses 12.
  • the beamlets generated in method step 72 contain calibration images 20 and/or calibration patterns 22.
  • the calibration images 20 and/or calibration patterns 22 are output via the beamlets to the current wearer of the data glasses 12.
  • a surface brightness of the image output of the data glasses 12 is calibrated.
  • a color homogeneity of the image output of the data glasses 12 is calibrated.
  • an image sharpness impression of the image output of the data glasses 12 is calibrated.
  • a field of view of the data glasses 12 is calibrated.
  • step 82 user feedback provided by the wearer of the data glasses 12 based on their perception is recorded and used to calibrate the data glasses 12.
  • the user feedback of the wearer of the data glasses 12 is recorded by means of at least the microphone 50 in the form of voice commands, by means of the push button 52 on the data glasses 12, by means of the camera 54 in the form of recognized gestures, by means of the eye tracking system 56 in the form of gaze commands or by means of an input into the external mobile device 58.
  • the automated mechanical actuator 24 is controlled by the computing unit 26 of the data glasses 12 in at least one method step 84 to adjust the position of the beamlet.
  • method step 84 by controlling the automated mechanical actuator 24, for example, a position of the tilt mirror 40 of the optical system 18 of the data glasses 12, in particular modifying a position of the beamlet, can be adjusted.
  • actuation of the manual mechanical actuator 28 can be initiated to manually adjust the position of the beamlet.
  • a position and/or an inclination of the temples 32, 32' relative to one of the spectacle lenses 30, 30' of the data glasses 12 can be adjusted, an adjustment of the spectacle hinge 34 can be made, an adjustment of the distance 36 between the spectacle lenses 30, 30' of the data glasses 12 can be made, or a relative positioning and/or inclination of optical elements of the optical system 18 of the data glasses 12 to one another can be made.
  • a manual conversion of the data glasses 12 for manually adjusting the position of the beamlet can be initiated in a method step 88 during the manual conversion of the data glasses 12.
  • the nose pad 38 of the data glasses 12 can be converted or replaced during the manual conversion of the data glasses 12.
  • the data glasses 12 outputs instructions to the wearer of the data glasses 12 via a visual display generated by the virtual retinal display 10 or via an acoustic sound output, e.g. of the data glasses 12 or the mobile device 58, as to how the manual mechanical actuator 28 must be actuated to calibrate the data glasses 12 accordingly, or which manual conversion of the data glasses 12 must be carried out to calibrate the data glasses 12 accordingly.
  • the laser projector 14 and/or the optical system 18 can be controlled by the computing unit 26 of the data glasses 12 during the calibration in order to adjust a brightness, a sharpness or a color tone of the reproduced calibration images 20 and/or calibration patterns 22.
  • a wearer-specific calibration setting is stored in the internal or external storage unit 60 of the data glasses 12 for later retrieval, e.g., for resetting the wearer-specific calibration setting.
  • a calibration image 20 is displayed to the wearer of the data glasses 12 in a sub-method step 94, which calibration image comprises a plurality of geometrically identical white surfaces 42, 42' (cf. Fig. 3).
  • the white surfaces 42, 42' in Fig. 3 are shown as rectangles by way of example. However, they could also be circles, polygons, or have other shapes.
  • the brightnesses of the white surfaces 42, 42' are used to calibrate the surface brightness of the image output of the data glasses. 12 are individually adjusted to one another manually and/or automatically based on user feedback.
  • one of the white areas 42 is selected by the wearer of the data glasses 12 or one of the white areas 42 is displayed to the wearer of the data glasses 12 as active. In Figure 3, this is done, for example, by displaying two small triangles above and below the active white area 42.
  • the computing unit 26 controls the laser projector 14 and/or the optical system 18 to adjust the local area brightness
  • the brightness is iteratively adjusted (e.g., by commands or manual settings) until it is adjusted to an adjacent white area 42' in the wearer's field of view. This is then repeated for the entire field of view.
  • a calibration image 20 comprising a plurality of geometrically identical colored areas 44, 44' (cf. Fig. 4) is displayed to the wearer of the data glasses 12 in a sub-process step 98.
  • the colored (e.g., red, green, or blue) areas 44, 44' in Fig. 4 are shown as rectangles by way of example. However, they could also be circles, polygons, or other shapes.
  • the color tones of the colored areas 44, 44' are individually adjusted to one another manually and/or automatically based on user feedback in order to calibrate the area brightness of the image output of the data glasses 12.
  • one of the colored areas 44 is selected by the wearer of the data glasses 12, or one of the colored areas 44 is displayed to the wearer of the data glasses 12 as active. In Figure 4, this is done, for example, by displaying two small triangles above and below the active colored area 44.
  • the computing unit 26 controls the laser projector 14 and/or the optical system 18 to adjust the local color tone
  • the color tone is iteratively adjusted (e.g., by commands or manual settings) until it is aligned with a neighboring colored area 44' in the wearer's field of view. This is then repeated for the entire field of view.
  • a calibration pattern 22 comprising a plurality of regions 46 with high-contrast lines or shapes, for example, line pair bars, is displayed to the wearer of the data glasses 12 in a sub-method step 102 (see Fig. 5).
  • the sharpness impression of the regions 46 is manually and/or automatically adjusted individually or in groups to calibrate the image sharpness impression of the image output of the data glasses 12 based on user feedback.
  • the image sharpness of the calibration pattern 22 in an area 46 is iteratively adjusted (e.g., by commands or manual settings) until it is aligned with the image sharpness of an adjacent area 46 in the wearer's field of view.
  • an optical element e.g., a varifocal lens, could also be used to adjust the local image sharpness by executing one of the method steps 84, 86, in which an actuator 24, 28 is actuated automatically or manually. This is then repeated for the entire field of view.
  • a calibration pattern 22 comprising a grid structure 104 (see Fig. 6) or a scale 106 (see Fig. 7) is displayed to the wearer of the data glasses 12 in a sub-method step 108.
  • the wearer of the data glasses 12 is prompted by means of the virtual retinal display 10 to provide user feedback regarding which parts of the grid structure 104 or the scale 106 are perceptible to him/her and/or at which coordinates of the grid structure 104 or the scale 106 a center of the field of view currently perceptible to the wearer of the data glasses 12 lies.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Mechanical Optical Scanning Systems (AREA)

Abstract

L'invention concerne un procédé d'étalonnage de lunettes à réalité augmentée (12) comportant un affichage rétinien virtuel (10) (Retinal Scan Display), des images (16) de balayage étant générées par un projecteur laser (14) de l'affichage rétinien virtuel (10) et reproduites sous forme de petits faisceaux individuels dans une zone oculaire des lunettes à réalité augmentée (12) par l'intermédiaire d'un système optique (18) des lunettes à réalité augmentée (12), des images d'étalonnage (20) et/ou des modèles d'étalonnage (22) étant délivrés à un porteur des lunettes à réalité augmentée (12) au moyen du petit faisceau individuel, et un retour d'utilisateur, qui est émis par le porteur de lunettes à réalité augmentée (12) sur la base de sa perception, étant acquis et utilisé pour l'étalonnage des lunettes à réalité augmentée (12). Selon l'invention, pour l'étalonnage des lunettes à réalité augmentée (12), un actionneur (24) mécanique automatisé est commandé par une unité de calcul (26) des lunettes à réalité augmentée (12) pour le réglage d'une position du petit faisceau individuel, ou un actionnement d'un actionneur (28) mécanique manuel est provoqué pour le réglage manuel de la position du petit faisceau individuel, ou une transformation manuelle des lunettes à réalité augmentée (12) est provoquée pour le réglage manuel de la position du petit faisceau individuel, ou le projecteur laser (14) et/ou le système optique (18) sont commandés par l'unité de calcul (26) des lunettes à réalité augmentée (12) pour l'adaptation d'une luminosité, d'une netteté ou d'une teinte des images d'étalonnage (20) et/ou des modèles d'étalonnage (22) reproduits.
PCT/EP2024/072083 2023-09-28 2024-08-02 Procédé d'étalonnage de lunettes à réalité augmentée comportant un affichage rétinien virtuel, unité de calcul et lunettes à réalité augmentée Pending WO2025067746A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102023209501.7 2023-09-28
DE102023209501.7A DE102023209501A1 (de) 2023-09-28 2023-09-28 Verfahren zu einer Kalibrierung einer eine virtuelle Netzhautanzeige aufweisenden Datenbrille, Recheneinheit und Datenbrille

Publications (1)

Publication Number Publication Date
WO2025067746A1 true WO2025067746A1 (fr) 2025-04-03

Family

ID=92212844

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2024/072083 Pending WO2025067746A1 (fr) 2023-09-28 2024-08-02 Procédé d'étalonnage de lunettes à réalité augmentée comportant un affichage rétinien virtuel, unité de calcul et lunettes à réalité augmentée

Country Status (2)

Country Link
DE (1) DE102023209501A1 (fr)
WO (1) WO2025067746A1 (fr)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140375540A1 (en) * 2013-06-24 2014-12-25 Nathan Ackerman System for optimal eye fit of headset display device
EP2499962B1 (fr) * 2011-03-18 2015-09-09 SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH Dispositif de mesure optique et procédé pour capturer au moins un paramètre d'au moins un oeil dans lequel une caractéristique d'éclairage est réglable
WO2016105281A1 (fr) * 2014-12-26 2016-06-30 Koc University Dispositif d'affichage proche de l'oeil
US20170221273A1 (en) * 2016-02-03 2017-08-03 Disney Enterprises, Inc. Calibration of virtual image displays
US20180129050A1 (en) * 2015-05-19 2018-05-10 Maxell, Ltd. Head-mounted display, head-up display and picture displaying method
US20210148697A1 (en) * 2015-11-04 2021-05-20 Magic Leap, Inc. Light field display metrology
US20210263307A1 (en) * 2020-02-21 2021-08-26 Fotonation Limited Multi-perspective eye acquisition
CN113614674A (zh) * 2018-12-19 2021-11-05 视觉系统有限责任公司 用于通过光学系统产生和显示虚拟对象的方法
WO2023023661A1 (fr) * 2021-08-20 2023-02-23 Ardalan Heshmati Système d'affichage de projection rétinienne

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010062479A1 (fr) * 2008-11-02 2010-06-03 David Chaum Système et appareil pour plateforme de dispositif de lunettes
US10852838B2 (en) * 2014-06-14 2020-12-01 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
DE102018105917A1 (de) * 2018-03-14 2019-09-19 tooz technologies GmbH Verfahren zur benutzerindividuellen Kalibrierung einer auf den Kopf eines Benutzers aufsetzbaren Anzeigevorrichtung für eine augmentierte Darstellung
DE102021206073A1 (de) * 2021-06-15 2022-12-15 Robert Bosch Gesellschaft mit beschränkter Haftung Optisches System für eine virtuelle Netzhautanzeige (Retinal Scan Display), Datenbrille und Verfahren zum Projizieren von Bildinhalten auf die Netzhaut eines Nutzers
DE102022207025A1 (de) * 2022-07-11 2024-01-11 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zu einem Betrieb einer eine virtuelle Netzhautanzeige umfassenden Datenbrille, Recheneinheit und Datenbrille

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2499962B1 (fr) * 2011-03-18 2015-09-09 SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH Dispositif de mesure optique et procédé pour capturer au moins un paramètre d'au moins un oeil dans lequel une caractéristique d'éclairage est réglable
US20140375540A1 (en) * 2013-06-24 2014-12-25 Nathan Ackerman System for optimal eye fit of headset display device
WO2016105281A1 (fr) * 2014-12-26 2016-06-30 Koc University Dispositif d'affichage proche de l'oeil
US20180129050A1 (en) * 2015-05-19 2018-05-10 Maxell, Ltd. Head-mounted display, head-up display and picture displaying method
US20210148697A1 (en) * 2015-11-04 2021-05-20 Magic Leap, Inc. Light field display metrology
US20170221273A1 (en) * 2016-02-03 2017-08-03 Disney Enterprises, Inc. Calibration of virtual image displays
CN113614674A (zh) * 2018-12-19 2021-11-05 视觉系统有限责任公司 用于通过光学系统产生和显示虚拟对象的方法
US20210263307A1 (en) * 2020-02-21 2021-08-26 Fotonation Limited Multi-perspective eye acquisition
WO2023023661A1 (fr) * 2021-08-20 2023-02-23 Ardalan Heshmati Système d'affichage de projection rétinienne

Also Published As

Publication number Publication date
DE102023209501A1 (de) 2025-04-03

Similar Documents

Publication Publication Date Title
DE102021200893A1 (de) Optisches System für eine virtuelle Netzhautanzeige und Verfahren zum Projizieren von Bildinhalten auf eine Netzhaut
EP0871913B1 (fr) Microscope
EP3607382A1 (fr) Lunettes à réalité augmentée (ra) et procédé pour intégrer des images virtuelles dans une image visible grâce à au moins un verre de lunettes pour un porteur des lunettes
EP3323012A1 (fr) Dispositif de projection pour des lunettes à réalité augmentée, lunettes à réalité augmentée et procédé de fonctionnement d'un dispositif de projection pour des lunettes à réalité augmentée
DE102018214637A1 (de) Verfahren zum Ermitteln einer Blickrichtung eines Auges
CN104812342A (zh) 视觉帮助投影仪
DE112013004470T5 (de) Laserbehandlungsgerät
DE102017211932A1 (de) Projektionsvorrichtung für eine Datenbrille, Datenbrille sowie Verfahren zum Betreiben einer Projektionsvorrichtung
DE202018101818U1 (de) Augennahe Anzeige mit Einzelbildrendering auf der Grundlage der Analyse einer reflektierten Wellenfront für die Augenchrakterisierung
DE19731301C2 (de) Vorrichtung zum Steuern eines Mikroskopes mittels Blickrichtungsanalyse
DE102016201567A1 (de) Projektionsvorrichtung für eine Datenbrille, Verfahren zum Darstellen von Bildinformationen mittels einer Projektionsvorrichtung und Steuergerät
DE102018209886B4 (de) Einrichtung zur Projektion eines Laserstrahls zur Erzeugung eines Bildes auf der Netzhaut eines Auges und Brilleneinrichtung mit zwei derartigen Einrichtungen
WO2023274628A1 (fr) Système optique pour un affichage de balayage rétinien et procédé de projection de contenus d'image sur une rétine
DE112021002930T5 (de) Anzeigevorrichtung und anzeigeverfahren
DE102021104528A1 (de) Optisches System für eine virtuelle Netzhautanzeige und Verfahren zum Projizieren von Bildinhalten auf eine Netzhaut
CN112204453A (zh) 影像投射系统、影像投射装置、影像显示光衍射光学元件、器具以及影像投射方法
DE102015214671A1 (de) Autofokussierende optische Vorrichtung und Verfahren zur optischen Autofokussierung
DE102020205910A1 (de) Datenbrille zur virtuellen Netzhautanzeige und Verfahren zum Betreiben derselben
DE102018123781B4 (de) Verfahren zum Durchführen einer Shading-Korrektur und optisches Beobachtungsgerätesystem
DE102020201114A1 (de) Datenbrille und Verfahren zu ihrem Betrieb
WO2025067746A1 (fr) Procédé d'étalonnage de lunettes à réalité augmentée comportant un affichage rétinien virtuel, unité de calcul et lunettes à réalité augmentée
WO2015158830A1 (fr) Dispositif et procédé d'affichage
JP2019049724A (ja) 目用投影システム
WO2024153397A1 (fr) Lunettes à réalité augmentée et procédé de projection d'une image de projection
DE102021200026A1 (de) Datenbrille und Verfahren zum Erzeugen eines virtuellen Bilds mit einem Rahmen unter Verwendung einer Datenbrille

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24751529

Country of ref document: EP

Kind code of ref document: A1