US20170343819A1 - Display device which can be placed on the head of a user - Google Patents
Display device which can be placed on the head of a user Download PDFInfo
- Publication number
- US20170343819A1 US20170343819A1 US15/529,083 US201515529083A US2017343819A1 US 20170343819 A1 US20170343819 A1 US 20170343819A1 US 201515529083 A US201515529083 A US 201515529083A US 2017343819 A1 US2017343819 A1 US 2017343819A1
- Authority
- US
- United States
- Prior art keywords
- display device
- image
- optical system
- imaging optical
- boundary surface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B13/00—Optical objectives specially designed for the purposes specified below
- G02B13/18—Optical objectives specially designed for the purposes specified below with lenses having one or more non-spherical faces, e.g. for reducing geometrical aberration
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B25/00—Eyepieces; Magnifying glasses
- G02B25/001—Eyepieces
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
Definitions
- the present invention relates to a display device with a holder that can be fitted on the head of a user and a first imaging optical system connected mechanically to the holder, which is formed to image an image generated in an image plane as a virtual image in such a way that, when the holder is fitted on his head, the user can perceive it with a first eye.
- Display devices can be formed in particular in such a way that the user can only perceive the images generated by the display device and can no longer perceive the surroundings.
- Such display devices are also referred to as HMD devices (Head Mounted Display devices) or as VR spectacles (Virtual Reality spectacles). Since the user can only perceive the images generated by the display device and can no longer perceive the surroundings, he is submerged as it were in a virtual reality, which leaves a strong impression.
- Such display devices should, since they are worn on the head, be as light as possible and, at the same time, as good as possible an imaging of the virtual image should be provided with a large field of view, which leads to the first imaging optical system having an increased weight since, for this purpose, it comprises e.g. several lenses.
- the disclosure includes a display device with a holder that can be placed on the head of a user and a first imaging optical system connected mechanically to the holder, which is formed to image an image generated in an image plane as a virtual image in such a way that, when the holder is placed on his head, the user can perceive it with a first eye, in such a way that the difficulties named at the start can be overcome as completely as possible.
- the disclosure also includes a display device which comprises a holder that can be placed on the head of a user and a first imaging optical system connected mechanically to the holder, which is formed to image an image generated in an image plane as a virtual image in such a way that, when the holder is placed on his head, the user can perceive it with a first eye, wherein the first imaging optical system comprises as imaging element precisely one first lens with a first and a second boundary surface, wherein both boundary surfaces are in each case aspherically curved.
- the first imaging optical system comprises precisely one first lens (and no further imaging optical elements), the weight of the display device can be kept low.
- the first and second boundary surfaces preferably have different spherical curvatures.
- the ratio of the value of the vertex curvature of the first boundary surface, which is facing the image plane, to the value of the vertex curvature of the second boundary surface can be in the range of from 1.2 to 1.8. It is thus possible to adequately correct the astigmatism, whereby the desired imaging quality can be provided.
- the second boundary surface can be formed as a hyperboloid without higher-order deformations and/or the first boundary surface can be formed as a hyperboloid with higher-order deformations.
- the first imaging optical system can be free from deflecting elements for beam path folding.
- the first lens can be formed from plastic. This leads to a low weight. It can be formed from PMMA for example. This leads to a reduction in the undesired chromatic aberrations since PMMA has a low dispersion.
- a diaphragm can be positioned in front of the first boundary surface and thus between the first boundary surface and the image plane. Beams with a strong aberration, which would otherwise be directed from the first lens in an undesired way to the exit pupil of the first imaging optical system, can thus be shadowed.
- the distance between the second boundary surface and the exit pupil of the first imaging optical system can be in the range of from 5 to 25 mm, in particular in the range of from 13 to 20 mm.
- the field of view provided can have a value between 65° and 105° in the diagonal direction.
- the distance between the exit pupil of the first imaging optical system and the image plane can be in the range of from 50 to 70 mm.
- the exit pupil of the first imaging optical system can have a maximum size of greater than or equal to 3 mm, preferably of greater than or equal to 5 mm.
- the display device can be designed in such a way that the image generated in the image plane has a diagonal of 50 mm or less.
- the first imaging optical system can be designed in such a way that the astigmatism can be regarded as corrected, whereas a distortion and a lateral chromatic aberration are allowed.
- the image generated in the image plane is then preferably generated with a corresponding opposite distortion and a corresponding opposite lateral chromatic aberration with the result that the imaged virtual image is imaged without distortion or with little distortion and without chromatic aberration or with little lateral chromatic aberration.
- the first imaging optical system can be optimized in such a way that the field curvature still present can be compensated by accommodation by means of the eye.
- the aspherically curved boundary surfaces are formed in such a way that they correct the spherical aberration. Preferably they can also correct higher-order aberrations.
- the first imaging optical system can be designed in such a way that its Petzval radius is in the range of from ⁇ 55 to ⁇ 49.
- the first imaging optical system can be designed in such a way that, during imaging, the angular deviation of the beams caused by the first boundary surface is compensated by the angular deviation of the beams caused by the second boundary surface.
- the absolute value of this term is preferably less than 0.1, in particular less than 0.7, less than 0.5 or less than 0.01.
- the display device can furthermore comprise an image module connected mechanically to the holder, which comprises a screen, on which the image is generated in the image plane.
- the image module can comprise a control unit for actuating the screen and a sensor unit for detecting a movement of the image module.
- the sensor unit can be equipped to supply signals to the control unit.
- the control unit can be equipped to recognize from the signals a tipping of the holder which has been carried out and to bring about, in dependence on the recognized tipping, a change in the image generated on the screen and/or the control of an application executed on the image module which provides contents which are contained in the image generated on the screen.
- Such a tipping or striking of the holder can be carried out discreetly and also in noisy surroundings without the input being distorted thereby.
- the display device can thus be controlled discreetly and securely.
- An input can thus be carried out by tipping, striking or knocking against the holder.
- the holder per se is used as an input interface which is sensitive to the holder being touched (in particular tipping, striking or knocking against the holder), wherein the holder does not need to be specifically adapted for this.
- the holder can be used as an input interface as it is since the measurement of the interaction (the tipping, striking or knocking etc.) carried out with the holder takes place by means of the sensor unit and the evaluation of the measurement signals takes place by means of the control unit, which are both part of the image module.
- the control unit can recognize from the supplied signals the position of the tipping which has been carried out on the holder, the strength of the tipping of the holder which has been carried out, the time interval between at least two successive tipping events and/or recognize the direction in which the holder was struck or knocked against during the tipping, and, in dependence on the recognized position, the recognized strength of the tipping, the recognized time interval and/or the recognized direction, can bring about a change in the image generated on the screen and/or control the application.
- the control unit can recognize from the signals a spatial and/or time pattern of one or more tipping events and, in dependence thereon, can bring about a change in the image generated on the screen and/or control the application.
- the spatial and/or time pattern is thus converted into a control command.
- a selectable menu option can be represented in the image on the left-hand and on the right-hand side of the screen, which can be selected by tipping the associated left-hand or right-hand side of the holder. It is also possible for the display device to represent a selectable menu option at least in one corner of the generated image, which can be selected by tipping the holder in the area of this corner.
- the display device according can be formed as an HMD device and/or as VR spectacles.
- the image module can be replaceably connected to the holder.
- the holder can comprise a front part in which the image module and the first imaging optical system are arranged, wherein, when the holder is fitted on the head, the front part is preferably positioned in front of the user's eyes in such a way that the user can only perceive the virtual image and not the surroundings.
- the front part can lie against the face in a light-proof manner in an area surrounding the eyes.
- the screen of the image module can be essentially perpendicular to the direction of forward view.
- the angle between the direction of forward view and the screen is 90° or almost 90° (for example with a maximum deviation ⁇ 10° in both directions and thus the angle can have values of from 80° to 100° or preferably with a maximum deviation 5° in both directions and thus the angle can have values of from 85° to 95°).
- the display device can comprise a second imaging optical system, which can be formed in the same way as the first imaging optical system.
- the second imaging optical system is in particular formed to provide a second image generated on the screen as a virtual image, when the user wears the holder on his head.
- the first imaging optical system can provide the virtual image for a first eye and the second imaging optical system can provide the virtual image for the second eye.
- a binocular display device is provided.
- the first and/or second imaging optical system can be formed as a magnifying optical system.
- the distance between the optical axes of the first imaging optical system and of the second imaging optical system is preferably between 60 and 65 mm, in particular between 61 and 63 mm.
- the distance between the imaging optical systems can be set to a desired distance which can correspond to the distance between the user's eyes.
- the sensor unit can comprise an inertia sensor, such as e.g. a gyroscope, a tilt sensor, an acceleration sensor and/or another type of sensor.
- an inertia sensor such as e.g. a gyroscope, a tilt sensor, an acceleration sensor and/or another type of sensor.
- the mechanical connection of the image module to the holder is preferably without play.
- the application executed on the image module can actuate the screen in such a way that, on the basis of the supplied image data of a first image, the first image is generated both in a first section of the screen and also in a second section of the screen separate from the first section.
- the application is executed directly on the image module, so the desired image generation can advantageously be carried out rapidly.
- image data which are saved in the image module or have been generated by the latter can be used for the representation.
- the image module can be a portable device with a control unit for executing program instructions and for actuating the screen.
- a portable device is e.g. a mobile phone, such as e.g. a smartphone.
- the property of the portable image module of being able to execute program instructions is thus used in order to execute the application on the image module with the corresponding instructions, which application carries out the actuation of the screen and preferably the division of the screen into two sections with the representation of the same image in both sections.
- the application can actuate the screen in such a way that the two sections are spaced apart from each other and the area between the two sections is switched to dark.
- the application can generate the first image in the first and second sections in such a way that there is no stereo effect when looking at the two images.
- the images can also be generated in such a way that there is a stereo effect.
- the image data can be supplied in such a way that the application accesses image data saved in the image module (also called image-generating device below) and/or that the image data are streamed onto the image-generating device and/or that the image data are generated on the image-generating device.
- the application can represent image data of other applications on the image-generating device according to the invention.
- the first image is generated pre-distorted in at least one of the sections and thereby for an imaging error of the imaging optical system to be at least partially compensated.
- the captured surroundings can be subjected to a further image data processing before they are represented.
- This can be a brightness adjustment, a colour adjustment, a false-colour representation, etc.
- the application can also be referred to as an app.
- the attached image data can in particular be image data of a capture of the surroundings of a user wearing the display device on his head and/or another type of image data. It is thus possible, for example, to show the user the surroundings as the first image. In addition, the surroundings can be shown with further items of image information superimposed with the result that more than the surroundings is displayed to the user. These can, for example, be information which is generated depending on the context and is represented in the generated image.
- a live image of the surroundings can be displayed to the user.
- another camera is preferably provided which captures the surroundings.
- the camera can be a separate camera which is preferably secured to the display device (e.g. to the holder).
- a camera which is available on the image-generating device.
- smartphones as a rule a camera is provided which is provided on the rear side (the side facing away from the screen) with the result that this camera can be used to capture the surroundings.
- Augmented Reality representation The described representation of the surroundings superimposed with a further item of image information is also often referred to as Augmented Reality representation.
- the application can generate the first image in the first and second sections in such a way that image content of the first image is generated in the first and second sections either with a first spacing or with a second spacing which is different therefrom. It is thus possible to adapt the image generation to users' different interpupillary distances.
- the application can comprise an interface via which the spacing can be input, wherein the first image is then generated in the first and second sections in dependence on the input spacing in such a way that the input spacing is present on the screen between the image content of the image.
- the interface can, in particular, be the input interface according to the invention of the holder, which can be operated by tipping, knocking or striking. In particular, it is possible to change the spacing continuously by means of the application.
- the complete first image is thus not then represented in the two sections but only a cropped first image.
- Any areas of the screen not filled by the first image can be represented black.
- the front part of the display device can comprise a receptacle into which the image module can be replaceably inserted.
- the receptacle can comprise a mount which is adapted to the image module and can be reversibly separated from the receptacle, into which mount the image module can be inserted.
- a simplified separation of the image module from the display device is thus achieved in that the image module itself is not removed but rather the image module together with the mount.
- the mount when separated from the display device, can thus also be used as a permanent protective cover for the portable image module for protection against mechanical damage.
- the mount is adapted to the portable image module and/or the receptacle in such a way that, when the mount is secured to the holder, the image module is automatically positioned in the image plane.
- the mount can be individually adapted to hold a predetermined product of image module, in particular a predetermined model of a mobile phone in such a way that, when the mount is secured to the receptacle, the image module is automatically positioned in the image plane.
- different image modules can thus be combined with the display device in a simple manner.
- the mount can comprise a recess complementary to the external dimensions of the image module, in which the image module can be embedded fitting precisely in such a way that an image-generating front side of the image module (i.e. the screen side in the case of a mobile phone) remains essentially uncovered by the mount.
- an image-generating front side of the image module i.e. the screen side in the case of a mobile phone
- the mount is adapted to the receptacle.
- Inserting the mount into the receptacle can comprise, in particular, sliding in and/or snapping in.
- the mount can comprise at least one holding element which holds the image module in such a way that it can be removed when it is inserted in the mount.
- the holding element can be formed by a flanged edge which at least partially engages around an outer edge of the image-generating front side of the image module.
- the mount can comprise a frame.
- the frame can have internal dimensions for the (precisely fitting) insertion of the image module and external dimensions for (precisely fitting) insertion into the holder.
- the frame can comprise a cover on the rear side which forms a lower flat closure of the frame and can therefore also be referred to as the frame base.
- the rear-side cover can comprise an optically transparent partial area.
- the optically transparent partial area of the cover can comprise an opening or an optically transparent material, for example a transparent plastic.
- the optically transparent partial area is preferably provided in the area of the frame base which is opposite a rear-side camera of the image module. It is thus made possible in particular to access a camera of the image module even when the image module is inserted.
- the camera of a mobile phone or of any other image module equipped with a camera to capture images (of the surroundings) while the user is wearing the display device with the image module inserted on his head.
- an optically transparent material is meant a material which is (partially) transparent to visible light, wherein the degree of transparency can be chosen freely and, in particular, can be in the range between 100% and 60%.
- the optically transparent material can comprise an optically effective layer.
- a partially reflective layer can be provided, the reflectivity of which can be chosen freely, but can preferably be between 5 and 30%.
- the partial area it is possible for the partial area to be left open as an opening in the frame base.
- the frame can comprise a non-transparent central bar which separates the image-generating front side of the image module inserted into the frame into a first optically accessible area and a second optically accessible area.
- the mount can comprise an insert which is adapted to the image module and can be separated from the mount.
- the insert can in turn be designed as a frame adapted to the image module, which frame can in turn further comprise a frame base with the features already explained above.
- the mount can be inserted in the receptacle by sliding in. In this way, a particularly user-friendly way of connecting the image module to the mount is provided.
- the receptacle can comprise a first slot with the result that the mount can be slid into the receptacle for example from the side or from the top, in the same way as a drawer.
- a further slot is provided opposite the first slot via which the mount can be removed from the receptacle again, or vice versa.
- the receptacle can comprise a guide along which the mount can be slid into the receptacle.
- the guide is preferably arranged on a narrow side of the receptacle.
- the guide can comprise a guide bar.
- the guide can comprise a detent with a locking position.
- the detent can advantageously realize the second positioning device.
- the imaging system can thus be automatically aligned relative to an optical axis of the first imaging optical system in a simple manner.
- the detent can be realized by a spring element secured to the front part, which engages for example in a recess in the mount, provided for this purpose.
- the receptacle can be formed at the front end (i.e. the end facing away from the user) of the front part with the result that the inserted image module or in the case of embodiment examples the frame base forms the front closure of the display device.
- the display device can thus be designed particularly compact.
- the receptacle when the image module is slid into the receptacle like a drawer, in particular in the cases together with a mount, the receptacle can also be formed spaced apart from the front end of the display device.
- an optically transparent partial area can be arranged at the front end of the display device.
- the optically transparent partial area of the front end can be made from an optically transparent material, for example a transparent plastic.
- the optically transparent material can comprise an optically effective layer.
- a partially reflective layer can be provided, the reflectivity of which can be chosen freely, but can preferably be between 5 and 30%.
- the partial area it is possible for the partial area to be left open as an opening in the cover.
- the optically transparent partial area is preferably provided in the region of a rear-side camera of the image module, whereby advantageously the access to the camera already described above is also made possible when the imaging system is inserted.
- FIG. 1 is a schematic perspective representation of an embodiment of the display device.
- FIG. 2 is a schematic top view of the embodiment from FIG. 1 .
- FIG. 3 is a schematic top view of the image-generating device arranged in the display device.
- FIG. 4 is a top view of the image-generating device according to FIG. 3 to explain the division of the screen into two sections and the representation of the same images in both sections.
- FIG. 5 is a sectional representation through the first imaging optical system according to FIG. 2 .
- FIG. 6 is a representation to explain the field of view S provided.
- FIG. 7 is diagrams of the beam aberrations for different field points for the y- and x-direction for the three wavelengths 656.27 nm (curves W 1 ), 587.56 nm (curves W 2 ) and 486.13 nm (curves W 3 ).
- FIG. 8 is diagrams which show the longitudinal spherical aberration, astigmatic field curves and the distortion.
- FIG. 9 is diagrams of the beam aberrations of a second embodiment of the first imaging optical system for different field points for the y- and x-direction for the three wavelengths 656.27 nm (curves W 1 ), 587.56 nm (curves W 2 ) and 486.13 nm (curves W 3 ).
- FIG. 10 is diagrams which show the longitudinal spherical aberration, astigmatic field curves and the distortion for the second embodiment of the first imaging optical system.
- FIG. 11 is diagrams of the beam aberrations of a third embodiment of the first imaging optical system for different field points for the y- and x-direction for the three wavelengths 656.27 nm (curves W 1 ), 587.56 nm (curves W 2 ) and 486.13 nm (curves W 3 ).
- FIG. 12 is diagrams which show the longitudinal spherical aberration, astigmatic field curves and the distortion of the third embodiment of the first imaging optical system.
- FIG. 13 is diagrams of the beam aberrations of a fourth embodiment of the first imaging optical system for different field points for the y- and x-direction for the three wavelengths 656.27 nm (curves W 1 ), 587.56 nm (curves W 2 ) and 486.13 nm (curves W 3 ).
- FIG. 14 is diagrams which show the longitudinal spherical aberration, astigmatic field curves and the distortion of the fourth embodiment of the first imaging optical system.
- FIG. 15 is diagrams of the beam aberrations of a fifth embodiment of the first imaging optical system for different field points for the y- and x-direction for the three wavelengths 656.27 nm (curves W 1 ), 587.56 nm (curves W 2 ) and 486.13 nm (curves W 3 ).
- FIG. 16 is diagrams which show the longitudinal spherical aberration, astigmatic field curves and the distortion of the fifth embodiment of the first imaging optical system.
- FIG. 17 is diagrams of the beam aberrations of a sixth embodiment of the first imaging optical system for different field points for the y- and x-direction for the three wavelengths 656.27 nm (curves W 1 ), 587.56 nm (curves W 2 ) and 486.13 nm (curves W 3 ).
- FIG. 18 is diagrams which show the longitudinal spherical aberration, astigmatic field curves and the distortion of the sixth embodiment of the first imaging optical system.
- FIG. 19 is diagrams of the beam aberrations of a seventh embodiment of the first imaging optical system for different field points for the y- and x-direction for the three wavelengths 656.27 nm (curves W 1 ), 587.56 nm (curves W 2 ) and 486.13 nm (curves W 3 ).
- FIG. 20 is diagrams which show the longitudinal spherical aberration, astigmatic field curves and the distortion of the seventh embodiment of the first imaging optical system.
- FIG. 21 is diagrams of the beam aberrations of an eighth embodiment of the first imaging optical system for different field points for the y- and x-direction for the three wavelengths 656.27 nm (curves W 1 ), 587.56 nm (curves W 2 ) and 486.13 nm (curves W 3 ).
- FIG. 22 is diagrams which show the longitudinal spherical aberration, astigmatic field curves and the distortion of the eighth embodiment of the first imaging optical system.
- FIG. 1 a first embodiment of the display device 1 according to the invention is shown schematically in a perspective representation.
- the display device 1 comprises a front part 2 formed essentially box-shaped with an open side 3 . All the other sides of the front part are at least essentially closed.
- the contour of the open side 3 is formed in such a way that it can be placed on the head of a user in such a way that the user can wear the display device 1 on his head like a pair of spectacles.
- the contour has a nose pad 4 on one side and, on the two lateral ends of the front part 2 , a retaining strap 5 is secured.
- the retaining strap 5 When the display device 1 according to the invention is worn, the retaining strap 5 is guided around the head of the user in such a way that the desired contact pressure is present, with the result that the display device 1 can be worn on the head ergonomically and preferably in a light-proof manner.
- the retaining strap 5 can be formed e.g. as an elastic strap and/or as a strap with an adjustable length. Together with the retaining strap 5 , the front part 2 forms a holder that can be fitted on the head of the user.
- the display device 1 comprises a first imaging optical system 6 for a left eye LA of the user and a second imaging optical system 7 for a right eye RA of the user, which in each case image an image generated in an image plane E enlarged in such a way that the user can perceive it as a virtual image.
- FIG. 2 shows a schematic top view of the display device 1 according to the invention, wherein, for a better representation, in FIG. 2 the front part 2 is shown to be open at the top, which is not actually the case however.
- the front part 2 is formed essentially closed except for the side 3 .
- these sides are completely closed, which seals off light from the outside.
- ventilation slots and/or ventilation holes are introduced into these sides, which are particularly preferably designed in such a way that the light passing through them from the outside is minimized.
- the front part 2 is formed essentially closed except for the side 3 , when he is wearing the display device 1 on his head as intended, the user can only perceive the images generated in the image plane E and can no longer perceive the surroundings.
- the display device can comprise a portable image module 8 with a screen 9 , which is arranged in the display device 1 in such a way that the screen 9 of the portable device 8 lies in the image plane E.
- a mount is provided consisting of a frame with a frame base (not shown).
- a recess is provided in the frame base, which is opposite a rear-side camera of the portable image module 8 .
- the frame can be slid sideways into a receptacle which is provided in the front part 2 .
- Two sprung elements are present in the receptacle, which engage in a locking manner on a detent of the frame in such a way that, when the frame is slid into the receptacle, the correct seating of the frame in the receptacle is provided and the frame is replaceably fixed in this position.
- two small permanent magnets or magnetizable small metal discs are provided in the receptacle at the top and the bottom as well as at the corresponding positions of the frame.
- the front flat closure of the front part 2 is formed by a partially transparent plastic disc.
- the disc has a partially reflective coating with the result that the inserted image module 8 cannot or can hardly be seen from the outside but a rear-side camera of the image module 8 can look outwards.
- such a portable device 8 is represented by way of example, which comprises a screen 9 , a control unit 10 as well as a sensor unit 11 for detecting a movement of the device 8 . Further elements necessary for operating the device 8 are not shown.
- the control unit 10 and the sensor unit 11 are represented with dotted lines since they are built into the device 8 and are not normally visible from the outside.
- the control unit 10 can execute program instructions and serves to actuate the screen 9 .
- the sensor unit 11 can comprise an inertia sensor, such as e.g. a single-axis, double-axis or triple-axis gyroscope, a tilt sensor, an acceleration sensor and/or another type of sensor with which it is possible to detect a movement of the device 8 .
- the sensor unit 11 generates corresponding measurement signals which are transferred to the control unit 10 (preferably continuously), as is represented schematically in FIG. 3 by the dotted connecting line 12 .
- the portable device 8 can, for example, be a mobile phone (e.g. a so-called smartphone) or another type of device with a corresponding screen (such as e.g. the so-called iPod touch from Apple Inc., California, USA) and is preferably replaceably arranged in the front part 2 .
- a mobile phone e.g. a so-called smartphone
- another type of device with a corresponding screen such as e.g. the so-called iPod touch from Apple Inc., California, USA
- each of the two imaging optical systems 6 , 7 only images a partial area of the screen 9 . So that a user wearing the display device 1 on his head can perceive an object to be represented with both eyes, this object must thus be generated in both partial areas of the screen 9 , which are imaged by the two imaging optical systems 6 and 7 .
- an application or a program is provided on the portable device 8 , which is executed by the control unit 10 and actuates the screen 9 in such a way that, on the basis of the supplied image data for the object to be represented or for a first image to be represented, the object or the first image 13 with schematically represented image elements 13 1 , 13 2 and 13 3 is generated both in a first section 14 of the screen 9 and also in a second section 15 of the screen separate from the first section 14 , as is shown in FIG. 4 .
- image data saved in the device 8 or image data supplied to the device 8 can advantageously be used in order to generate the representation of the images in the two sections 14 and 15 .
- image data originating from other applications running on the device 8 can be processed by the application according to the invention in such a way that the same image is always represented in both sections 14 and 15 .
- Images and films can thus be offered to the user wearing the display device 1 on his head in such a way that he can perceive them enlarged as virtual images with both eyes. The user can thus perceive e.g. videos from YouTube or other video platforms, videos which are saved on the device or other videos or images enlarged as desired. It is also possible to represent enlarged images from games or other applications installed on the device.
- the two sections 14 and 15 can be chosen in such a way that they border each other directly. Alternatively, it is possible for them to be spaced apart from each other.
- the spacing area can, in particular, be represented or actuated as an area which is switched dark.
- the application can represent the images 13 in the two sections 14 , 15 in such a way that there is no stereo effect for the user. However, it is also possible to generate a stereo effect.
- the screen 9 is preferably touch-sensitive and forms the primary input interface.
- this primary input interface is no longer accessible to the user since the device is then inserted in the front part 2 .
- the display device 1 according to the invention is therefore formed in such a way that tipping of the front part 2 is measured by means of the sensor unit 11 and is evaluated by means of the control unit 10 , which can thereupon actuate the screen 9 in such a way that the first image 11 is changed.
- Tipping of the front part 2 can be detected easily by means of the device 8 since, when it is inserted in the front part 2 , the device 8 is mechanically connected to the front part 2 (preferably free from play), so that it maintains the preset position relative to the imaging optical systems 6 and 7 during the intended use of the display device 1 . Tipping of the front part 2 is thus transferred directly to the device 8 and can then be measured by means of the sensor unit 11 . This leads to a change in the measurement signals generated by the sensor unit 11 , which are transferred to the control unit 10 with the result that the control unit 10 can recognize a tipping which has been carried out in dependence on the change in the measurement signals.
- the simplest input is thus a single tipping, which corresponds for example to clicking on a button with a mouse in a conventional computer.
- the control unit 10 can thus be formed in particular in such a way that it recognizes tipping.
- the number of several tipping actions and/or the time spacing thereof can also be evaluated as input signals.
- the application which has just been executed on the device 8 and the image 13 of which is represented can offer individual menu options or functions for selection in the four corners of the image, as is indicated by the quadrants M 1 , M 2 , M 3 and M 4 in FIG. 4 .
- Tipping of the front part 2 in the upper left-hand area 16 ( FIG. 1 ) then leads to the selection of the menu option M 1 .
- Tipping in the upper right-hand area 17 of the front part 2 leads to the selection of the menu option M 2 .
- tipping in the lower left-hand area of the front part 2 leads to the selection of the menu option M 3 and tipping in the lower right-hand area of the front part 2 leads to the selection of the menu option M 4 .
- Tipping of the front part 2 from different directions brings about, in particular when using a “multi-axis” sensor, a signal characteristic in each axis with the result that conclusions can be drawn about the location and/or the direction of tipping by analyzing the signals in the individual axes.
- tipping the left-hand side brings about, for example, a positive signal in the horizontal sensor, for example, while tipping the right-hand side then brings about, for example, a negative signal in the horizontal sensor.
- the vertical sensor shows no signal.
- Tipping the upper right-hand corner in the direction of the bottom left brings about, for example, a negative signal in the horizontal sensor and, for example, a negative signal in the vertical sensor. Tipping the lower right-hand corner in the direction of the top left then brings about, for example, a negative signal in the horizontal sensor but a positive signal, for example, in the vertical sensor. And so on. In this way, the direction of the tipping can be determined. Since the shape of the front part 2 is known and tipping is in practice only possible from outside, it is thus also clear (at least approximately) which point of the front part 2 was tipped.
- the determined direction of tipping i.e. the direction in which the front part 2 was struck during tipping
- Striking in a first direction R 1 from top right to bottom left ( FIG. 1 ) can e.g. be interpreted and processed as selection of the menu option M 2 .
- the position of tipping can be disregarded with the result that tipping in the same direction at another point of the front part 2 , as is shown by the arrow R 1 ′ in FIG. 1 , would also lead to the selection of the menu option M 2 .
- the position of the tipping can additionally be taken into account, of course.
- the direction of the tipping can be taken into account together with the strength of the tipping and thus the tipping momentum or the momentum generated by the tipping or striking.
- the display provided or the images 13 provided by the device 8 can thus be influenced.
- the display device according to the invention thus provides an easy-to-use and well-functioning input interface with which the device 8 can be operated.
- the two imaging optical systems 6 , 7 can be formed the same or different. In the embodiment described here they are formed the same with the result that only the first imaging optical system 6 is described in detail below.
- the first imaging optical system 6 comprises precisely one first lens 18 , which is formed as a convex-convex lens made of plastic (e.g. PMMA) ( FIG. 5 ).
- the first lens 18 comprises a first and a second boundary surface 20 , 21 , wherein the first boundary surface 20 faces the screen 9 or the image plane E.
- the exit pupil 22 of the first imaging optical system 6 two bundles of rays S 1 and S 2 with corresponding main beams H 1 and H 2 as well as a glass cover 23 of the screen 9 are also shown, wherein the side of the glass cover 23 facing the first lens 18 is denoted by the reference number 19 .
- Both boundary surfaces 20 and 21 are formed as aspherical surfaces. In particular they can be formed as rotationally symmetrical aspheres, which are rotationally symmetrical with respect to the optical axis OA.
- the optical axis OA is perpendicular to the image plane E.
- the two boundary surfaces 20 , 21 can be described by the following surface area equation, wherein r is the radial height on the surface, k is the conic constant and c indicates the curvature of the corresponding surface at the vertex.
- A, B and C are the fourth-, sixth- and eighth-order deformation coefficients:
- the distances along the optical axis OA are indicated in the following Table 2.
- the first lens 18 is formed from PMMA with a first refractive index n1 of 1.492 and a first Abbe number ⁇ 1 of 57.2.
- the refractive index and Abbe number are indicated for the wavelength of 589 nm.
- the first imaging optical system 6 is designed in such a way that an astigmatism is corrected and that the field of view as well as the distance between the second boundary surface 21 and the exit pupil 22 is as large as possible. Distortion and lateral chromatic aberration are taken into account, however, since these are retained by the representation on the screen 9 in such a way that they are compensated as far as possible in the imaged virtual image. A distorted representation is thus carried out on the screen 9 , which leads, after imaging by means of the first lens 18 , to as undistorted as possible a virtual image. The same is carried out with the lateral chromatic aberration.
- the application which is executed on the portable device 8 can be correspondingly formed. The application thus generates from the supplied image data those images 13 which, without looking through the first lens 18 , are distorted and have a lateral chromatic aberration.
- the two boundary surfaces as aspherical surfaces, it is possible to correct spherical aberrations and higher-order aberrations.
- the first imaging optical system 6 is designed in such a way that an angular deviation of the individual beams caused by the first boundary surface 20 is compensated (as far as possible) by the second boundary surface 21 . This is the case when BGL is almost zero, wherein BGL is to be calculated as follows:
- BGL
- angles ⁇ 1 , ⁇ 2 and ⁇ 3 are the angle following the exit pupil 22 ( ⁇ 1 ), the first boundary surface 20 ( ⁇ 2 ) and the second boundary surface 21 ( ⁇ 3 ) of a main beam H 1 at the edge of the field or image 13 to be imaged with an axis O 1 or O 2 shifted in parallel with respect to the optical axis, as is drawn in schematically in FIG. 5 , and n air and n substrate are the indices of refraction of the surrounding medium in each case.
- BGL has a value of 0.00246.
- the value for BGL as well as for further parameters characterizing the first imaging optical system 6 are indicated in the following Table 3:
- K1 denotes the focal length
- K2 the ratio of the vertex curvature of the first boundary surface 20 to the vertex curvature of the second boundary surface 21
- K3 the distortion
- K4 the lateral chromatic aberration
- K5 the astigmatism
- K6 the Petzval radius
- K7 the defocusing.
- the beam aberrations are represented in mm in the known way for different field points for the y-direction and the x-direction for three different wavelengths.
- the curves for the wavelength 656.27 nm are denoted W 1 .
- the curves for the wavelength 587.56 nm are denoted W 2 and the curves for the wavelength 486.13 nm are denoted W 3 . It can occur that the curves partially coincide in the representation.
- the curves can partially coincide in FIG. 8 too. Further properties of the first imaging optical system 6 according to the described embodiment can be inferred from the representations in FIGS. 7 and 8 .
- the exit pupil 22 (or the eyebox 22 ) of the first imaging optical system 6 has a diameter of 16 mm.
- the exit pupil 22 is spaced apart from the image plane E and thus from the screen 9 by a distance d of 60 mm.
- a person wearing spectacles can wear the display device 1 according to the invention as intended on his head together with his spectacles.
- the two imaging optical systems 6 and 7 for the left and right eye LA, RA of the user are spaced apart from each other by 61 mm. This corresponds approximately to the average interpupillary distance of the world population.
- a diaphragm 24 can be arranged in front of the first boundary surface 21 and thus between the first boundary surface 21 and the image plane E, which shadows beams with strong aberrations and so contributes to an improvement in the imaging.
- FIGS. 9 and 10 the beam aberrations, the longitudinal spherical aberration, the astigmatic field curves and the distortion of a second embodiment of the first imaging optical system 6 are represented in the same way as in FIGS. 7 and 8 .
- the aspherical constants for the surfaces 20 and 21 of the second embodiment are indicated in the following Table 4.
- the distances along the optical axis OA for the second embodiment are indicated in the following Table 5.
- FIGS. 11 and 12 the beam aberrations, the longitudinal spherical aberration, the astigmatic field curves and the distortion of a third embodiment of the first imaging optical system 6 are represented in the same way as in FIGS. 7 and 8 .
- the aspherical constants for the surfaces 20 and 21 of the third embodiment are indicated in the following Table 7.
- the distances along the optical axis OA of the third embodiment are indicated in the following Table 8.
- FIGS. 13 and 14 the beam aberrations, the longitudinal spherical aberration, the astigmatic field curves and the distortion of a fourth embodiment of the first imaging optical system 6 are represented in the same way as in FIGS. 7 and 8 .
- the aspherical constants for the surfaces 20 and 21 for the fourth embodiment are indicated in the following Table 10.
- the distances along the optical axis OA of the fourth embodiment are indicated in the following Table 11.
- FIGS. 15 and 16 the beam aberrations, the longitudinal spherical aberration, the astigmatic field curves and the distortion of a fifth embodiment of the first imaging optical system 6 are represented in the same way as in FIGS. 7 and 8 .
- the aspherical constants for the surfaces 20 and 21 of the fifth embodiment are indicated in the following Table 13.
- the distances along the optical axis OA of the fifth embodiment are indicated in the following Table 14.
- FIGS. 17 and 18 the beam aberrations, the longitudinal spherical aberration, the astigmatic field curves and the distortion of a sixth embodiment of the first imaging optical system 6 are represented in the same way as in FIGS. 7 and 8 .
- the aspherical constants for the surfaces 20 and 21 of the sixth embodiment are indicated in the following Table 16.
- the distances along the optical axis OA of the sixth embodiment are indicated in the following Table 17.
- the beam aberrations, the longitudinal spherical aberration, the astigmatic field curves and the distortion of a seventh embodiment of the first imaging optical system 6 are represented in the same way as in FIGS. 7 and 8 .
- the aspherical constants for the surfaces 20 and 21 of the seventh embodiment are indicated in the following Table 19.
- the distances along the optical axis OA of the seventh embodiment are indicated in the following Table 20.
- FIGS. 21 and 22 the beam aberrations, the longitudinal spherical aberration, the astigmatic field curves and the distortion of an eighth embodiment of the first imaging optical system 6 are represented in the same way as in FIGS. 7 and 8 .
- the aspherical constants for the surfaces 20 and 21 of the eighth embodiment are indicated in the following Table 22.
- the distances along the optical axis OA of the eighth embodiment are indicated in the following Table 23.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102014017534.0 | 2014-11-24 | ||
| DE102014017534.0A DE102014017534A1 (de) | 2014-11-24 | 2014-11-24 | Anzeigevorrichtung, die auf den Kopf eines Benutzers aufsetzbar ist |
| PCT/EP2015/075882 WO2016083095A1 (de) | 2014-11-24 | 2015-11-06 | Anzeigevorrichtung, die auf den kopf eines benutzers aufsetzbar ist |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20170343819A1 true US20170343819A1 (en) | 2017-11-30 |
Family
ID=54478751
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/529,083 Abandoned US20170343819A1 (en) | 2014-11-24 | 2015-11-24 | Display device which can be placed on the head of a user |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US20170343819A1 (de) |
| CN (1) | CN107003504A (de) |
| DE (1) | DE102014017534A1 (de) |
| WO (1) | WO2016083095A1 (de) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11536957B2 (en) * | 2018-02-24 | 2022-12-27 | Beijing Boe Optoelectronics Technology Co., Ltd. | Method and apparatus for optimizing a lens of a virtual reality device, and computer readable storage medium |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102014116665B4 (de) | 2014-09-22 | 2024-07-25 | Carl Zeiss Ag | Verfahren und Vorrichtungen zur Bestimmung der Augenrefraktion |
| DE102016218582A1 (de) | 2016-09-27 | 2018-03-29 | Bayerische Motoren Werke Aktiengesellschaft | Projektionsanzeigevorrichtung mit einer Darstellung in mehreren Anzeigenebenen |
| CN107275904B (zh) * | 2017-07-19 | 2024-05-24 | 北京小米移动软件有限公司 | 虚拟现实眼镜的数据连接线 |
| CN107238930A (zh) * | 2017-07-19 | 2017-10-10 | 北京小米移动软件有限公司 | 虚拟现实眼镜 |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130033756A1 (en) * | 2011-08-02 | 2013-02-07 | Google Inc. | Method and apparatus for a near-to-eye display |
Family Cites Families (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB643938A (en) * | 1948-06-01 | 1950-09-27 | Christian Ellis Coulman | Lenses |
| GB2200763B (en) * | 1987-01-28 | 1991-02-13 | Combined Optical Ind Ltd | Stand magnifiers and lens |
| JP2005062803A (ja) * | 2003-07-31 | 2005-03-10 | Olympus Corp | 撮像光学系及びそれを用いた光学装置 |
| US8957835B2 (en) * | 2008-09-30 | 2015-02-17 | Apple Inc. | Head-mounted display apparatus for retaining a portable electronic device with display |
| GB2499102B (en) * | 2013-01-11 | 2013-12-25 | Mvr Global Ltd | Head-mounted display device |
-
2014
- 2014-11-24 DE DE102014017534.0A patent/DE102014017534A1/de not_active Withdrawn
-
2015
- 2015-11-06 CN CN201580061628.5A patent/CN107003504A/zh active Pending
- 2015-11-06 WO PCT/EP2015/075882 patent/WO2016083095A1/de not_active Ceased
- 2015-11-24 US US15/529,083 patent/US20170343819A1/en not_active Abandoned
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130033756A1 (en) * | 2011-08-02 | 2013-02-07 | Google Inc. | Method and apparatus for a near-to-eye display |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11536957B2 (en) * | 2018-02-24 | 2022-12-27 | Beijing Boe Optoelectronics Technology Co., Ltd. | Method and apparatus for optimizing a lens of a virtual reality device, and computer readable storage medium |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2016083095A1 (de) | 2016-06-02 |
| CN107003504A (zh) | 2017-08-01 |
| DE102014017534A1 (de) | 2016-05-25 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9999348B2 (en) | Compact eye imaging and eye tracking apparatus | |
| US11500607B2 (en) | Using detected pupil location to align optical components of a head-mounted display | |
| US12111471B2 (en) | Free-form prism-lens group and near-eye display apparatus | |
| CN104423044B (zh) | 虚像显示装置 | |
| TWI534475B (zh) | 虛像顯示裝置 | |
| JP6759224B2 (ja) | 超微細構造によって保護されたコンパクトなヘッドマウントディスプレイシステム | |
| US9551857B2 (en) | Wide angle lens assembly | |
| US10139626B2 (en) | Imaging optical system as well as display device with such an imaging optical system | |
| US9910282B2 (en) | Increasing field of view of head-mounted display using a mirror | |
| RU2642149C2 (ru) | Составная линза и содержащая ее система отображения | |
| US20170343819A1 (en) | Display device which can be placed on the head of a user | |
| US11454783B2 (en) | Tiled triplet lenses providing a wide field of view | |
| US9529196B1 (en) | Image guide optics for near eye displays | |
| KR101300671B1 (ko) | 안경 렌즈 제작을 위해 필요한 파라미터 측정 방법 및 이를 구현하기 위한 측정 장치 | |
| CN111771179A (zh) | 用于确定显示器与用户的眼睛之间的配准的显示系统和方法 | |
| KR20180110158A (ko) | 근안 디스플레이용 색수차 보정 광학계 | |
| CN106154548A (zh) | 透视型头戴式显示装置 | |
| JP2009282180A (ja) | 観察光学系及びそれを用いた撮像装置 | |
| US20160217613A1 (en) | Extendable eyecups for a virtual reality headset | |
| CN206411339U (zh) | 一种用于虚拟现实头盔的降低畸变与色散的光学结构 | |
| JP5629996B2 (ja) | 立体視光学装置および結像光学装置 | |
| CN211318884U (zh) | 一种轻薄型光学显示系统、影像镜头模组及vr设备 | |
| CN108152955A (zh) | 用于近眼显示器的图像引导光学器件 | |
| JP7177356B2 (ja) | パーソナルディスプレイ用の接眼レンズ及びこのような接眼レンズを含むパーソナルディスプレイ | |
| JP2009282179A (ja) | 観察光学系及びそれを用いた撮像装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: CARL ZEISS AG, GERMANY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LERNER, SCOTT, DR.;GAENGLER, DIETMAR;KERWIEN, NORBERT, DR.;AND OTHERS;SIGNING DATES FROM 20170502 TO 20170523;REEL/FRAME:042881/0279 |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |