[go: up one dir, main page]

WO2016046123A1 - Dispositif d'affichage pouvant être monté sur la tête d'un utilisateur et procédé de commande de ce dispositif d'affichage - Google Patents

Dispositif d'affichage pouvant être monté sur la tête d'un utilisateur et procédé de commande de ce dispositif d'affichage Download PDF

Info

Publication number
WO2016046123A1
WO2016046123A1 PCT/EP2015/071587 EP2015071587W WO2016046123A1 WO 2016046123 A1 WO2016046123 A1 WO 2016046123A1 EP 2015071587 W EP2015071587 W EP 2015071587W WO 2016046123 A1 WO2016046123 A1 WO 2016046123A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
screen
holding device
display device
generated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2015/071587
Other languages
German (de)
English (en)
Inventor
Simon Brattke
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Carl Zeiss AG
Original Assignee
Carl Zeiss AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Carl Zeiss AG filed Critical Carl Zeiss AG
Publication of WO2016046123A1 publication Critical patent/WO2016046123A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/34Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1636Sensing arrangement for detection of a tap gesture on the housing

Definitions

  • Display device which can be placed on the head of a user, and methods for controlling such a display device
  • the present invention relates to a display device with a holding device which can be placed on the head of a user, an image module mechanically connected to the holding device, which has a screen on which an image is produced, and a first imaging optical unit mechanically connected to the holding device, which has the image formed on the screen as a virtual image so that the user can perceive it in the upside-down state of the holding device. Furthermore, the present invention relates to a method for controlling such a display device.
  • Such display devices may in particular be designed so that the user can only perceive the images generated by the display device and no longer the environment.
  • Such display devices are also referred to as HMD device (Head Mounted Display) device or as VR glasses (virtual reality glasses). Since the user can only perceive the images generated by the display device and no longer the environment, he immerses virtually in a virtual reality, which leaves a strong impression.
  • HMD device Head Mounted Display
  • VR glasses virtual reality glasses
  • a common user interface in such mobile phones is still an audio interface through which the mobile phone can be controlled by voice control.
  • the audio interface can not be operated quietly and discreetly.
  • not too many other audio sources should be present to ensure adequate error-free use through the audio interface. Proceeding from this, it is therefore an object of the invention to provide a display device of the type mentioned, which can be used safely and discreetly.
  • this object is achieved in the case of a display device with a holding device which can be placed on the head of a user, an image module mechanically connected to the holding device, which has a screen on which an image is generated, and a first imaging optical system which is mechanically connected to the holding device image formed on the screen as a virtual image so that the user can perceive it in the upside down state of the holding device,
  • the image module further comprises a control unit for driving the screen and a sensor unit for detecting a movement of the image module, wherein the sensor unit of the
  • the control unit supplies signals which, from the signals, are a tapping of the signals
  • Holding device detects and depending on the detected tap a change in the image generated on the screen and / or the control of running on the image module application that provides content that is included in the image generated on the screen causes.
  • Such a tapping or abutment of the holding device can be carried out discretely and also in a louder environment, without thereby distorting the input.
  • a discrete and secure control of the display device can take place.
  • an input can thus be made by tapping, pushing or tapping against the holding device.
  • the holding device is used as an input interface, which is sensitive to a touch of the holding device (in particular a tap, push or knock against the holding device), to which the holding device does not need to be specially adapted.
  • Holding device can be used as it is as an input interface, since the measurement of the interaction performed (typing, pushing, knocking, etc.) with the holding device by means of the sensor unit and the evaluation of the measured signals by means of the control unit, which are both part of the image module ,
  • the control unit from the supplied signals, the position of the performed tapping on the holding device, the strength of the performed tapping of the holding device, the time interval of at least two consecutive tap events recognize and / or recognize the direction in which when tapping was pushed or knocked against the holding device, and depending on the detected position, the detected strength of the tap, the detected temporal Distance and / or the direction detected cause a change in the image generated on the screen and / or the control of the application.
  • control unit can detect from the signals a spatial and / or temporal pattern of one or more tap events and, depending thereon, effect a change of the image generated on the screen and / or the control of the application.
  • the spatial and / or temporal pattern is thus implemented in a control command.
  • a selectable menu item can be displayed in the image, which can be selected by tapping the associated left or right side of the holding device. It is also possible for the display device to represent at least in one corner of the generated image a selectable menu item which can be selected by tapping the holding device in the region of this corner.
  • the display device according to the invention can be designed as an HMD device and / or as VR glasses. Furthermore, in the display device according to the invention, the image module
  • the holding device may have a front part, in which the image module and the first imaging optics are arranged, wherein in the mounted on the head state of the holding device, the front part is preferably positioned in front of the eyes of the user that the user only the virtual image and not the Environment can perceive.
  • the front part may be light-tight on the face in an area surrounding the eyes.
  • the screen of the image module can be substantially perpendicular to the straight-ahead direction in the state of the holding device mounted on the head.
  • the first imaging optics may comprise at least one optical element.
  • the optical element may, for example, be a lens, a Fresnel lens, a lens array and / or a diffractive optical element (DOE).
  • DOE diffractive optical element
  • Imaging optics comprise a lens, wherein the lens has an optically effective surface. On the lens, a diffractive structure may be arranged.
  • the display device may have, in addition to the first imaging optics, a second imaging optics, which may be formed in the same way as the first
  • the second imaging optics is in particular designed to provide a second image generated on the screen as a virtual image when the user
  • the first imaging optics can provide the virtual image for a first eye and the second imaging optics provide the virtual image for the second eye.
  • a binocular display device is provided.
  • the first and / or second imaging optics can be designed as magnifying glass optics.
  • the sensor unit may include an inertial sensor, such as an inertial sensor. a gyroscope, a tilt sensor, an acceleration sensor and / or another sensor.
  • the mechanical connection of the image module with the holding device is preferably free of play.
  • the image module may have a control unit for controlling the screen and a sensor unit for detecting a movement of the image module, wherein the sensor unit supplies signals to the control unit, by means of which the performed tap can be recognized.
  • the method according to the invention can realize the method steps described in connection with the display device according to the invention.
  • the application executed on the image module can control the screen in such a way that, based on supplied image data of a first image, the first image is generated both in a first section of the screen and in a second section of the screen separate from the first section.
  • the application is carried out directly on the image module, the desired image formation can advantageously be carried out quickly. Furthermore, image data that is stored on or generated by the image module can be used for display.
  • the image module may be a portable device having a controller for executing program instructions and driving the display.
  • Such an image forming apparatus is e.g. B. a mobile phone such. B. a smartphone.
  • Program instructions are used to execute the application on the image module with the appropriate instructions that performs the screen control and preferably the division of the screen into two sections with the representation of the same image in both sections.
  • the application can control the screen so that the two sections are spaced apart and the area between the two sections is darkened. Further, the application may generate the first image in the first and second sections so that there is no stereo effect when viewing the two images. Of course, the images can also be generated so that a stereo effect is present.
  • the image data may be supplied in such a way that the application accesses image data stored in the image module (hereinafter also referred to as image generation device) and / or that the image data is streamed onto the image generation device and / or that the image data is generated on the image generation device.
  • the application may image data of other applications on the imaging device
  • the first image can be predistorted in at least one of the sections, thereby at least partially compensating an aberration of the imaging optics.
  • the application can also be referred to as an app.
  • the added image data may in particular be image data of a recording of the surroundings of a user carrying the display device on his head and / or other image data. It is thus possible, for example, to present the environment to the user as the first image. In addition, the environment can be overlaid with other image information so that the user is shown more than the environment. These can be, for example, clues that are generated context-sensitively and displayed in the generated image.
  • the user can be displayed a live image of the environment.
  • a camera is provided which receives the environment.
  • the camera may be a separate camera, which is preferably attached to the display device (e.g., the fixture).
  • the display device e.g., the fixture.
  • a camera is usually provided, which is provided on the back (the side facing away from the screen), so that this camera can be used for recording the environment.
  • the application can generate the first image in the first and second sections in such a way that an image content of the first image in the first and second sections is generated either at a first distance or at a different second distance. This makes it possible to differentiate the image production
  • the application may have an interface over which the distance can be entered, in which case the first image in the first and second sections is generated as a function of the entered distance so that the input distance is present on the screen between the image content of the image.
  • the interface can
  • the inventive input interface of the holding device be that can be operated by tapping, knocking or bumping.
  • a continuous change in distance by means of the application is possible.
  • the complete first picture in the two sections is shown, but only a trimmed first picture. Any areas of the screen not filled in by the first image may be displayed in black.
  • a camera may be provided which receives the environment and supplies as image data to the image forming device.
  • the camera can be part of the image module.
  • the application can generate the first image, which is an image of the surroundings, in superposition with at least one further image element on the two sections of the screen.
  • the at least one further picture element may be z.
  • Image data processing before being displayed This can be a brightness adjustment, a color adjustment, a false color representation, etc.
  • augmented reality representation The described representation of the environment in superposition with a further image information is often referred to as augmented reality representation.
  • Fig. 1 is a schematic perspective view of an embodiment of
  • Fig. 2 is a schematic plan view of the embodiment of Fig. 1;
  • Fig. 3 is a schematic plan view arranged in the display device 1
  • FIG. 4 is a plan view of the image forming apparatus of FIG. 3 for explaining the
  • FIG. 5 is a sectional view through the first imaging optics 6 according to FIG. 2;
  • Fig. 6 is a diagram for explaining the provided field of view S
  • Fig. 7-9 are illustrations of the image forming apparatus according to Fig. 4 for explaining the
  • a first embodiment of the display device 1 is shown schematically in a perspective view.
  • the display device 1 comprises a substantially box-shaped front part 2 with an open side 3. All other sides of the front part are at least substantially closed.
  • the contour of the open side 3 is designed such that it can be applied to the head of a user so that the user can wear the display device 1 like a spectacle on his head.
  • Display device 1 is guided around the head of the user, so that the desired contact pressure is present in order to be able to carry the display device 1 ergonomically and preferably light-tight on the head.
  • the tether 5 may, for. B. may be formed as an elastic band and / or as a band with adjustable length.
  • the front part 2 forms together with the tether 5 a can be placed on the head of the user holding device.
  • the display device 1 comprises a first imaging optics 6 for a left eye LA of the user and a second imaging optics 7 for a right eye RA of the user, each of which magnifies an image generated in an image plane E so that the user perceives it as a virtual image can.
  • Fig. 2 shows a schematic plan view of the display device 1 according to the invention, for better illustration in Fig. 2, the front part 2 is shown as open at the top, which is not actually the case.
  • the front part 2 is substantially closed except for the side 3.
  • these sides are completely closed, which causes foreclosure of light from the outside.
  • the front part 2 is formed substantially closed except for the side 3, the user, if he carries the display device 1 as intended on the head, perceive only the images generated in the image plane E and no longer the environment.
  • the display device according to the invention may comprise a portable device 8 with a screen 9, which is arranged in the display device 1 so that the screen 9 of the portable device 8 is located in the image plane E.
  • the screen 9 of the portable device 8 is located in the image plane E.
  • FIG. 2 only the screen 9 is shown schematically in order to simplify the illustration, and FIG. 1 shows only the image plane E and not the device 8 in order to simplify the illustration.
  • FIG. 3 shows by way of example such a portable device 8 which has a screen 9, a control unit 10 and a sensor unit 11 for detecting a movement of the device 8. Further necessary for the operation of the device 8 elements are not shown.
  • the control unit 10 and the sensor unit 1 1 are shown in dashed lines, since they are installed in the device 8 and are not normally visible from the outside.
  • the control unit 10 can execute program instructions and serves to control the screen 9.
  • the sensor unit 1 1 may comprise an inertial sensor, such as a 1-axis, 2-axis or 3-axis gyroscope, an inclination sensor, an acceleration sensor and / or another sensor with which a detection of a movement of the device 8 is possible.
  • the sensor unit 1 1 generates corresponding measurement signals, which are transmitted to the control unit 10 (preferably continuously), as shown schematically by the dashed connecting line 12 in FIG.
  • the portable device 8 may be, for example, a mobile telephone (eg a so-called smartphone) or another device with a corresponding screen (such as the so-called iPod touch from Apple Inc., California, USA) and is preferably interchangeable arranged in the front part 2. As can be seen from the illustration in FIG.
  • each of the two imaging optics 6, 7 forms only a partial area of the screen 9.
  • a user wearing the display device 1 to be able to perceive an object to be displayed with both eyes, he must this thus in the two subregions of the screen 9, by the two
  • Imaging optics 6 and 7 are imaged.
  • an application or a program is provided on the portable device 8, which is executed by the control unit 10 and controls the screen 9 so that based on supplied image data for the object to be displayed or for a first image to be displayed, the object or the first image 13 with schematically represented picture elements 13i, 132 and 133 both on a first section 14 of the screen 9 and on a second section 15 of the screen separated from the first section 14, as shown in FIG.
  • image data stored on the device 8 or image data supplied to the device 8 can advantageously be used to display the images in the two sections 14 and 15 to produce.
  • image data originating from other applications running on the device 8 can be processed by the application according to the invention such that the same image is always displayed in both sections 14 and 15.
  • the user carrying the display device 1 on the head can thus be presented with images and films in such a way that he can perceive them enlarged with both eyes as virtual images. The user can thus z.
  • videos from youtube or other video sharing platforms videos that are stored on the device, or perceive other videos or images enlarged in the desired manner.
  • the two sections 14 and 15 may be chosen so that they directly adjoin one another. Alternatively, it is possible that they are spaced from each other.
  • the distance range can in particular be displayed or activated as a dark switched range.
  • the application can display the images 13 in the two sections 14, 15 in such a way that the user does not have a stereo effect. However, it is also possible to create a stereo effect.
  • the screen 9 is preferably touch-sensitive and forms the primary input interface.
  • This primary input interface is at
  • Display device 1 is designed so that a tapping of the front part 2 by means of the sensor unit 1 1 is measured and evaluated by the control unit 10, which then the screen 9 can control so that the first image 1 1 is changed.
  • One Tapping of the front part 2 can be detected well by means of the device 8, since the device 8 is mechanically connected to the front part 2 in the inserted state in the front part 2 (preferably free of play), so that it during the intended use of the
  • Display device 1 maintains the predetermined position relative to the imaging optics 6 and 7. Thus, a tap of the front part 2 transmits directly to the device 8, which can then be measured by means of the sensor unit 1 1. This leads to a change in the measurement signals generated by the sensor unit 11, which are transmitted to the control unit 10, so that the control unit 10 can detect a tapping performed as a function of the change in the measurement signals.
  • the simplest input is thus a single tap, which, for example, corresponds to clicking a button with a mouse in a conventional computer.
  • the control unit 10 can thus be designed in particular so that it detects a tapping. Also, the number of multiple tap operations and / or their temporal
  • Input interface is evaluated.
  • the application currently executing on the device 8 and displaying its image 11 may offer individual menu items or functions for selection in the four corners of the image, as indicated by the quadrants M1, M2, M3 and M4 in FIG. 4 is indicated.
  • a tapping of the front part 2 in the left upper area 16 ( Figure 1) then leads to the selection of the menu item M1.
  • a tap in the upper right area 17 of the front part 2 (Fig. 1) then leads to the selection of the menu item M2.
  • a tap in the lower left portion of the front panel 2 leads to the selection of the menu item M3 and a tap in the lower right portion of the front panel 2 to select the menu item M4.
  • the tapping of the front part 2 from different directions causes especially in the use of a "multi-axis" sensor characteristic in each axis signal, so that can be deduced by analyzing the signals in the individual axes on the location and / or direction of tapping.
  • a tap of the lower right corner in the upper left direction causes then, for example, a negative signal in the horizontal sensor but an example positive signal in the vertical sensor. And so on.
  • the direction of the tap can be determined. Since the shape of the front part 2 is known and a tapping is practically possible only from the outside, it is also (at least approximately) clear which part of the front part 2 was tapped. It is also possible to use the particular direction of tapping, ie the direction in which the tapping against the front part 2 was encountered, as an input criterion.
  • Striking in a first direction R1 from top right to bottom left may be e.g. be designed and processed as a selection of the menu item M2.
  • the position of the tap remain unconsidered, so that even a tap in the same direction at a different location of the front part 2, as shown by the arrow R1 'in Fig. 1, would lead to the selection of the menu item M2.
  • the position of the tap can also be considered. Further, e.g. the direction of the tapping together with the strength of the tap and thus the Antippimpuls or by tapping or
  • the display device according to the invention thus provides an easy-to-use and well-functioning input interface with which the device 8 can be operated.
  • the two imaging optics 6, 7 may be the same or different. In the embodiment described here, they are identical, so that only the first imaging optics 6 will be described in detail below.
  • the first imaging optics 6 comprises a first lens 18 with negative refractive power and a second positive power lens 19, both lenses 18, 19 being spaced apart (Figure 5).
  • the first lens 18 is formed as a convex-concave lens and has a first and a second interface 20, 21, wherein the concave side 20 of the screen 9 and the
  • Image plane E faces.
  • the second lens 19 is designed as a convex-convex lens and has a first and a second boundary surface 22, 23. All four interfaces 20 to 23 are formed as aspheric surfaces. In particular, they can be designed as rotationally symmetric aspheres, which are rotationally symmetrical with respect to the optical axis OA.
  • the optical axis OA is perpendicular to the image plane E.
  • the first lens 18 is formed of polycarbonate having a refractive index n1 of 1.855 and an Abbe number vi of 30.0.
  • the second lens 19 is formed of PM A having a second refractive index n2 of 1, 492 and a second Abbe number v2 of 57.2.
  • the refractive indices and Abbe numbers are given here for the wavelength of 589 nm.
  • Exit pupil 24 (or eyebox 24) of the first imaging optics 6 has a diameter of 8 mm.
  • the exit pupil 24 is spaced from the image plane E and thus from the screen 9 by a distance d of 60.52 mm.
  • the first and second sections 14, 15 each have an aspect ratio of 4: 3, the diagonal being 62.5 mm.
  • the two imaging optics 6 and 7 for the left and right eye LA, RA of the user are spaced apart by 61 mm. This corresponds approximately to the mean pupillary distance of the world population.
  • the two imaging optics 6, 7 each have a circular exit pupil 24 with a diameter of 8 mm, users with eye pupil distances in the range of 55 mm to 69 mm can perceive the virtual images of the images 13 on the screen 9 without cropping the image field.
  • the two images 13 are generated on the screen 9 so that the distance of Pixels 13i - 133 is adapted to the individual eye pupil distance of each user, in particular, the distance of the picture elements 13i - 133 correspond to the individual eye pupil distance or nearly correspond to this.
  • the application may have a function with which the distance of the images 13 and in particular the pixels 13i - 133 is changeable.
  • the portable device 8 is shown in the same manner as in Fig. 4, wherein the distance D of the picture elements 13i - 133 is shown. It is assumed that the distance D is 61 mm or is adapted for this distance of the two imaging optics 6 and 7. A user with an eye pupil distance of 61 mm can thus with attached
  • Extension of the images 13 in the horizontal direction is less left and right two strip-shaped areas of the screen 9 unused. These are shown hatched in Fig. 8 and may be dark, for example (it is shown in black).
  • the images 13 can be displayed such that the distance of the image elements 13i-133 is greater than in the state of FIG. 7. This state for the greater distance of the eye pupil is shown in Fig. 9.
  • the two images 13 can be displayed at a distance from each other, so that in the central region of the screen 9 there is an unused strip-shaped region, which is shown hatched in FIG. 9 and which in turn can be switched to dark.
  • the described input interface with the tapping of the front part 2 can be used to pass inputs with respect to the distance D of the picture elements 13i-133 of the application. Depending on this, the application then controls the image generation according to the states described in connection with FIGS. 7-9. It is also possible in the application to deposit a desired eye pupil distance, which is then used. This can be done via the described input interface with the tapping of the front part 2.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Computer Hardware Design (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

L'invention concerne un dispositif d'affichage comprenant un dispositif de support (2, 5) pouvant être monté sur la tête d'un utilisateur, un module d'affichage (8) relié mécaniquement au dispositif de support (2, 5), ledit module présentant un écran d'affichage (9) sur lequel une image (13) est générée, et une première optique de formation d'image (6, 7) reliée mécaniquement au dispositif de support (2, 5) et assurant la reproduction de l'image (13) générée sur l'écran d'affichage sous forme d'image virtuelle, de façon que l'utilisateur puisse la visualiser lorsque le dispositif de support (2, 5) est monté sur sa tête, le module d'affichage (8) comprenant en outre une unité de commande (10) destinée à commander l'écran d'affichage (9) et une unité de détection (11) permettant de mesurer un mouvement du module d'affichage (8), l'unité de détection (11) fournissant des signaux à l'unité de commande (10), laquelle détecte, à partir de ces signaux, un contact avec le dispositif de support (2, 5) et, selon le contact détecté, provoque un changement de l'image (13) générée sur l'écran d'affichage (9) et/ou la commande d'une application mise en oeuvre sur le module d'affichage, laquelle application fournit des contenus présents dans l'image générée sur l'écran d'affichage.
PCT/EP2015/071587 2014-09-22 2015-09-21 Dispositif d'affichage pouvant être monté sur la tête d'un utilisateur et procédé de commande de ce dispositif d'affichage Ceased WO2016046123A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102014113685.3 2014-09-22
DE102014113685.3A DE102014113685A1 (de) 2014-09-22 2014-09-22 Anzeigevorrichtung, die auf den Kopf eines Benutzers aufsetzbar ist, und Verfahren zum Steuern einer solchen Anzeigevorrichtung

Publications (1)

Publication Number Publication Date
WO2016046123A1 true WO2016046123A1 (fr) 2016-03-31

Family

ID=54150421

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2015/071587 Ceased WO2016046123A1 (fr) 2014-09-22 2015-09-21 Dispositif d'affichage pouvant être monté sur la tête d'un utilisateur et procédé de commande de ce dispositif d'affichage

Country Status (2)

Country Link
DE (1) DE102014113685A1 (fr)
WO (1) WO2016046123A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106227343A (zh) * 2016-07-20 2016-12-14 深圳市金立通信设备有限公司 一种虚拟现实眼镜及其控制方法
WO2018005131A1 (fr) * 2016-07-01 2018-01-04 Google Llc Visiocasque ayant des panneaux d'affichage ayant des bordures de panneau asymétriques pour un meilleur champ visuel (fov) nasal
WO2019071590A1 (fr) * 2017-10-13 2019-04-18 华为技术有限公司 Tampon éponge en contact avec le visage pour dispositif de vr et procédé de préparation associé, et dispositif de vr

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100079356A1 (en) * 2008-09-30 2010-04-01 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US8319746B1 (en) * 2011-07-22 2012-11-27 Google Inc. Systems and methods for removing electrical noise from a touchpad signal
US20140194163A1 (en) * 2013-01-04 2014-07-10 Apple Inc. Fine-Tuning an Operation Based on Tapping

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100079356A1 (en) * 2008-09-30 2010-04-01 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
US8319746B1 (en) * 2011-07-22 2012-11-27 Google Inc. Systems and methods for removing electrical noise from a touchpad signal
US20140194163A1 (en) * 2013-01-04 2014-07-10 Apple Inc. Fine-Tuning an Operation Based on Tapping

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS: "Oculus Rift - Wikipedia, the free encyclopedia", 21 September 2014 (2014-09-21), XP055230902, Retrieved from the Internet <URL:https://en.wikipedia.org/w/index.php?title=Oculus_Rift&oldid=626489213> [retrieved on 20151124] *
ANONYMOUS: "Taps as Input on the Rift | Input Devices | Oculus VR Forums", 9 July 2013 (2013-07-09), XP055230882, Retrieved from the Internet <URL:https://forums.oculus.com/viewtopic.php?t=2439> [retrieved on 20151124] *
BEN LANG: "Google: 'Cardboard is a Placeholder,' 6,000 Kits and 50,000+ App Downloads in First Week", 1 July 2014 (2014-07-01), XP055230945, Retrieved from the Internet <URL:http://www.roadtovr.com/google-cardboard-virtual-reality-6000-kits-50000-app-downloads-first-week/> [retrieved on 20151124] *
DABLET00: "DIY google cardboard straps head mount, hands free gaming", 20 August 2014 (2014-08-20), XP054976221, Retrieved from the Internet <URL:https://www.youtube.com/watch?v=o_tDXYa9ti4> [retrieved on 20151125] *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018005131A1 (fr) * 2016-07-01 2018-01-04 Google Llc Visiocasque ayant des panneaux d'affichage ayant des bordures de panneau asymétriques pour un meilleur champ visuel (fov) nasal
CN106227343A (zh) * 2016-07-20 2016-12-14 深圳市金立通信设备有限公司 一种虚拟现实眼镜及其控制方法
WO2019071590A1 (fr) * 2017-10-13 2019-04-18 华为技术有限公司 Tampon éponge en contact avec le visage pour dispositif de vr et procédé de préparation associé, et dispositif de vr

Also Published As

Publication number Publication date
DE102014113685A1 (de) 2016-04-07

Similar Documents

Publication Publication Date Title
EP3189372A1 (fr) Dispositif d&#39;affichage pour démontrer des propriétés optiques de verres de lunette
DE69113287T2 (de) Wiedergabeeinrichtung in Brillenform zur direkten Wiedergabe auf der Netzhaut.
DE202018101012U1 (de) Augennahe Anzeige mit erweiterter Akkomodationsbereichsanpassung
DE102014209556A1 (de) Anzeigevorrichtung
DE102016201567A1 (de) Projektionsvorrichtung für eine Datenbrille, Verfahren zum Darstellen von Bildinformationen mittels einer Projektionsvorrichtung und Steuergerät
EP3767366A1 (fr) Appareil optique à distance pourvu de canal de détection d&#39;image
DE202016104179U1 (de) Ausgabegerät zur stereoskopischen Bildwiedergabe
EP3765888B1 (fr) Procédé d&#39;étalonnage spécifique à l&#39;utilisateur d&#39;un dispositif d&#39;affichage pour un affichage augmenté qui peut etre placé sur la tête d&#39;un utilisateur
EP3123278B1 (fr) Procédé et système de fonctionnement d&#39;un dispositif d&#39;affichage
DE102014119225B4 (de) Informationsverarbeitungsverfahren und tragbares elektronisches Gerät
DE102016117024A1 (de) Vorrichtung zum Erfassen eines Stereobilds
EP3730036A1 (fr) Détermination d&#39;une erreur de réfection d&#39;un il
WO2016083095A1 (fr) Dispositif d&#39;affichage disposable sur la tête d&#39;un utilisateur
DE102015219859A1 (de) Vorrichtung und Verfahren für die AR-Darstellung
DE202014010406U1 (de) Anzeigevorrichtung
WO2016046123A1 (fr) Dispositif d&#39;affichage pouvant être monté sur la tête d&#39;un utilisateur et procédé de commande de ce dispositif d&#39;affichage
CN105700140A (zh) 瞳距可调的沉浸式视频系统
DE102015214671B4 (de) Autofokussierende optische Vorrichtung und Verfahren zur optischen Autofokussierung
DE102015103276B4 (de) Elektronische Vorrichtung, Anzeigevorrichtung und Anzeigesteuerverfahren
US10613322B2 (en) Display device, display method and head-mounted virtual display helmet
WO2016046124A1 (fr) Dispositif d&#39;affichage pouvant être monté sur la tête d&#39;un utilisateur et procédé de commande de ce dispositif d&#39;affichage
DE112016007015T5 (de) Anzeigevorrichtung, anzeigesteuervorrichtung und anzeigesteuerverfahren
DE102018107113A1 (de) Anzeigevorrichtung
DE102010041344A1 (de) Anzeigevorrichtung
WO2015158830A1 (fr) Dispositif et procédé d&#39;affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15766821

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15766821

Country of ref document: EP

Kind code of ref document: A1