[go: up one dir, main page]

WO2018045985A1 - Système d'affichage à réalité augmentée - Google Patents

Système d'affichage à réalité augmentée Download PDF

Info

Publication number
WO2018045985A1
WO2018045985A1 PCT/CN2017/100933 CN2017100933W WO2018045985A1 WO 2018045985 A1 WO2018045985 A1 WO 2018045985A1 CN 2017100933 W CN2017100933 W CN 2017100933W WO 2018045985 A1 WO2018045985 A1 WO 2018045985A1
Authority
WO
WIPO (PCT)
Prior art keywords
augmented reality
display system
light
reality display
see
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2017/100933
Other languages
English (en)
Chinese (zh)
Inventor
钟张翼
毛颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Dreamworld Smart Technology Co Ltd
Original Assignee
Shenzhen Dreamworld Smart Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201610812210.XA external-priority patent/CN107797278A/zh
Priority claimed from CN201710079190.4A external-priority patent/CN108427193A/zh
Application filed by Shenzhen Dreamworld Smart Technology Co Ltd filed Critical Shenzhen Dreamworld Smart Technology Co Ltd
Publication of WO2018045985A1 publication Critical patent/WO2018045985A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays

Definitions

  • the embodiments of the present application relate to the field of augmented reality technologies, and in particular, to an augmented reality display system.
  • Augmented reality display system is a new technology developed in recent years. It is mainly divided into VR (Virtual Reality) and AR (Augmented Reality) according to specific applications.
  • the principle of AR is to simulate virtual vision through an augmented reality display system, superimposed on the user's normal vision.
  • the AR augmented reality display system has two implementations of optical perspective and video perspective, the main difference being that the optical synthesis device is different.
  • the optical synthesis device in the optical perspective augmented reality display system may be a partially transmissive, partially reflective component through which light from a real environment is partially projected, and virtual image information is projected onto the component. It is reflected into the eyes of the user to synthesize real and virtual image information.
  • the inventors have found that at least the following problems exist in the related art: the field of view of the existing augmented reality display system is usually small, and the user cannot efficiently interact with the virtual image information. It is also impossible to form a 3D virtual scene.
  • the existing augmented reality display system also has a problem of large volume and no portable belt.
  • the technical problem to be solved by the embodiments of the present application is to provide an augmented reality display system with a large visual field area and capable of forming a 3D virtual scene with a small size and a portable belt.
  • an embodiment of the present invention provides an augmented reality display system, including: a head frame, a display module, two see-through light guiding elements, and a main board, the display module, two see-through light guiding elements, and The motherboard is placed on the head frame;
  • the head worn frame for wearing on a user's head
  • main board is provided with a processor, the processor is configured to process virtual image information and display the virtual image information on the display module;
  • the display module is detachably or fixedly mounted on the headset frame for displaying virtual image information, and transmitting the virtual image information in the first light and the second light;
  • the two fluoroscopic light guiding elements each have a concave surface disposed toward the eyes of the user; the first light reflected by the concave surface of the fluoroscopic light guiding element enters the user a left eye, and a second light reflected through the concave surface of the other of the see-through light guiding elements enters the right eye of the user to form a view of the 3D virtual scene; wherein the first light includes left eye virtual image information, The second light ray includes right eye virtual image information.
  • the beneficial effect of the embodiment of the present application is that the first ray including the left-eye virtual image information and the second ray including the right-eye virtual image information are respectively reflected into the eyes of the user through the concave surfaces of the two fluoroscopic light guiding elements.
  • a visual experience of a 3D virtual scene is formed in the user's brain, and the visual area is large.
  • the third light containing the external image information transmitted through the convex and concave surfaces of the see-through light guiding element enters the eyes of the user, and the user can see the real scene of the outside world, thereby forming a visual feeling of mixing the 3D virtual scene and the real scene.
  • FIG. 1a is a schematic structural diagram of an augmented reality display system according to Embodiment 1 of the present application.
  • Figure 1b is a schematic view of the see-through light guiding element shown in Figure 1a when it is placed on the head frame;
  • Figure 1c is a first relationship diagram between a side view angle and a display brightness of the display module shown in Figure 1a;
  • Figure 1d is a second relationship diagram between the side view angle and the display brightness of the display module shown in Figure 1a;
  • Figure 1e is a third relationship diagram between a side view angle and a display brightness of the display module shown in Figure 1a;
  • FIG. 2a is a schematic diagram showing the positional relationship between the display module and the user's face when the augmented reality display system shown in FIG. 1a is worn;
  • Figure 2b is a schematic view showing the rotation of the display module shown in Figure 1a;
  • FIG. 3 is a schematic diagram of an imaging principle of the augmented reality display system shown in FIG. 1a;
  • FIG. 3a is a schematic structural view of a see-through light guiding element provided with the light shielding layer shown in FIG. 1a;
  • FIG. 3b is a schematic structural diagram of an augmented reality display system according to an embodiment of the present invention.
  • 3c is a schematic structural view of an embodiment of a display screen in a display module
  • Figure 3d is a schematic structural view of still another embodiment of the display screen in the display module.
  • 3e is a schematic structural view of still another embodiment of a display screen in a display module
  • Figure 3f is a schematic structural view of still another embodiment of the display screen in the display module.
  • Figure 4 is a cross-sectional view of a see-through light guiding member for introducing a concave shape of a concave surface
  • Figure 4a is a plan view of a see-through light guiding element for introducing a surface recess value
  • Figure 4b is a cross-sectional view of the see-through light guiding element for introducing the concave value of the convex surface
  • Figure 5 is a schematic view of the augmented reality display system of Figure 1a when a diopter correction lens is provided;
  • FIG. 6 is a schematic diagram showing the distance relationship between the diagonal field of view area and the farthest end of the head frame to the foremost end of the user's head of the augmented reality display system shown in FIG. 1a;
  • FIG. 7 is a schematic diagram of the augmented reality display system shown in FIG. 1a connected to an external device;
  • FIG. 8 is a schematic structural diagram of an augmented reality display system according to Embodiment 2 of the present application.
  • FIG. 9 is a schematic diagram of the augmented reality display system shown in FIG. 8 connected to an external device;
  • FIG. 10 is another schematic diagram of the augmented reality display system shown in FIG. 8 when the external device is connected to work;
  • Figure 11 is a schematic illustration of the augmented reality display system of Figure 8 in operation.
  • Figure 12 is a schematic view showing the arrangement angle and light reflection of the partial structure of the augmented reality display system shown in Figure 1a.
  • an augmented reality display system provided by an embodiment of the present application has a total weight of less than 350 grams, including: a head frame 11, two display modules 12, and two perspective light guides. Element 13. Wherein, the see-through light guiding element 13 is a partially transmissive, partially reflective optical synthesizing device.
  • the display module 12 and the see-through light guiding elements 13 are all disposed on the head frame 11.
  • the bracket 11 fixes the display module 12 and the see-through light guiding element 13.
  • the display module 12 is disposed in a see-through light guide On the upper side of the element 13, the light emitted by the display module 12 can be reflected by the see-through light guiding element 13.
  • the display module 12 may also be located at a side of the see-through light guiding element 13 .
  • the augmented reality display system further includes a main board 17 disposed on the head frame 11 and located between the two display modules 12.
  • the main board 17 is provided with a processor for processing a virtual image signal and displaying the virtual image information on the display module 12.
  • the head frame 11 is used for wearing on the head of the user, and each of the see-through light guiding elements 13 has a concave surface which is disposed toward the eyes of the user.
  • the first light reflected through the concave surface of the see-through light guiding element 13 enters the left eye of the user, and the right eye of the second light user reflected by the concave surface of the other see-through light guiding element 13 to form in the mind of the user
  • the vision of 3D virtual scenes The first light is emitted by the display module 12, and the first light includes left eye virtual image information, the second light is emitted by the display module 12, and the second light includes right eye virtual image information.
  • two see-through light guiding elements 13 are disposed on the head frame 11 and are independently embedded in the head frame 11, respectively.
  • two regions corresponding to the left and right eyes of the user may be disposed on the raw material for fabricating the fluoroscopic light guiding element, and the shape and size of the region are different from the shape and size of each of the fluoroscopic light guiding members 13 when independently disposed as described above.
  • the same effect; the final effect is that a large see-through light guiding element is provided with two areas corresponding to the left and right eyes of the user.
  • the two fluoroscopic light guiding elements 13 are integrally formed.
  • the see-through light guiding elements provided to correspond to the left and right eye regions of the user are embedded in the head frame 11.
  • the display module 12 is detachably mounted on the head frame 11, for example, the display module is an intelligent display terminal such as a mobile phone or a tablet computer; or the display module is fixedly mounted on the head frame, for example, the display module and the head. Wear a frame integrated design.
  • Two display modules 12 can be mounted on the headgear frame 11.
  • the left eye and the right eye of the user are respectively provided with a display module 12, for example, one display module 12 is configured to emit a first light containing virtual image information of the left eye, and A display module 12 is configured to emit a second light that includes virtual image information of the right eye.
  • the two display modules 12 can be respectively located above the two fluoroscopic light guiding elements 13 in a one-to-one correspondence.
  • the two display modules 12 are respectively located one by one on the left side of the user.
  • the display module 12 can also be located on the side of the see-through light guiding element, that is, two see-through light guiding elements are located between the two display modules, when the augmented reality display system is worn on the user's head
  • the two display modules are respectively located on the side of the left eye and the right eye of the user in a one-to-one correspondence.
  • a single display module 12 can also be mounted on the headgear frame 11.
  • the single display module 12 has two display areas, one for emitting a first ray containing left eye virtual image information and the other for transmitting The second light of the virtual image information of the right eye.
  • the display module includes, but is not limited to, an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), an LCOS (Liquid Crystal On Silicon), or the like.
  • LCD Liquid Crystal Display
  • OLED Organic Light-Emitting Diode
  • LCOS Liquid Crystal On Silicon
  • the headgear frame may be a spectacle frame structure for hanging on the ear and nose of the user, or a helmet frame structure for wearing on the top and nose of the user's head.
  • the head frame since the main function of the headwear frame is to be worn on the user's head and to provide support for the optical and electrical components such as the display module and the see-through light guiding component, the head frame includes but is not limited to the above. In a manner, with the above-mentioned main effects, those skilled in the art can make some modifications to the head frame according to the needs of practical applications.
  • the horizontal axis represents the side view angle and the vertical axis represents the display brightness.
  • the display module 12 is an LCD
  • the brightness of the display module 12 varies with the angle of the viewer.
  • the side observation angle ⁇ at a display luminance of 50% is generally large.
  • the LCD When the LCD is applied to an augmented reality display system, it is more suitable for a small side viewing angle, and the brightness of such a display module 12 is concentrated in an angular area near the center. Because the augmented reality display system mainly uses an angled area near the center, the brightness of the first light and the second light projected into the user's eyes will be relatively high. Referring to FIG. 1d, the brightness of the first light and the second light emitted by the LCD applied to the augmented reality display system is generally small when the display brightness is 50%. Moreover, the distribution of the brightness of the first light and the second light emitted by the LCD applied in the augmented reality display system is symmetric about the 0 degree side view, and the side view angle is less than 60 degrees.
  • the display brightness of the brightness of the first light and the second light emitted by the display module 12 is maximum, and when the user's angle of view is shifted to both sides, the display brightness is gradually reduced, and the side view is gradually reduced. When it is less than 60 degrees, the display brightness is 0.
  • the brightness distribution of the first light and the second light emitted by the LCD applied to the augmented reality display system may not be symmetric with respect to the 0 degree side view, and the side view angle when the brightness is the brightest is not 0. degree.
  • two display modules 12 are respectively located one by one correspondingly above the two fluoroscopic light guiding elements 13.
  • the display module 12 and the user's head are flat.
  • the face forms an angle a which is from 0 to 180 degrees, preferably an obtuse angle.
  • the projection of the display module 12 on the horizontal plane is perpendicular to the normal plane.
  • the position of the see-through light guiding element 13 can be rotated by a certain angle b around a certain axis of rotation perpendicular to the horizontal plane, the angle b of the angle b being 0 degrees to 180 degrees, preferably 0 degrees to 90 degrees.
  • the see-through light guiding elements 13 corresponding to the left and right eyes can be adjusted in pitch by the mechanical structure on the head frame 11 to accommodate the user's interpupillary distance, ensuring comfort and imaging quality in use.
  • the farthest distance between the edges of the two see-through light guiding elements 13 is less than 150 mm, that is, the left edge of the see-through light guiding element 13 corresponding to the left eye to the see-through light guiding element corresponding to the right eye
  • the distance to the right edge of 13 is less than 150 mm.
  • the display modules 12 are connected by a mechanical structure, and the distance between the display modules 12 can also be adjusted, or the same effect can be achieved by adjusting the position of the display content on the display module 12.
  • the headgear frame 11 may be an eyeglass frame structure for hanging on the ear and nose of the user, on which the nose pad 111 and the temple 112 are disposed, and the nose pad 111 and the temple 112 are fixed to the user's head.
  • the temple 112 is a foldable structure, wherein the nose pad 111 is correspondingly fixed on the nose bridge of the user, and the temple 112 is correspondingly fixed on the user's ear.
  • the glasses legs 112 can also be connected by an elastic band, and the elastic band tightens the temples when worn to help the frame to be fixed at the head.
  • the nose pad 111 and the temple 112 are telescopic mechanisms that adjust the height of the nose pad 111 and the telescopic length of the temple 112, respectively.
  • the nose pad 111 and the temple 112 can also be of a detachable structure, and the nose pad 111 or the temple 112 can be replaced after disassembly.
  • the head frame 11 may include a nose pad and a stretch rubber band that is fixed to the user's head by a nose pad and a stretch rubber band; or only a stretch rubber band that is fixed to the user's head by the stretch rubber band.
  • the headgear frame 11 may also be a helmet-type frame structure for wearing on the top and nose of the user's head.
  • the head frame 11 since the main function of the head frame 11 is to be worn on the user's head and to provide support for the optical and electrical components such as the display module 12 and the see-through light guiding element 13, the head frame includes but not Limited to the above manner, under the premise of having the above-mentioned main effects, those skilled in the art can make several modifications to the head frame according to the needs of practical applications.
  • the display module 12 emits a first light ray 121 including left-eye virtual image information, and the first light ray 121 reflected by the concave surface 131 of the see-through light guiding element 13 enters the left eye 14 of the user; similarly, the display module emits a second light comprising the virtual image information of the right eye, the second light reflected by the concave surface of the other see-through light guiding element enters the right eye of the user, thereby forming a 3D virtual scene in the brain of the user Visual perception, in addition, different from Google glasses, by directly setting a small display screen in front of the user's right eye, resulting in a smaller visual area.
  • more display is reflected by two see-through light guiding elements. The first light and the second light emitted by the module respectively enter the eyes of the user, and the visual area is large.
  • each of the see-through light guiding elements 13 when the augmented reality display system realizes the function of augmented reality, each of the see-through light guiding elements 13 further has a convex surface disposed opposite to the concave surface; the convex surface and the concave surface transmitted through the transparent light guiding element 13 A third ray containing external image information enters the user's eyes to form a visual blend of the 3D virtual scene and the real scene.
  • a see-through light guiding element 13 further has a convex surface 132 disposed opposite the concave surface 131, and the third light 151 containing external image information transmitted through the convex surface 132 and the concave surface 131 of the see-through light guiding element 13 enters the user.
  • the left eye 14 is similarly shaped, and the other see-through light guiding element further has a convex surface disposed opposite to the concave surface thereof, and the third light containing the external image information transmitted through the convex surface and the concave surface of the see-through light guiding element enters the right side of the user.
  • the user can see the real scene of the outside world, thereby forming a visual experience of mixing the 3D virtual scene and the real scene.
  • the other surface of each of the see-through light guiding elements disposed opposite to the concave surface includes, but is not limited to, a convex shape, in order to block the image information including the external image.
  • the third light enters the user's eyes, that is, the user avoids the real scene of the outside world.
  • the other surface of the see-through light guiding element 13 opposite to the concave surface 131 may be plated or affixed with shading.
  • Layer 16 as shown in FIG.
  • a hood 171 for blocking the inclusion of the third light containing external image information into the eyes of the user may also be disposed on the head frame 17, so that only the left eye virtual image information is transmitted by the display module.
  • the first light and the second light containing the virtual image information of the right eye enter the eyes of the user, forming a visual experience of the 3D virtual scene in the user's brain, and realizing the function of the virtual reality.
  • the display module 12 includes a display screen.
  • the display screen may be a display screen 18 having a spherical surface.
  • the radius of curvature of the spherical surface of the display screen 18 is positive, that is, the display
  • the light emitting surface 181 of the screen 18 is convex; as shown in FIG. 3d, the display screen may be a screen 19 having a spherical surface, and the radius of curvature of the spherical surface of the display screen 19 is negative, that is, the light emitting surface 191 of the display screen 19.
  • the display screen may also be a display screen 20 having a cylindrical surface.
  • the radius of curvature of the cylinder surface of the display screen 20 is positive, that is, the light emitting surface 201 of the display screen 20 is convex.
  • the display screen may also be a display screen 21 having a cylindrical surface.
  • the radius of curvature of the cylinder surface of the display screen 21 is negative, that is, the light emitting surface 211 of the display screen 21 is internal. Concave cylinder.
  • the concave surfaces of the two see-through light guiding elements need to be able to The aberration of the user's eyes and the aberration caused by the tilting of the see-through light guiding element are balanced, and based on this, the concave surface of the see-through light guiding element is designed according to four special functions, as explained below.
  • the face depression value refers to the distance of different regions of the surface of the optical element from the center point O of the surface of the optical element in the Z-axis direction.
  • the optical component refers to a see-through light guiding component
  • the surface of the optical component refers to a concave surface of the see-through light guiding component
  • the concave shape of the concave surface of the see-through light guiding component is sag(x, y).
  • the projection point coordinate of the concave surface of the see-through light guiding element on the XY coordinate plane is (x, y).
  • the concave surface of the see-through light guiding element is designed according to the following power series polynomial function:
  • c is the basic curvature of the concave and/or convex surface
  • k is the basic conic coefficient of the concave and/or convex surface
  • N is the number of polynomials
  • a i is the coefficient of the i-th order polynomial
  • E i (x, y) is the binary power series polynomial of the standard two variables (x, y).
  • the concave surface of the see-through light guiding element is designed according to the following Chebyshev polynomial function:
  • N is the number of polynomials in the x direction
  • M is the number of polynomials in the y direction
  • a ij is the coefficient of the sum of the fractional parts of the ij order polynomial, with Is the normalized coordinates after redefining the x and y coordinates to the interval [-1, 1];
  • ) is the maximum value of x absolute values
  • ) is the maximum value of y absolute values
  • the concave surface of the see-through light guiding element is designed according to the following standard Zernike polynomial function:
  • c is the basic curvature of the concave and/or convex surface
  • k is the basic conic coefficient of the concave and/or convex surface
  • a i is the coefficient of the i-th order aspheric variable
  • N is the number of standard Zernike polynomials
  • ⁇ And ⁇ are the polar coordinates corresponding to the x coordinate and the y coordinate, respectively
  • the interval range of ⁇ is [0, 1]
  • the interval range of ⁇ is [0, 2 ⁇ ]. It is the i-th standard Zernike polynomial.
  • the concave surface of the see-through light guiding element is designed according to the following Anamorphic function:
  • c x is the basic curvature of the concave and/or convex surface in the x direction
  • k x is the basic conic coefficient of the concave and/or convex surface in the x direction
  • c y is the concave and/or convex surface in the y direction
  • the basic curvature, k y is the basic conic coefficient of the concave and/or convex surface in the y direction
  • ⁇ 4 is the fourth higher order coefficient of axial symmetry
  • ⁇ 4 is the fourth higher order coefficient of axial asymmetry
  • ⁇ 6 is the sixth higher order coefficient of axial symmetry
  • ⁇ 6 is the sixth higher order coefficient of axial asymmetry
  • ⁇ 8 is the eighth higher order coefficient of axial symmetry
  • ⁇ 8 is the eighth highest of axial asymmetry.
  • the order factor, ⁇ 10 is the 10th higher order coefficient of axial symmetry
  • ⁇ 10 is the 10th higher order coefficient of axial asymmetry.
  • the optical element refers to a see-through light guiding element
  • the surface of the optical element refers to a convex surface of the see-through light guiding element
  • the concave shape of the convex surface of the see-through light guiding element is Sag(x, y), as shown in Fig.
  • the projection point coordinate of the convex surface of the see-through light guiding element on the XY coordinate plane is (x, y), in order to ensure that the third light containing the external image information enters the user's eyes
  • the convex surface of the see-through light guiding element is designed according to any one of the power series polynomial function, the Chebyshev polynomial function, the standard Zernike polynomial function, and the Anamorphic function.
  • a diopter correcting lens 16 is disposed between the human eye and the see-through light guiding element 13, the diopter correcting lens 16 being disposed perpendicular to the horizontal plane.
  • the plane of the diopter correction lens may also be at an angle of 30 degrees to 90 degrees from the horizontal plane.
  • different degrees of diopter correcting lenses may be arbitrarily set.
  • the display module 12 emits a first light ray 121 including left-eye virtual image information, a first light ray 121 reflected through the concave surface 131 of the fluoroscopic light guiding element 13, and a convex surface 132 and a concave surface 131 transmitted through the fluoroscopic light guiding element 13
  • the third light ray 151 of the image information passes through the refractive correction lens 16 before entering the left eye 14 of the user.
  • the refractive correction lens 16 is a concave lens, and the first light 121 and the third light 151 passing therethrough are diverged, so that the focus of the first light 121 and the third light 151 on the left eye 14 are shifted back.
  • the refractive correction lens 16 can also be a convex lens that converges the first light ray 121 and the third light ray 151 thereon to advance the focus of the first light ray 121 and the third light 151 on the left eye 14.
  • the display module emits a second light containing the virtual image information of the right eye, the second light reflected through the concave surface of the other see-through light guiding element, and the external image information transmitted through the convex and concave surfaces of the see-through light guiding element.
  • the lens is first corrected by a diopter.
  • the user's eyeball is the apex, and the user's eyeball is formed on both sides of the virtual display area of the virtual image seen through the see-through light guiding element 13.
  • Diagonal field of view The distance from the farthest end of the head frame to the contact position with the foremost end of the head is c, and the distance length of the c can be adjusted as needed.
  • the angular extent of the diagonal field of view region is inversely proportional to the distance from the most distal end of the head frame 11 to the contact position with the foremost end of the head.
  • the distance from the farthest end of the head frame to the contact position with the foremost end of the head is less than 80 mm under the premise that the diagonal field of view area is greater than 55 degrees.
  • the second display module 12 is connected to the main board 17 by a cable.
  • the main board 17 is also provided with a video interface and a power interface.
  • the video interface is used to connect a computer, a mobile phone, or other device to receive a video signal.
  • the video interface may be: hmdi, display port, thunderbolt or usb type-c, micro Usb, MHL (Mobile High-Definition Link) and other interfaces.
  • the processor is configured to decode the video signal transmission and display it on the display module 12.
  • the power interface includes a USB interface or other interfaces.
  • the reality display system includes only the headgear frame 11, the two display modules 12, the two see-through light guiding elements 13 and the main board 17, as described above, all 3D virtual scene rendering, image generation corresponding to both eyes are enhanced and enhanced.
  • the reality display system is connected to an external device.
  • the external device includes: a computer, a mobile phone, a tablet computer, and the like.
  • the augmented reality display system receives the video signal of the external device through the video interface, and displays it on the display module 12 after decoding.
  • the interaction with the user is performed by an application software on an external device such as a computer, a mobile phone, a tablet computer, etc., and can interact with the augmented reality display system by using a mouse keyboard, a touch pad or a button on the external device.
  • Examples of such basic structures include, but are not limited to, large screen portable displays.
  • the augmented reality display system can project the display screen at a fixed location within the user's field of view. The user needs to adjust the size, position, and the like of the projection screen through software on the device connected to the augmented reality display system.
  • the augmented reality display system reflects the first ray including the left-eye virtual image information and the second ray including the right-eye virtual image information by the concave surfaces of the two fluoroscopic light guiding elements.
  • the user's eyes are entered, thereby forming a visual experience of the 3D virtual scene in the user's brain, and the visual area is large.
  • a plurality of sensors are disposed to perform sensing on the surrounding environment.
  • the present invention provides an augmented reality display system.
  • the augmented reality display system has a total weight of less than 350 grams, and includes a head frame 21, two display modules 22, two see-through light guiding elements 23, and a main board 24.
  • the display module 22, the see-through light guiding element 23 and the main board 24 are all disposed on the head frame 21.
  • the head frame 21 fixes the display module 22, the see-through light guiding element 23 and the main board 24.
  • the display module 22 is disposed on the upper side of the see-through light guiding element 23, and the light emitted by the display module 22 can be reflected by the see-through light guiding element 23.
  • the main board 24, the main board 24 is located between the two display modules 22, the main board A processor is provided 24 for processing the virtual image signal and displaying the virtual image information on the display module 22.
  • the head frame 21, the two display modules 22, the two see-through light guiding elements 23, the main board 24 and the head frame 11 described in the first embodiment, the two display modules 12, the two see-through light guiding elements 13, and the main board 17 The specific functions, structures, and positional relationships are the same and will not be described here.
  • a diopter correcting lens is disposed between the human eye and the see-through light guiding element 23, the diopter correcting lens being disposed perpendicular to the horizontal plane.
  • different degrees of diopter correcting lenses may be arbitrarily set.
  • the head frame 21 is further provided with a monocular camera 211, a binocular/multi-view camera 212, an eyeball tracking camera 213, a gyroscope 214, an accelerometer 215, a magnetometer 216, a depth of field sensor 217, an ambient light sensor 218, and/or a distance. Sensor 219.
  • the monocular camera 211, the binocular/multi-view camera 212, the eyeball tracking camera 213, the gyroscope 214, the accelerometer 215, the magnetometer 216, the depth of field sensor 217, the ambient light sensor 218, and/or the distance sensor 219 are all electrically connected to the main board 24 on.
  • the monocular camera 211 is a color monocular camera placed at the front of the head frame 21.
  • the monocular camera 211 faces the other side with respect to the user's face, and the camera can be used for photographing.
  • the augmented reality display system can be used to locate the use of the camera using a computer vision technology to detect known locations in the environment.
  • the monocular camera 211 can also be a high-resolution camera for taking photos or taking video; the captured video can also be superimposed by software to superimpose the virtual object seen by the user, and the user can see through the augmented reality display system. content.
  • the binocular/multi-view camera 212 may be a monochrome or color camera disposed on the front or side of the head frame 21 and located on one side, two sides or all sides of the monocular camera 211. Further, the binocular/multi-view camera 212 may be provided with an infrared filter. Using the binocular camera, you can further obtain the depth of field information on the image based on the environment image. With a multi-camera camera, you can further expand the camera's viewing angle to get more ambient image and depth of field information.
  • the ambient image and distance information captured by the dual/multi-view camera 212 can be used to: (1) fuse with the data of the gyroscope 214, the accelerometer 215, and the magnetometer 216 to calculate the pose of the augmented reality display system. (2) Capture user gestures, palm prints, etc. for human-computer interaction.
  • each of the above-mentioned monocular camera or binocular/multi-view camera may be one of an RGB camera, a monochrome camera or an infrared camera.
  • the eyeball tracking camera 213 is disposed on one side of the see-through light guiding element 23, and when the user wears the augmented reality display system, the eyeball tracking camera 213 faces the side opposite to the user's face.
  • the eyeball tracking camera 213 is used to track the focus of the human eye, and to track and specialize the specific parts in the virtual object or virtual screen that the human eye is watching. For example, the specific information of the object is automatically displayed next to the object that the human eye is watching.
  • the area of the human eye can display the high-definition virtual object image, while for other areas, only the low-definition image can be displayed, which can effectively reduce the amount of image rendering calculation without affecting the user experience.
  • the gyroscope 214, the accelerometer 215, and the magnetometer 216 are disposed between the two display modules 22.
  • the relative pose between the user's head and the initial position of the system can be obtained by fusing the data of the gyroscope 214, the accelerometer 215, and the magnetometer 216.
  • the raw data of these sensors can be further fused with the data of the binocular/multi-view camera 212 to obtain the position and attitude of the augmented reality display system in a fixed environment.
  • the depth of field sensor 217 is disposed at the front of the head frame 21, and can directly obtain depth information in the environment. Compared to the dual/multi-view camera 212, the depth of field sensor can obtain more accurate, higher resolution depth of field data. Similarly, the use of these data can be: (1) fused with the data of the gyroscope 214, the accelerometer 215, and the magnetometer 216 to calculate the pose of the augmented reality display system. (2) Capture user gestures, palm prints, etc. to interact with humans. (3) detecting three-dimensional information of objects around the user.
  • the ambient light sensor 218 is disposed on the head frame 21, and can monitor the intensity of ambient light in real time.
  • the augmented reality display system adjusts the brightness of the display module 22 in real time according to the change of the ambient light to ensure the consistency of the display effect under different ambient light.
  • the distance sensor 219 is disposed at a position where the augmented reality display system is in contact with the user's face for detecting whether the augmented reality display system is worn on the user's head. If the user removes the augmented reality display system, the power can be saved by turning off the display module 22, the processor, and the like.
  • the augmented reality display system further comprises: an infrared/near-infrared LED electrically connected to the main board 24, wherein the infrared/near-infrared LED is used for binocular/multiple
  • the camera 212 provides a light source. Specifically, the infrared/near-infrared LED emits infrared rays, and when the infrared rays reach an object acquired through the binocular/multi-view camera 212, the object reflects the infrared rays, and the photosensitive element on the binocular/multi-view camera 212 receives the reflection. The returned infrared rays are converted into electrical signals, followed by imaging processing.
  • the operations that the augmented reality display system can perform when performing human-computer interaction include the following:
  • the augmented reality display system can project the display screen at a fixed position within the user's field of view.
  • the user can adjust the size, position, and the like of the projection screen through sensors on the augmented reality display system.
  • buttons, joysticks, touchpads, etc. on the remote control which are connected to the augmented reality display system by wired or wireless means as a human-computer interaction interface.
  • the device and the microphone can be integrated by adding an audio decoding and power amplifying chip to the main board, integrating an earphone jack, an earplug, or a speaker, and allowing the user to interact with the augmented reality display system using voice.
  • a video interface and a processor are provided on the motherboard.
  • the augmented reality display system includes the headgear frame 21, the two display modules 22, the two see-through light guiding elements 23, the main board 24, and the plurality of sensors as described above, all of the 3D virtual scene renderings, corresponding eyes
  • the image generation and the processing of the data acquired by the plurality of sensors can be performed in an external device connected to the augmented reality display system.
  • the external device includes: a computer, a mobile phone, a tablet computer, and the like.
  • the augmented reality display system receives the video signal of the external device through the video interface, and displays it on the display module 23 after decoding.
  • the external device receives the data acquired by the plurality of sensors on the augmented reality display system, and performs processing to adjust the image displayed by the two eyes according to the data, and is reflected on the image displayed on the display module 23.
  • the processor on the augmented reality display system is only used to support the transmission and display of video signals and the transmission of sensor data.
  • a processor with strong computing power is disposed on the motherboard, and some or all of the computer vision algorithms are completed in the augmented reality display system.
  • the augmented reality display system receives the video signal of the external device through the video interface, and displays it on the display module 23 after decoding.
  • the external device receives the data acquired by a part of the sensors on the augmented reality display system, and performs processing to adjust the image displayed by the two eyes according to the sensor data, and is reflected on the image displayed on the display module 23.
  • the data acquired by the remaining sensors is on the augmented reality display system. deal with. For example, data acquired by the monocular camera 211, the binocular/multi-view camera 212, the gyroscope 214, the accelerometer 215, the magnetometer 216, and the depth of field sensor 217 are processed in an augmented reality display system.
  • the data acquired by the eyeball tracking camera 213, the ambient light sensor 218, and the distance sensor 219 are processed in an external device.
  • the processor on the augmented reality display system is used to support the transmission and display of video signals, the processing of part of the sensor data, and the transfer of the remaining sensor data.
  • the motherboard is equipped with a high-performance processor and an image processor to perform all operations in the augmented reality display system.
  • Augmented Reality displays operate as a stand-alone system without the need to connect an external device.
  • the augmented reality display system processes the data acquired by the sensor, the image displayed by the two eyes is adjusted, and is displayed on the display module 23 after being rendered.
  • the processor on the augmented reality display system is used for decoding processing and display of video signals and processing of sensor data.
  • the concave surface of the see-through light guiding element is plated with a reflective film.
  • the reflective surface of the see-through light guiding element coated with the reflective film has a reflectance of 20% to 80%.
  • the concave surface of the see-through light guiding element is plated with a polarizing reflective film, and the polarization direction of the polarizing reflective film is The angle between the polarization directions of a light and the second light is greater than 70° and less than or equal to 90°.
  • the polarization direction of the polarized reflective film is perpendicular to the polarization directions of the first light and the second light, achieving nearly 100% reflection.
  • the third light containing the external image information is unpolarized light
  • the concave surface of the see-through light guiding element is plated with the polarizing reflective film
  • the convex surface of the see-through light guiding element is coated with an anti-reflection film.
  • the concave surface of the see-through light guiding element is provided with a pressure sensitive reflective film, and by changing the magnitude of the voltage applied to the pressure sensitive reflective film, the reflectance of the pressure sensitive reflective film can be adjusted between 0 and 100% when the pressure sensitive reflective film When the reflectivity is 100%, the augmented reality display system can realize the function of virtual reality.
  • the external image is included.
  • the controllable adjustment of the transmittance of the third light of the interest, the pressure sensitive black sheet is disposed on the other surface of the see-through light guiding element disposed opposite to the concave surface, and by changing the voltage applied to the pressure sensitive black sheet, Adjust the light transmittance of the pressure sensitive black sheet.
  • the display module 12 is disposed at an angle of between 5° and 70° with respect to the horizontal direction; the reflected light 521 entering the upper edge of the user's left eye 14 in the first ray.
  • the angle with the incident ray 522 is less than 90°; the angle of the reflected ray 531 and the incident ray 532 entering the lower edge of the user's left eye 14 in the first ray is greater than 35°; the first ray entering the user's left eye 14
  • the angle between the reflected light between the upper edge of the field of view and the lower edge of the field of view and the incident light is between 35° and 90°. It should be noted that those skilled in the art can adjust the angle 2 and the angle 3 by adjusting the placement angle of the display module 12 with respect to the horizontal direction and the placement angle of the see-through light guiding element 13 according to the needs of the actual application. In order to achieve the best effect, the effective utilization of the left eye virtual image information and the right eye virtual image information is improved, and the user experience is improved.
  • the augmented reality display system reflects the first ray including the left-eye virtual image information and the second ray including the right-eye virtual image information by the concave surfaces of the two fluoroscopic light guiding elements.
  • the user's eyes are entered, thereby forming a visual experience of the 3D virtual scene in the user's brain, and the visual area is large.
  • a plurality of sensors are arranged on the augmented reality display system. After the sensor senses the surrounding environment, the perceived result can be reflected in the image displayed in the display module, so that the on-site experience is better and the user experience is better.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

L'invention concerne un système d'affichage à réalité augmentée, comprenant : un cadre monté sur la tête (11), un module d'affichage (12), deux éléments de guidage de lumière transparents (13) et une carte mère (17). Le module d'affichage (12), les deux éléments de guidage de lumière transparents (13) et la carte mère (17) sont disposés sur le cadre monté sur la tête (11). L'effet bénéfique des modes de réalisation comprend : une plus grande quantité d'une première lumière comprenant des informations d'image virtuelle d'œil gauche et d'une deuxième lumière comprenant des informations d'image virtuelle d'œil droit peut être respectivement réfléchie dans les yeux de l'utilisateur au moyen de surfaces concaves des deux éléments de guidage de lumière transparents (13), ce qui permet de former une expérience visuelle d'une scène virtuelle 3D avec une grande zone visuelle dans le cerveau de l'utilisateur. De plus, une troisième lumière comprenant des informations d'image externe entre dans les yeux de l'utilisateur au moyen de la transmission de surfaces convexes et concaves des éléments de guidage de lumière transparents (13), de telle sorte que l'utilisateur puisse voir la scène réelle du monde extérieur, ce qui permet de former une expérience visuelle combinant la scène virtuelle 3D et la scène réelle.
PCT/CN2017/100933 2016-09-07 2017-09-07 Système d'affichage à réalité augmentée Ceased WO2018045985A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201610812210.X 2016-09-07
CN201610812210.XA CN107797278A (zh) 2016-09-07 2016-09-07 头戴式显示器
CN201710079190.4A CN108427193A (zh) 2017-02-14 2017-02-14 增强现实显示系统
CN201710079190.4 2017-02-14

Publications (1)

Publication Number Publication Date
WO2018045985A1 true WO2018045985A1 (fr) 2018-03-15

Family

ID=61561720

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/100933 Ceased WO2018045985A1 (fr) 2016-09-07 2017-09-07 Système d'affichage à réalité augmentée

Country Status (1)

Country Link
WO (1) WO2018045985A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109298533A (zh) * 2018-12-07 2019-02-01 北京七鑫易维信息技术有限公司 一种头显设备
CN111522141A (zh) * 2020-06-08 2020-08-11 歌尔光学科技有限公司 头戴设备
US11412310B2 (en) * 2020-05-18 2022-08-09 Qualcomm Incorporated Performing and evaluating split rendering over 5G networks
CN115396656A (zh) * 2022-08-29 2022-11-25 歌尔科技有限公司 基于ar sdk的增强现实方法、系统、设备及介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102937745A (zh) * 2012-11-13 2013-02-20 京东方科技集团股份有限公司 开放式头戴显示装置及其显示方法
CN203658670U (zh) * 2013-10-23 2014-06-18 卫荣杰 头戴透视显示装置
US8873148B1 (en) * 2011-12-12 2014-10-28 Google Inc. Eyepiece having total internal reflection based light folding
CN204595328U (zh) * 2014-12-26 2015-08-26 成都理想境界科技有限公司 头戴式显示装置
US9366869B2 (en) * 2014-11-10 2016-06-14 Google Inc. Thin curved eyepiece for see-through head wearable display
CN206497255U (zh) * 2017-02-14 2017-09-15 毛颖 增强现实显示系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8873148B1 (en) * 2011-12-12 2014-10-28 Google Inc. Eyepiece having total internal reflection based light folding
CN102937745A (zh) * 2012-11-13 2013-02-20 京东方科技集团股份有限公司 开放式头戴显示装置及其显示方法
CN203658670U (zh) * 2013-10-23 2014-06-18 卫荣杰 头戴透视显示装置
US9366869B2 (en) * 2014-11-10 2016-06-14 Google Inc. Thin curved eyepiece for see-through head wearable display
CN204595328U (zh) * 2014-12-26 2015-08-26 成都理想境界科技有限公司 头戴式显示装置
CN206497255U (zh) * 2017-02-14 2017-09-15 毛颖 增强现实显示系统

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109298533A (zh) * 2018-12-07 2019-02-01 北京七鑫易维信息技术有限公司 一种头显设备
CN109298533B (zh) * 2018-12-07 2023-12-26 北京七鑫易维信息技术有限公司 一种头显设备
US11412310B2 (en) * 2020-05-18 2022-08-09 Qualcomm Incorporated Performing and evaluating split rendering over 5G networks
CN111522141A (zh) * 2020-06-08 2020-08-11 歌尔光学科技有限公司 头戴设备
CN115396656A (zh) * 2022-08-29 2022-11-25 歌尔科技有限公司 基于ar sdk的增强现实方法、系统、设备及介质

Similar Documents

Publication Publication Date Title
CN206497255U (zh) 增强现实显示系统
US11385467B1 (en) Distributed artificial reality system with a removable display
CN111602082B (zh) 用于包括传感器集成电路的头戴式显示器的位置跟踪系统
JP6083880B2 (ja) 入出力機構を有する着用可能な装置
CN108427193A (zh) 增强现实显示系统
KR20210004776A (ko) 증강 현실을 디스플레이하는 장치 및 방법
CN108421252B (zh) 一种基于ar设备的游戏实现方法和ar设备
KR20150116814A (ko) 아이 트래킹 웨어러블 디바이스들 및 사용을 위한 방법들
WO2019001575A1 (fr) Dispositif d'affichage portable
US20160097929A1 (en) See-through display optic structure
CN116209942A (zh) 具有应变仪计算的护目镜
US20180335635A1 (en) Head mounted display device, control method for head mounted display device, and computer program
WO2018149267A1 (fr) Procédé et dispositif d'affichage basés sur la réalité augmentée
WO2018045985A1 (fr) Système d'affichage à réalité augmentée
WO2016169339A1 (fr) Structure de lunettes améliorant les images
TWM512138U (zh) 垂直投射式近眼顯示模組
CN107111143B (zh) 视觉系统及观片器
CN118591753A (zh) 具有被定向为减少出现重影图像的光栅的显示系统
US20250231409A1 (en) Folded Optics for a Head Mounted Display
CN206638889U (zh) 头戴式显示器
CN116009253A (zh) 一种光学设备
WO2018149266A1 (fr) Procédé et dispositif de traitement d'informations basés sur la réalité augmentée
CN204439936U (zh) 一种视频眼镜
TW201805689A (zh) 外加式近眼顯示裝置
KR20240116909A (ko) 불균일 푸시-풀 렌즈 세트를 포함하는 아이웨어

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17848158

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 06/08/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17848158

Country of ref document: EP

Kind code of ref document: A1