WO2024202682A1 - Display device and display method - Google Patents
Display device and display method Download PDFInfo
- Publication number
- WO2024202682A1 WO2024202682A1 PCT/JP2024/005653 JP2024005653W WO2024202682A1 WO 2024202682 A1 WO2024202682 A1 WO 2024202682A1 JP 2024005653 W JP2024005653 W JP 2024005653W WO 2024202682 A1 WO2024202682 A1 WO 2024202682A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display device
- camera
- optical
- feature points
- light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/02—Viewing or reading apparatus
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C7/00—Optical parts
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C7/00—Optical parts
- G02C7/02—Lenses; Lens systems ; Methods of designing lenses
- G02C7/04—Contact lenses for the eyes
Definitions
- the technology disclosed herein (hereinafter also referred to as "the technology”) relates to a display device and a display method.
- Display devices that display images by irradiating light (more specifically, image light) onto the observer's eyeball are known (see, for example, Patent Document 1).
- the display device described in Patent Document 1 detects the relative positions of the scanning unit (light projection system) and the deflection unit (ocular optical system) from two-dimensional position information of each of two feature points on the contact lens, and controls the scanning unit based on the detection results.
- the main objective of this technology is to provide a display device that can accurately detect the relative position between the light projection system and the eyepiece optical system.
- the present technology includes a light projection system including a light source; an eyepiece optical system that guides the light projected from the light projection system to an eyeball; Equipped with The eyepiece optical system includes: An optical element; Three or more feature points having fixed relative positions with respect to the optical element; Including,
- the display device further includes an acquisition unit that acquires at least two-dimensional position information of at least three of the three or more feature points.
- the at least three feature points may not be collinear.
- the optical element may be provided on a contact lens that is fitted to the eye.
- the at least three features may be provided on the contact lens.
- the acquisition unit may be capable of acquiring at least the two-dimensional position information of each of the at least three feature points even if a direction of the eyeball changes.
- the optical element may be provided on a lens attached to an eyeglass frame.
- the at least three feature points may all be provided on the lens or the eyeglass frame, or some of the at least three feature points may be provided on the lens and others may be provided on the eyeglass frame.
- the at least three feature points and the optical center of the optical element may be on the same plane.
- the acquisition unit may include at least one camera.
- the at least one camera may include a visible light camera and/or a Time-of-Flight camera.
- the at least one camera may include a plurality of cameras constituting a stereo camera.
- the optical projection system may include the acquisition unit.
- the capture unit may include an infrared light source, the camera may be sensitive to infrared wavelengths, and the at least three features may be made of a retroreflective material.
- the optical projection system may include a projection optical system disposed on an optical path of light from the light source, and an optical axis of the projection optical system may be coaxial with an optical axis of the camera.
- the optical projection system may include a projection optical system arranged on an optical path of light from the light source, and a control unit that controls the light source and/or the projection optical system based on the acquisition result of the acquisition unit.
- the projection optical system includes a movable deflection element and a focus lens
- the acquisition unit acquires two-dimensional position information of each of the at least three feature points in a plane perpendicular to the optical axis direction of the camera, and distance information between the camera and each of the at least three feature points in the optical axis direction of the camera
- the control unit may simultaneously control the movable deflection element based on the two-dimensional position information, control the light source based on at least the two-dimensional position information and the distance information, and/or control the focus lens based on the distance information.
- At least two of the at least three characteristic points may be integrally provided.
- the optical projection system and the eyepiece optical system may be separate entities.
- the present technology includes a step of acquiring at least two-dimensional position information of at least three feature points among three or more feature points whose relative positions with respect to an optical element that guides light projected from a light projection system to an eyeball are fixed; and controlling a part of the optical projection system based on the results obtained in the obtaining step.
- the optical projection system may include a light source, a movable deflection element, and a focus lens
- the step of acquiring the at least two-dimensional position information may include a step of acquiring two-dimensional position information of each of the at least three feature points in a plane perpendicular to the optical axis direction of the camera, and a step of acquiring distance information between the camera and each of the at least three feature points in the optical axis direction of the camera
- the step of controlling may include a step of controlling the movable deflection element based on at least the two-dimensional position information, and a step of controlling the light source based on the two-dimensional position information and the distance information and/or a step of controlling the focus lens based on the distance information.
- FIG. 1 is a diagram for explaining the principle of the present technology.
- FIG. 11 is a diagram for explaining a method of calculating a normal vector.
- 1 is a diagram showing a display device according to a first embodiment of the present technology; 4A to 4C are diagrams for explaining a first configuration example of the eyepiece optical system of the display device of FIG. 5A to 5C are diagrams for explaining a second configuration example of the eyepiece optical system of the display device of FIG. 6A to 6C are diagrams for explaining a third example configuration of the eyepiece optical system of the display device of FIG. 7A to 7E are diagrams for explaining a fourth example configuration of the eyepiece optical system of the display device of FIG.
- FIG. 8A to 8E are diagrams for explaining a fifth example configuration of the eyepiece optical system of the display device of FIG. 1A and 1B are diagrams for explaining the incidence angle dependency of the diffraction efficiency in a HOE.
- FIG. 4 is a block diagram showing an example of a basic configuration for controlling the display device shown in FIG. 3 .
- 11 is a flowchart showing the flow of overall control in the basic configuration example of FIG. 10 .
- FIG. 4 is a block diagram showing a first example of a control configuration for the display device shown in FIG. 3 .
- 13 is a flowchart showing the flow of overall control in the configuration example 1 of FIG. 12 .
- 13 is a flowchart showing a movable mirror control process in the configuration example 1 of FIG. 12 .
- FIG. 13 is a flowchart showing a focus lens control process in the configuration example 1 of FIG. 12 .
- 13 is a flowchart showing a projection image control process in the configuration example 1 of FIG. 12 .
- FIG. 4 is a block diagram showing a second example of a control configuration for the display device shown in FIG. 3 .
- 18 is a flowchart showing the flow of overall control in the configuration example 2 of FIG. 17 .
- FIG. 11 is a diagram showing a display device according to a second embodiment of the present technology.
- 10 is a flowchart showing a flow of control of a display device according to a second embodiment of the present technology.
- FIG. 13 is a diagram showing a display device according to a third embodiment of the present technology.
- FIG. 13 is a diagram showing a display device according to a fourth embodiment of the present technology.
- FIG. 13 is a diagram showing a display device according to a fifth embodiment of the present technology.
- FIG. 1 is a diagram for explaining the principle of a stereo camera.
- FIG. 13 is a diagram showing a display device according to a sixth embodiment of the present technology.
- FIG. 1 is a diagram for explaining retroreflection.
- FIG. 2 is a block diagram showing an example of the configuration of a TOF camera.
- a retinal direct imaging device (display device) that projects light (specifically, image light) from an optical projection system and directs the projected light to the pupil of an observer via an eyepiece optical system worn by the observer to display an image.
- a retinal direct imaging device in which the positional relationship (relative position) between the optical projection system and the eyepiece optical system is not fixed, it is necessary to recognize the state of the observer from the optical projection system side.
- Such a retinal direct imaging device has a feature that the relative position between the eyepiece optical system and the optical projection system changes in real time depending on the movement of the observer, so it is mainly desired to perform the following recognition and correction.
- optical elements are positioned so that their optical centers overlap with the observer's pupil, so it is desirable to recognize the position of the optical center of the optical element (e.g., two-dimensional position information) from the light projection system.
- the diffraction efficiency of a diffraction element depends on the angle of incidence, and when the angle of incidence of the projection light to the HOE changes, the diffraction efficiency changes independently for each of RGB. In order to correctly reproduce the image and brightness that you want the viewer to see, it is desirable to correct the brightness of the original image depending on the angle of incidence.
- the relative position between the scanning unit (light projection system) and the deflection unit (eyepiece optical system) is detected from two-dimensional position information of each of two feature points provided in the deflection unit (eyepiece optical system).
- the inventors developed a display device according to this technology that can accurately detect the relative position between the light projection system and the eyepiece optical system (for example, the relative position of the eyepiece optical system as viewed from the light projection system).
- a display device when the optical elements of the eyepiece optical system are mounted on a contact lens that is worn on the observer's eyeball, the contact lens and the optical elements move in accordance with the movement of the pupil, so that if the position of the pupil is recognized, the position of the optical element can also be recognized, and if the direction of the pupil (direction of gaze) is recognized, the angle (posture) of the optical element can also be recognized.
- This makes it possible to detect the relative position between the light projection system and the eyepiece optical system with greater precision.
- the display device described in Patent Document 1 is thought to be configured so that the scanning unit (light projection system) is placed on the observer's head, and it would be extremely difficult to accommodate a configuration requiring high detection accuracy in which the scanning unit (light projection system) and the deflection unit (ocular optical system) are arranged separately and spatially separated by a certain long distance, causing their positional relationship to change dynamically.
- the display device can accurately detect changes in the relative position between a spatially separated (separate) optical projection system and an eyepiece optical system, and is therefore fully capable of handling configurations in which the positional relationship between such an optical projection system and an eyepiece optical system changes dynamically.
- the eyepiece optical system EOS has three or more (for example, three) feature points FP whose relative positions with respect to the optical element OE are fixed.
- three feature points FP are provided on the spectacle lens GL on which the optical element OE is provided.
- At least one of the three feature points FP may be provided, for example, on the spectacle frame GF.
- These three feature points FP are recognized by a camera C mounted on a light projection system LPS that is spatially separated from the eyepiece optical system EOS.
- the three-dimensional coordinates (x, y, z) of the optical center OC (optical center) of the optical element OE as seen from the light projection system LPS and the angle ( ⁇ x, ⁇ y) of the optical element OE as seen from the light projection system LPS can be detected in real time.
- ⁇ x indicates the rotation angle (pitch) around the x-axis extending to the left and right of the observer.
- ⁇ y indicates the rotation angle (yaw) around the y-axis that extends above and below the observer.
- the three-dimensional coordinates (x, y, z) of each of the three feature points FP are obtained. If the positional relationship between the three feature points FP and the optical center OC of the optical element OE is known in advance, the three-dimensional coordinates (x, y, z) that are the relative position of the optical center OC of the optical element OE as seen from the light projection system LPS can be calculated by interpolation or extrapolation of the three points.
- the three-dimensional coordinates (x, y, z) of each feature point FP are obtained using a camera C.
- the main acquisition method is the stereo camera method, but a configuration using only a TOF camera or a configuration using a visible light camera and a TOF camera in combination is also possible.
- the stereo camera method also includes a method using an infrared camera, an infrared light source, and a retroreflective material.
- (Calculation of normal vector) 2 is a diagram for explaining a method of calculating a normal vector. From the three-dimensional coordinates (x, y, z) of each of the three feature points FP obtained as described above, a normal vector of a plane including these three feature points FP is calculated using a vector cross product calculation. This normal vector corresponds to the orientation of the optical element OE, and the angle ( ⁇ x, ⁇ y) of the optical element OE as viewed from the optical projection system LPS can be calculated by taking the inner product with the axis vector of the reference coordinate system set in the optical projection system LPS.
- the angles ⁇ x, ⁇ y, and ⁇ z between the x-axis, y-axis, and z-axis and the normal vector N can be calculated using the dot product of vectors as shown in the following equations (2) to (4), assuming that the directional vectors of each axis are Ex, Ey, and Ez.
- N ⁇ Ex
- N ⁇ Ey
- ⁇ x arccos (cos ⁇ x) ...(5)
- ⁇ y arccos (cos ⁇ y) ...(6)
- ⁇ z arccos (cos ⁇ z) ...(7)
- the three-dimensional coordinates of at least three feature points FP are required. If there are four or more feature points FP, at least three of them can be selected to perform the calculations.
- FIG. 3 is a diagram showing a display device 1 according to a first embodiment of the present technology.
- the display device 1 is a retinal direct imaging type display device that directly images an image on the retina of a user who is an observer by light.
- the display device 1 includes a light projection system 10 and an eyepiece optical system 20 that guides light projected from the light projection system 10 to an eyeball EB.
- the light projection system 10 further includes an acquisition unit 300 including at least one camera 300a and a control unit 400 (see 10, for example). Note that the acquisition unit 300 does not have to be included in the light irradiation system 10. Specifically, the acquisition unit 300 may be provided separately from the light irradiation system 10 (spatially separated). Even in this case, it is desirable that the positional relationship between the light irradiation system 10 and the acquisition unit 300 remains unchanged.
- the optical projection system 10 and the eyepiece optical system 20 are provided separately (spatially separated).
- the optical projection system 10 may be placed, for example, on a desk, a table, etc., or hung on a wall, or may be attached to a portable item such as a user's wristwatch or smartphone, or may be attached to a moving object ridden by the user, such as a car, motorcycle, or bicycle.
- the eyepiece optical system 20 is provided at or near the observer's eyeball EB.
- Image data is input to the control unit 400.
- the control unit 400 generates modulation signals for each color of RGB according to the input image data, and outputs the modulation signals to a light source driving unit 100d (described later) of the light projection system 10 (see, for example, FIG. 12).
- the optical projection system 10 has two cameras 300a, but it may have only one camera 300a, or it may have three or more cameras 300a. (The same applies to other figures in which multiple cameras 300a are shown.)
- the optical projection system 10 includes, for example, an image light generation unit 100 and a projection optical system 200 that projects the image light IL generated by the image light generation unit 100.
- the image light generation unit 100, the projection optical system 200, and at least one camera 300a are, for example, integrally provided in a housing H.
- the image light generating unit 100 includes, as an example, a light source 100a, a scanning mirror 100b, and a light source driving unit 100d (see FIG. 12).
- the light source 100a has a laser, such as an edge-emitting laser (LD) or a surface-emitting laser (VCSEL).
- the light source 100a has, for example, a red laser, a green laser, and a blue laser.
- a light-emitting diode (LED) may be used as the light source 100a.
- the light source 100a may have only a laser or LED of a single color. This allows the configuration of the light source 100a to be simplified.
- the light source driving unit 100d generates a driving signal based on the modulation signal for each color of RGB input from the control unit 400, and applies the driving signal to the corresponding laser of the light source 100a to drive the laser.
- the light source 100a and the light source driving unit 100d are collectively referred to as the "light source system.”
- the light source driving unit 100d receives luminance information of the projected image from the control unit 400.
- Scanning mirror 100b deflects and scans the light emitted from light source 100a.
- a MEMS mirror that can be driven around two axes is used as scanning mirror 100b. This allows for a reduction in the number of parts and a smaller size.
- a mirror that can be driven around two axes is used as scanning mirror 100b, distortion may occur in the projected image due to the driving of the mirror.
- two MEMS mirrors that can be driven around one axis may be used in combination.
- the projection optical system 200 has, as an example, a focus lens 200c as a projection lens and a movable mirror 200a as a movable deflection element.
- the projection optical system 200 is controlled by the control unit 400.
- a projection window 200b is provided behind the movable mirror 200a.
- the focus lens 200c is disposed on the optical path of the image light IL via the scanning mirror 100b.
- the focus lens 200c is provided so as to be movable in the optical axis direction, and is driven in the optical axis direction by a focus lens drive unit.
- the focus lens drive unit is controlled by the control unit 400. If the observer moves in the z-axis direction (a direction parallel to the projection optical axis), the image will not be in focus. Therefore, it is preferable to adjust the position of the focus lens 200c in the optical axis direction (focus adjustment) so that the focus of the image light IL coincides with the observer's retina according to the position of the optical element OE in the z-axis direction.
- the movable mirror 200a reflects the image light IL through the focus lens 200c and guides it to the projection window 200b.
- the image light IL reflected by the movable mirror 200a and passing through the projection window 200b is projected to the eyepiece optical system 20.
- the movable mirror 200a is arranged to be swingable around two axes, for example, and is driven by a movable mirror drive unit.
- a movable mirror drive unit for example, a combination of two galvanometer mirrors that can swing around one axis, or a gimbal mirror that can swing around two axes, etc. are used.
- the movable mirror drive unit is controlled by the control unit 400.
- the projection direction of the image light IL can be changed two-dimensionally.
- the light projection system 10 and the eyepiece optical system 20 are spatially separated, and the relative positional relationship changes dynamically in real time depending on the orientation of the observer's eyeball, head, and body. If the projection optical axis of the light projection system 10 (more specifically, the optical axis of the projection optical system 200, hereinafter also simply referred to as the "projection optical axis") does not pass through the optical element OE, the observer will not be able to see the image.
- the angle (posture) of the movable mirror 200a it is preferable to control the angle (posture) of the movable mirror 200a so that the projection optical axis passes through the optical element OE based on the two-dimensional coordinates (x, y) of the optical element OE, i.e., based on two-dimensional position information in a plane perpendicular to the projection optical axis.
- This allows the observer to continue viewing the image at all times.
- a mechanical-less deflection element can also be used as the movable deflection element.
- the eyepiece optical system 20 includes an optical element OE and three or more feature points FP.
- the eyepiece optical system 20 is located within the imaging field of view (within the angle of view) of the camera 300a when the display device 1 is in use (when the image light IL is projected from the optical projection system 10 toward the eyeball EB).
- the optical element OE is positioned so that its focal point is located near the observer's pupil.
- the image light IL projected from the light projection system 10 is focused by the optical element OE and passes through the observer's pupil, thereby being imaged directly on the retina. This allows the observer to recognize the image.
- the optical element OE is provided in a contact lens CL that is attached to the observer's eyeball EB.
- the contact lens CL may or may not be a component of the eyepiece optical system 20.
- the contact lens CL has the advantage of self-alignment, in that the center of the contact lens CL moves to coincide with the center of the pupil of the eyeball EB. In other words, the orientation of the contact lens CL coincides with the direction of the line of sight, which is the direction of the eyeball EB.
- the optical element OE is provided in glasses, there is a possibility that a positional deviation occurs between the glasses and the pupil.
- the optical element OE is, for example, a holographic element (HOE), which diffracts and focuses the image light IL projected from the optical projection system 10 and guides it to the retina of the eyeball EB. If the angle (posture) of the HOE with respect to the projection optical axis changes, the color of the image seen by the observer changes. This is because the diffraction efficiency of the HOE depends on the angle of incidence.
- HOE holographic element
- the HOE has a characteristic that the diffraction efficiency varies depending on the angle of incidence of light. Specifically, the diffraction efficiency of the HOE has an incidence angle dependency such that light other than the image light IL projected from the light projection system 10 is transmitted without being diffracted (see FIG. 9). This allows the observer to see the background image and the projected image superimposed, as in AR. On the other hand, it is physically impossible for the diffraction efficiency of the HOE to be 1 (100%) only in a certain incidence angle range (see the graph on the left side of FIG. 9), and the diffraction efficiency has a characteristic that it always has a peak incidence angle and then drops with an angle width (see the right diagram of FIG. 9).
- this characteristic is independent of RGB, and the peak angle and half-width do not necessarily match.
- it is necessary to recognize the angle ( ⁇ x, ⁇ y) of the HOE with respect to the projection optical axis and generate the projected image taking into account the diffraction efficiency at that angle.
- each feature point FP has different light reflection characteristics from the surrounding areas so that it can be distinguished from the surrounding areas in the image captured by camera 300a.
- each feature point FP may be made of a reflective material with a higher reflectance than the surrounding areas, or may be made of a light-transmitting or light-absorbing material with a lower reflectance than the surrounding areas.
- each feature point FP may be made to include irregularities obtained by processing a part of a member (e.g., a contact lens, an eyeglass lens, an eyeglass frame, etc.).
- a visible light camera may be used as the camera 300a, and markers visible at visible light wavelengths may be used as each feature point FP.
- the markers can be recognized over a wide dynamic range.
- an infrared camera may be used as the camera 300a, and markers visible at infrared wavelengths (e.g., a dichroic film) may be used as each feature point FP.
- each feature point FP with respect to the optical element OE must be fixed. Therefore, in the display device 1, at least three feature points FP are provided on the contact lens CL, which serves as a rigid body on which the optical element OE is provided. In other words, each feature point FP and the optical element OE are integrated via the contact lens CL.
- the acquisition unit 300 including the camera 300a, be able to acquire at least two-dimensional coordinates (x, y) of each of the at least three feature points FP even if the orientation of the eyeball EB changes.
- the acquisition unit 300, including the camera 300a be able to acquire at least two-dimensional coordinates (x, y) of each of the at least three feature points FP, regardless of the orientation of the eyeball EB.
- the three characteristic points FP may be arranged so that even if some of the characteristic points FP are hidden by the eyelids, etc., at least three characteristic points FP can be captured by the camera 300a.
- the three characteristic points FP do not necessarily need to be physically separated from each other (separate), and may be provided, for example, integrally with some figure (see Figures 29A and 29B).
- the three characteristic points FP are provided integrally with the circle and spaced apart from each other in the circumferential direction.
- two of the three characteristic points FP are provided integrally with the arc and spaced apart from each other in the circumferential direction, and one characteristic point FP is provided separately from the two characteristic points FP and the arc.
- a plane needs to be specified by at least three feature points FP, so it is necessary that at least three of the three or more feature points FP are not on the same line. In addition, it is preferable that at least three feature points FP and the optical center OC of the optical element OE are on the same plane.
- Example 1 of the eyepiece optical system configuration are diagrams for explaining a first configuration example of the eyepiece optical system 20 of the display device 1.
- three feature points FP of the same shape e.g., circles
- the contact lens CL so as to be located at the three vertices of a triangle and to be located around the pupil (the darkest part in Figs. 4A to 4C). This allows the three feature points FP to be captured by the camera 300a even if the eyeball EB moves left and right to some extent as shown in Figs. 4B and 4C.
- Example 2 of the eyepiece optical system configuration 5A to 5C are diagrams for explaining a second configuration example of the eyepiece optical system 20 of the display device 1.
- three feature points FP of the same shape e.g., circles
- the contact lens CL so as to be located around the pupil (the part with the darkest color in Figs. 5A to 5C) and on the inside compared to the first configuration example of Fig. 4A.
- Example 3 of the eyepiece optical system configuration 6A to 6C are diagrams for explaining a configuration example 3 of the eyepiece optical system 20 of the display device 1.
- three relatively large feature points FP of different shapes e.g., circular, triangular, and rectangular
- the contact lens CL so as to be located at the three vertices of a triangle and to be located around the pupil (the part with the darkest color in Figs. 6A to 6C).
- at least two of the feature points FP may be made to have different colors instead of or in addition to the shapes.
- Example 4 of the eyepiece optical system configuration 7A to 7E are diagrams for explaining a fourth configuration example of the eyepiece optical system 20 of the display device 1.
- the fourth configuration example of the eyepiece optical system 20 as shown in Fig. 7A, six feature points FP of the same shape (e.g., circles) are provided on the contact lens CL so as to be located at the six vertices of a hexagon and to be located around the pupil (the part with the darkest color in Figs. 7A to 7E).
- This allows at least three of the six feature points FP to be captured by the camera 300a even if the eyeball EB moves relatively widely up, down, left, and right as shown in Figs. 7B to 7E.
- FIGS. 8A to 8E are diagrams for explaining a configuration example 5 of the eyepiece optical system 20 of the display device 1.
- three feature points FP of the same shape e.g., circles
- the contact lens CL so as to be located at the three vertices of a triangle and overlap with the pupil (the darkest part in Figs. 8A to 8E).
- the three feature points FP can be captured by the camera 300a (e.g., an infrared camera).
- Fig. 10 is a block diagram showing a basic configuration example of the control of the display device 1.
- the acquisition unit 300 of the display device 1 acquires at least two-dimensional position information (e.g., three-dimensional position information) of at least three feature points FP among three or more feature points FP whose relative positions with respect to the optical element OE are fixed.
- the acquisition unit 300 has, as an example, a camera 300a and an xyz coordinate acquisition unit 300b.
- the control unit 400 has a calculation unit 400a and a display control unit 400b.
- the acquisition unit 300 and the control unit 400 are realized by hardware such as a CPU, a chipset, etc.
- the xyz coordinate acquisition unit 300b acquires the xyz coordinates (three-dimensional position information) of at least three feature points FP from the image captured by the camera 300a.
- the calculation unit 400a calculates the angle of the movable mirror 200a, the position of the focus lens 200c, and the brightness of the projected image based on the xyz coordinates of each of at least three feature points FP.
- the display control unit 400b controls the movable mirror 200a based on the calculation result of the angle of the movable mirror 200a, controls the focus lens 200c based on the calculation result of the position of the focus lens 200c, and controls the projected image based on the calculation result of the luminance of the projected image (specifically, it outputs a luminance correction value based on the calculation result of the luminance of the projected image to the light source driving unit 100d).
- Fig. 11 is a flowchart showing the flow of overall control in the basic configuration example of the control of the display device 1 shown in Fig. 10.
- Fig. 11 is based on a processing algorithm executed by the CPU. The series of processes in Fig. 11 are started when the camera 300a captures the coordinates (when the acquisition unit 300 acquires the coordinates).
- the acquisition unit 300 acquires the three-dimensional coordinates (x, y, z) of each of at least three feature points FP.
- the calculation unit 400a of the control unit 400 calculates the normal vector N based on the three-dimensional coordinates (x, y, z) of each of the at least three feature points FP.
- the calculation unit 400a of the control unit 400 calculates ⁇ x, ⁇ y, and ⁇ z using the normal vector N.
- the calculation unit 400a of the control unit 400 calculates the distance D from the camera 300a to the optical element OE (more specifically, the plane identified by the three feature points FP) based on the three-dimensional coordinates (x, y, z) of each of the at least three feature points FP.
- the display control unit 400b of the control unit 400 controls the display of the image based on x, y, ⁇ x, ⁇ y, ⁇ z, and D (specifically, controls the movable mirror 200a, the focus lens 200c, and the brightness and shape correction of the projected image).
- the control unit 400 determines whether or not to end the process. For example, when the acquisition unit 300 can acquire the three-dimensional coordinates of each feature point FP (when each feature point FP is within the angle of view of the camera 300a), the control unit 400 denies this determination (determines not to end the process) and performs the series of processes of steps S1 to S5 again. For example, when the user issues an end command or the acquisition unit 300 can no longer acquire the three-dimensional coordinates of each feature point FP (when each feature point FP falls outside the angle of view of the camera 300a), the control unit 400 affirms this determination (determines to end the process) and ends the flow.
- Step camera A stereo camera method can be introduced, in which multiple cameras 300a are used to obtain the z coordinates of each feature point FP (the distance from the camera 300a to each feature point FP).
- two cameras 300a are arranged so that their optical axes are parallel to each other, and each parameter is defined as follows (see FIG. 24).
- h Distance between the centers of the left and right cameras [mm]
- f Camera lens focal length [mm]
- P Measurement object coordinates (x, y, z) [mm] Image point on the right camera sensor of pFP (hr,vr) [pixel]
- pl Image point on the left camera sensor of P (hl,vl) [pixel]
- xr hr / ( width ⁇ size_x)
- yr vr / ( height ⁇ size_y)
- xl hl / ( width ⁇ size_x)
- xl vl / ( height ⁇ size_y)
- the midpoint between the left and right cameras is defined as the origin O. Note that in the stereo camera method, the optical axes of the two cameras do not necessarily need to be parallel, and formulas for non-parallel arrangements are also known, but they will not be discussed here.
- FIG. 12 is a block diagram showing a first example of a control configuration of the display device 1. As shown in FIG.
- Configuration example 1 of the control of the display device 1 is a more specific configuration of the control unit 400 compared to the basic configuration example shown in FIG. 10.
- a stereo camera SC is configured by two cameras 300a (e.g., visible light cameras).
- the calculation unit 400a of the control unit 400 has an optical center xyz coordinate calculation unit 400a1 and an optical element tilt angle calculation unit 400a2, and the display control unit 400b of the control unit 400 has a movable mirror control unit 400b1, a focus lens control unit 400b2, and a projection image control unit 400b3.
- Fig. 13 is a flowchart showing the flow of the overall control in the configuration example 1 in Fig. 12.
- Fig. 13 is based on a processing algorithm executed by a CPU. The series of processes in Fig. 13 are started when the camera 300a captures the coordinates (when the acquisition unit 300 acquires the coordinates).
- the acquisition unit 300 performs an xyz coordinate acquisition process. Specifically, the xyz coordinate acquisition unit 300b acquires the xy coordinates and z coordinates of at least three feature points FP from the images captured by the two cameras 300a, 300a that make up the stereo camera SC.
- control unit 400 performs the movable mirror control process in step S22-1, the focus lens control process in step S22-2, and the projection image control process in step S22-3 in parallel.
- step S23 the control unit 400 judges whether or not to end the process. If the judgment here is positive, the flow ends, and if it is negative, the flow returns to step S21 and the same series of processes are performed again. If the acquisition unit 300 can acquire the three-dimensional coordinates of each feature point FP (if each feature point FP is within the angle of view of the camera 300a), the control unit 400 denies the judgment here (determines not to end the process) and performs the processes of steps S21, S22-1, S22-2, and S22-3 again.
- control unit 400 makes a positive judgment here (determines to end the process) and ends the flow.
- FIG. 14 is a flowchart showing the movable mirror control process (step S22-1) of FIG.
- the optical center xyz coordinate calculation unit 400a1 calculates the xyz coordinates of the optical center OC of the optical element OE.
- the movable mirror control unit 400b1 determines the angle of the movable mirror 200a based on the x and y coordinates of the optical center OC. Note that the z coordinate of the optical center OC may also be used in this determination.
- the movable mirror control unit 400b1 controls the movable mirror 200a. Specifically, the movable mirror control unit 400b1 sends a mirror drive signal corresponding to the angle of the movable mirror 200a determined in step S22-1-2 to the movable mirror drive unit.
- FIG. 15 is a flowchart showing the focus lens control process (step S22-2) of FIG.
- the optical center xyz coordinate calculation unit 400a1 calculates the z coordinate of the optical center OC of the optical element OE.
- the focus lens control unit 400b2 determines the position of the focus lens 200c in the optical axis direction based on the z coordinate of the optical center OC.
- the focus lens control unit 400b2 controls the focus lens 200c. Specifically, the focus lens control unit 400b2 sends a lens drive signal based on the position of the focus lens 200c in the optical axis direction determined in step S22-2-2 to the focus lens drive unit.
- FIG. 16 is a flow chart showing the projection image control process (step S22-3) of FIG.
- the optical element tilt angle calculation unit 400a2 calculates the tilt angle ( ⁇ x, ⁇ y) of the optical element OE.
- the projection image control unit 400b3 determines the brightness of the projection image based on the tilt angle ( ⁇ x, ⁇ y) of the optical element OE.
- the projection image control unit 400b3 controls the projection image. Specifically, the projection image control unit 400b3 sends a light source drive signal modulated according to the brightness of the projection image determined in step S22-3-2 to the light source drive unit 100d. Note that the light source drive signal may be a signal modulated according to the brightness and shape correction of the projection image.
- FIG. 17 is a block diagram showing a second example of the control configuration of the display device 1.
- FIG. 17 is a block diagram showing a second example of the control configuration of the display device 1.
- Configuration example 2 of the control of the display device 1 differs from configuration example 1 shown in FIG. 12 in that the acquisition unit 300 has an xy coordinate acquisition unit 300c and a z coordinate acquisition unit 300d instead of the xyz coordinate acquisition unit 300b, and the calculation unit 400a of the control unit 400 has an optical center xy coordinate calculation unit 400a11 and an optical center z coordinate calculation unit 400a12 instead of the xyz coordinate calculation unit 400a1.
- Fig. 18 is a flowchart showing the flow of overall control in the configuration example 2 of Fig. 17.
- Fig. 18 is based on a processing algorithm executed by a CPU. The series of processes in Fig. 18 are started when the camera 300a captures the coordinates (when the acquisition unit 300 acquires the coordinates).
- step S31-1 the xy coordinate acquisition unit 300c performs an xy coordinate acquisition process and the z coordinate acquisition unit 300d performs a z coordinate acquisition process in parallel.
- the xy coordinate acquisition unit 300c acquires the xy coordinates of at least three feature points FP from the images captured by the two cameras 300a, 300a that make up the stereo camera SC.
- the z coordinate acquisition unit 300d acquires the z coordinates of at least three feature points FP from the images captured by the two cameras 300a, 300a that make up the stereo camera SC.
- Step S32-1 is the same as step S22-1 in FIG. 13.
- Step S32-2 is the same as step S22-2 in FIG. 13.
- Step S32-3 is the same as step S22-3 in FIG. 13.
- step S33 the control unit 400 judges whether or not to end the processing each time step S32-1, step S32-2, and step S32-3 are executed. If the judgment here is positive, the flow ends, and if it is negative, the flow returns to steps S31-1 and S31-2, and the same processing is performed again.
- the control unit 400 denies the judgment here (determines not to end the processing) and performs the processing of steps S31-1, S31-2, S32-1, S32-2, and S32-3 again.
- control unit 400 affirms the determination here (determines that processing should be ended) and ends the flow.
- the control unit 400 controls the movable mirror 200a based on the two-dimensional position information (x, y) of each of the at least three feature points FP, controls the light source 100a based on the two-dimensional position information (x, y) and the distance information of the feature points FP (distance in the z-axis direction), and controls the focus lens 200c based on the distance information, in parallel.
- the x and y coordinates of each feature point FP it is relatively easy to obtain the x and y coordinates of each feature point FP, but obtaining the z coordinate requires restrictions on the device and arrangement used and a computation load. Furthermore, obtaining the x and y coordinates of the five axes of each feature point FP directly affects whether the observer can see the image or not, and has a higher priority than the remaining three parameters (z, ⁇ x, ⁇ y). On the other hand, the control of the projected image is limited by the frame rate of the image display, and is characterized by not being able to control at high speed compared to the movable mirror 200a.
- the information acquisition period for obtaining the x and y coordinates of each feature point FP and the z coordinate, and the loop period of the subsequent calculation are independent.
- the process related to the control of the movable mirror 200a (first control loop) calculates the x and y coordinates at the highest speed and makes the projection optical axis follow the optical element OE (target).
- the process requiring the z coordinate (second control loop) updates information at a slower period than the process related to the control of the movable mirror 200a.
- the x and y coordinate information required to calculate ⁇ x and ⁇ y can be calculated using the latest information at the required timing. In this way, by dividing the control loop into two, the most important control of the movable mirror can be performed at high speed without being affected by the rate-limiting factors of the control speed.
- FIG. 19 is a diagram showing a configuration example of a display device 2 according to a second embodiment of the present technology.
- the display device 2 has a configuration substantially similar to that of the display device 1 according to the first embodiment, except that the optical axis (projection optical axis) of the projection optical system 200 and the optical axis of the camera 300a are coaxial.
- a half mirror 200d is disposed on the optical path of the image light IL between the focus lens 200c and the movable mirror 200a.
- the camera 300a is disposed at a position outside the optical path of the image light IL between the focus lens 200c and the movable mirror 200a, facing the half mirror 200d.
- the camera 300a is always in a state of capturing the position through which the projection optical axis passes within a predetermined angle of view (for example, a central angle of view). Therefore, if the projection optical axis deviates from the optical center OC of the optical element OE, the predetermined angle of view of the image captured by the camera 300a will also deviate from the optical center OC.
- the projection image can be always directed toward the optical center OC.
- the projection optical axis and the optical axis of the camera 300a are coaxial does not necessarily mean that they are strictly coaxial, but means that deviation is allowed within a range in which the same action and effect is achieved.
- FIG. 20 is a flowchart showing the overall control flow of an example configuration of a display device 2 according to a second embodiment of the present technology.
- the flowchart in FIG. 20 is based on a processing algorithm executed by a CPU.
- the series of processes in FIG. 20 are started when the camera 300a captures coordinates (when the acquisition unit 300 acquires coordinates).
- the acquisition unit 300 acquires the two-dimensional coordinates (x, y) of each of at least three feature points FP.
- the calculation unit 400a of the control unit 400 calculates the two-dimensional coordinates (x, y) of the optical center OC of the optical element OE based on the two-dimensional coordinates (x, y) of each of the at least three feature points FP.
- the control unit 400 determines whether or not the projection optical axis passes through the optical center OC of the optical element OE. Specifically, the control unit 400 determines whether or not there is a misalignment between the projection optical axis and the optical center OC based on the position (two-dimensional coordinates) of the optical center OC in the image captured by the camera 300a. If the determination here is positive (if it is determined that there is no misalignment), the process returns to step S11, and if it is negative (if it is determined that there is a misalignment), the process proceeds to step S14.
- control unit 400 controls the movable mirror 200a so that the projection optical axis passes through the optical center OC of the optical element OE.
- the control unit 400 determines whether or not to end the process. For example, when the acquisition unit 300 can acquire the three-dimensional coordinates of each feature point FP (when each feature point FP is within the angle of view of the camera 300a), the determination here is negative (determined not to end the process) and the process returns to step S11. On the other hand, for example, when the user issues an end command or the acquisition unit 300a can no longer acquire the three-dimensional coordinates of each feature point FP (when each feature point FP falls outside the angle of view of the camera 300a), the determination here is positive (determined to end the process) and the flow ends.
- the camera 300a is arranged to recognize the vicinity of the eyepiece optical system 20 through the movable mirror 200a, so that it is possible to achieve both wide-range and high-precision recognition.
- the imaging range of the camera 300a can cover the entire range within which the light projection system 10 can project.
- the origins of the light projection coordinate system and the camera coordinate system coincide, there is also the advantage that the initial calibration can be simplified and positional deviation errors due to each perturbation can be suppressed.
- the recognition is via the movable mirror 200a, the detected coordinates are relative coordinates rather than absolute coordinates. Therefore, when the observer is lost, it becomes unclear in which direction the movable mirror 200a should be pointed.
- This camera does not require high performance in terms of resolution and shooting speed, so a relatively inexpensive one may be prepared, or an area that is not coaxial with the projection optical axis may be provided in part of the imaging field of the camera 300a and used by cutting out that area.
- Another solution is to provide two types of control methods for the movable mirror 200a: a tracking mode and a search mode. When the position of the observer is unknown, the search mode is used, and the observer is searched for by two-dimensionally scanning the movable range of the movable mirror 200a. When the position of the observer is identified, the mode is switched to the tracking mode and control is performed using the above-mentioned method.
- the at least one camera 300a may be any type, such as a visible light camera, an infrared camera, a stereo camera, or a TOF camera, and it is also possible to combine it with the display device 1 according to the first embodiment.
- Display device according to a third embodiment of the present technology> 21 is a diagram showing a configuration example of a display device 3 according to a third embodiment of the present technology.
- the display device 3 has a similar configuration to the display device 1 according to the first embodiment, except that an optical element OE and at least three feature points FP are provided on a spectacle lens GL.
- the eyeglass lens GL may or may not be a component of the eyepiece optical system 20. If the optical element OE is provided on the eyeglass lens GL, there is a possibility that the pupil of the eyeball EB and the optical center OC of the optical element OE will not coincide, which may result in disadvantages such as the need for greater precision and measures to address changes over time, or the need to additionally detect and track the position of the pupil, but it also has advantages such as a high degree of freedom in designing the feature points FP, improved detection accuracy of the feature points FP, easy manufacture, high safety, and low barriers to wearing.
- a HOE can be mainly used for the optical element OE, but a diffractive optical element (DOE) or the like can also be used.
- DOE diffractive optical element
- At least three feature points FP are provided in the area around the optical element OE of the eyeglass lens GL fitted into the rim (frame) of the eyeglass frame GF.
- the eyeglass frame GF may be a rimless frame.
- Display device according to fourth embodiment of the present technology> 22 is a diagram showing a configuration example of a display device 4 according to a fourth embodiment of the present technology.
- the display device 4 has a similar configuration to the display device 3 according to the third embodiment, except that an optical element OE is provided on a spectacle lens GL and at least three feature points FP are provided on a spectacle frame GF.
- At least three feature points FP are provided on the rim of the eyeglass frame GF, as shown in FIG. 22.
- Display device is a diagram showing a configuration example of a display device 5 according to a fifth embodiment of the present technology.
- the display device 5 has a similar configuration to the display device 3 according to the third embodiment, except that the optical element OE and parts of the at least three feature points FP are provided on the eyeglass lens GL, and the other parts of the at least three feature points FP are provided on the eyeglass frame GF.
- an optical element OE and at least one feature point FP are provided on the eyeglass lens GL, and at least two feature points FP are provided on the eyeglass frame GF.
- the optical element OE and at least two feature points FP may be provided on the eyeglass lens GL, and at least one feature point FP may be provided on the eyeglass frame GF.
- Display device is a diagram showing a configuration example of a display device 6 according to a sixth embodiment of the present technology.
- the display device 6 has a configuration similar to that of the display device 1 according to the first embodiment, except that the acquisition unit 300 has an infrared light source 300e (e.g., an LED, a laser, etc.) that irradiates infrared light to at least three feature points FP near the focus lens 200c, and has an infrared camera (a camera sensitive to infrared wavelengths) as the camera 300a.
- the infrared light source 500 may be a part of the light source 100a, or may be disposed on the optical path of the light from the light source 100a.
- a visible light cut filter may be provided on the camera 300a as a way to reduce the amount of processing.
- markers made of a retroreflective material are used as each feature point FP.
- markers made of a retroreflective material have the property that the incident light and reflected light are in the same direction, unlike specular reflection (regular reflection) on a plane mirror.
- specular reflection specular reflection
- the display device 6 has the infrared light source 300e as an illumination light source, the position of the marker can be detected even if the observer is in a dark environment.
- FIG. 27 is a block diagram showing an example of the configuration of a TOF (Time of Flight) camera as the camera 300a.
- the TOF camera serving as camera 300a has a light emitting unit 300a1, a light receiving unit 300a2, and a distance calculation unit 300a3.
- the measurement method of the TOF camera may be a direct TOF method or an indirect TOF method.
- the TOF camera can be used, for example, to obtain the z coordinate (distance) of each feature point FP, and can be used in conjunction with a visible light camera or an infrared camera to obtain the xy coordinates of the feature points FP. If the TOF camera has high xy resolution, it can obtain all xyz coordinates, which is expected to reduce the number of parts, making it smaller, lighter, and more cost-effective.
- FIG. 28 is a block diagram showing an example of the configuration of a camera 300a (event camera) having an event sensor.
- the event sensor of the event camera as camera 300a has a luminance change detection unit 300a4 that detects luminance changes in each pixel, and a luminance change data output unit 300a5 that outputs only data related to pixels whose luminance has changed, as shown in FIG. 28.
- the event sensor is also called an event-based vision sensor (EVS).
- An event sensor is a sensor that detects changes in the brightness of each pixel asynchronously and outputs only the changed data, combining it with coordinate and time information, achieving high-speed, low-latency data output.
- Event cameras equipped with event sensors can capture changes in feature points faster than normal cameras while reducing the amount of information they handle, enabling a more natural tracking experience and at the same time reducing the processing load in subsequent stages, contributing to smaller, lighter models and lower power consumption.
- the display device according to the present technology described above (for example, any of the display devices 1 to 6 according to the first to sixth embodiments) comprises an optical projection system 10 including a light source 100a, and an eyepiece optical system 20 that guides light projected from the optical projection system 10 to an eyeball EB, where the eyepiece optical system 20 includes an optical element OE and three or more feature points FP whose relative positions with respect to the optical element OE are fixed, and the optical projection system 10 includes an acquisition unit 300 that acquires at least two-dimensional position information of each of at least three of the three or more feature points FP.
- the display device can accurately detect at least two-dimensional information for each of the at least three feature points FP.
- the display device according to the present technology can provide a display device capable of detecting the relative position between the light projection system 10 and the eyepiece optical system 20 with high accuracy.
- the display device according to the present technology can detect the relative position with high accuracy, and therefore can realize a display device with excellent image tracking performance with respect to the eyepiece optical system 20 worn by the observer.
- the display device according to the present technology has excellent image tracking performance even in a usage mode in which the separate light projection system and eyepiece optical system are arranged at a considerable spatial distance (for example, farther apart than when both the light projection system and the eyepiece optical system are worn by the observer) and the positional relationship between the two changes dynamically in real time.
- the display method according to the present technology includes a step of acquiring at least two-dimensional position information of at least three of three or more feature points FP whose relative positions with respect to an optical element OE that guides light projected from an optical projection system 10 to an eyeball EB are fixed, and a step of controlling the optical projection system 10 based on the acquisition results from the acquisition step.
- the display method according to the present technology can accurately detect at least two-dimensional information for each of the at least three feature points FP.
- the display method according to the present technology can provide a display method capable of detecting the relative position between the light projection system 10 and the eyepiece optical system 20 with high accuracy.
- the display method according to the present technology can detect the relative position with high accuracy, and therefore can realize a display method with excellent image tracking for the eyepiece optical system 20 worn by the observer.
- the display method according to the present technology has excellent image tracking even in a usage mode in which the separate light projection system and eyepiece optical system are arranged at a considerable spatial distance (for example, farther apart than when both the light projection system and the eyepiece optical system are worn by the observer) and the positional relationship between the two changes dynamically in real time.
- the acquisition unit 300 may acquire only two-dimensional position information for each of at least three feature points FP. Even in this case, it is possible to draw an image on the eyeball EB of the user who is the observer by controlling the movable mirror 200a based on the two-dimensional position information.
- the acquisition unit 300 may acquire at least two-dimensional position information (e.g., two-dimensional position information only, or two-dimensional position information and distance information) for each of the at least four feature points FP.
- the relative position between the optical projection system 10 and the eyepiece optical system 20 can be determined with greater accuracy.
- the movable deflection element e.g., the movable mirror 200a
- the movable deflection element is controlled based on two-dimensional coordinates (x, y), but the movable deflection element may also be controlled based on three-dimensional coordinates (x, y, z).
- the projection optical system 200 does not need to have a focus lens 200c and a focus lens driving unit.
- control unit 400 does not need to have a focus lens control unit and/or a projection image control unit.
- control unit 400 may control only one of the light source 100a and the projection optical system 200.
- the image light generating unit is of the scanning type, but it may also be of the non-scanning type, including, for example, a liquid crystal display.
- the optical projection system 10 can be worn on the head of the user who is the observer.
- At least a portion of the configuration of the eyeball information detection device in each of the above embodiments may be combined to the extent that they are not mutually inconsistent.
- the present technology can also be configured as follows.
- a light projection system including a light source; an eyepiece optical system that guides the light projected from the light projection system to an eyeball; Equipped with The eyepiece optical system includes: An optical element; Three or more feature points having fixed relative positions with respect to the optical element; Including, The display device is provided with an acquisition unit that acquires at least two-dimensional position information of at least three of the three or more feature points.
- the at least three feature points are provided on the contact lens.
- the display device (4), wherein the acquisition unit is capable of acquiring at least the two-dimensional position information of each of the at least three feature points even when the orientation of the eyeball changes.
- the optical element is provided on a lens attached to an eyeglass frame.
- the display device according to any one of (1) to (7), wherein the at least three characteristic points and the optical center of the optical element are on the same plane.
- the display device according to any one of (1) to (8), wherein the acquisition unit has at least one camera.
- the display device (10) The display device according to (9), wherein the at least one camera includes a visible light camera and/or a Time-of-Flight camera.
- the display device (9) or (10), wherein the at least one camera includes a plurality of cameras constituting a stereo camera.
- the optical projection system includes the acquisition unit.
- the optical projection system includes: a projection optical system disposed on an optical path of light from the light source;
- the display device according to any one of (9) to (13), wherein an optical axis of the projection optical system and an optical axis of the camera are coaxial.
- the optical projection system includes: a projection optical system disposed on an optical path of light from the light source; a control unit that controls the light source and/or the projection optical system based on the result of the acquisition by the acquisition unit;
- the display device according to any one of (9) to (14),
- the projection optical system includes a movable deflection element and a focus lens,
- the acquisition unit is Two-dimensional position information of each of the at least three feature points in a plane perpendicular to the optical axis direction of the camera; Distance information between the camera and each of the at least three feature points in a direction of an optical axis of the camera; Get The control unit is controlling the movable deflection element based on at least the two-dimensional position information; controlling the light source based on the two-dimensional position information and the distance information, and/or controlling the focus lens based on the distance information;
- the display device according to (15), wherein the above steps are performed in parallel.
- the optical projection system includes a light source, a movable deflection element, and a focus lens;
- the step of acquiring at least two-dimensional position information includes: acquiring two-dimensional position information of each of the at least three feature points in a plane perpendicular to an optical axis direction of the camera; acquiring distance information between the camera and each of the at least three feature points in a direction of an optical axis of the camera;
- the controlling step includes: controlling the movable deflection element based on at least the two-dimensional position information; controlling the light source based on the two-dimensional position information and the distance information, and/or controlling the focus lens based on the distance information;
- Display device 10 Light projection system 20: Eyepiece optical system 100a: Light source 200: Projection optical system 200a: Movable mirror (movable deflection element) 200c: Focus lens 300: Acquisition unit 300a: Camera 300e: Infrared light source 400: Control unit EB: Eyeball FP: Feature point OE: Optical element OC: Optical center CL: Contact lens GL: Glasses lens (lens) GF: Glasses frame IL: Image light (light)
Landscapes
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- General Health & Medical Sciences (AREA)
Abstract
Description
本開示に係る技術(以下「本技術」とも呼ぶ)は、表示装置及び表示方法に関する。 The technology disclosed herein (hereinafter also referred to as "the technology") relates to a display device and a display method.
従来、観察者の眼球に光(詳しくは画像光)を照射して画像を表示する表示装置が知られている(例えば特許文献1参照)。 Display devices that display images by irradiating light (more specifically, image light) onto the observer's eyeball are known (see, for example, Patent Document 1).
例えば特許文献1に記載の表示装置では、コンタクトレンズに設けられた2つの特徴点の各々の2次元位置情報から、走査部(光投射系)と偏向部(接眼光学系)との相対位置を検出し、その検出結果に基づいて走査部を制御している。 For example, the display device described in Patent Document 1 detects the relative positions of the scanning unit (light projection system) and the deflection unit (ocular optical system) from two-dimensional position information of each of two feature points on the contact lens, and controls the scanning unit based on the detection results.
しかしながら、従来の表示装置では、光投射系と接眼光学系との相対位置を精度良く検出することに関して改善の余地があった。 However, conventional display devices have room for improvement in terms of accurately detecting the relative position between the light projection system and the eyepiece optical system.
そこで、本技術は、光投射系と接眼光学系との相対位置を精度良く検出することができる表示装置を提供することを主目的とする。 The main objective of this technology is to provide a display device that can accurately detect the relative position between the light projection system and the eyepiece optical system.
本技術は、光源を含む光投射系と、
前記光投射系から投射された光を眼球へ導く接眼光学系と、
を備え、
前記接眼光学系は、
光学素子と、
前記光学素子に対する相対位置が固定された3つ以上の特徴点と、
を含み、
前記3つ以上の特徴点のうち少なくとも3つの特徴点の各々の少なくとも2次元位置情報を取得する取得部が設けられている、表示装置を提供する。
前記少なくとも3つの特徴点が同一直線上になくてもよい。
前記光学素子は、前記眼球に装着されるコンタクトレンズに設けられていてもよい。 前記少なくとも3つの特徴点が、前記コンタクトレンズに設けられていてもよい。
前記取得部は、前記眼球の向きが変化しても、前記少なくとも3つの特徴点の各々の少なくとも前記2次元位置情報を取得可能であってもよい。
前記光学素子は、眼鏡フレームに装着されるレンズに設けられていてもよい。
前記少なくとも3つの特徴点がいずれも前記レンズ若しくは前記眼鏡フレームに設けられ、又は、前記少なくとも3つの特徴点の一部が前記レンズに設けられ且つ他部が前記眼鏡フレームに設けられていてもよい。
前記少なくとも3つの特徴点と前記光学素子の光学中心とが同一平面上にあってもよい。
前記取得部は、少なくとも1つのカメラを有していてもよい。
前記少なくとも1つのカメラは、可視光カメラ及び/又はTOFカメラを含んでいてもよい。
前記少なくとも1つのカメラは、ステレオカメラを構成する複数のカメラを含んでいてもよい。
前記光投射系が、前記取得部を含んでいてもよい。
前記取得部は、赤外光源を有し、前記カメラは、赤外波長に感度を持ち、前記少なくとも3つの特徴点は、再帰性反射材料からなってもよい。
前記光投射系は、前記光源からの光の光路上に配置された投射光学系を含み、前記投射光学系の光軸と前記カメラの光軸とが同軸であってもよい。
前記光投射系は、前記光源からの光の光路上に配置された投射光学系と、前記取得部の取得結果に基づいて、前記光源及び/又は前記投射光学系を制御する制御部と、を有していてもよい。
前記投射光学系は、可動偏向素子及びフォーカスレンズを含み、前記取得部は、前記少なくとも3つの特徴点の各々の、前記カメラの光軸方向に直交する平面内の2次元位置情報と、前記カメラと前記少なくとも3つの特徴点の各々との前記カメラの光軸方向に関する距離情報と、を取得し、前記制御部は、前記2次元位置情報に基づいて前記可動偏向素子を制御することと、少なくとも前記2次元位置情報及び前記距離情報に基づいて前記光源を制御し、及び/又は、前記距離情報に基づいて前記フォーカスレンズを制御することと、を並行して行ってもよい。
前記少なくとも3つの特徴点のうち少なくとも2つの特徴点が一体的に設けられていてもよい。
前記光投射系と前記接眼光学系とが別体であってもよい。
本技術は、光投射系から投射された光を眼球へ導く光学素子に対する相対位置が固定された3つ以上の特徴点のうち少なくとも3つの特徴点の各々の少なくとも2次元位置情報を取得する工程と、
前記取得する工程での取得結果に基づいて、前記光投射系の一部を制御する工程と、 を含む、表示方法も提供する。
前記光投射系は、光源、可動偏向素子及びフォーカスレンズを含み、前記少なくとも2次元位置情報を取得する工程は、前記少なくとも3つの特徴点の各々の、前記カメラの光軸方向に直交する平面内の2次元位置情報を取得する工程と、前記カメラと前記少なくとも3つの特徴点の各々との前記カメラの光軸方向に関する距離情報を取得する工程と、を含み、前記制御する工程は、少なくとも前記2次元位置情報に基づいて前記可動偏向素子を制御する工程と、前記2次元位置情報及び前記距離情報に基づいて前記光源を制御し、及び/又は、前記距離情報に基づいて前記フォーカスレンズを制御する工程と、を含んでいてもよい。
The present technology includes a light projection system including a light source;
an eyepiece optical system that guides the light projected from the light projection system to an eyeball;
Equipped with
The eyepiece optical system includes:
An optical element;
Three or more feature points having fixed relative positions with respect to the optical element;
Including,
The display device further includes an acquisition unit that acquires at least two-dimensional position information of at least three of the three or more feature points.
The at least three feature points may not be collinear.
The optical element may be provided on a contact lens that is fitted to the eye. The at least three features may be provided on the contact lens.
The acquisition unit may be capable of acquiring at least the two-dimensional position information of each of the at least three feature points even if a direction of the eyeball changes.
The optical element may be provided on a lens attached to an eyeglass frame.
The at least three feature points may all be provided on the lens or the eyeglass frame, or some of the at least three feature points may be provided on the lens and others may be provided on the eyeglass frame.
The at least three feature points and the optical center of the optical element may be on the same plane.
The acquisition unit may include at least one camera.
The at least one camera may include a visible light camera and/or a Time-of-Flight camera.
The at least one camera may include a plurality of cameras constituting a stereo camera.
The optical projection system may include the acquisition unit.
The capture unit may include an infrared light source, the camera may be sensitive to infrared wavelengths, and the at least three features may be made of a retroreflective material.
The optical projection system may include a projection optical system disposed on an optical path of light from the light source, and an optical axis of the projection optical system may be coaxial with an optical axis of the camera.
The optical projection system may include a projection optical system arranged on an optical path of light from the light source, and a control unit that controls the light source and/or the projection optical system based on the acquisition result of the acquisition unit.
The projection optical system includes a movable deflection element and a focus lens, and the acquisition unit acquires two-dimensional position information of each of the at least three feature points in a plane perpendicular to the optical axis direction of the camera, and distance information between the camera and each of the at least three feature points in the optical axis direction of the camera, and the control unit may simultaneously control the movable deflection element based on the two-dimensional position information, control the light source based on at least the two-dimensional position information and the distance information, and/or control the focus lens based on the distance information.
At least two of the at least three characteristic points may be integrally provided.
The optical projection system and the eyepiece optical system may be separate entities.
The present technology includes a step of acquiring at least two-dimensional position information of at least three feature points among three or more feature points whose relative positions with respect to an optical element that guides light projected from a light projection system to an eyeball are fixed;
and controlling a part of the optical projection system based on the results obtained in the obtaining step.
The optical projection system may include a light source, a movable deflection element, and a focus lens, and the step of acquiring the at least two-dimensional position information may include a step of acquiring two-dimensional position information of each of the at least three feature points in a plane perpendicular to the optical axis direction of the camera, and a step of acquiring distance information between the camera and each of the at least three feature points in the optical axis direction of the camera, and the step of controlling may include a step of controlling the movable deflection element based on at least the two-dimensional position information, and a step of controlling the light source based on the two-dimensional position information and the distance information and/or a step of controlling the focus lens based on the distance information.
以下に添付図面を参照しながら、本技術の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。以下に説明する実施形態は、本技術の代表的な実施形態を示したものであり、これにより本技術の範囲が狭く解釈されることはない。本明細書において、本技術に係る表示装置及び表示方法が複数の効果を奏することが記載される場合でも、本技術に係る表示装置及び表示方法は、少なくとも1つの効果を奏すればよい。本明細書に記載された効果はあくまで例示であって限定されるものではなく、また他の効果があってもよい。 Below, a preferred embodiment of the present technology will be described in detail with reference to the attached drawings. Note that in this specification and the drawings, components having substantially the same functional configuration will be denoted with the same reference numerals to avoid repeated description. The embodiment described below shows a representative embodiment of the present technology, and is not intended to narrow the scope of the present technology. Even if this specification describes that the display device and display method related to the present technology have multiple effects, it is sufficient that the display device and display method related to the present technology have at least one effect. The effects described in this specification are merely examples and are not limiting, and other effects may also be present.
また、以下の順序で説明を行う。
0.導入
1.本技術の第1実施形態に係る表示装置
2.本技術の第2実施形態に係る表示装置
3.本技術の第3実施形態に係る表示装置
4.本技術の第4実施形態に係る表示装置
5.本技術の第5実施形態に係る表示装置
6.本技術の第6実施形態に係る表示装置
7.本技術に係る表示装置の効果
8.本技術の変形例
The explanation will be given in the following order:
0. Introduction 1. Display device according to a first embodiment of the present technology 2. Display device according to a second embodiment of the present technology 3. Display device according to a third embodiment of the present technology 4. Display device according to a fourth embodiment of the present technology 5. Display device according to a fifth embodiment of the present technology 6. Display device according to a sixth embodiment of the present technology 7. Effects of the display device according to the present technology 8. Modifications of the present technology
<0.導入>
従来、光投射系から光(詳しくは画像光)を投射し、投射された光を観察者に装着された接眼光学系を介して該観察者の瞳に導くことで画像を表示する網膜直描デバイス(表示装置)が知られている。特に、光投射系と接眼光学系との位置関係(相対位置)が固定されていない網膜直描デバイスを実現するためには、光投射系側から観察者の状態を認識する必要がある。このような網膜直描デバイスでは、観察者の動きによって接眼光学系と光投射系との相対位置がリアルタイムに変化するという特徴があるため、主に以下の認識と補正を行うことが望まれる。
・ 光投射系から見た、光学素子の光学中心の位置の認識
マクスウェル光学系の場合、投射光軸が瞳を通らないと観察者は画像を見ることができない。通常、光学素子はその光学中心が観察者の瞳に重なるように配置されるため、光投射系側から光学素子の光学中心の位置(例えば2次元位置情報)を認識することが望ましい。
・ 投射光軸に対する光学素子の角度(姿勢)の補正
光学素子としての回折素子(例えばHOE:ホログラム素子)の回折効率には入射角依存性があり、投射光のHOEへの入射角度が変化するとRGBの各々で独立に回折効率が変化する。観察者に見せたい画像や明るさを正しく再現するために、入射角度によって原画像の明るさを補正することが望ましい。
<0. Introduction>
Conventionally, there is known a retinal direct imaging device (display device) that projects light (specifically, image light) from an optical projection system and directs the projected light to the pupil of an observer via an eyepiece optical system worn by the observer to display an image. In particular, to realize a retinal direct imaging device in which the positional relationship (relative position) between the optical projection system and the eyepiece optical system is not fixed, it is necessary to recognize the state of the observer from the optical projection system side. Such a retinal direct imaging device has a feature that the relative position between the eyepiece optical system and the optical projection system changes in real time depending on the movement of the observer, so it is mainly desired to perform the following recognition and correction.
- Recognizing the position of the optical center of the optical element as seen from the light projection system In the case of Maxwell optical systems, the observer cannot see the image unless the projection optical axis passes through the pupil. Normally, optical elements are positioned so that their optical centers overlap with the observer's pupil, so it is desirable to recognize the position of the optical center of the optical element (e.g., two-dimensional position information) from the light projection system.
Correction of the angle (attitude) of the optical element relative to the projection optical axis The diffraction efficiency of a diffraction element (e.g., HOE: hologram element) as an optical element depends on the angle of incidence, and when the angle of incidence of the projection light to the HOE changes, the diffraction efficiency changes independently for each of RGB. In order to correctly reproduce the image and brightness that you want the viewer to see, it is desirable to correct the brightness of the original image depending on the angle of incidence.
ところで、特許文献1に記載された表示装置では、偏向部(接眼光学系)に設けられた2つの特徴点の各々の2次元位置情報から、走査部(光投射系)と偏向部(接眼光学系)との相対位置を検出している。しかしながら、この表示装置では、該相対位置の検出精度を向上することに改善の余地がある。 Incidentally, in the display device described in Patent Document 1, the relative position between the scanning unit (light projection system) and the deflection unit (eyepiece optical system) is detected from two-dimensional position information of each of two feature points provided in the deflection unit (eyepiece optical system). However, there is room for improvement in the detection accuracy of this relative position in this display device.
そこで、発明者らは、鋭意検討の末、光投射系と接眼光学系との相対位置(例えば光投射系から見た接眼光学系の相対位置)を精度良く検出することができる表示装置として、本技術に係る表示装置を開発した。 After extensive research, the inventors developed a display device according to this technology that can accurately detect the relative position between the light projection system and the eyepiece optical system (for example, the relative position of the eyepiece optical system as viewed from the light projection system).
特に、本技術に係る表示装置において、接眼光学系の光学素子が観察者の眼球に装着されるコンタクトレンズに設けられる場合、瞳の動きに合わせてコンタクトレンズ及び光学素子も動くため、瞳の位置を認識すれば光学素子の位置も認識でき、瞳の向き(視線の方向)を認識すれば光学素子の角度(姿勢)も認識できる。これにより、光投射系と接眼光学系との相対位置をより精度良く検出することができる。 In particular, in a display device according to this technology, when the optical elements of the eyepiece optical system are mounted on a contact lens that is worn on the observer's eyeball, the contact lens and the optical elements move in accordance with the movement of the pupil, so that if the position of the pupil is recognized, the position of the optical element can also be recognized, and if the direction of the pupil (direction of gaze) is recognized, the angle (posture) of the optical element can also be recognized. This makes it possible to detect the relative position between the light projection system and the eyepiece optical system with greater precision.
また、特許文献1に記載の表示装置では、走査部(光投射系)を観察者の頭部に設置するような構成を想定していると考えられ、走査部(光投射系)と偏向部(接眼光学系)とが空間的にある程度長い距離離間して別体で配置されダイナミックに位置関係が変わるような、検出精度が要求される構成に対応することは困難を極める。 In addition, the display device described in Patent Document 1 is thought to be configured so that the scanning unit (light projection system) is placed on the observer's head, and it would be extremely difficult to accommodate a configuration requiring high detection accuracy in which the scanning unit (light projection system) and the deflection unit (ocular optical system) are arranged separately and spatially separated by a certain long distance, causing their positional relationship to change dynamically.
一方、本技術に係る表示装置では、空間的に分離された(別体の)光投射系と接眼光学系との相対位置の変化を精度良く検出することができることから、このような光投射系と接眼光学系との位置関係がダイナミックに変わるような構成にも十分に対応可能である。 On the other hand, the display device according to this technology can accurately detect changes in the relative position between a spatially separated (separate) optical projection system and an eyepiece optical system, and is therefore fully capable of handling configurations in which the positional relationship between such an optical projection system and an eyepiece optical system changes dynamically.
図1は、本技術の原理を説明するための図である。図1に示すように、本技術では、接眼光学系EOSに、光学素子OEに対する相対位置が固定された3つ以上(例えば3つ)の特徴点FPを持たせている。図1の例では、光学素子OEが設けられた眼鏡レンズGLに3つの特徴点FPが設けられている。3つの特徴点FPの少なくとも1つは、例えば眼鏡フレームGFに設けられてもよい。これら3つの特徴点FPを接眼光学系EOSとは空間的に分離された光投射系LPSに搭載されたカメラCで認識する。以下の空間座標の測定及び法線ベクトルの算出を行うことで、光投射系LPSから見た光学素子OEの光学中心OC(光学的な中心)の3次元座標(x、y、z)と光投射系LPSから見た光学素子OEの角度(θx、θy)とをリアルタイムに検出することができる。なお、θxは、観察者の左右に延びるx軸の周りの回転角度(ピッチ)を示す。θyは、観察者の上下に延びるy軸の周りの回転角度(ヨー)を示す。なお、本明細書中及び図面において、特に断りがない限り、「x」と「X」を同一の意味で用い、「y」と「Y」を同一の意味で用い、「z」と「Z」を同一の意味で用いるものとする。 1 is a diagram for explaining the principle of this technology. As shown in FIG. 1, in this technology, the eyepiece optical system EOS has three or more (for example, three) feature points FP whose relative positions with respect to the optical element OE are fixed. In the example of FIG. 1, three feature points FP are provided on the spectacle lens GL on which the optical element OE is provided. At least one of the three feature points FP may be provided, for example, on the spectacle frame GF. These three feature points FP are recognized by a camera C mounted on a light projection system LPS that is spatially separated from the eyepiece optical system EOS. By measuring the spatial coordinates and calculating the normal vector as follows, the three-dimensional coordinates (x, y, z) of the optical center OC (optical center) of the optical element OE as seen from the light projection system LPS and the angle (θx, θy) of the optical element OE as seen from the light projection system LPS can be detected in real time. Note that θx indicates the rotation angle (pitch) around the x-axis extending to the left and right of the observer. θy indicates the rotation angle (yaw) around the y-axis that extends above and below the observer. In this specification and drawings, unless otherwise specified, "x" and "X" are used with the same meaning, "y" and "Y" are used with the same meaning, and "z" and "Z" are used with the same meaning.
(空間座標の測定)
例えば3つの特徴点FPの各々の3次元座標(x、y、z)を取得する。3つの特徴点FPと光学素子OEの光学中心OCとの位置関係が予め分かっていれば、3点の内挿や外挿計算によって光投射系LPSから見た光学素子OEの光学中心OCの相対位置である3次元座標(x、y、z)を算出することができる。各特徴点FPの3次元座標(x、y、z)は、カメラCを用いて取得する。主な取得方法は、ステレオカメラ法であるが、TOFカメラ単独の構成や、可視光カメラとTOFカメラを併用する構成も考えられる。ステレオカメラ法は可視光カメラ以外に、赤外線カメラ、赤外光源及び再帰性反射材料を用いる方法もある。
(Measurement of spatial coordinates)
For example, the three-dimensional coordinates (x, y, z) of each of the three feature points FP are obtained. If the positional relationship between the three feature points FP and the optical center OC of the optical element OE is known in advance, the three-dimensional coordinates (x, y, z) that are the relative position of the optical center OC of the optical element OE as seen from the light projection system LPS can be calculated by interpolation or extrapolation of the three points. The three-dimensional coordinates (x, y, z) of each feature point FP are obtained using a camera C. The main acquisition method is the stereo camera method, but a configuration using only a TOF camera or a configuration using a visible light camera and a TOF camera in combination is also possible. In addition to a visible light camera, the stereo camera method also includes a method using an infrared camera, an infrared light source, and a retroreflective material.
(法線ベクトルの算出)
図2は、法線ベクトルの算出方法を説明するための図である。上述のようにして取得された3つの特徴点FPの各々の3次元座標(x、y、z)から、ベクトルの外積演算を用いてこれら3つの特徴点FPを含む平面の法線ベクトルを算出する。この法線ベクトルが、光学素子OEの向きに相当しており、光投射系LPSに設定された基準座標系の軸ベクトルとの内積を取ることで光投射系LPSから見た光学素子OEの角度(θx、θy)を算出することができる。
(Calculation of normal vector)
2 is a diagram for explaining a method of calculating a normal vector. From the three-dimensional coordinates (x, y, z) of each of the three feature points FP obtained as described above, a normal vector of a plane including these three feature points FP is calculated using a vector cross product calculation. This normal vector corresponds to the orientation of the optical element OE, and the angle (θx, θy) of the optical element OE as viewed from the optical projection system LPS can be calculated by taking the inner product with the axis vector of the reference coordinate system set in the optical projection system LPS.
3つの特徴点FPの空間座標ベクトルをそれぞれP[0]、P[1]、P[2]とおくと、この3つの特徴点を含む平面の法線ベクトルNは、次の(1)式のようにベクトルの外積を用いて算出することができる。
N = (P[2]-P[0]) × (P[1]-P[0]) ・・・(1)
If the spatial coordinate vectors of the three feature points FP are P[0], P[1], and P[2], respectively, the normal vector N of the plane containing these three feature points can be calculated using the cross product of vectors as shown in the following equation (1).
N = (P[2]-P[0]) × (P[1]-P[0]) ... (1)
x軸、y軸、z軸と、法線ベクトルNとのなす角θx、θy、θzは、各軸の方向ベクトルをEx、Ey、Ezとすれば、次の(2)~(4)式のようにベクトルの内積を用いて算出することができる。
N・Ex = |N||Ex|cos θx ・・・(2)
N・Ey = |N||Ey|cos θy ・・・(3)
N・Ez = |N||Ez|cos θz ・・・(4)
The angles θx, θy, and θz between the x-axis, y-axis, and z-axis and the normal vector N can be calculated using the dot product of vectors as shown in the following equations (2) to (4), assuming that the directional vectors of each axis are Ex, Ey, and Ez.
N・Ex = |N||Ex|cos θx...(2)
N・Ey = |N||Ey|cos θy...(3)
N・Ez = |N||Ez|cos θz...(4)
上記(2)~(4)式から、θx、θy、θzは、次の(5)~(7)式のように求めることができる。
θx = arccos (cos θx) ・・・(5)
θy = arccos (cos θy) ・・・(6)
θz = arccos (cos θz) ・・・(7)
From the above equations (2) to (4), θx, θy, and θz can be calculated using the following equations (5) to (7).
θx = arccos (cos θx) ...(5)
θy = arccos (cos θy) ...(6)
θz = arccos (cos θz) ...(7)
以上のような演算を行うためには、少なくとも3つの特徴点FPの3次元座標が必要である。仮に4つ以上の特徴点FPがある場合は、その中から少なくとも3つの特徴点FPを選択して演算を行うことができる。 To perform the above calculations, the three-dimensional coordinates of at least three feature points FP are required. If there are four or more feature points FP, at least three of them can be selected to perform the calculations.
<1.本技術の第1実施形態に係る表示装置>
図3は、本技術の第1実施形態に係る表示装置1を示す図である。表示装置1は、一例として、観察者であるユーザの網膜に光により画像を直描する網膜直描型の表示装置である。表示装置1は、光投射系10と、該光投射系10から投射された光を眼球EBへ導く接眼光学系20とを備える。光投射系10は、さらに、少なくとも1つのカメラ300aを含む取得部300及び制御部400(例えば10参照)を備える。なお、取得部300は、光照射系10に含まれなくてもよい。具体的には、取得部300は、光照射系10とは別体に(空間的に分離して)設けられていてもよい。この場合でも、光照射系10と取得部300との位置関係は不変であることが望ましい。
<1. Display device according to first embodiment of the present technology>
FIG. 3 is a diagram showing a display device 1 according to a first embodiment of the present technology. As an example, the display device 1 is a retinal direct imaging type display device that directly images an image on the retina of a user who is an observer by light. The display device 1 includes a light projection system 10 and an eyepiece optical system 20 that guides light projected from the light projection system 10 to an eyeball EB. The light projection system 10 further includes an acquisition unit 300 including at least one camera 300a and a control unit 400 (see 10, for example). Note that the acquisition unit 300 does not have to be included in the light irradiation system 10. Specifically, the acquisition unit 300 may be provided separately from the light irradiation system 10 (spatially separated). Even in this case, it is desirable that the positional relationship between the light irradiation system 10 and the acquisition unit 300 remains unchanged.
表示装置1では、一例として、光投射系10と接眼光学系20とが別体に(空間的に分離して)設けられている。光投射系10は、例えば、机、台等に載置されてもよいし、壁に掛けられてもよいし、ユーザの腕時計、スマートフォン等の携帯物に装着されてもよいし、自動車、バイク、自転車等のユーザが乗る移動体に取り付けられてもよい。接眼光学系20は、観察者の眼球EB又はその付近に設けられる。 In the display device 1, as an example, the optical projection system 10 and the eyepiece optical system 20 are provided separately (spatially separated). The optical projection system 10 may be placed, for example, on a desk, a table, etc., or hung on a wall, or may be attached to a portable item such as a user's wristwatch or smartphone, or may be attached to a moving object ridden by the user, such as a car, motorcycle, or bicycle. The eyepiece optical system 20 is provided at or near the observer's eyeball EB.
制御部400には、画像データが入力される。制御部400は、入力された画像データに応じてRGBの各色の変調信号を生成し、該変調信号を光投射系10の、後述する光源駆動部100d(例えば図12参照)に出力する。 Image data is input to the control unit 400. The control unit 400 generates modulation signals for each color of RGB according to the input image data, and outputs the modulation signals to a light source driving unit 100d (described later) of the light projection system 10 (see, for example, FIG. 12).
なお、図3では、光投射系10が、2つのカメラ300aを有しているが、1つのカメラ300aのみを有していてもよいし、3つ以上のカメラ300aを有していてもよい。(カメラ300aが複数図示されている他の図でも同様)。 In FIG. 3, the optical projection system 10 has two cameras 300a, but it may have only one camera 300a, or it may have three or more cameras 300a. (The same applies to other figures in which multiple cameras 300a are shown.)
(光投射系)
光投射系10は、一例として、画像光生成部100と、該画像光生成部100で生成された画像光ILを投射する投射光学系200とを含む。画像光生成部100、投射光学系200及び少なくとも1つのカメラ300aは、一例として、筐体Hに一体的に設けられている。
(Light projection system)
The optical projection system 10 includes, for example, an image light generation unit 100 and a projection optical system 200 that projects the image light IL generated by the image light generation unit 100. The image light generation unit 100, the projection optical system 200, and at least one camera 300a are, for example, integrally provided in a housing H.
画像光生成部100は、一例として、光源100aと、走査ミラー100bと、光源駆動部100d(図12参照)とを有する。 The image light generating unit 100 includes, as an example, a light source 100a, a scanning mirror 100b, and a light source driving unit 100d (see FIG. 12).
光源100aは、例えば端面発光レーザ(LD)、面発光レーザ(VCSEL)等のレーザを有する。ここでは、光源100aは、例えば赤色レーザ、緑色レーザ及び青色レーザを有する。なお、光源100aとして、例えば発光ダイオード(LED)を用いてもよい。光源100aは、単一色のレーザ又はLEDのみを有していてもよい。これにより、光源100aの構成を簡略化できる。 The light source 100a has a laser, such as an edge-emitting laser (LD) or a surface-emitting laser (VCSEL). Here, the light source 100a has, for example, a red laser, a green laser, and a blue laser. Note that, for example, a light-emitting diode (LED) may be used as the light source 100a. The light source 100a may have only a laser or LED of a single color. This allows the configuration of the light source 100a to be simplified.
光源駆動部100dは、制御部400から入力されたRGBの各色の変調信号に基づいて駆動信号を生成し、光源100aの対応するレーザに印加することにより該レーザを駆動する。光源100a及び光源駆動部100dを併せて「光源系」とも呼ぶ。一例として、光源駆動部100dには、制御部400から、投射画像の輝度情報が入力される。 The light source driving unit 100d generates a driving signal based on the modulation signal for each color of RGB input from the control unit 400, and applies the driving signal to the corresponding laser of the light source 100a to drive the laser. The light source 100a and the light source driving unit 100d are collectively referred to as the "light source system." As an example, the light source driving unit 100d receives luminance information of the projected image from the control unit 400.
走査ミラー100bは、光源100aから出射された光を偏向走査する。走査ミラー100bとしては、例えば2軸周りに駆動可能なMEMSミラー等が用いられる。これにより、部品点数の削減及び小型化を図ることができる。一方で、走査ミラー100bとして2軸周りに駆動可能なミラーを用いた場合には、ミラーの駆動により投射画像に歪みが生じる場合がある。これを回避するために、例えば1軸周りに駆動可能なMEMSミラー等のミラーを2つ組み合わせて用いてもよい。 Scanning mirror 100b deflects and scans the light emitted from light source 100a. For example, a MEMS mirror that can be driven around two axes is used as scanning mirror 100b. This allows for a reduction in the number of parts and a smaller size. On the other hand, if a mirror that can be driven around two axes is used as scanning mirror 100b, distortion may occur in the projected image due to the driving of the mirror. To avoid this, for example, two MEMS mirrors that can be driven around one axis may be used in combination.
投射光学系200は、一例として、投射レンズとしてのフォーカスレンズ200cと、可動偏向素子としての可動ミラー200aとを有する。投射光学系200は、制御部400により制御される。可動ミラー200aの後段には、投射窓200bが設けられている。 The projection optical system 200 has, as an example, a focus lens 200c as a projection lens and a movable mirror 200a as a movable deflection element. The projection optical system 200 is controlled by the control unit 400. A projection window 200b is provided behind the movable mirror 200a.
フォーカスレンズ200cは、走査ミラー100bを介した画像光ILの光路上に配置されている。フォーカスレンズ200cは、その光軸方向に移動可能に設けられ、フォーカスレンズ駆動部により該光軸方向に駆動される。該フォーカスレンズ駆動部は、制御部400により制御される。観察者がz軸方向(投射光軸に平行な方向)に動いた場合は画像の焦点が合わなくなってしまう。そこで、光学素子OEのz軸方向の位置に応じて画像光ILの焦点が観察者の網膜上に一致するようにフォーカスレンズ200cの光軸方向の位置調整(フォーカス調整)を行うことが好ましい。 The focus lens 200c is disposed on the optical path of the image light IL via the scanning mirror 100b. The focus lens 200c is provided so as to be movable in the optical axis direction, and is driven in the optical axis direction by a focus lens drive unit. The focus lens drive unit is controlled by the control unit 400. If the observer moves in the z-axis direction (a direction parallel to the projection optical axis), the image will not be in focus. Therefore, it is preferable to adjust the position of the focus lens 200c in the optical axis direction (focus adjustment) so that the focus of the image light IL coincides with the observer's retina according to the position of the optical element OE in the z-axis direction.
可動ミラー200aは、フォーカスレンズ200cを介した画像光ILを反射して投射窓200bへ導く。可動ミラー200aで反射され投射窓200bを介した画像光ILが、接眼光学系20へ投射される。可動ミラー200aは、例えば2軸周りに揺動可能に設けられ、可動ミラー駆動部により駆動される。可動ミラー200aとしては、例えば1軸周りに揺動可能なガルバノミラーを2つ組み合わせたもの、2軸周りに揺動可能なジンバルミラー等が用いられる。該可動ミラー駆動部は、制御部400により制御される。可動ミラー200aを2軸周りに揺動させることにより、画像光ILの投射方向を2次元的に変化させることができる。光投射系10と接眼光学系20とは、空間的に分離されており、観察者の眼球、頭部及び身体の向きにより、相対的な位置関係がダイナミックにリアルタイムで変化する。光投射系10の投射光軸(詳しくは投射光学系200の光軸、以下単に「投射光軸」とも呼ぶ)が光学素子OEを通らないと、観察者が画像を見ることができなくなってしまう。そこで、光学素子OEの2次元座標(x、y)、すなわち投射光軸に直交する平面内における2次元位置情報に基づいて、投射光軸が光学素子OEを通るように可動ミラー200aの角度(姿勢)を制御することが好ましい。これにより、観察者が常に画像を見続けることができるようにすることができる。なお、可動偏向素子として、可動ミラーに代えて、メカレスの偏向素子を用いることもできる。 The movable mirror 200a reflects the image light IL through the focus lens 200c and guides it to the projection window 200b. The image light IL reflected by the movable mirror 200a and passing through the projection window 200b is projected to the eyepiece optical system 20. The movable mirror 200a is arranged to be swingable around two axes, for example, and is driven by a movable mirror drive unit. As the movable mirror 200a, for example, a combination of two galvanometer mirrors that can swing around one axis, or a gimbal mirror that can swing around two axes, etc. are used. The movable mirror drive unit is controlled by the control unit 400. By swinging the movable mirror 200a around two axes, the projection direction of the image light IL can be changed two-dimensionally. The light projection system 10 and the eyepiece optical system 20 are spatially separated, and the relative positional relationship changes dynamically in real time depending on the orientation of the observer's eyeball, head, and body. If the projection optical axis of the light projection system 10 (more specifically, the optical axis of the projection optical system 200, hereinafter also simply referred to as the "projection optical axis") does not pass through the optical element OE, the observer will not be able to see the image. Therefore, it is preferable to control the angle (posture) of the movable mirror 200a so that the projection optical axis passes through the optical element OE based on the two-dimensional coordinates (x, y) of the optical element OE, i.e., based on two-dimensional position information in a plane perpendicular to the projection optical axis. This allows the observer to continue viewing the image at all times. Note that instead of a movable mirror, a mechanical-less deflection element can also be used as the movable deflection element.
(接眼光学系)
接眼光学系20は、光学素子OEと、3つ以上の特徴点FPとを含む。接眼光学系20は、表示装置1の使用時(光投射系10から眼球EBに向けて画像光ILが投射されるとき)に、カメラ300aの撮像視野内(画角内)に位置する。
(Ocular optical system)
The eyepiece optical system 20 includes an optical element OE and three or more feature points FP. The eyepiece optical system 20 is located within the imaging field of view (within the angle of view) of the camera 300a when the display device 1 is in use (when the image light IL is projected from the optical projection system 10 toward the eyeball EB).
光学素子OEは、焦点が観察者の瞳孔付近に位置するように配置されている。光投射系10から投射された画像光ILは、光学素子OEで集光されて観察者の瞳孔を通過することで網膜に直接描画される。これにより、観察者は画像を認識することができる。 The optical element OE is positioned so that its focal point is located near the observer's pupil. The image light IL projected from the light projection system 10 is focused by the optical element OE and passes through the observer's pupil, thereby being imaged directly on the retina. This allows the observer to recognize the image.
光学素子OEは、観察者の眼球EBに装着されるコンタクトレンズCLに設けられている。コンタクトレンズCLは、接眼光学系20の構成要素であってもよいし、構成要素でなくてもよい。コンタクトレンズCLには、コンタクトレンズCLの中心が眼球EBの瞳の中心と一致するように動く、セルフアライメントという長所がある。すなわち、コンタクトレンズCLの向きと、眼球EBの向きである視線の向きとが一致する。例えば光学素子OEが眼鏡に設けられる場合、眼鏡と瞳とに位置ずれが生じる可能性がある。この場合、眼鏡及び瞳の位置を検出して該位置ずれを加味した制御にするか、位置ずれ量が問題にならないレベルまで眼鏡の調整と経時変化を追い込むことが必要である。光学素子OEをコンタクトレンズCLに設けることでこの位置ずれを考慮する必要がなくなる。なお、コンタクトレンズCLの度数は、0であってもよい。 The optical element OE is provided in a contact lens CL that is attached to the observer's eyeball EB. The contact lens CL may or may not be a component of the eyepiece optical system 20. The contact lens CL has the advantage of self-alignment, in that the center of the contact lens CL moves to coincide with the center of the pupil of the eyeball EB. In other words, the orientation of the contact lens CL coincides with the direction of the line of sight, which is the direction of the eyeball EB. For example, when the optical element OE is provided in glasses, there is a possibility that a positional deviation occurs between the glasses and the pupil. In this case, it is necessary to detect the positions of the glasses and the pupil and perform control that takes into account the positional deviation, or to adjust the glasses and drive the changes over time to a level where the amount of positional deviation is not a problem. By providing the optical element OE in the contact lens CL, there is no need to consider this positional deviation. The degree of the contact lens CL may be 0.
光学素子OEは、一例として、ホログラム素子(HOE)であり、光投射系10から投射された画像光ILを回折、集光して眼球EBの網膜に導く。投射光軸に対してHOEの角度(姿勢)が変化すると、観察者が見る画像の色合いが変わってしまう。これは、HOEの回折効率に入射角依存性が存在するためである。 The optical element OE is, for example, a holographic element (HOE), which diffracts and focuses the image light IL projected from the optical projection system 10 and guides it to the retina of the eyeball EB. If the angle (posture) of the HOE with respect to the projection optical axis changes, the color of the image seen by the observer changes. This is because the diffraction efficiency of the HOE depends on the angle of incidence.
すなわち、HOEは、光の入射角度によって回折効率が異なる特性を持つ。具体的には、HOEの回折効率は、光投射系10から投射された画像光IL以外の光が回折されずに透過するような入射角依存性を持つ(図9参照)。これにより、観察者は、ARのように背景画像と投射画像とを重ねて見ることができる。一方で、HOEの回折効率は、ある入射角度範囲のみ1(100%)とすること(図9左側のグラフ参照)は物理的に不可能であり、必ずピークの入射角度を持ちそこから角度幅を持って回折効率が落ちていく特性を持つ(図9右図参照)。さらに、この特性はRGBで独立であり、ピーク角度と半値幅は必ずしも一致しない。このような特徴を持つHOEを介して、観察者に見せたい画像の色を適正化するためには、投射光軸に対するHOEの角度(θx、θy)を認識した上でその角度における回折効率を加味して投射画像を生成する必要がある。 That is, the HOE has a characteristic that the diffraction efficiency varies depending on the angle of incidence of light. Specifically, the diffraction efficiency of the HOE has an incidence angle dependency such that light other than the image light IL projected from the light projection system 10 is transmitted without being diffracted (see FIG. 9). This allows the observer to see the background image and the projected image superimposed, as in AR. On the other hand, it is physically impossible for the diffraction efficiency of the HOE to be 1 (100%) only in a certain incidence angle range (see the graph on the left side of FIG. 9), and the diffraction efficiency has a characteristic that it always has a peak incidence angle and then drops with an angle width (see the right diagram of FIG. 9). Furthermore, this characteristic is independent of RGB, and the peak angle and half-width do not necessarily match. In order to optimize the color of the image to be shown to the observer through a HOE with such characteristics, it is necessary to recognize the angle (θx, θy) of the HOE with respect to the projection optical axis and generate the projected image taking into account the diffraction efficiency at that angle.
(特徴点) (Feature points)
各特徴点FPは、カメラ300aの撮像画像内でその周辺の領域と区別可能となるように該領域とは光の反射特性が異なることが望ましい。例えば、各特徴点FPは、その周辺の領域よりも反射率が大きい反射材料で構成されてもよいし、その周辺の領域よりも反射率が小さい光透過材料又は光吸収材料で構成されてもよい。例えば、各特徴点FPは、部材(例えばコンタクトレンズ、眼鏡レンズ、眼鏡フレーム等)の一部を加工して得られる凹凸等を含んで構成されてもよい。 It is desirable that each feature point FP has different light reflection characteristics from the surrounding areas so that it can be distinguished from the surrounding areas in the image captured by camera 300a. For example, each feature point FP may be made of a reflective material with a higher reflectance than the surrounding areas, or may be made of a light-transmitting or light-absorbing material with a lower reflectance than the surrounding areas. For example, each feature point FP may be made to include irregularities obtained by processing a part of a member (e.g., a contact lens, an eyeglass lens, an eyeglass frame, etc.).
例えば、カメラ300aとして可視光カメラを用い、各特徴点FPとして可視光波長で視認可能なマーカーを採用してもよい。可視光カメラを用いることにより、広いダイナミックレンジでマーカーを認識することができる。また、例えば、カメラ300aとして、赤外線カメラを用い、各特徴点FPとして赤外波長で視認可能なマーカー(例えばダイクロイック膜)を採用してもよい。 For example, a visible light camera may be used as the camera 300a, and markers visible at visible light wavelengths may be used as each feature point FP. By using a visible light camera, the markers can be recognized over a wide dynamic range. Also, for example, an infrared camera may be used as the camera 300a, and markers visible at infrared wavelengths (e.g., a dichroic film) may be used as each feature point FP.
各特徴点FPは、光学素子OEに対する相対位置が固定されている必要がある。そこで、表示装置1では、少なくとも3つの特徴点FPが、光学素子OEが設けられた、剛体としてのコンタクトレンズCLに設けられている。すなわち、各特徴点FPと光学素子OEとがコンタクトレンズCLを介して一体になっている。 The relative position of each feature point FP with respect to the optical element OE must be fixed. Therefore, in the display device 1, at least three feature points FP are provided on the contact lens CL, which serves as a rigid body on which the optical element OE is provided. In other words, each feature point FP and the optical element OE are integrated via the contact lens CL.
3つの特徴点FPのXYZ座標をP[0]~P[2](図2参照)、光学素子OEの光学中心OCの座標をHcとし、3つの特徴点FP及び光学中心OCの位置ベクトルをP[0], P[1], P[2], Hcとすると、ベクトルの1次結合を用いて以下の(8)式が成立する。
Hc = a P[0] + b P[1] + c P[2] ・・・(8)
If the XYZ coordinates of the three feature points FP are P[0] to P[2] (see FIG. 2), the coordinate of the optical center OC of the optical element OE is Hc, and the position vectors of the three feature points FP and the optical center OC are P[0], P[1], P[2], Hc, then the following equation (8) is established using a linear combination of vectors.
Hc = a P[0] + b P[1] + c P[2] ... (8)
ここで、初期状態(製造時)における各特徴点FPの座標Po[0]~Po[2], 光学素子OEの光学中心OCの座標Hcoが分かっていれば、係数a~cを求めることができる。さらに、a+b+c=1が成り立つように各特徴点FPを配置することができれば、この剛体(例えばコンタクトレンズCL)がどのような位置でどんな姿勢をしていても上記(8)式が常に成り立つ。すなわち、カメラ300aでの認識による3つの特徴点FPの3次元座標P[0]~P[2]から光学素子OEの光学中心Hcを計算で求めることができる。ここで挙げた条件a+b+c=1は、光学素子の光学中心HcがP[0]~P[2]を3次元座標とする3つの特徴点FPにより特定される平面内に存在することを意味している。 Here, if the coordinates Po[0] to Po[2] of each feature point FP in the initial state (at the time of manufacture) and the coordinate Hco of the optical center OC of the optical element OE are known, the coefficients a to c can be found. Furthermore, if each feature point FP can be positioned so that a+b+c=1 holds, then the above formula (8) will always hold no matter what position or attitude the rigid body (e.g., contact lens CL) is in. In other words, the optical center Hc of the optical element OE can be calculated from the three-dimensional coordinates P[0] to P[2] of the three feature points FP recognized by the camera 300a. The condition a+b+c=1 listed here means that the optical center Hc of the optical element exists within a plane specified by the three feature points FP with three-dimensional coordinates P[0] to P[2].
ところで、カメラ300aにより光学素子OEの5軸に関する位置情報(x、y、z、θx、θy)を認識するためには少なくとも3つの特徴点FPがカメラ300aで撮像できる位置に常時位置することが望まれる(但し、観察者が目をつぶっているときや、瞬きの瞬間は認識する必要がないものとして除外する。)すなわち、カメラ300aを含む取得部300が、眼球EBの向きが変化しても、少なくとも3つの特徴点FPの各々の少なくとも2次元座標(x、y)を取得可能であることが望まれる。理想的には、カメラ300aを含む取得部300が、眼球EBの向きによらず、少なくとも3つの特徴点FPの各々の少なくとも2次元座標(x、y)を取得可能であることが望まれる。 In order for the camera 300a to recognize positional information (x, y, z, θx, θy) about the five axes of the optical element OE, it is desirable that at least three feature points FP are always located in positions where they can be imaged by the camera 300a (however, this excludes times when the observer has their eyes closed or when they blink, as these do not need to be recognized). In other words, it is desirable that the acquisition unit 300, including the camera 300a, be able to acquire at least two-dimensional coordinates (x, y) of each of the at least three feature points FP even if the orientation of the eyeball EB changes. Ideally, it is desirable that the acquisition unit 300, including the camera 300a, be able to acquire at least two-dimensional coordinates (x, y) of each of the at least three feature points FP, regardless of the orientation of the eyeball EB.
そこで、3つの特徴点FPの配置、大きさ、形状、色等を工夫することが考えられる。それ以外にも、例えば4つ以上の特徴点FPを配置し、仮に一部の特徴点FPが瞼等で隠れてしまったとしても、少なくとも3つの特徴点FPがカメラ300aで撮像可能となるようにしてもよい。また、3つの特徴点FPは、必ずしも各々が互いに物理的に分離している(別体である)必要はなく、例えば何らかの図形と一体的に設けられてもよい(図29A、図29B参照)。図29Aに示す接眼光学系20の構成例6では、3つの特徴点FPが円と一体的に且つ互いに周方向に離間して設けられている。図29Bに示す接眼光学系20の構成例7では、3つの特徴点FPのうち2つの特徴点FPが円弧と一体的に且つ互いに周方向に離間して設けられ、1つの特徴点FPが該2つの特徴点FP及び円弧とは別体に設けられている。 Therefore, it is possible to devise the arrangement, size, shape, color, etc. of the three characteristic points FP. In addition, for example, four or more characteristic points FP may be arranged so that even if some of the characteristic points FP are hidden by the eyelids, etc., at least three characteristic points FP can be captured by the camera 300a. In addition, the three characteristic points FP do not necessarily need to be physically separated from each other (separate), and may be provided, for example, integrally with some figure (see Figures 29A and 29B). In the configuration example 6 of the eyepiece optical system 20 shown in Figure 29A, the three characteristic points FP are provided integrally with the circle and spaced apart from each other in the circumferential direction. In the configuration example 7 of the eyepiece optical system 20 shown in Figure 29B, two of the three characteristic points FP are provided integrally with the arc and spaced apart from each other in the circumferential direction, and one characteristic point FP is provided separately from the two characteristic points FP and the arc.
3つの特徴点FPの各々の位置情報を用いて、光学素子OEの向きを求めるためには、少なくとも3つの特徴点FPにより1つの平面が特定される必要があるため、3つ以上の特徴点FPのうち少なくとも3つの特徴点FPが同一直線上にないことが必要である。また、少なくとも3つの特徴点FP及び光学素子OEの光学中心OCが同一平面上にあることが好ましい。 In order to determine the orientation of the optical element OE using the position information of each of the three feature points FP, a plane needs to be specified by at least three feature points FP, so it is necessary that at least three of the three or more feature points FP are not on the same line. In addition, it is preferable that at least three feature points FP and the optical center OC of the optical element OE are on the same plane.
(接眼光学系の構成例1)
図4A~図4Cは、表示装置1の接眼光学系20の構成例1を説明するための図である。接眼光学系20の構成例1では、図4Aに示すように3つの同形(例えば円形)の特徴点FPが三角形の3つの頂点に位置し、且つ、瞳(図4A~図4Cで最も色が濃い部分)の周辺に位置するようにコンタクトレンズCLに設けられている。これにより、図4B及び図4Cに示すように眼球EBがある程度左右に動いても、該3つの特徴点FPをカメラ300aにより撮像可能である。
(Example 1 of the eyepiece optical system configuration)
4A to 4C are diagrams for explaining a first configuration example of the eyepiece optical system 20 of the display device 1. In the first configuration example of the eyepiece optical system 20, as shown in Fig. 4A, three feature points FP of the same shape (e.g., circles) are provided on the contact lens CL so as to be located at the three vertices of a triangle and to be located around the pupil (the darkest part in Figs. 4A to 4C). This allows the three feature points FP to be captured by the camera 300a even if the eyeball EB moves left and right to some extent as shown in Figs. 4B and 4C.
(接眼光学系の構成例2)
図5A~図5Cは、表示装置1の接眼光学系20の構成例2を説明するための図である。接眼光学系20の構成例2では、図5Aに示すように3つの同形(例えば円形)の特徴点FPが三角形の3つの頂点に位置し、且つ、瞳(図5A~図5Cで最も色が濃い部分)の周辺であって図4Aの構成例1よりも内側に位置するようにコンタクトレンズCLに設けられている。これにより、図5B及び図5Cに示すように眼球EBが比較的大きく左右に動いても、該3つの特徴点FPをカメラ300aにより撮像可能である。
(Example 2 of the eyepiece optical system configuration)
5A to 5C are diagrams for explaining a second configuration example of the eyepiece optical system 20 of the display device 1. In the second configuration example of the eyepiece optical system 20, as shown in Fig. 5A, three feature points FP of the same shape (e.g., circles) are located at the three vertices of a triangle, and are provided on the contact lens CL so as to be located around the pupil (the part with the darkest color in Figs. 5A to 5C) and on the inside compared to the first configuration example of Fig. 4A. This allows the three feature points FP to be captured by the camera 300a even if the eyeball EB moves relatively widely left and right as shown in Figs. 5B and 5C.
(接眼光学系の構成例3)
図6A~図6Cは、表示装置1の接眼光学系20の構成例3を説明するための図である。接眼光学系20の構成例3では、形状が異なる3つの(例えば円形、三角形、四角形の)比較的大きな特徴点FPが三角形の3つの頂点に位置し、且つ、瞳(図6A~図6Cで最も色が濃い部分)の周辺に位置するようにコンタクトレンズCLに設けられている。これにより、図6B及び図6Cに示すように眼球EBが比較的大きく左右に動いても、該3つの特徴点FPをカメラ300aにより撮像可能である。なお、本構成例3において、少なくとも2つの特徴点FPの形状に代えて又は加えて色を異ならせてもよい。
(Example 3 of the eyepiece optical system configuration)
6A to 6C are diagrams for explaining a configuration example 3 of the eyepiece optical system 20 of the display device 1. In the configuration example 3 of the eyepiece optical system 20, three relatively large feature points FP of different shapes (e.g., circular, triangular, and rectangular) are provided on the contact lens CL so as to be located at the three vertices of a triangle and to be located around the pupil (the part with the darkest color in Figs. 6A to 6C). This allows the three feature points FP to be captured by the camera 300a even if the eyeball EB moves relatively widely left and right as shown in Figs. 6B and 6C. In this configuration example 3, at least two of the feature points FP may be made to have different colors instead of or in addition to the shapes.
(接眼光学系の構成例4)
図7A~図7Eは、表示装置1の接眼光学系20の構成例4を説明するための図である。接眼光学系20の構成例4では、図7Aに示すように6つの同形(例えば円形)の特徴点FPが六角形の6つの頂点に位置し、且つ、瞳(図7A~図7Eで最も色が濃い部分)の周辺に位置するようにコンタクトレンズCLに設けられている。これにより、図7B~図7Eに示すように眼球EBが比較的大きく上下左右に動いても、該6つの特徴点FPのうち少なくとも3つの特徴点FPをカメラ300aにより撮像可能である。
(Example 4 of the eyepiece optical system configuration)
7A to 7E are diagrams for explaining a fourth configuration example of the eyepiece optical system 20 of the display device 1. In the fourth configuration example of the eyepiece optical system 20, as shown in Fig. 7A, six feature points FP of the same shape (e.g., circles) are provided on the contact lens CL so as to be located at the six vertices of a hexagon and to be located around the pupil (the part with the darkest color in Figs. 7A to 7E). This allows at least three of the six feature points FP to be captured by the camera 300a even if the eyeball EB moves relatively widely up, down, left, and right as shown in Figs. 7B to 7E.
(接眼光学系の構成例5)
図8A~図8Eは、表示装置1の接眼光学系20の構成例5を説明するための図である。接眼光学系20の構成例5では、図8Aに示すように3つの同形(例えば円形)の特徴点FPであって可視光を透過させ、且つ、赤外光を反射する3つの同形の特徴点FPが、三角形の3つの頂点に位置し、且つ、瞳(図8A~図8Eで最も色が濃い部分)に重なるようにコンタクトレンズCLに設けられている。これにより、図8B~図8Eに示すように眼球EBが非常に大きく上下左右に動いても、該3つの特徴点FPをカメラ300a(例えば赤外線カメラ)により撮像可能である。
(Configuration Example 5 of Eyepiece Optical System)
8A to 8E are diagrams for explaining a configuration example 5 of the eyepiece optical system 20 of the display device 1. In the configuration example 5 of the eyepiece optical system 20, as shown in Fig. 8A, three feature points FP of the same shape (e.g., circles) that transmit visible light and reflect infrared light are provided on the contact lens CL so as to be located at the three vertices of a triangle and overlap with the pupil (the darkest part in Figs. 8A to 8E). As a result, even if the eyeball EB moves significantly up, down, left, and right as shown in Figs. 8B to 8E, the three feature points FP can be captured by the camera 300a (e.g., an infrared camera).
(表示装置の制御の基本構成例)
図10は、表示装置1の制御の基本構成例を示すブロック図である。表示装置1の取得部300は、光学素子OEに対する相対位置が固定された3つ以上の特徴点FPのうち少なくとも3つの特徴点FPの各々の少なくとも2次元位置情報(例えば3次元位置情報)を取得する。図10に示すように、取得部300は、一例として、カメラ300a及びxyz座標取得部300bを有する。制御部400は、演算部400a及び表示制御部400bを有する。取得部300及び制御部400は、例えばCPU、チップセット等のハードウェアで実現される。
(Basic configuration example of control of display device)
Fig. 10 is a block diagram showing a basic configuration example of the control of the display device 1. The acquisition unit 300 of the display device 1 acquires at least two-dimensional position information (e.g., three-dimensional position information) of at least three feature points FP among three or more feature points FP whose relative positions with respect to the optical element OE are fixed. As shown in Fig. 10, the acquisition unit 300 has, as an example, a camera 300a and an xyz coordinate acquisition unit 300b. The control unit 400 has a calculation unit 400a and a display control unit 400b. The acquisition unit 300 and the control unit 400 are realized by hardware such as a CPU, a chipset, etc.
xyz座標取得部300bは、カメラ300aの撮像画像から、少なくとも3つの特徴点FPの各々のxyz座標(3次元位置情報)を取得する。 The xyz coordinate acquisition unit 300b acquires the xyz coordinates (three-dimensional position information) of at least three feature points FP from the image captured by the camera 300a.
演算部400aは、少なくとも3つの特徴点FPの各々のxyz座標に基づいて、可動ミラー200aの角度、フォーカスレンズ200cの位置、投射画像の輝度を算出する。 The calculation unit 400a calculates the angle of the movable mirror 200a, the position of the focus lens 200c, and the brightness of the projected image based on the xyz coordinates of each of at least three feature points FP.
表示制御部400bは、可動ミラー200aの角度の算出結果に基づいて可動ミラー200aを制御し、フォーカスレンズ200cの位置の算出結果に基づいてフォーカスレンズ200cを制御し、投射画像の輝度の算出結果に基づいて投射画像を制御する(具体的には、投射画像の輝度算出結果に基づく輝度補正値を光源駆動部100dに出力する)。 The display control unit 400b controls the movable mirror 200a based on the calculation result of the angle of the movable mirror 200a, controls the focus lens 200c based on the calculation result of the position of the focus lens 200c, and controls the projected image based on the calculation result of the luminance of the projected image (specifically, it outputs a luminance correction value based on the calculation result of the luminance of the projected image to the light source driving unit 100d).
(表示装置の制御の基本構成例における全体制御の流れ)
図11は、図10に示す、表示装置1の制御の基本構成例における全体制御の流れを示すフローチャートである。図11は、CPUにより実行される処理アルゴリズムに基づいている。図11の一連の処理は、カメラ300aが座標を捕らえたとき(取得部300が座標を取得したとき)に開始される。
(Overall control flow in basic configuration example of control of display device)
Fig. 11 is a flowchart showing the flow of overall control in the basic configuration example of the control of the display device 1 shown in Fig. 10. Fig. 11 is based on a processing algorithm executed by the CPU. The series of processes in Fig. 11 are started when the camera 300a captures the coordinates (when the acquisition unit 300 acquires the coordinates).
最初のステップS1では、取得部300が、少なくとも3つの特徴点FPの各々の3次元座標(x、y、z)を取得する。 In the first step S1, the acquisition unit 300 acquires the three-dimensional coordinates (x, y, z) of each of at least three feature points FP.
次のステップS2では、制御部400の演算部400aが、少なくとも3つの特徴点FPの各々の3次元座標(x、y、z)に基づいて、法線ベクトルNを算出する。 In the next step S2, the calculation unit 400a of the control unit 400 calculates the normal vector N based on the three-dimensional coordinates (x, y, z) of each of the at least three feature points FP.
次のステップS3では、制御部400の演算部400aが、法線ベクトルNを用いて、θx、θy、θzを算出する。 In the next step S3, the calculation unit 400a of the control unit 400 calculates θx, θy, and θz using the normal vector N.
次のステップS4では、制御部400の演算部400aが、少なくとも3つの特徴点FPの各々の3次元座標(x、y、z)に基づいて、カメラ300aから光学素子OE(詳しくは3つの特徴点FPにより特定される平面)までの距離Dを算出する。 In the next step S4, the calculation unit 400a of the control unit 400 calculates the distance D from the camera 300a to the optical element OE (more specifically, the plane identified by the three feature points FP) based on the three-dimensional coordinates (x, y, z) of each of the at least three feature points FP.
次のステップS5では、制御部400の表示制御部400bが、x、y、θx、θy、θz、Dに基づいて、画像の表示を制御する(具体的には、可動ミラー200a、フォーカスレンズ200c、投射画像の輝度及び形状補正を制御する)。 In the next step S5, the display control unit 400b of the control unit 400 controls the display of the image based on x, y, θx, θy, θz, and D (specifically, controls the movable mirror 200a, the focus lens 200c, and the brightness and shape correction of the projected image).
次のステップS6では、制御部400が、処理を終了するか否かを判断する。制御部400は、例えば、取得部300が各特徴点FPの3次元座標を取得できるとき(各特徴点FPがカメラ300aの画角内にあるとき)には、ここでの判断を否定し(処理を終了しないと判断し)、ステップS1~S5の一連の処理を再び実施する。制御部400は、例えば、ユーザが終了指示を出したとき又は取得部300が各特徴点FPの3次元座標を取得できなくなったとき(各特徴点FPがカメラ300aの画角から外れたとき)に、ここでの判断を肯定し(処理を終了すると判断し)、フローを終了させる。 In the next step S6, the control unit 400 determines whether or not to end the process. For example, when the acquisition unit 300 can acquire the three-dimensional coordinates of each feature point FP (when each feature point FP is within the angle of view of the camera 300a), the control unit 400 denies this determination (determines not to end the process) and performs the series of processes of steps S1 to S5 again. For example, when the user issues an end command or the acquisition unit 300 can no longer acquire the three-dimensional coordinates of each feature point FP (when each feature point FP falls outside the angle of view of the camera 300a), the control unit 400 affirms this determination (determines to end the process) and ends the flow.
(ステレオカメラ)
各特徴点FPのz座標(カメラ300aから各特徴点FPまでの距離)を取得するために複数のカメラ300aを用いる、ステレオカメラ法を導入することができる。ステレオカメラ法は、2つのカメラ300aを互いに光軸が平行になるように配置し、各パラメータを以下のように(図24参照)のように定義する。
(Stereo camera)
A stereo camera method can be introduced, in which multiple cameras 300a are used to obtain the z coordinates of each feature point FP (the distance from the camera 300a to each feature point FP). In the stereo camera method, two cameras 300a are arranged so that their optical axes are parallel to each other, and each parameter is defined as follows (see FIG. 24).
h : 左右カメラの中心間距離[mm]
f : カメラレンズ焦点距離[mm]
size_x : カメラのセンサーサイズ横[mm]
size_y : カメラのセンサーサイズ縦[mm]
width : カメラのセンサー横pixel 数
height : カメラのセンサー縦pixel 数
P : 測定対象物 座標(x,y,z)[mm]
pFP の右カメラセンサー上の結像点(hr,vr)[pixel]
pl : P の左カメラセンサー上の結像点(hl,vl)[pixel]
h: Distance between the centers of the left and right cameras [mm]
f: Camera lens focal length [mm]
size_x: Camera sensor size width [mm]
size_y : Camera sensor size (vertical) [mm]
width: Number of pixels across the camera sensor
height: number of vertical pixels of the camera sensor
P: Measurement object coordinates (x, y, z) [mm]
Image point on the right camera sensor of pFP (hr,vr) [pixel]
pl: Image point on the left camera sensor of P (hl,vl) [pixel]
結像点の座標を[mm]に変換するには、xr, yr, xlは、それぞれ以下のようになる。
xr = hr / ( width × size_x)
yr = vr / ( height × size_y)
xl = hl / ( width × size_x)
xl = vl / ( height × size_y)
To convert the coordinates of the image point into [mm], xr, yr, and xl are as follows:
xr = hr / ( width × size_x)
yr = vr / ( height × size_y)
xl = hl / ( width × size_x)
xl = vl / ( height × size_y)
上記から測定対象物P(x,y,z)の座標は下記のように算出できる。
x = (xl + xr) * h / 2 * (xl - xr )
y = yr * h / (xl - xr ) = yl * h / (xl - xr )
z = f * h / (xl - xr )
From the above, the coordinates of the measurement object P(x, y, z) can be calculated as follows:
x = (xl + xr) * h / 2 * (xl - xr )
y = yr * h / (xl - xr ) = yl * h / (xl - xr )
z = f * h / (xl - xr)
上記各式及び図24では、左右のカメラの中間位置を原点Oと定義している。なお、ステレオカメラ法は、必ずしも2つのカメラの光軸を平行にする必要はなく、平行でない配置の場合の計算式も知られているが、ここでは省略する。 In the above formulas and in Figure 24, the midpoint between the left and right cameras is defined as the origin O. Note that in the stereo camera method, the optical axes of the two cameras do not necessarily need to be parallel, and formulas for non-parallel arrangements are also known, but they will not be discussed here.
(表示装置の制御の構成例1)
図12は、表示装置1の制御の構成例1を示すブロック図である。
(Configuration Example 1 of Display Device Control)
FIG. 12 is a block diagram showing a first example of a control configuration of the display device 1. As shown in FIG.
表示装置1の制御の構成例1は、図10に示す基本構成例に対して、制御部400の構成をより具体化したものである。本構成例1では、2つのカメラ300a(例えば可視光カメラ)によりステレオカメラSCが構成されている。 Configuration example 1 of the control of the display device 1 is a more specific configuration of the control unit 400 compared to the basic configuration example shown in FIG. 10. In this configuration example 1, a stereo camera SC is configured by two cameras 300a (e.g., visible light cameras).
本構成例1では、制御部400の演算部400aが、光学中心xyz座標算出部400a1と、光学素子傾き角算出部400a2とを有し、制御部400の表示制御部400bが、可動ミラー制御部400b1と、フォーカスレンズ制御部400b2と、投射画像制御部400b3とを有する。 In this configuration example 1, the calculation unit 400a of the control unit 400 has an optical center xyz coordinate calculation unit 400a1 and an optical element tilt angle calculation unit 400a2, and the display control unit 400b of the control unit 400 has a movable mirror control unit 400b1, a focus lens control unit 400b2, and a projection image control unit 400b3.
(表示装置の制御の構成例1における全体制御の流れ)
図13は、図12の構成例1における全体制御の流れを示すフローチャートである。図13は、CPUにより実行される処理アルゴリズムに基づいている。図13の一連の処理は、カメラ300aが座標を捕らえたとき(取得部300が座標を取得したとき)に開始される。
(Overall control flow in configuration example 1 of the control of the display device)
Fig. 13 is a flowchart showing the flow of the overall control in the configuration example 1 in Fig. 12. Fig. 13 is based on a processing algorithm executed by a CPU. The series of processes in Fig. 13 are started when the camera 300a captures the coordinates (when the acquisition unit 300 acquires the coordinates).
先ず、ステップS21において、取得部300が、xyz座標取得処理を実施する。具体的には、xyz座標取得部300bが、ステレオカメラSCを構成する2つのカメラ300a、300aの撮像画像から、少なくとも3つの特徴点FPの各々のxy座標及びz座標を取得する。 First, in step S21, the acquisition unit 300 performs an xyz coordinate acquisition process. Specifically, the xyz coordinate acquisition unit 300b acquires the xy coordinates and z coordinates of at least three feature points FP from the images captured by the two cameras 300a, 300a that make up the stereo camera SC.
次に、制御部400が、ステップS22-1の可動ミラー制御処理、ステップS22-2のフォーカスレンズ制御処理及びステップS22-3の投射画像制御処理を並行して実施する。 Then, the control unit 400 performs the movable mirror control process in step S22-1, the focus lens control process in step S22-2, and the projection image control process in step S22-3 in parallel.
次に、ステップS23において、制御部400が、処理を終了するか否かを判断する。ここでの判断が肯定されるとフローは終了し、否定されるとステップS21に戻り、同様の一連の処理が再び行われる。制御部400は、取得部300が各特徴点FPの3次元座標を取得できるとき(各特徴点FPがカメラ300aの画角内にあるとき)には、ここでの判断を否定し(処理を終了しないと判断し)、ステップS21、S22-1、S22-2、S22-3の処理を再び実施する。制御部400は、例えば、ユーザが終了指示を出したとき又は取得部300が各特徴点FPの3次元座標を取得できなくなったとき(各特徴点FPがカメラ300aの画角から外れたとき)に、ここでの判断を肯定し(処理を終了すると判断し)、フローを終了させる。 Next, in step S23, the control unit 400 judges whether or not to end the process. If the judgment here is positive, the flow ends, and if it is negative, the flow returns to step S21 and the same series of processes are performed again. If the acquisition unit 300 can acquire the three-dimensional coordinates of each feature point FP (if each feature point FP is within the angle of view of the camera 300a), the control unit 400 denies the judgment here (determines not to end the process) and performs the processes of steps S21, S22-1, S22-2, and S22-3 again. For example, when the user issues an end command or when the acquisition unit 300 can no longer acquire the three-dimensional coordinates of each feature point FP (if each feature point FP falls outside the angle of view of the camera 300a), the control unit 400 makes a positive judgment here (determines to end the process) and ends the flow.
(可動ミラー制御処理)
図14は、図13の可動ミラー制御処理(ステップS22-1)を示すフローチャートである。
(Movable mirror control process)
FIG. 14 is a flowchart showing the movable mirror control process (step S22-1) of FIG.
最初のステップS22-1-1では、光学中心xyz座標算出部400a1が、光学素子OEの光学中心OCのxyz座標を算出する。 In the first step S22-1-1, the optical center xyz coordinate calculation unit 400a1 calculates the xyz coordinates of the optical center OC of the optical element OE.
次のステップS22-1-2では、可動ミラー制御部400b1が、光学中心OCのxy座標に基づいて可動ミラー200aの角度を決定する。なお、当該決定に光学中心OCのz座標も用いるようにしてもよい。 In the next step S22-1-2, the movable mirror control unit 400b1 determines the angle of the movable mirror 200a based on the x and y coordinates of the optical center OC. Note that the z coordinate of the optical center OC may also be used in this determination.
最後のステップS22-1-3では、可動ミラー制御部400b1が、可動ミラー200aを制御する。具体的には、可動ミラー制御部400b1が、ステップS22-1-2で決定された、可動ミラー200aの角度に応じたミラー駆動信号を可動ミラー駆動部に送る。 In the final step S22-1-3, the movable mirror control unit 400b1 controls the movable mirror 200a. Specifically, the movable mirror control unit 400b1 sends a mirror drive signal corresponding to the angle of the movable mirror 200a determined in step S22-1-2 to the movable mirror drive unit.
(フォーカスレンズ制御処理)
図15は、図13のフォーカスレンズ制御処理(ステップS22-2)を示すフローチャートである。
(Focus lens control process)
FIG. 15 is a flowchart showing the focus lens control process (step S22-2) of FIG.
最初のステップS22-2-1では、光学中心xyz座標算出部400a1が、光学素子OEの光学中心OCのz座標を算出する。 In the first step S22-2-1, the optical center xyz coordinate calculation unit 400a1 calculates the z coordinate of the optical center OC of the optical element OE.
次のステップS22-2-2では、フォーカスレンズ制御部400b2が、光学中心OCのz座標に基づいてフォーカスレンズ200cの光軸方向の位置を決定する。 In the next step S22-2-2, the focus lens control unit 400b2 determines the position of the focus lens 200c in the optical axis direction based on the z coordinate of the optical center OC.
最後のステップS22-2-3では、フォーカスレンズ制御部400b2が、フォーカスレンズ200cを制御する。具体的には、フォーカスレンズ制御部400b2が、ステップS22-2-2で決定された、フォーカスレンズ200cの光軸方向の位置に基づくレンズ駆動信号をフォーカスレンズ駆動部に送る。 In the final step S22-2-3, the focus lens control unit 400b2 controls the focus lens 200c. Specifically, the focus lens control unit 400b2 sends a lens drive signal based on the position of the focus lens 200c in the optical axis direction determined in step S22-2-2 to the focus lens drive unit.
(投射画像制御処理)
図16は、図13の投射画像制御処理(ステップS22-3)を示すフローチャートである。
(Projection image control processing)
FIG. 16 is a flow chart showing the projection image control process (step S22-3) of FIG.
最初のステップS22-3-1では、光学素子傾き角算出部400a2が、光学素子OEの傾き角(θx、θy)を算出する。 In the first step S22-3-1, the optical element tilt angle calculation unit 400a2 calculates the tilt angle (θx, θy) of the optical element OE.
次のステップS22-3-2では、投射画像制御部400b3が、光学素子OEの傾き角(θx、θy)に基づいて投射画像の輝度を決定する。 In the next step S22-3-2, the projection image control unit 400b3 determines the brightness of the projection image based on the tilt angle (θx, θy) of the optical element OE.
最後のステップS22-3-3では、投射画像制御部400b3が、投射画像を制御する。具体的には、投射画像制御部400b3が、ステップS22-3-2で決定された、投射画像の輝度に応じて変調された光源駆動信号を光源駆動部100dに送る。なお、該光源駆動信号は、投影画像の輝度及び形状補正に応じて変調された信号であってもよい。 In the final step S22-3-3, the projection image control unit 400b3 controls the projection image. Specifically, the projection image control unit 400b3 sends a light source drive signal modulated according to the brightness of the projection image determined in step S22-3-2 to the light source drive unit 100d. Note that the light source drive signal may be a signal modulated according to the brightness and shape correction of the projection image.
(表示装置の制御の構成例2)
図17は、表示装置1の制御の構成例2を示すブロック図である。
(Configuration Example 2 of Display Device Control)
FIG. 17 is a block diagram showing a second example of the control configuration of the display device 1. In FIG.
表示装置1の制御の構成例2は、図12に示す構成例1に対して、取得部300が、xyz座標取得部300bの代わりにxy座標取得部300c及びz座標取得部300dを有している点と、制御部400の演算部400aが、xyz座標算出部400a1の代わりに光学中心xy座標算出部400a11及び光学中心z座標算出部400a12を有している点とが異なる。 Configuration example 2 of the control of the display device 1 differs from configuration example 1 shown in FIG. 12 in that the acquisition unit 300 has an xy coordinate acquisition unit 300c and a z coordinate acquisition unit 300d instead of the xyz coordinate acquisition unit 300b, and the calculation unit 400a of the control unit 400 has an optical center xy coordinate calculation unit 400a11 and an optical center z coordinate calculation unit 400a12 instead of the xyz coordinate calculation unit 400a1.
(表示装置の制御の構成例2における全体制御の流れ)
図18は、図17の構成例2における全体制御の流れを示すフローチャートである。図18は、CPUにより実行される処理アルゴリズムに基づいている。図18の一連の処理は、カメラ300aが座標を捕らえたとき(取得部300が座標を取得したとき)に開始される。
(Overall control flow in configuration example 2 of the display device control)
Fig. 18 is a flowchart showing the flow of overall control in the configuration example 2 of Fig. 17. Fig. 18 is based on a processing algorithm executed by a CPU. The series of processes in Fig. 18 are started when the camera 300a captures the coordinates (when the acquisition unit 300 acquires the coordinates).
先ず、ステップS31-1において、xy座標取得部300cによるxy座標取得処理及びz座標取得部300dによるz座標取得処理を並行して実施する。具体的には、xy座標取得部300cが、ステレオカメラSCを構成する2つのカメラ300a、300aの撮像画像から、少なくとも3つの特徴点FPの各々のxy座標を取得する。z座標取得部300dが、ステレオカメラSCを構成する2つのカメラ300a、300aの撮像画像から、少なくとも3つの特徴点FPの各々のz座標を取得する。 First, in step S31-1, the xy coordinate acquisition unit 300c performs an xy coordinate acquisition process and the z coordinate acquisition unit 300d performs a z coordinate acquisition process in parallel. Specifically, the xy coordinate acquisition unit 300c acquires the xy coordinates of at least three feature points FP from the images captured by the two cameras 300a, 300a that make up the stereo camera SC. The z coordinate acquisition unit 300d acquires the z coordinates of at least three feature points FP from the images captured by the two cameras 300a, 300a that make up the stereo camera SC.
次に、制御部400が、ステップS32-1の可動ミラー制御処理、ステップS32-2のフォーカスレンズ制御処理及びステップS32-3の投射画像制御処理を並行して実施する。ステップS32-1は、図13のステップS22-1と同様である。ステップS32-2は、図13のステップS22-2と同様である。ステップS32-3は、図13のステップS22-3と同様である。 Next, the control unit 400 performs the movable mirror control process of step S32-1, the focus lens control process of step S32-2, and the projection image control process of step S32-3 in parallel. Step S32-1 is the same as step S22-1 in FIG. 13. Step S32-2 is the same as step S22-2 in FIG. 13. Step S32-3 is the same as step S22-3 in FIG. 13.
次に、ステップS33において、制御部400が、ステップS32-1、ステップS32-2及びステップS32-3の各々が実行される毎に処理を終了するか否かを判断する。ここでの判断が肯定されるとフローは終了し、否定されるとステップS31-1、S31-2に戻り、同様の処理が再び実施される。制御部400は、取得部300が各特徴点FPの3次元座標を取得できるとき(各特徴点FPがカメラ300aの画角内にあるとき)には、ここでの判断を否定し(処理を終了しないと判断し)、ステップS31-1、S31-2、S32-1、S32-2、S32-3の処理を再び実施する。制御部400は、例えば、ユーザが終了指示を出したとき又は取得部300が各特徴点FPの3次元座標を取得できなくなったとき(各特徴点FPがカメラ300aの画角から外れたとき)に、ここでの判断を肯定し(処理を終了すると判断し)、フローを終了させる。 Next, in step S33, the control unit 400 judges whether or not to end the processing each time step S32-1, step S32-2, and step S32-3 are executed. If the judgment here is positive, the flow ends, and if it is negative, the flow returns to steps S31-1 and S31-2, and the same processing is performed again. When the acquisition unit 300 is able to acquire the three-dimensional coordinates of each feature point FP (when each feature point FP is within the angle of view of camera 300a), the control unit 400 denies the judgment here (determines not to end the processing) and performs the processing of steps S31-1, S31-2, S32-1, S32-2, and S32-3 again. For example, when the user issues an end command or when the acquisition unit 300 is no longer able to acquire the three-dimensional coordinates of each feature point FP (when each feature point FP falls outside the angle of view of the camera 300a), the control unit 400 affirms the determination here (determines that processing should be ended) and ends the flow.
以上説明した表示装置1の制御の構成例2では、制御部400は、少なくとも3つの特徴点FPの各々の2次元位置情報(x、y)に基づいて可動ミラー200aを制御することと、該2次元位置情報(x、y)及び該特徴点FPの距離情報(z軸方向の距離)に基づいて光源100aを制御し、且つ、該距離情報に基づいてフォーカスレンズ200cを制御することと、を並行して行う。 In the configuration example 2 of the control of the display device 1 described above, the control unit 400 controls the movable mirror 200a based on the two-dimensional position information (x, y) of each of the at least three feature points FP, controls the light source 100a based on the two-dimensional position information (x, y) and the distance information of the feature points FP (distance in the z-axis direction), and controls the focus lens 200c based on the distance information, in parallel.
ここで、一般に、各特徴点FPのxy座標の取得は比較的容易でありz座標の取得には採用するデバイスや配置の制限と演算負荷が必要である。さらに、各特徴点FPの5軸のうちxy座標の取得は観察者にとって画像を見ることができるかできないかに直接影響するものであり、残りの3つのパラメータ(z、θx、θy)よりも優先度が高い。一方で、投射画像の制御は、画像表示のフレームレートに律速されてしまい、可動ミラー200aと比較して高速の制御ができない特徴がある。そこで、上記構成例2では、上述の如く、各特徴点FPのxy座標の取得と、z座標の取得の情報取得周期及び、その後の演算のループ周期を独立にしている。可動ミラー200aの制御に関する処理(第1制御ループ)は、xy座標を最速で計算し投射光軸を光学素子OE(ターゲット)に追従させる。z座標が必要な処理(第2制御ループ)は、可動ミラー200aの制御に関する処理よりも遅い周期で情報を更新する。θx、θyの計算に必要なxy座標の情報は、必要なタイミングで最新のものを使って計算すればよい。このように、制御ループを二つに分けることによって、制御速度の律速要因に影響を受けることなく、最も重要な可動ミラーの制御を高速で行うことができる。 Here, in general, it is relatively easy to obtain the x and y coordinates of each feature point FP, but obtaining the z coordinate requires restrictions on the device and arrangement used and a computation load. Furthermore, obtaining the x and y coordinates of the five axes of each feature point FP directly affects whether the observer can see the image or not, and has a higher priority than the remaining three parameters (z, θx, θy). On the other hand, the control of the projected image is limited by the frame rate of the image display, and is characterized by not being able to control at high speed compared to the movable mirror 200a. Therefore, in the above configuration example 2, as described above, the information acquisition period for obtaining the x and y coordinates of each feature point FP and the z coordinate, and the loop period of the subsequent calculation are independent. The process related to the control of the movable mirror 200a (first control loop) calculates the x and y coordinates at the highest speed and makes the projection optical axis follow the optical element OE (target). The process requiring the z coordinate (second control loop) updates information at a slower period than the process related to the control of the movable mirror 200a. The x and y coordinate information required to calculate θx and θy can be calculated using the latest information at the required timing. In this way, by dividing the control loop into two, the most important control of the movable mirror can be performed at high speed without being affected by the rate-limiting factors of the control speed.
<2.本技術の第2実施形態に係る表示装置>
図19は、本技術の第2実施形態に係る表示装置2の構成例を示す図である。表示装置2は、図19に示すように、投射光学系200の光軸(投射光軸)とカメラ300aの光軸とが同軸である点を除いて、第1実施形態に係る表示装置1と概ね同様の構成を有する。
<2. Display device according to second embodiment of the present technology>
Fig. 19 is a diagram showing a configuration example of a display device 2 according to a second embodiment of the present technology. As shown in Fig. 19, the display device 2 has a configuration substantially similar to that of the display device 1 according to the first embodiment, except that the optical axis (projection optical axis) of the projection optical system 200 and the optical axis of the camera 300a are coaxial.
表示装置2では、フォーカスレンズ200cと可動ミラー200aとの間の画像光ILの光路上にハーフミラー200dが配置されている。カメラ300aは、フォーカスレンズ200cと可動ミラー200aとの間の画像光ILの光路から外れた位置であって、ハーフミラー200dに対向する位置にハーフミラー200d側に向けて配置されている。この場合、カメラ300aは、投射光軸が通っている位置を常時所定画角(例えば中心画角)内に捉えた状態となる。よって、投射光軸が光学素子OEの光学中心OCからずれると、カメラ300aの撮像画像の所定画角も光学中心OCからずれることになる。そこで、このずれを打ち消すように可動ミラー200aを動かすことで、投射画像を常に光学中心OCに向け続けることができる。なお、「投射光軸とカメラ300aの光軸とが同軸である」とは、厳密に同軸である場合に限らず、同一の作用・効果を奏する範囲内でのずれを許容する意味である。 In the display device 2, a half mirror 200d is disposed on the optical path of the image light IL between the focus lens 200c and the movable mirror 200a. The camera 300a is disposed at a position outside the optical path of the image light IL between the focus lens 200c and the movable mirror 200a, facing the half mirror 200d. In this case, the camera 300a is always in a state of capturing the position through which the projection optical axis passes within a predetermined angle of view (for example, a central angle of view). Therefore, if the projection optical axis deviates from the optical center OC of the optical element OE, the predetermined angle of view of the image captured by the camera 300a will also deviate from the optical center OC. Therefore, by moving the movable mirror 200a so as to cancel this deviation, the projection image can be always directed toward the optical center OC. Note that "the projection optical axis and the optical axis of the camera 300a are coaxial" does not necessarily mean that they are strictly coaxial, but means that deviation is allowed within a range in which the same action and effect is achieved.
図20は、本技術の第2実施形態に係る表示装置2の構成例の全体制御の流れを示すフローチャートである。図20のフローチャートは、CPUにより実行される処理アルゴリズムに基づいている。図20の一連の処理は、カメラ300aが座標を捕らえたとき(取得部300が座標を取得したとき)に開始される。 FIG. 20 is a flowchart showing the overall control flow of an example configuration of a display device 2 according to a second embodiment of the present technology. The flowchart in FIG. 20 is based on a processing algorithm executed by a CPU. The series of processes in FIG. 20 are started when the camera 300a captures coordinates (when the acquisition unit 300 acquires coordinates).
最初のステップS11では、取得部300が、少なくとも3つの特徴点FPの各々の2次元座標(x、y)を取得する。 In the first step S11, the acquisition unit 300 acquires the two-dimensional coordinates (x, y) of each of at least three feature points FP.
次のステップS12では、制御部400の演算部400aが、少なくとも3つの特徴点FPの各々の2次元座標(x、y)に基づいて、光学素子OEの光学中心OCの2次元座標(x、y)を算出する。 In the next step S12, the calculation unit 400a of the control unit 400 calculates the two-dimensional coordinates (x, y) of the optical center OC of the optical element OE based on the two-dimensional coordinates (x, y) of each of the at least three feature points FP.
次のステップS13では、制御部400が、投射光軸が光学素子OEの光学中心OCを通っているか否かを判断する。具体的には、制御部400が、カメラ300aの撮像画像における光学中心OCの位置(2次元座標)に基づいて、投射光軸と光学中心OCとのずれの有無を判定する。ここでの判断が肯定されると(該ずれが無いと判断されると)ステップS11に戻り、否定されると(該ずれが有ると判断されると)ステップS14に進む。 In the next step S13, the control unit 400 determines whether or not the projection optical axis passes through the optical center OC of the optical element OE. Specifically, the control unit 400 determines whether or not there is a misalignment between the projection optical axis and the optical center OC based on the position (two-dimensional coordinates) of the optical center OC in the image captured by the camera 300a. If the determination here is positive (if it is determined that there is no misalignment), the process returns to step S11, and if it is negative (if it is determined that there is a misalignment), the process proceeds to step S14.
次のステップS14では、制御部400が、投射光軸が光学素子OEの光学中心OCを通るように可動ミラー200aを制御する。 In the next step S14, the control unit 400 controls the movable mirror 200a so that the projection optical axis passes through the optical center OC of the optical element OE.
最後のステップS15では、制御部400が、処理を終了するか否かを判断する。例えば、取得部300が各特徴点FPの3次元座標を取得できるとき(各特徴点FPがカメラ300aの画角内にあるとき)に、ここでの判断が否定され(処理を終了しないと判断され)、ステップS11に戻る。一方、例えば、ユーザが終了指示を出したとき又は取得部300aが各特徴点FPの3次元座標を取得できなくなったとき(各特徴点FPがカメラ300aの画角から外れたとき)にここでの判断が肯定され(処理を終了すると判断され)、フローは終了する。 In the final step S15, the control unit 400 determines whether or not to end the process. For example, when the acquisition unit 300 can acquire the three-dimensional coordinates of each feature point FP (when each feature point FP is within the angle of view of the camera 300a), the determination here is negative (determined not to end the process) and the process returns to step S11. On the other hand, for example, when the user issues an end command or the acquisition unit 300a can no longer acquire the three-dimensional coordinates of each feature point FP (when each feature point FP falls outside the angle of view of the camera 300a), the determination here is positive (determined to end the process) and the flow ends.
以上説明した表示装置2では、カメラ300aが、可動ミラー200aを介して接眼光学系20付近を認識するように配置されるため、広範囲の認識と高精度な認識を両立することができる。カメラ300aの撮像範囲は、光投射系10が投射可能な範囲内をすべてカバーすることが可能である。また、光投射座標系とカメラ座標系の原点が一致するので、初期のキャリブレーションを簡略化でき、各摂動に対する位置ずれ誤差を抑制することができるという利点もある。しかし、この場合、可動ミラー200aを介しての認識であるため、検出される座標は絶対座標でなく相対座標となる。このため、観察者を見失った際に可動ミラー200aをどの方向に向ければ良いかが分からなくなる。これは、絶対座標認識用のカメラを設置すれば解決することができる。このカメラとしては、解像度や撮影速度に対して高い性能を必要としないため比較的安価なものを1つ用意してもよいし、カメラ300aの撮像視野の一部に投射光軸と同軸にならない領域を設け、その領域を切り出して使ってもよい。もう一つの解決手段としては、可動ミラー200aの制御方法に追従モードと捜索モードの2種類を用意することである。観察者の位置が不明なときは捜索モードとし、可動ミラー200aの可動範囲内を2次元走査することで観察者を探し、観察者の位置を特定すると追従モードに切り替え上述した手法で制御する。少なくとも1つのカメラ300aとしては、可視光カメラ、赤外線カメラ、ステレオカメラ、TOFカメラ等の種類は如何なるものでもよく、第1実施形態に係る表示装置1と組み合わせることも可能である。 In the display device 2 described above, the camera 300a is arranged to recognize the vicinity of the eyepiece optical system 20 through the movable mirror 200a, so that it is possible to achieve both wide-range and high-precision recognition. The imaging range of the camera 300a can cover the entire range within which the light projection system 10 can project. In addition, since the origins of the light projection coordinate system and the camera coordinate system coincide, there is also the advantage that the initial calibration can be simplified and positional deviation errors due to each perturbation can be suppressed. However, in this case, since the recognition is via the movable mirror 200a, the detected coordinates are relative coordinates rather than absolute coordinates. Therefore, when the observer is lost, it becomes unclear in which direction the movable mirror 200a should be pointed. This can be solved by installing a camera for absolute coordinate recognition. This camera does not require high performance in terms of resolution and shooting speed, so a relatively inexpensive one may be prepared, or an area that is not coaxial with the projection optical axis may be provided in part of the imaging field of the camera 300a and used by cutting out that area. Another solution is to provide two types of control methods for the movable mirror 200a: a tracking mode and a search mode. When the position of the observer is unknown, the search mode is used, and the observer is searched for by two-dimensionally scanning the movable range of the movable mirror 200a. When the position of the observer is identified, the mode is switched to the tracking mode and control is performed using the above-mentioned method. The at least one camera 300a may be any type, such as a visible light camera, an infrared camera, a stereo camera, or a TOF camera, and it is also possible to combine it with the display device 1 according to the first embodiment.
<3.本技術の第3実施形態に係る表示装置>
図21は、本技術の第3実施形態に係る表示装置3の構成例を示す図である。表示装置3では、光学素子OE及び少なくとも3つの特徴点FPが眼鏡レンズGLに設けられている点を除いて、第1実施形態に係る表示装置1と同様の構成を有する。
<3. Display device according to a third embodiment of the present technology>
21 is a diagram showing a configuration example of a display device 3 according to a third embodiment of the present technology. The display device 3 has a similar configuration to the display device 1 according to the first embodiment, except that an optical element OE and at least three feature points FP are provided on a spectacle lens GL.
眼鏡レンズGLは、接眼光学系20の構成要素であってもよいし、構成要素でなくてもよい。光学素子OEが眼鏡レンズGLに設けられていると、眼球EBの瞳と光学素子OEの光学中心OCとが一致しない可能性があり、より精度や経時変化を追い込む工夫や、追加で瞳の位置を検出して追従などが必要になるというデメリットが生じる一方で、特徴点FPの設計の自由度が高い、特徴点FPの検出精度を向上できる、製造が容易である、安全性が高い、着用のハードルが低いなどのメリットがある。ここでは、光学素子OEに主に例えばHOEを用いることができるが、回折光学素子(DOE)等を用いることもできる。 The eyeglass lens GL may or may not be a component of the eyepiece optical system 20. If the optical element OE is provided on the eyeglass lens GL, there is a possibility that the pupil of the eyeball EB and the optical center OC of the optical element OE will not coincide, which may result in disadvantages such as the need for greater precision and measures to address changes over time, or the need to additionally detect and track the position of the pupil, but it also has advantages such as a high degree of freedom in designing the feature points FP, improved detection accuracy of the feature points FP, easy manufacture, high safety, and low barriers to wearing. Here, a HOE can be mainly used for the optical element OE, but a diffractive optical element (DOE) or the like can also be used.
表示装置3では、一例として、図21に示すように、眼鏡フレームGFのリム(枠)に嵌め込まれた眼鏡レンズGLの、光学素子OEの周辺の領域に少なくとも3つの特徴点FPが設けられている。なお、表示装置3において、眼鏡フレームGFは、リムレスフレームであってもよい。 In the display device 3, as an example, as shown in FIG. 21, at least three feature points FP are provided in the area around the optical element OE of the eyeglass lens GL fitted into the rim (frame) of the eyeglass frame GF. Note that in the display device 3, the eyeglass frame GF may be a rimless frame.
<4.本技術の第4実施形態に係る表示装置>
図22は、本技術の第4実施形態に係る表示装置4の構成例を示す図である。表示装置4では、光学素子OEが眼鏡レンズGLに設けられ、且つ、少なくとも3つの特徴点FPが眼鏡フレームGFに設けられている点を除いて、第3実施形態に係る表示装置3と同様の構成を有する。
<4. Display device according to fourth embodiment of the present technology>
22 is a diagram showing a configuration example of a display device 4 according to a fourth embodiment of the present technology. The display device 4 has a similar configuration to the display device 3 according to the third embodiment, except that an optical element OE is provided on a spectacle lens GL and at least three feature points FP are provided on a spectacle frame GF.
表示装置4では、一例として、図22に示すように、眼鏡フレームGFのリムに少なくとも3つの特徴点FPが設けられている。 In the display device 4, as an example, at least three feature points FP are provided on the rim of the eyeglass frame GF, as shown in FIG. 22.
<5.本技術の第5実施形態に係る表示装置>
図23は、本技術の第5実施形態に係る表示装置5の構成例を示す図である。表示装置5では、光学素子OE及び少なくとも3つの特徴点FPの一部が眼鏡レンズGLに設けられ、且つ、該少なくとも3つの特徴点FPの他部が眼鏡フレームGFに設けられている点を除いて、第3実施形態に係る表示装置3と同様の構成を有する。
<5. Display device according to a fifth embodiment of the present technology>
23 is a diagram showing a configuration example of a display device 5 according to a fifth embodiment of the present technology. The display device 5 has a similar configuration to the display device 3 according to the third embodiment, except that the optical element OE and parts of the at least three feature points FP are provided on the eyeglass lens GL, and the other parts of the at least three feature points FP are provided on the eyeglass frame GF.
表示装置5では、一例として、図23に示すように、光学素子OE及び少なくとも1つの特徴点FPが眼鏡レンズGLに設けられ、且つ、少なくとも2つの特徴点FPが眼鏡フレームGFに設けられている。なお、表示装置5において、光学素子OE及び少なくとも2つの特徴点FPが眼鏡レンズGLに設けられ、且つ、少なくとも1つの特徴点FPが眼鏡フレームGFに設けられていてもよい。 In the display device 5, as an example, as shown in FIG. 23, an optical element OE and at least one feature point FP are provided on the eyeglass lens GL, and at least two feature points FP are provided on the eyeglass frame GF. Note that in the display device 5, the optical element OE and at least two feature points FP may be provided on the eyeglass lens GL, and at least one feature point FP may be provided on the eyeglass frame GF.
<6.本技術の第6実施形態に係る表示装置>
図25は、本技術の第6実施形態に係る表示装置6の構成例を示す図である。表示装置6は、図25に示すように、取得部300が、少なくとも3つの特徴点FPに赤外光を照射する赤外光源300e(例えばLED、レーザ等)をフォーカスレンズ200c付近に有し、且つ、カメラ300aとしての赤外線カメラ(赤外波長に感度のあるカメラ)を有している点を除いて、第1実施形態に係る表示装置1と概ね同様の構成を有する。なお、赤外光源500は、光源100aの一部であってもよいし、光源100aからの光の光路上に配置されていてもよい。
<6. Display device according to a sixth embodiment of the present technology>
25 is a diagram showing a configuration example of a display device 6 according to a sixth embodiment of the present technology. As shown in FIG. 25, the display device 6 has a configuration similar to that of the display device 1 according to the first embodiment, except that the acquisition unit 300 has an infrared light source 300e (e.g., an LED, a laser, etc.) that irradiates infrared light to at least three feature points FP near the focus lens 200c, and has an infrared camera (a camera sensitive to infrared wavelengths) as the camera 300a. Note that the infrared light source 500 may be a part of the light source 100a, or may be disposed on the optical path of the light from the light source 100a.
表示装置6において、処理量を減らす工夫として、カメラ300aに可視光カットフィルタを設けてもよい。ここでは、各特徴点FPとして、再帰性反射材料からなるマーカーを用いている。再帰性反射材料からなるマーカーは、図26に示すように、平面鏡における鏡面反射(正反射)とは異なり、入射光と反射光とが同じ方向になる特性を有している。この特性を利用することで原理的には観察者が大きく動いても必ず投射光学系200(詳しくは赤外光源300e)の周辺にマーカーで反射した強い光が戻ってくるため、精度良くマーカーの位置を検出することができる。また、表示装置6は、照明光源としての赤外光源300eを持っているため、観察者が暗い環境にいてもマーカーの位置を検出することができる。 In the display device 6, a visible light cut filter may be provided on the camera 300a as a way to reduce the amount of processing. Here, markers made of a retroreflective material are used as each feature point FP. As shown in FIG. 26, markers made of a retroreflective material have the property that the incident light and reflected light are in the same direction, unlike specular reflection (regular reflection) on a plane mirror. By utilizing this property, in principle, even if the observer moves significantly, strong light reflected by the marker will always return to the periphery of the projection optical system 200 (more specifically, the infrared light source 300e), so the position of the marker can be detected with high accuracy. In addition, since the display device 6 has the infrared light source 300e as an illumination light source, the position of the marker can be detected even if the observer is in a dark environment.
(TOFカメラ)
図27は、カメラ300aとしてのTOF(Time of Flight)カメラの構成例を示すブロック図である。
(TOF camera)
FIG. 27 is a block diagram showing an example of the configuration of a TOF (Time of Flight) camera as the camera 300a.
カメラ300aとしてのTOFカメラは、図27に示すように、発光部300a1と、受光部300a2と、距離演算部300a3とを有する。当該TOFカメラの測定法として、直接TOF法を用いてもよいし、間接TOF法を用いてもよい。 As shown in FIG. 27, the TOF camera serving as camera 300a has a light emitting unit 300a1, a light receiving unit 300a2, and a distance calculation unit 300a3. The measurement method of the TOF camera may be a direct TOF method or an indirect TOF method.
TOFカメラは、例えば各特徴点FPのz座標(距離)を取得するために用いることができ、該特徴点FPのxy座標を取得するための可視光カメラや赤外線カメラと併用することができる。なお、xy分解能が高いTOFカメラであれば、該TOFカメラでxyzすべての座標を取得することができるため、部品数削減による小型軽量化やコスト削減が期待できる。 The TOF camera can be used, for example, to obtain the z coordinate (distance) of each feature point FP, and can be used in conjunction with a visible light camera or an infrared camera to obtain the xy coordinates of the feature points FP. If the TOF camera has high xy resolution, it can obtain all xyz coordinates, which is expected to reduce the number of parts, making it smaller, lighter, and more cost-effective.
(イベントカメラ)
図28は、イベントセンサを有するカメラ300a(イベントカメラ)の構成例を示すブロック図である。
(Event Camera)
FIG. 28 is a block diagram showing an example of the configuration of a camera 300a (event camera) having an event sensor.
カメラ300aとしてのイベントカメラが有するイベントセンサは、図28に示すように、各画素の輝度変化を検出する輝度変化検出部300a4と、輝度が変化した画素に関するデータのみを出力する輝度変化データ出力部300a5とを有する。イベントセンサは、イベントベースビジョンセンサ(EVS)とも呼ばれる。 The event sensor of the event camera as camera 300a has a luminance change detection unit 300a4 that detects luminance changes in each pixel, and a luminance change data output unit 300a5 that outputs only data related to pixels whose luminance has changed, as shown in FIG. 28. The event sensor is also called an event-based vision sensor (EVS).
イベントセンサは、各画素の輝度変化を非同期で検出し、変化したデータのみを座標および時間の情報と組み合わせて出力する高速・低遅延なデータ出力を実現するセンサである。イベントセンサを有するイベントカメラは、通常のカメラに比べて特徴点の変化を高速で捉えつつ、扱う情報量を減らすことができるため、より違和感の少ない追従体験を実現でき、同時に後段の処理の負荷を削減して小型軽量化、低消費電力化にも寄与する。 An event sensor is a sensor that detects changes in the brightness of each pixel asynchronously and outputs only the changed data, combining it with coordinate and time information, achieving high-speed, low-latency data output. Event cameras equipped with event sensors can capture changes in feature points faster than normal cameras while reducing the amount of information they handle, enabling a more natural tracking experience and at the same time reducing the processing load in subsequent stages, contributing to smaller, lighter models and lower power consumption.
<7.本技術に係る表示装置の効果>
以上説明した本技術に係る表示装置(例えば第1~第6実施形態に係る表示装置1~6のいずれか)は、光源100aを含む光投射系10と、光投射系10から投射された光を眼球EBへ導く接眼光学系20と、を備え、接眼光学系20は、光学素子OEと、光学素子OEに対する相対位置が固定された3つ以上の特徴点FPと、を含み、光投射系10は、該3つ以上の特徴点のうち少なくとも3つの特徴点FPの各々の少なくとも2次元位置情報を取得する取得部300を含む。
7. Effects of the display device according to the present technology
The display device according to the present technology described above (for example, any of the display devices 1 to 6 according to the first to sixth embodiments) comprises an optical projection system 10 including a light source 100a, and an eyepiece optical system 20 that guides light projected from the optical projection system 10 to an eyeball EB, where the eyepiece optical system 20 includes an optical element OE and three or more feature points FP whose relative positions with respect to the optical element OE are fixed, and the optical projection system 10 includes an acquisition unit 300 that acquires at least two-dimensional position information of each of at least three of the three or more feature points FP.
本技術に係る表示装置では、該少なくとも3つの特徴点FPの各々の少なくとも2次元情報を精度良く検出することができる。 The display device according to this technology can accurately detect at least two-dimensional information for each of the at least three feature points FP.
結果として、本技術に係る表示装置によれば、光投射系10と接眼光学系20との相対位置を精度良く検出することが可能な表示装置を提供することができる。本技術に係る表示装置は、該相対位置を精度良く検出することができるため、観察者に装着される接眼光学系20に対する画像追従性に優れた表示装置を実現することができる。本技術に係る表示装置は、別体の光投射系と接眼光学系とが空間的に相当離れて配置(例えば光投射系及び接眼光学系の双方が観察者に装着される場合よりも離れて)配置され、両者の位置関係がリアルタイムでダイナミックに変化する使用態様においても優れた画像追従性を有する。 As a result, the display device according to the present technology can provide a display device capable of detecting the relative position between the light projection system 10 and the eyepiece optical system 20 with high accuracy. The display device according to the present technology can detect the relative position with high accuracy, and therefore can realize a display device with excellent image tracking performance with respect to the eyepiece optical system 20 worn by the observer. The display device according to the present technology has excellent image tracking performance even in a usage mode in which the separate light projection system and eyepiece optical system are arranged at a considerable spatial distance (for example, farther apart than when both the light projection system and the eyepiece optical system are worn by the observer) and the positional relationship between the two changes dynamically in real time.
本技術に係る表示方法(例えば第1~第6実施形態に係る表示装置1~6のいずれかを用いて行われる表示方法)は、光投射系10から投射された光を眼球EBへ導く光学素子OEに対する相対位置が固定された3つ以上の特徴点FPのうち少なくとも3つの特徴点FPの各々の少なくとも2次元位置情報を取得する工程と、該取得する工程での取得結果に基づいて、光投射系10を制御する工程と、を含む。 The display method according to the present technology (for example, a display method performed using any of the display devices 1 to 6 according to the first to sixth embodiments) includes a step of acquiring at least two-dimensional position information of at least three of three or more feature points FP whose relative positions with respect to an optical element OE that guides light projected from an optical projection system 10 to an eyeball EB are fixed, and a step of controlling the optical projection system 10 based on the acquisition results from the acquisition step.
本技術に係る表示方法では、該少なくとも3つの特徴点FPの各々の少なくとも2次元情報を精度良く検出することができる。 The display method according to the present technology can accurately detect at least two-dimensional information for each of the at least three feature points FP.
結果として、本技術に係る表示方法によれば、光投射系10と接眼光学系20との相対位置を精度良く検出することが可能な表示方法を提供することができる。本技術に係る表示方法は、該相対位置を精度良く検出することができるため、観察者に装着される接眼光学系20に対する画像追従性に優れた表示方法を実現することができる。本技術に係る表示方法は、別体の光投射系と接眼光学系とが空間的に相当離れて配置(例えば光投射系及び接眼光学系の双方が観察者に装着される場合よりも離れて)配置され、両者の位置関係がリアルタイムでダイナミックに変化する使用態様においても優れた画像追従性を有する。 As a result, the display method according to the present technology can provide a display method capable of detecting the relative position between the light projection system 10 and the eyepiece optical system 20 with high accuracy. The display method according to the present technology can detect the relative position with high accuracy, and therefore can realize a display method with excellent image tracking for the eyepiece optical system 20 worn by the observer. The display method according to the present technology has excellent image tracking even in a usage mode in which the separate light projection system and eyepiece optical system are arranged at a considerable spatial distance (for example, farther apart than when both the light projection system and the eyepiece optical system are worn by the observer) and the positional relationship between the two changes dynamically in real time.
<8.本技術の変形例>
以上説明した本技術の各実施形態に係る表示装置及び該表示装置を用いる表示方法は、適宜変更可能である。
8. Modifications of the present technology
The display device and the display method using the display device according to each embodiment of the present technology described above can be modified as appropriate.
例えば、上記各実施形態に係る表示装置において、取得部300が、少なくとも3つの特徴点FPの各々の2次元位置情報のみを取得するようにしてもよい。この場合でも、該2次元位置情報に基づいて可動ミラー200aを制御することにより、観察者であるユーザの眼球EBに画像を描画することが可能である。 For example, in the display device according to each of the above embodiments, the acquisition unit 300 may acquire only two-dimensional position information for each of at least three feature points FP. Even in this case, it is possible to draw an image on the eyeball EB of the user who is the observer by controlling the movable mirror 200a based on the two-dimensional position information.
例えば、上記各実施形態に係る表示装置において、取得部300が、少なくとも4つの特徴点FPの各々の少なくとも2次元位置情報(例えば2次元位置情報のみ、2次元位置情報及び距離情報)を取得するようにしてもよい。この場合、光投射系10と接眼光学系20との相対位置をより精度良く求めることができる。 For example, in the display device according to each of the above embodiments, the acquisition unit 300 may acquire at least two-dimensional position information (e.g., two-dimensional position information only, or two-dimensional position information and distance information) for each of the at least four feature points FP. In this case, the relative position between the optical projection system 10 and the eyepiece optical system 20 can be determined with greater accuracy.
例えば、上記各実施形態に係る表示装置において、2次元座標(x、y)に基づいて可動偏向素子(例えば可動ミラー200a)を制御することとしているが、3次元座標(x、y、z)に基づいて可動偏向素子を制御してもよい。 For example, in the display devices according to the above embodiments, the movable deflection element (e.g., the movable mirror 200a) is controlled based on two-dimensional coordinates (x, y), but the movable deflection element may also be controlled based on three-dimensional coordinates (x, y, z).
例えば、上記各実施形態に係る表示装置は、投射光学系200が、フォーカスレンズ200c及びフォーカスレンズ駆動部を有していなくてもよい。 For example, in the display device according to each of the above embodiments, the projection optical system 200 does not need to have a focus lens 200c and a focus lens driving unit.
例えば、上記各実施形態に係る表示装置は、制御部400が、フォーカスレンズ制御部及び/又は投射画像制御部を有していなくてもよい。 For example, in the display device according to each of the above embodiments, the control unit 400 does not need to have a focus lens control unit and/or a projection image control unit.
例えば、上記各実施形態に係る表示装置では、フォーカスレンズの位置制御及び投射画像の輝度制御の一方のみを行ってもよい。 For example, in the display device according to each of the above embodiments, only one of focus lens position control and projected image brightness control may be performed.
例えば、上記各実施形態に係る表示装置において、制御部400が、光源100a及び投射光学系200の一方のみを制御してもよい。 For example, in the display device according to each of the above embodiments, the control unit 400 may control only one of the light source 100a and the projection optical system 200.
例えば、上記各実施形態に係る表示装置では、画像光生成部が走査型であるが、例えば液晶ディスプレイ等を含む非走査型であってもよい。 For example, in the display devices according to the above embodiments, the image light generating unit is of the scanning type, but it may also be of the non-scanning type, including, for example, a liquid crystal display.
例えば、上記各実施形態に係る表示装置において、光投射系10は、観察者であるユーザの頭部に装着して用いることも可能である。 For example, in the display devices according to the above embodiments, the optical projection system 10 can be worn on the head of the user who is the observer.
上記各実施例の眼球情報検出装置の構成の少なくとも一部を相互に矛盾しない範囲で組み合わせてもよい。 At least a portion of the configuration of the eyeball information detection device in each of the above embodiments may be combined to the extent that they are not mutually inconsistent.
また、本技術は、以下のような構成をとることもできる。
(1)光源を含む光投射系と、
前記光投射系から投射された光を眼球へ導く接眼光学系と、
を備え、
前記接眼光学系は、
光学素子と、
前記光学素子に対する相対位置が固定された3つ以上の特徴点と、
を含み、
前記3つ以上の特徴点のうち少なくとも3つの特徴点の各々の少なくとも2次元位置情報を取得する取得部が設けられている、表示装置。
(2)前記少なくとも3つの特徴点が同一直線上にない、(1)に記載の表示装置。
(3)前記光学素子は、前記眼球に装着されるコンタクトレンズに設けられている、(1)又は(2)に記載の表示装置。
(4)前記少なくとも3つの特徴点が、前記コンタクトレンズに設けられている、(3)に記載の表示装置。
(5)前記取得部は、前記眼球の向きが変化しても、前記少なくとも3つの特徴点の各々の少なくとも前記2次元位置情報を取得可能である、(4)に記載の表示装置。
(6)前記光学素子は、眼鏡フレームに装着されるレンズに設けられている、(1)又は(2)に記載の表示装置。
(7)前記少なくとも3つの特徴点がいずれも前記レンズ若しくは前記眼鏡フレームに設けられ、又は、前記少なくとも3つの特徴点の一部が前記レンズに設けられ且つ他部が前記眼鏡フレームに設けられている、(6)に記載の表示装置。
(8)前記少なくとも3つの特徴点と前記光学素子の光学中心とが同一平面上にある、(1)~(7)のいずれか1つに記載の表示装置。
(9)前記取得部は、少なくとも1つのカメラを有する、(1)~(8)のいずれか1つに記載の表示装置。
(10)前記少なくとも1つのカメラは、可視光カメラ及び/又はTOFカメラを含む、(9)に記載の表示装置。
(11)前記少なくとも1つのカメラは、ステレオカメラを構成する複数のカメラを含む、(9)又は(10)に記載の表示装置。
(12)前記光投射系が、前記取得部を含む、(1)~(11)のいずれか1つに記載の表示装置。
(13)前記取得部は、赤外光源を有し、前記カメラは、赤外波長に感度を持ち、前記少なくとも3つの特徴点は、再帰性反射材料からなる、(9)~(12)のいずれか1つに記載の表示装置。
(14)前記光投射系は、
前記光源からの光の光路上に配置された投射光学系を含み、
前記投射光学系の光軸と前記カメラの光軸とが同軸である、(9)~(13)のいずれか1つに記載の表示装置。
(15)前記光投射系は、
前記光源からの光の光路上に配置された投射光学系と、
前記取得部の取得結果に基づいて、前記光源及び/又は前記投射光学系を制御する制御部と、
を有する、(9)~(14)のいずれか1つに記載の表示装置。
(16)前記投射光学系は、可動偏向素子及びフォーカスレンズを含み、
前記取得部は、
前記少なくとも3つの特徴点の各々の、前記カメラの光軸方向に直交する平面内の2次元位置情報と、
前記カメラと前記少なくとも3つの特徴点の各々との前記カメラの光軸方向に関する距離情報と、
を取得し、
前記制御部は、
少なくとも前記2次元位置情報に基づいて前記可動偏向素子を制御することと、
前記2次元位置情報及び前記距離情報に基づいて前記光源を制御し、及び/又は、前記距離情報に基づいて前記フォーカスレンズを制御することと、
を並行して行う、(15)に記載の表示装置。
(17)前記少なくとも3つの特徴点のうち少なくとも2つの特徴点が一体的に設けられている、(1)~(16)のいずれか1つに記載の表示装置。
(18)前記光投射系と前記接眼光学系とが別体である、(1)~(17)のいずれか1つに記載の表示装置。
(19)光投射系から投射された光を眼球へ導く光学素子に対する相対位置が固定された3つ以上の特徴点のうち少なくとも3つの特徴点の各々の少なくとも2次元位置情報を取得する工程と、
前記取得する工程での取得結果に基づいて、前記光投射系の一部を制御する工程と、
を含む、表示方法。
(20)前記光投射系は、光源、可動偏向素子及びフォーカスレンズを含み、
前記少なくとも2次元位置情報を取得する工程は、
前記少なくとも3つの特徴点の各々の、前記カメラの光軸方向に直交する平面内の2次元位置情報を取得する工程と、
前記カメラと前記少なくとも3つの特徴点の各々との前記カメラの光軸方向に関する距離情報を取得する工程と、
を含み、
前記制御する工程は、
少なくとも前記2次元位置情報に基づいて前記可動偏向素子を制御する工程と、
前記2次元位置情報及び前記距離情報に基づいて前記光源を制御し、及び/又は、前記距離情報に基づいて前記フォーカスレンズを制御する工程と、
を含む、(19)に記載の表示方法。
The present technology can also be configured as follows.
(1) a light projection system including a light source;
an eyepiece optical system that guides the light projected from the light projection system to an eyeball;
Equipped with
The eyepiece optical system includes:
An optical element;
Three or more feature points having fixed relative positions with respect to the optical element;
Including,
The display device is provided with an acquisition unit that acquires at least two-dimensional position information of at least three of the three or more feature points.
(2) The display device according to (1), wherein the at least three feature points are not on the same line.
(3) The display device according to (1) or (2), wherein the optical element is provided in a contact lens that is attached to the eyeball.
(4) The display device according to (3), wherein the at least three feature points are provided on the contact lens.
(5) The display device according to (4), wherein the acquisition unit is capable of acquiring at least the two-dimensional position information of each of the at least three feature points even when the orientation of the eyeball changes.
(6) The display device according to (1) or (2), wherein the optical element is provided on a lens attached to an eyeglass frame.
(7) The display device described in (6), wherein all of the at least three feature points are provided on the lens or the eyeglass frame, or some of the at least three feature points are provided on the lens and others are provided on the eyeglass frame.
(8) The display device according to any one of (1) to (7), wherein the at least three characteristic points and the optical center of the optical element are on the same plane.
(9) The display device according to any one of (1) to (8), wherein the acquisition unit has at least one camera.
(10) The display device according to (9), wherein the at least one camera includes a visible light camera and/or a Time-of-Flight camera.
(11) The display device according to (9) or (10), wherein the at least one camera includes a plurality of cameras constituting a stereo camera.
(12) The display device according to any one of (1) to (11), wherein the optical projection system includes the acquisition unit.
(13) A display device described in any one of (9) to (12), wherein the acquisition unit has an infrared light source, the camera is sensitive to infrared wavelengths, and the at least three feature points are made of a retroreflective material.
(14) The optical projection system includes:
a projection optical system disposed on an optical path of light from the light source;
The display device according to any one of (9) to (13), wherein an optical axis of the projection optical system and an optical axis of the camera are coaxial.
(15) The optical projection system includes:
a projection optical system disposed on an optical path of light from the light source;
a control unit that controls the light source and/or the projection optical system based on the result of the acquisition by the acquisition unit;
The display device according to any one of (9) to (14),
(16) The projection optical system includes a movable deflection element and a focus lens,
The acquisition unit is
Two-dimensional position information of each of the at least three feature points in a plane perpendicular to the optical axis direction of the camera;
Distance information between the camera and each of the at least three feature points in a direction of an optical axis of the camera;
Get
The control unit is
controlling the movable deflection element based on at least the two-dimensional position information;
controlling the light source based on the two-dimensional position information and the distance information, and/or controlling the focus lens based on the distance information;
The display device according to (15), wherein the above steps are performed in parallel.
(17) The display device according to any one of (1) to (16), wherein at least two of the at least three characteristic points are integrally provided.
(18) The display device according to any one of (1) to (17), wherein the optical projection system and the eyepiece optical system are separate entities.
(19) acquiring at least two-dimensional position information of at least three of the three or more feature points whose relative positions with respect to an optical element that guides light projected from a light projection system to an eyeball are fixed;
controlling a part of the optical projection system based on the results of the acquiring step;
Including, how to display.
(20) The optical projection system includes a light source, a movable deflection element, and a focus lens;
The step of acquiring at least two-dimensional position information includes:
acquiring two-dimensional position information of each of the at least three feature points in a plane perpendicular to an optical axis direction of the camera;
acquiring distance information between the camera and each of the at least three feature points in a direction of an optical axis of the camera;
Including,
The controlling step includes:
controlling the movable deflection element based on at least the two-dimensional position information;
controlling the light source based on the two-dimensional position information and the distance information, and/or controlling the focus lens based on the distance information;
The display method according to (19),
1、2、3、4、5、6:表示装置
10:光投射系
20:接眼光学系
100a:光源
200:投射光学系
200a:可動ミラー(可動偏向素子)
200c:フォーカスレンズ
300:取得部
300a:カメラ
300e:赤外光源
400:制御部
EB:眼球
FP:特徴点
OE:光学素子
OC:光学中心
CL:コンタクトレンズ
GL:眼鏡レンズ(レンズ)
GF:眼鏡フレーム
IL:画像光(光)
1, 2, 3, 4, 5, 6: Display device 10: Light projection system 20: Eyepiece optical system 100a: Light source 200: Projection optical system 200a: Movable mirror (movable deflection element)
200c: Focus lens 300: Acquisition unit 300a: Camera 300e: Infrared light source 400: Control unit EB: Eyeball FP: Feature point OE: Optical element OC: Optical center CL: Contact lens GL: Glasses lens (lens)
GF: Glasses frame IL: Image light (light)
Claims (20)
前記光投射系から投射された光を眼球へ導く接眼光学系と、
を備え、
前記接眼光学系は、
光学素子と、
前記光学素子に対する相対位置が固定された3つ以上の特徴点と、
を含み、
前記3つ以上の特徴点のうち少なくとも3つの特徴点の各々の少なくとも2次元位置情報を取得する取得部が設けられている、表示装置。 a light projection system including a light source;
an eyepiece optical system that guides the light projected from the light projection system to an eyeball;
Equipped with
The eyepiece optical system includes:
An optical element;
Three or more feature points having fixed relative positions with respect to the optical element;
Including,
The display device is provided with an acquisition unit that acquires at least two-dimensional position information of at least three of the three or more feature points.
前記カメラは、赤外波長に感度を持ち、
前記少なくとも3つの特徴点は、再帰性反射材料からなる、請求項9に記載の表示装置。 The acquisition unit has an infrared light source,
the camera is sensitive to infrared wavelengths;
The display device of claim 9 , wherein the at least three features are made of retroreflective material.
前記光源からの光の光路上に配置された投射光学系を含み、
前記投射光学系の光軸と前記カメラの光軸とが同軸である、請求項9に記載の表示装置。 The light projection system includes:
a projection optical system disposed on an optical path of light from the light source;
10. The display device according to claim 9, wherein an optical axis of the projection optical system and an optical axis of the camera are coaxial.
前記光源からの光の光路上に配置された投射光学系と、
前記取得部の取得結果に基づいて、前記光源及び/又は前記投射光学系を制御する制御部と、
を有する、請求項9に記載の表示装置。 The light projection system includes:
a projection optical system disposed on an optical path of light from the light source;
a control unit that controls the light source and/or the projection optical system based on the result of the acquisition by the acquisition unit;
The display device according to claim 9 ,
前記取得部は、
前記少なくとも3つの特徴点の各々の、前記カメラの光軸方向に直交する平面内の2次元位置情報と、
前記カメラと前記少なくとも3つの特徴点の各々との前記カメラの光軸方向に関する距離情報と、
を取得し、
前記制御部は、
少なくとも前記2次元位置情報に基づいて前記可動偏向素子を制御することと、
前記2次元位置情報及び前記距離情報に基づいて前記光源を制御し、及び/又は、前記距離情報に基づいて前記フォーカスレンズを制御することと、
を並行して行う、請求項15に記載の表示装置。 the projection optical system includes a movable deflection element and a focus lens;
The acquisition unit is
Two-dimensional position information of each of the at least three feature points in a plane perpendicular to the optical axis direction of the camera;
Distance information between the camera and each of the at least three feature points in a direction of an optical axis of the camera;
Get
The control unit is
controlling the movable deflection element based on at least the two-dimensional position information;
controlling the light source based on the two-dimensional position information and the distance information, and/or controlling the focus lens based on the distance information;
The display device according to claim 15 , wherein the above steps are performed in parallel.
前記取得する工程での取得結果に基づいて、前記光投射系の一部を制御する工程と、
を含む、表示方法。 acquiring at least two-dimensional position information of at least three of the three or more feature points whose relative positions with respect to an optical element that guides light projected from a light projection system to the eyeball are fixed;
controlling a part of the optical projection system based on the results of the acquiring step;
Including, how to display.
前記少なくとも2次元位置情報を取得する工程は、
前記少なくとも3つの特徴点の各々の、前記カメラの光軸方向に直交する平面内の2次元位置情報を取得する工程と、
前記カメラと前記少なくとも3つの特徴点の各々との前記カメラの光軸方向に関する距離情報を取得する工程と、
を含み、
前記制御する工程は、
少なくとも前記2次元位置情報に基づいて前記可動偏向素子を制御する工程と、
前記2次元位置情報及び前記距離情報に基づいて前記光源を制御し、及び/又は、前記距離情報に基づいて前記フォーカスレンズを制御する工程と、
を含む、請求項19に記載の表示方法。 the light projection system includes a light source, a movable deflection element, and a focus lens;
The step of acquiring at least two-dimensional position information includes:
acquiring two-dimensional position information of each of the at least three feature points in a plane perpendicular to an optical axis direction of the camera;
acquiring distance information between the camera and each of the at least three feature points in a direction of an optical axis of the camera;
Including,
The controlling step includes:
controlling the movable deflection element based on at least the two-dimensional position information;
controlling the light source based on the two-dimensional position information and the distance information, and/or controlling the focus lens based on the distance information;
20. The display method of claim 19, comprising:
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202480021053.3A CN120917367A (en) | 2023-03-30 | 2024-02-19 | Display devices and display methods |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2023055322 | 2023-03-30 | ||
| JP2023-055322 | 2023-03-30 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024202682A1 true WO2024202682A1 (en) | 2024-10-03 |
Family
ID=92904149
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2024/005653 Pending WO2024202682A1 (en) | 2023-03-30 | 2024-02-19 | Display device and display method |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN120917367A (en) |
| WO (1) | WO2024202682A1 (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008155720A (en) * | 2006-12-22 | 2008-07-10 | Nippon Seiki Co Ltd | Head-up display device |
| WO2009066446A1 (en) * | 2007-11-20 | 2009-05-28 | Panasonic Corporation | Beam scanned type display device, display method, and automobile |
| US20120281181A1 (en) * | 2011-05-05 | 2012-11-08 | Sony Computer Entertainment Inc. | Interface using eye tracking contact lenses |
| EP2768224A1 (en) * | 2013-02-14 | 2014-08-20 | BlackBerry Limited | Wearable display system with detached projector |
| US20190155031A1 (en) * | 2016-06-28 | 2019-05-23 | Hologram Industries Research Gmbh | Display apparatus for superimposing a virtual image into the field of vision of a user |
| JP2021102428A (en) * | 2019-12-25 | 2021-07-15 | パナソニックIpマネジメント株式会社 | Display system |
-
2024
- 2024-02-19 CN CN202480021053.3A patent/CN120917367A/en active Pending
- 2024-02-19 WO PCT/JP2024/005653 patent/WO2024202682A1/en active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008155720A (en) * | 2006-12-22 | 2008-07-10 | Nippon Seiki Co Ltd | Head-up display device |
| WO2009066446A1 (en) * | 2007-11-20 | 2009-05-28 | Panasonic Corporation | Beam scanned type display device, display method, and automobile |
| US20120281181A1 (en) * | 2011-05-05 | 2012-11-08 | Sony Computer Entertainment Inc. | Interface using eye tracking contact lenses |
| EP2768224A1 (en) * | 2013-02-14 | 2014-08-20 | BlackBerry Limited | Wearable display system with detached projector |
| US20190155031A1 (en) * | 2016-06-28 | 2019-05-23 | Hologram Industries Research Gmbh | Display apparatus for superimposing a virtual image into the field of vision of a user |
| JP2021102428A (en) * | 2019-12-25 | 2021-07-15 | パナソニックIpマネジメント株式会社 | Display system |
Also Published As
| Publication number | Publication date |
|---|---|
| CN120917367A (en) | 2025-11-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11194167B2 (en) | Augmented reality head-mounted display with eye tracking for pupil steering | |
| KR102474236B1 (en) | Systems, devices and methods for integrating eye tracking and scanning laser projection in wearable heads-up displays | |
| US10409057B2 (en) | Systems, devices, and methods for laser eye tracking in wearable heads-up displays | |
| AU2016314630B2 (en) | Eye projection system and method | |
| WO2020131810A1 (en) | Holographic in-field illuminator | |
| US20140354514A1 (en) | Gaze tracking with projector | |
| CN117546073B (en) | Optical systems for eye tracking | |
| JP7302592B2 (en) | Information detection device, video projection device, information detection method, and video projection method | |
| CN103429139A (en) | Spectacle device with an adjustable field of view and method | |
| US12277266B2 (en) | Line-of-sight detection device, display device, and method for sensing eyeball | |
| WO2023086280A1 (en) | Multi-view eye tracking system with a holographic optical element combiner | |
| WO2024202682A1 (en) | Display device and display method | |
| US12105873B2 (en) | Light field based eye tracking | |
| US12153728B2 (en) | Optical system for a virtual retina display and a gesture detection of a user of the virtual retina display |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24778827 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 202480021053.3 Country of ref document: CN |
|
| WWP | Wipo information: published in national office |
Ref document number: 202480021053.3 Country of ref document: CN |