WO2006118057A1 - Image display device - Google Patents
Image display device Download PDFInfo
- Publication number
- WO2006118057A1 WO2006118057A1 PCT/JP2006/308448 JP2006308448W WO2006118057A1 WO 2006118057 A1 WO2006118057 A1 WO 2006118057A1 JP 2006308448 W JP2006308448 W JP 2006308448W WO 2006118057 A1 WO2006118057 A1 WO 2006118057A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- unit
- display device
- observer
- line
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/02—Viewing or reading apparatus
- G02B27/022—Viewing apparatus
- G02B27/024—Viewing apparatus comprising a light source, e.g. for viewing photographic slides, X-ray transparancies
- G02B27/026—Viewing apparatus comprising a light source, e.g. for viewing photographic slides, X-ray transparancies and a display device, e.g. CRT, LCD, for adding markings or signs or to enhance the contrast of the viewed object
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/74—Projection arrangements for image reproduction, e.g. using eidophor
- H04N5/7475—Constructional details of television projection apparatus
- H04N5/7491—Constructional details of television projection apparatus of head mounted projectors
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
Definitions
- the present invention relates to an image display device, and more particularly to an eyeball projection type image display device that displays a virtual image (including a video image) superimposed on a real image.
- a conventional display device such as an optical see-through display has several problems.
- a conventional display device has a narrow viewing angle at which a real image can be obtained.
- the brightness of the real image changes depending on the line of sight of the observer (user) (the object to be watched), whereas the brightness of the virtual image is constant, so the observer feels uncomfortable.
- the virtual image related to the actual observed object (real image) 's distance to the point of gaze changes according to the observer's line-of-sight direction always looks the same, the perspective between the real image and the virtual image I can't get it, and I can't get mixed reality with more realism (verisimilar)! /.
- Patent Document 1 For example, according to a conventional display device proposed in Japanese Patent Laid-Open No. 8-313843 (Patent Document 1), a background image (wide-field image) displayed at a low resolution and a line of sight are followed. High-reality images can be realized by combining and displaying high-resolution narrow-field images that move.
- the above display device always displays a wide-field image that is related to the distance from the wide-field image and the narrow-field image to the observer (its eyeball). The distance between the object included in the object and the object included in the narrow field image cannot be accurately recognized. That is, according to the display device of Patent Document 1, it is not possible to express the perspective of an image depending on the difference in the distance (depth of field) in the viewing direction.
- Patent Document 2 Japanese Patent Application Laid-Open No. 11-313843
- Patent Document 2 describes a first video with a wide field of view and a high-resolution image.
- An image display device that synthesizes and displays a fine second image is disclosed.
- the above video display device it is suggested to display a virtual image with a depth of field of view that is always displayed clearly without depending on the lens effect of the eyeball.
- the depth of field of the virtual image is arbitrarily set by the observer, and the distance to the target object of the real image obtained by actual observation is compared with the depth of field.
- An object of the present invention is to provide an image display device that forms a composite image by appropriately fusing (superimposing) a virtual image with a real image and displaying the composite image without giving a sense of incongruity to an observer.
- An image display device includes a gaze direction detection unit that detects a gaze direction of an observer's eyeball, and is disposed on the optical axis of the eyeball of the observer and is in the gaze direction of the eyeball.
- An image capturing unit that captures a real image, a virtual image storage unit that stores a virtual image, and a composite image in which a virtual image stored in the virtual image storage unit is superimposed on a real image captured by the image capturing unit.
- a display unit for displaying a composite image to an observer.
- a realistic composite image without a sense of incongruity in which a virtual image is superimposed on a real image is displayed, thereby realizing a more real mixed reality. can do.
- FIG. 1 is a schematic diagram showing an image display apparatus according to Embodiment 1 of the present invention.
- FIG. 2 is a block diagram showing each component unit of the image display device in FIG. 1.
- FIG. 3 is a schematic view showing a modification of the display unit of FIG.
- FIG. 4 is a schematic view showing components of the display unit of FIG.
- FIG. 5 is a schematic diagram showing a process of calibrating the vertical center axis of the line-of-sight direction detection unit with respect to the line-of-sight direction in the image display device according to Embodiment 1.
- FIG. 6 is a schematic diagram showing a process of calibrating the vertical center axis of the line-of-sight direction detection unit with respect to the line-of-sight direction in the image display device according to Embodiment 1.
- FIG. 7 is a schematic view showing another modification of the display unit of FIG.
- FIG. 8 is a schematic view showing still another modification of the display unit of FIG.
- FIG. 9 is a schematic view showing still another modification of the display unit of FIG.
- FIG. 10 is a schematic view showing a modification of the drive unit in FIG. 1.
- FIG. 11 is a schematic diagram showing a process of removing the drive unit of FIG. 1 from the line-of-sight direction when a failure occurs in the image display apparatus according to Embodiment 1.
- FIG. 12 is a schematic diagram showing an image display apparatus according to Embodiment 2 of the present invention.
- FIG. 13 is a schematic diagram showing an image display apparatus according to Embodiment 3 of the present invention.
- FIG. 14 is a schematic diagram showing an arrangement position of an eyeball, a real image, and a virtual image when an observer observes a real image and a virtual image at the same time using the image display device of the third embodiment.
- FIG. 15 is a schematic diagram showing an optical system that realizes Maxwellian vision using a point light source (directional light source) employed in an image display apparatus according to Embodiment 4 of the present invention.
- FIG. 16 is a schematic diagram showing an optical system using a normal surface light source (diffusive light source).
- 11 Housing, 12: LCD panel (display unit), 14: First mirror (optical component), 16: Eyeball camera (line-of-sight detection unit), 18: Camera for line-of-sight shooting (imaging unit), 20 : Second mirror, 22: Light source, 23: Infrared light source, 24: Eyepiece, 25: Correction lens, 32: Optical shirt,
- Control unit 64: Non-volatile memory (virtual image storage unit), 66: Input unit,
- LS Gaze direction
- EB Eyeball
- CB Calibration signal (light flux)
- GP Gaze point
- RI Actual image
- VI Virtual image.
- FIG. 1 is a schematic diagram showing an image display device 1 according to Embodiment 1 of the present invention
- FIG. 2 is a block diagram showing each component unit of the image display device 1 of FIG. Note that this image display device is normally used by being attached to the observer's eyes (left eye and right eye). In the following embodiments, the image display device attached to the left eye unless otherwise specified. It is to be understood that the image display apparatus mounted on the right eye is configured symmetrically with respect to the symmetry plane SS indicated by the one-dot chain line in FIG.
- an image display device 1 includes a display unit 10, a drive unit 40 that moves the display unit 10, and the display unit 10 and the drive unit 40. And an information processing unit 60 connected to be wired or wirelessly controllable.
- the display unit 10 includes a display unit 12 such as a liquid crystal display (LCD) panel disposed inside a housing 11 having a substantially hemispherical outer shape, and an optical component 14 such as a semitransparent mirror. And an eye photography camera (line-of-sight detection unit) 16 for detecting the line-of-sight direction LS of the observer (user) indicated by a broken-line arrow in FIG. Further, the display unit 10 includes a camera (imaging unit) 18 for capturing a visual line direction for capturing an actual image observed in the visual line direction LS. According to the configuration shown in FIG. 1, the imaging unit 18 is located outside the nosing / housing 11 and is disposed in the line-of-sight direction LS.
- a display unit 12 such as a liquid crystal display (LCD) panel disposed inside a housing 11 having a substantially hemispherical outer shape
- an optical component 14 such as a semitransparent mirror.
- an eye photography camera (line-of-sight detection unit) 16 for
- the display unit 12, the optical component 14, the line-of-sight direction detection unit 16, and the imaging unit 18 constituting the display unit 10 are fixed with respect to each other in the housing 11, and as the housing 11 moves, It is possible to move together while maintaining the relative positional relationship.
- these cameras process the obtained image information digitally by the information processing unit 60, and are preferably a charge-coupled device (CCD) or CMOS imaging unit. It is configured using a digital image sensor such as a device.
- the drive unit 40 includes a first pivot part (rotary drive part) 42 connected to the housing 11 of the display unit 10 at one end and a second part at the other end. And a second arm 48 slidably attached to the second pivot 44.
- the first arm 46 (including the first and second pivot portions 42 and 44) and the second arm 48 move the display unit 10 to an arbitrary position in response to a command from the information processing unit 60. Can be made.
- the information processing unit 60 is a general information control terminal such as a personal 'computer or a portable' computer, and includes a control unit 62 such as a central processing unit (CPU) and a hard disk for storing virtual images or It has a virtual image storage unit 64 composed of nonvolatile memory such as a flash memory and an input unit 66 composed of a keyboard or touch panel.
- a control unit 62 such as a central processing unit (CPU) and a hard disk for storing virtual images or It has a virtual image storage unit 64 composed of nonvolatile memory such as a flash memory and an input unit 66 composed of a keyboard or touch panel.
- a virtual image storage unit 64 composed of nonvolatile memory such as a flash memory
- an input unit 66 composed of a keyboard or touch panel.
- the line-of-sight direction detection unit 16 passes through the semi-transparent mirror 14 and the pupil position of the observer's eyeball EB, that is, the observer's line of sight.
- the direction LS is detected, and the line-of-sight direction data is transmitted to the control unit 62 of the information processing unit 60.
- the control unit 62 drives the drive unit 40 to move the display unit 10 so that the imaging unit 18 is arranged in the detected line-of-sight direction LS.
- the imaging unit 18 Since a series of operations of the line-of-sight direction detection unit 16, the information processing unit 60, and the drive unit 40 is continuously performed while the display unit 10 is mounted on the observer, the imaging unit 18 always observes. A real image of the actual object observed in the person's gaze direction LS is captured. Real image data captured by the imaging unit 18 is sequentially transmitted to the control unit 62. [0018] Similarly, the virtual image data stored in the virtual image storage unit 64 is transmitted to the control unit 62, where it is digitally processed and superimposed on the real image captured by the imaging unit 18, Data of the synthesized image is generated.
- the composite image data is transmitted to the display unit 10, displayed on the display unit 12, and the light beam passing therethrough is reflected by the mirror 14 and projected onto the observer's eyeball EB.
- the control unit 62 adjusts the brightness of the real image and the virtual image to generate composite image data, so that a more natural composite image can be displayed to the observer.
- the line-of-sight direction detection unit 16 constantly monitors the line-of-sight direction of the observer and transmits line-of-sight direction data to the control unit 62. Therefore, when the line of sight of the observer moves, the control unit 62 A command is issued to 40, and the display unit 10 is moved so that the imaging unit 18 captures an image in the line-of-sight direction LS. At this time, preferably, the drive unit 40 moves the display unit 10 on a spherical surface centered on the eyeball EB of the observer so that the distance to the display unit 10 is constant also in the eyeball EB force of the observer. .
- the real image always in the line-of-sight direction LS is captured following the observer's line-of-sight direction LS, and the composite image is superimposed on the virtual image. Since the image can be displayed to the observer, a more natural mixed reality can be given to the observer.
- the imaging unit 18 of the display unit 10 shown in FIG. 1 is located outside the housing 11 and is disposed on the line-of-sight direction LS, but is not limited to this. That is, in the image display device 1 shown in FIG. 3, the imaging unit 18 is disposed inside the housing 11. In addition to the first mirror 14 that reflects the image displayed on the display unit 12 toward the observer's eyeball EB, a real image in the line-of-sight direction LS is projected onto the imaging unit 18 in the nosing 11. A second mirror 20 is provided. Thus, the real image in the line-of-sight direction is picked up by the image pickup unit 18 and superimposed on the virtual image by the control unit 62 as in the first embodiment. The composite image is displayed on the display unit 12, reflected by the first mirror 14, and displayed to the observer. Is done.
- the first mirror 14 may be a force deflection separation mirror described as being translucent.
- the second mirror 20 may be a dichroic mirror that reflects visible light and transmits infrared light.
- the display unit 10 includes a light source 22, a transmissive LCD panel (display unit) 12 disposed adjacent to the light source 22, and a light flux from the LCD panel 12. It has a deflecting beam splitter (optical component) 14 that reflects toward the eyeball EB, and an eyepiece 24 that focuses the light beam onto the eyeball EB.
- the line-of-sight direction detection unit 16 and the imaging unit 18 are assembled so that their vertical center axes are arranged in a straight line.
- the display unit 10 configured as described above, white light from the light source 22 is transmitted through the transmissive LCD panel 12, and the light beam including the composite image displayed on the transmissive LCD panel 12 is reflected by the deflecting beam splitter 16. And projected onto the observer's eyeball EB. As a result, the observer can see the composite image displayed on the transmissive LCD panel 12.
- the image display device 1 it is possible to follow the observer's line of sight, capture a real image always in the line of sight, and display it to the observer. it can.
- the gaze direction detection unit 16 and the imaging unit 18 are easy to assemble (fix) so that their vertical central axes are arranged in a straight line. It is difficult to accurately align the vertical center axis of the line-of-sight direction detection unit 16 (imaging unit 18) and the line-of-sight direction LS. It is necessary to perform a process (calibration) that reliably calibrates the central axis to the line-of-sight direction LS.
- the LCD panel 12 An image of a predetermined shape such as a circle is displayed at the center of the image sensor 13 (ie, the center of the imaging unit 18), and a calibration signal (light beam) CB is projected to the translucent mirror 14 by directing it (arrow 26a). Then, it is reflected by the semitransparent mirror 14 and irradiated to the eyeball EB (arrow 26b).
- a calibration signal (light beam) CB is projected to the translucent mirror 14 by directing it (arrow 26a). Then, it is reflected by the semitransparent mirror 14 and irradiated to the eyeball EB (arrow 26b).
- the calibration light beam CB is reflected by the eyeball EB in the line-of-sight direction LS, passes through the translucent mirror 14, It is detected by the line-of-sight direction detector 16 (arrow 26c). That is, if the center of the line-of-sight detection unit 16 (calibration signal CB of a predetermined shape) and the line-of-sight direction (intersection 28 of the dashed line in FIGS. 6 (a) and (b)) LS match, FIG.
- a calibration-shaped luminous flux (image) with a predetermined shape is detected at the intersection 28 of the dotted chain line, but if the central axis of the line-of-sight direction detector 16 and the line-of-sight direction LS are not accurately aligned, As shown in Fig. 6 (b), the image of the predetermined shape deviates from the intersection 28 of the alternate long and short dash line.
- the line-of-sight direction detection unit 16 detects that in FIG. Based on the image, the control unit 62 calculates a deviation amount of the center axis (that is, the calibration light beam CB) of the imaging unit 18 with respect to the intersection 28 (that is, the line-of-sight direction LS) of the one-dot chain line. Then, the control unit 62 moves the display unit 10 using the drive unit 40 in accordance with the amount of deviation of the central axes of the imaging unit 18 and the gaze direction detection unit 16 with respect to the gaze direction LS, as shown in FIG.
- the control unit 62 moves the display unit 10 using the drive unit 40 in accordance with the amount of deviation of the central axes of the imaging unit 18 and the gaze direction detection unit 16 with respect to the gaze direction LS, as shown in FIG.
- Such an image having a predetermined shape (calibration light beam CB) is preferably projected onto the eyeball EB at an interval shorter than the time recognizable by the observer when the observer observes the predetermined gazing point.
- This makes it possible to calibrate the central axis of the gaze direction detection unit 16 (and the imaging unit 18) to the gaze direction LS without bothering the observer as much as possible.
- such a calibration process is periodically performed so that the central axis of the line-of-sight direction detection unit 16 and the line-of-sight direction LS are always aligned.
- the line-of-sight direction detection unit 16 constantly monitors the line-of-sight direction LS. Specifically, as shown in FIG. 7, the line-of-sight direction detection unit 16 has an infrared light source 23 such as an infrared LED lamp, and the infrared light from the infrared light source 23 passes through the translucent mirror 16. Then, the pupil position of the eyeball EB, that is, the line-of-sight direction LS is detected by detecting the reflected infrared ray irradiated to the eyeball EB of the observer.
- an infrared light source 23 such as an infrared LED lamp
- the line-of-sight direction LS is detected using infrared light that is not recognized by the observer, so the observer's line-of-sight direction LS is constantly monitored without giving the observer a sense of incongruity. be able to.
- the display unit 10 described above with reference to FIG. 7 includes a gaze direction photographing camera (imaging unit) 18 for capturing an actual image observed in the gaze direction LS, and the gaze direction of the observer. Inspection As shown in FIG. 8, the present invention is not limited to this. However, the present invention is not limited to this, and the imaging unit 18 and the line-of-sight detection unit 1 6 , And a single CCD or CMOS imaging device 30 may be used to image both the real image and the pupil of the eyeball EB.
- the image pickup device 30 is generally formed using a silicon substrate, and is configured to pick up a real image with the front side facing the line-of-sight direction LS and detect infrared light irradiated from the back side. can do.
- an optical shirter 32 such as a liquid crystal shirter is arranged in front of the imaging unit 18, and the optical shirter 32 is opened to pick up a real image, and the line of sight is detected. Closes the optical shirt 32.
- the image display apparatus 1 can always display the real image in the line-of-sight direction LS to the observer following the line of sight of the observer.
- the gaze direction detection unit 16 and its peripheral circuit can be shared (omitted). Therefore, the number of parts can be reduced and the display unit 10 can be reduced in size and weight.
- a force that employs a transmissive LCD panel as the display unit 12 is not limited to this. That is, the display unit 10 includes a light source 22 and a display unit 12 such as a reflective LCD panel, as shown in FIG. That is, the white light having the light source 22 passes through the deflecting beam splitter 14, and the light beam formed as an image by the reflective LCD panel 12 is reflected by the deflecting beam splitter 14 and irradiated to the observer's eyeball EB. Thus, the observer can observe the composite image displayed on the reflective LCD panel 12. Since the display unit 10 including the reflective LCD panel 12 can generally achieve higher definition image quality than the display unit including the transmissive LCD panel 12, the observer can use the reflective display unit 10 shown in FIG. It can be used to observe a more precise image.
- the drive unit 40 may have the force and other structures described as having the first and second arms 42 and 44.
- the drive unit 40 is roughly composed of a substantially hemispherical outer peripheral wall 56 and a transparent member. And a hollow guide housing 50 formed by the inner peripheral wall 57 and the end wall 58, and the display unit 10 is slidably disposed in the hollow guide housing 50.
- a plurality of electromagnets 52 are disposed, while a plurality of permanent magnets 54 are disposed in the display unit 10.
- the control unit 62 adjusts the magnetic force (the amount of current flowing through the electromagnet 52) of each electromagnet 52 in the hollow guide nosing 50 so that the imaging unit 18 is arranged on the detected line-of-sight direction LS.
- the display unit 10 can be moved along the hollow guide nosing 50 (arrow 56).
- the control unit 62 has failed in the display unit 10 or the information processing unit 60 and cannot display an appropriate composite image to the observer.
- the failure of the display unit 10 and the information processing unit 60 is not limited to this, but for example, interruption of image data transmission from the line-of-sight direction detection unit 16 and the imaging unit 18 to the control unit 62, malfunction of the display unit 12
- the power failure or malfunction of the control unit 62 is included.
- a force that can be realized by using any means understood by those skilled in the art For example, the display unit 10 is energized to exclude the display unit 10.
- a spring member (not shown) is released in response to the failure, and the state force of FIG. 11 (a) also moves the first arm 46 to the state of FIG. 11 (b).
- Embodiment 2 of the image display apparatus according to the present invention will be described below with reference to FIG.
- the real image is directly observed by the observer
- the display unit 10 is the image of the first embodiment except that only the virtual image is projected onto the observer's eyeball EB. Since it has the same configuration as that of the display device 1, detailed description of the overlapping components is omitted. The same components as those in Embodiment 1 will be described using the same reference numerals.
- the display unit 10 generally includes a light source 22 disposed inside the housing 11, a display unit 12 such as a liquid crystal display (LCD) panel, and reflects visible light by reflecting infrared light. Transmitting dichroic mirror 15; Translucent mirror 14 that reflects infrared light and transmits visible light; Eyepiece 24 and correction lens 2 5 for converging LS-direction luminous flux into eyeball EB, observer A line-of-sight direction detection unit 16 for detecting the line-of-sight direction LS, and an imaging unit 18 for measuring the brightness of an actual image observed in the line-of-sight direction LS.
- a line-of-sight direction detection unit 16 for detecting the line-of-sight direction LS
- an imaging unit 18 for measuring the brightness of an actual image observed in the line-of-sight direction LS.
- the observer can directly see the object in the line-of-sight direction LS through the eyepiece lens 25, the translucent mirror 14, and the correction lens 24. Further, the luminous flux with the line-of-sight direction LS force is reflected by the semi-transparent mirror 14, and the amount of light (the actual brightness of the image observed in the line-of-sight direction LS) is detected by the imaging unit 18.
- the line-of-sight direction detection unit 16 has an infrared light source 23. Infrared light from the infrared light source 23 is reflected by the dichroic mirror 15 and the semitransparent mirror 14 and projected onto the eyeball EB. It is detected by the detector 16. Thus, as in the first embodiment, the line-of-sight direction detection unit 16 can detect the line-of-sight direction LS of the eyeball EB of the observer.
- the virtual image stored in the non-volatile memory 64 of the processing arithmetic unit 60 is processed by the control unit 62 and then transmitted to the LCD panel 12 to form an image.
- the white light from the light source 22 passes through the LCD panel 12, and the light beam including the virtual image passes through the dichroic mirror 15, is reflected by the semitransparent mirror 14, and is projected onto the observer's eyeball EB.
- the observer can observe the virtual image while directly viewing the real image.
- the light amount of the real image (the actual brightness of the image observed in the line-of-sight direction LS) is detected by the imaging unit 18, and the light amount data is transmitted to the control unit 62. Can be adjusted according to the brightness of the real image.
- the virtual image can be superimposed on the real image more naturally and displayed to the observer.
- the image display device 3 of the third embodiment has the same configuration as the image display devices 1 and 2 described above except that it has a distance sensor that measures the distance from the eyeball EB to the gazing point. Detailed description of the constituent elements to be performed will be omitted. Also, Components similar to those in the first embodiment will be described using the same reference numerals.
- a gaze point is generally defined as a point where an object to be observed by an observer exists, but when observing an object with both eyes, the gaze point GP is This is also the intersection of the eyes LS and LS.
- the left and right eyeballs EB the line segment between the centers of EB C
- Each eyeball EB can be calculated from the line-of-sight direction LS, for example, 0 and ⁇ .
- L R L R 1 2 can be expressed as a function of distance L and angles ⁇ and ⁇ .
- Image display device 3 is configured to reproduce the natural perspective obtained when observed with the naked eye as described above. That is, in FIG. 14, the observer is gazing at the chair (real image) RI at the gazing point GP, and the distance L from the eyeball EB to the gazing point is calculated as described above. At this time, when forming a composite image by superimposing a virtual image VI (image of a pigeon in FIG.
- the control unit 62 Calculates the real image data obtained from the imaging unit 18 and the virtual image data stored in the virtual image storage unit 64 to display the real image RI clearly, while displaying the virtual image VI in a blurred manner. Thus, a composite image is formed.
- attribute information X related to the distance (virtual distance) to the position where the virtual image VI should be seen (the position where it should be seen) is assigned to the virtual image VI.
- the virtual image VI is stored in the nonvolatile memory 64.
- the observer may input the attribute information of the virtual distance X using the input unit 66 of the information processing unit 60. That is, each virtual image can have an arbitrary virtual distance (attribute information) X.
- the information processing unit 60 of Embodiment 3 determines the distance to the real image RI (the real distance ) Measure L and compare the distance (virtual distance X) to the position where the virtual image VI should be visible (the position where it will be visible) . If the real distance L and the virtual distance X are different, the virtual image is Real image RI is clearly displayed and virtual image VI is unclearly displayed when the image is far from or near the image.
- the information processing unit 60 compares the virtual distance X with the real distance, and blurs the virtual image VI that is not placed near the real image (gaze point GP) in the gaze direction LS.
- Real image Superimposes on RI to form a composite image. The synthesized image obtained in this way allows the observer to obtain a more realistic perspective.
- the information processing unit 60 when the virtual distance X is within a predetermined distance d with respect to the actual distance L (that is, when X becomes L ⁇ d or X> L + d). ) Alternatively, only the virtual image VI may be blurred and superimposed on the real image RI and displayed on the LCD panel 12.
- the distance from the point of sight GP to the eyeball EB may be measured directly.
- the perspective of the virtual image VI with respect to the real image RI can be expressed. That is, the observer can enjoy a realistic image with only one eye using the display unit 10 having a simpler configuration.
- the visual distance L is compared with the real distance L and the virtual distance X. It is not placed near the real image (gaze point GP) in the direction LS! ⁇ It is possible to provide the observer with a realistic virtual image VI by blurring the virtual image VI and projecting it on the eyeball EB. it can. [0044] Embodiment 4.
- Embodiment 4 of the image display device according to the present invention will be described below with reference to FIGS.
- the light source of the display unit 10 of the image display device described so far is a surface light source
- the image display device 4 of the fourth embodiment is generally except that the display unit 10 includes a point light source. Since it has the same configuration as that of the image display apparatus described so far, detailed description of the overlapping components will be omitted. The same components as those in the above embodiment are described using the same reference numerals.
- the light source 22 of the display unit 10 of Embodiment 4 shown in FIG. 15 is configured to be a point light source (a light source that emits a light beam having directivity), and the line-of-sight direction of the light source 22
- the size in the direction orthogonal to LS (width in the line-of-sight direction) w is configured to have the same size as the pupil diameter a of the eyeball EB. More specifically, the light source 22 is set so that its size w is 4 times or less, preferably 2 times or less of the pupil diameter a, and when the general pupil diameter a is 3 mm,
- the size w of 22 is set to be 12 mm or less, preferably 6 mm or less.
- the light flux from the light source 22 is uniformly diffused from a limited area (point light source) as shown in FIG.
- the light passes through 12 and is condensed by the condenser lens 34 so as to focus on the pupil P of the eyeball EB, and forms an image on the retina.
- an observer without depending on the lens effect of the lens CL of the eyeball EB can always see a clear image (Maxwell's view).
- the image displayed on the LCD panel 12 should be projected directly (with good reproducibility) onto the observer's eyeball EB. Can do.
- the lens CL has a predetermined lens effect ( The observer can perceive a clear image only when it has a lens power.
- the optical system that realizes Maxwellian vision described in the fourth embodiment is applied to the image display device 1 of the first embodiment, the combined image power SLCD in which the virtual image VI is superimposed on the real image RI. Displayed on the panel 12, the observer can clearly see the composite image displayed on the LCD panel 12 regardless of the real distance L and the virtual distance X.
- an optical system that realizes Maxwell's vision is used in the image display device 2 of Embodiment 2, it is actually related to the actual distance L (depth of focus) when viewing a real image with the naked eye. Regardless of whether the observed object is far or near, the virtual image displayed on the LCD panel 12 can always be clearly recognized without depending on the lens effect of the lens CL of the eyeball EB. it can.
- the observer sets an arbitrary value using the input unit 66 using the virtual distance X as attribute information of the virtual image VI, and the control unit 62 uses the LCD panel 12.
- the real distance L and the virtual distance X are different, the real image RI may be displayed clearly and the virtual image VI may be displayed unclearly.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Ophthalmology & Optometry (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
明 細 書 Specification
画像表示装置 Image display device
技術分野 Technical field
[0001] 本発明は、画像表示装置に関し、とりわけ現実の画像に仮想の画像 (映像を含む) を重畳して表示する眼球投射型の画像表示装置に関する。 The present invention relates to an image display device, and more particularly to an eyeball projection type image display device that displays a virtual image (including a video image) superimposed on a real image.
背景技術 Background art
[0002] 近年、現実世界と仮想世界を融合する技術である複合現実感 (Mixed Reality)の技 術分野にお 1ヽて、違和感なく現実画像および仮想画像を融合 (重畳表示)する表示 デバイス (画像表示装置)の重要性はますます増大しつつある。 [0002] In recent years, a display device that fuses (superimposes) real and virtual images without a sense of incongruity in the technical field of mixed reality, which is a technology that fuses the real and virtual worlds. The importance of image display devices) is increasing.
[0003] ところが、オプティカルシースルー方式のディスプレイなどの従来式の表示デバイス は、いくつかの問題点を有する。例えば、従来式の表示デバイスにおいて、現実画 像が得られる視野角が狭 、。また、観測者 (ユーザ)の視線方向(注視される対象物) により現実画像の明るさが変化するのに対して、仮装画像の明るさは一定であるので 、観測者は違和感を覚える。さらに、実際に観測される対象物 (現実画像)の注視点 までの距離が観測者の視線方向により変化するのに関係なぐ仮想画像が常に同様 に見えるため、現実画像と仮想画像との遠近感がつかめず、より臨場感のある、本物 らし 、 (verisimilar)複合現実感を得ることができな!/、。 However, a conventional display device such as an optical see-through display has several problems. For example, a conventional display device has a narrow viewing angle at which a real image can be obtained. In addition, the brightness of the real image changes depending on the line of sight of the observer (user) (the object to be watched), whereas the brightness of the virtual image is constant, so the observer feels uncomfortable. In addition, since the virtual image related to the actual observed object (real image) 's distance to the point of gaze changes according to the observer's line-of-sight direction always looks the same, the perspective between the real image and the virtual image I can't get it, and I can't get mixed reality with more realism (verisimilar)! /.
[0004] 例えば、特開平 8— 313843号公報 (特許文献 1)で提案された従来式の表示デバ イスによれば、低解像度で表示される背景映像 (広視野映像)と、視線に追従して移 動する高解像度の狭視野映像とを組み合わせて表示することにより、現実感の高い 映像を実現することができる。しかし、上記表示デバイスにおいては、広視野映像お よび狭視野映像から観測者 (の眼球)までの距離に関係なぐ広視野映像を常にぼ カゝして表示するため、観測者は、広視野映像に含まれる対象物と狭視野映像に含ま れる対象物の間の距離感を正確に認識することができない。すなわち、特許文献 1の 表示デバイスによれば、視野方向における距離 (視界深度)の差異に依存して映像 の遠近感を表現することができな 、。 [0004] For example, according to a conventional display device proposed in Japanese Patent Laid-Open No. 8-313843 (Patent Document 1), a background image (wide-field image) displayed at a low resolution and a line of sight are followed. High-reality images can be realized by combining and displaying high-resolution narrow-field images that move. However, the above display device always displays a wide-field image that is related to the distance from the wide-field image and the narrow-field image to the observer (its eyeball). The distance between the object included in the object and the object included in the narrow field image cannot be accurately recognized. That is, according to the display device of Patent Document 1, it is not possible to express the perspective of an image depending on the difference in the distance (depth of field) in the viewing direction.
[0005] また、特開平 11— 313843号公報 (特許文献 2)には、広視野の第 1の映像と高精 細な第 2の映像を合成して表示する映像表示装置が開示されている。しかしながら、 上記映像表示装置においては、眼球が有するレンズ効果に依存することなぐ常に 鮮明に表示される視界深度の深!ヽ仮想画像を表示することは示唆されて!ヽな ヽ。ま た特許文献 2には、観測者により仮想画像の視界深度が任意に設定され、実際に観 察して得られた現実画像の対象物までの距離と視界深度を比較して、注視点近傍に 配置されな 、仮想画像をぼ力して表示することにつ 、ては何ら言及されて 、な 、。 発明の開示 [0005] Also, Japanese Patent Application Laid-Open No. 11-313843 (Patent Document 2) describes a first video with a wide field of view and a high-resolution image. An image display device that synthesizes and displays a fine second image is disclosed. However, in the above video display device, it is suggested to display a virtual image with a depth of field of view that is always displayed clearly without depending on the lens effect of the eyeball. Further, in Patent Document 2, the depth of field of the virtual image is arbitrarily set by the observer, and the distance to the target object of the real image obtained by actual observation is compared with the depth of field. There is no mention of imagining and displaying virtual images without being arranged. Disclosure of the invention
発明が解決しょうとする課題 Problems to be solved by the invention
[0006] 本発明は、現実画像に仮想画像を適正に融合 (重畳)して合成画像を形成し、観 測者に違和感を与えることなく合成画像を表示する画像表示装置を提供することを 目的とする。 An object of the present invention is to provide an image display device that forms a composite image by appropriately fusing (superimposing) a virtual image with a real image and displaying the composite image without giving a sense of incongruity to an observer. And
課題を解決するための手段 Means for solving the problem
[0007] 本発明の 1つの態様による画像表示装置は、観測者の眼球の視線方向を検出する 視線方向検出部と、観測者の眼球の光軸上に配設され、眼球の視線方向にある現 実画像を撮像する撮像部と、仮想画像を記憶する仮想画像記憶部と、前記仮想画 像記憶部で記憶された仮想画像を前記撮像部で撮像された現実画像に重畳して合 成画像を形成する制御部と、合成画像を観測者に表示する表示部とを備えたことを 特徴とする。 [0007] An image display device according to one aspect of the present invention includes a gaze direction detection unit that detects a gaze direction of an observer's eyeball, and is disposed on the optical axis of the eyeball of the observer and is in the gaze direction of the eyeball. An image capturing unit that captures a real image, a virtual image storage unit that stores a virtual image, and a composite image in which a virtual image stored in the virtual image storage unit is superimposed on a real image captured by the image capturing unit. And a display unit for displaying a composite image to an observer.
発明の効果 The invention's effect
[0008] 本発明の 1つの態様の画像表示装置によれば、現実画像に仮想画像が重畳され た違和感のない臨場感のある合成画像を表示して、より本物ら 、複合現実感を実 現することができる。 [0008] According to the image display device of one aspect of the present invention, a realistic composite image without a sense of incongruity in which a virtual image is superimposed on a real image is displayed, thereby realizing a more real mixed reality. can do.
図面の簡単な説明 Brief Description of Drawings
[0009] [図 1]本発明の実施の形態 1に係る画像表示装置を示す概略図である。 FIG. 1 is a schematic diagram showing an image display apparatus according to Embodiment 1 of the present invention.
[図 2]図 1の画像表示装置の各構成ユニットを示すブロック図である。 2 is a block diagram showing each component unit of the image display device in FIG. 1. FIG.
[図 3]図 1の表示ユニットの変形例を示す概略図である。 FIG. 3 is a schematic view showing a modification of the display unit of FIG.
[図 4]図 1の表示ユニットの構成部品を示す概略図である。 [図 5]実施の形態 1に係る画像表示装置において、視線方向に対して視線方向検出 部の垂直中心軸を較正する処理を示す概略図である。 FIG. 4 is a schematic view showing components of the display unit of FIG. FIG. 5 is a schematic diagram showing a process of calibrating the vertical center axis of the line-of-sight direction detection unit with respect to the line-of-sight direction in the image display device according to Embodiment 1.
[図 6]実施の形態 1に係る画像表示装置において、視線方向に対して視線方向検出 部の垂直中心軸を較正する処理を示す概略図である。 FIG. 6 is a schematic diagram showing a process of calibrating the vertical center axis of the line-of-sight direction detection unit with respect to the line-of-sight direction in the image display device according to Embodiment 1.
[図 7]図 1の表示ユニットの別の変形例を示す概略図である。 FIG. 7 is a schematic view showing another modification of the display unit of FIG.
[図 8]図 1の表示ユニットのさらに別の変形例を示す概略図である。 FIG. 8 is a schematic view showing still another modification of the display unit of FIG.
[図 9]図 1の表示ユニットのさらに別の変形例を示す概略図である。 FIG. 9 is a schematic view showing still another modification of the display unit of FIG.
[図 10]図 1の駆動ユニットの変形例を示す概略図である。 FIG. 10 is a schematic view showing a modification of the drive unit in FIG. 1.
[図 11]実施の形態 1に係る画像表示装置に障害が生じた際に、図 1の駆動ユニットが 視線方向から排除される処理を示す概略図である。 FIG. 11 is a schematic diagram showing a process of removing the drive unit of FIG. 1 from the line-of-sight direction when a failure occurs in the image display apparatus according to Embodiment 1.
[図 12]本発明の実施の形態 2に係る画像表示装置を示す概略図である。 FIG. 12 is a schematic diagram showing an image display apparatus according to Embodiment 2 of the present invention.
[図 13]本発明の実施の形態 3に係る画像表示装置を示す概略図である。 FIG. 13 is a schematic diagram showing an image display apparatus according to Embodiment 3 of the present invention.
[図 14]実施の形態 3の画像表示装置を用いて、観測者が現実画像と仮想画像を同 時に観測したときの眼球、現実画像、および仮想画像の配置位置を示す概略図であ る。 FIG. 14 is a schematic diagram showing an arrangement position of an eyeball, a real image, and a virtual image when an observer observes a real image and a virtual image at the same time using the image display device of the third embodiment.
[図 15]本発明の実施の形態 4に係る画像表示装置に採用された点光源 (指向性光 源)により、マクスゥエル視を実現する光学系を示す概略図である。 FIG. 15 is a schematic diagram showing an optical system that realizes Maxwellian vision using a point light source (directional light source) employed in an image display apparatus according to Embodiment 4 of the present invention.
[図 16]通常の面光源 (拡散性光源)を用いた光学系を示す概略図である。 FIG. 16 is a schematic diagram showing an optical system using a normal surface light source (diffusive light source).
符号の説明 Explanation of symbols
10:表示ユニット 10: Display unit
11:ハウジング、 12: LCDパネル (表示部)、 14:第 1のミラー(光学部品)、 16:眼球 撮影用カメラ (視線方向検出部)、 18:視線方向撮影用カメラ (撮像部 )、 20:第 2のミ ラー、 22:光源、 23:赤外光源、 24:接眼レンズ、 25:補正レンズ、 32:光学式シャツ タ、 11: Housing, 12: LCD panel (display unit), 14: First mirror (optical component), 16: Eyeball camera (line-of-sight detection unit), 18: Camera for line-of-sight shooting (imaging unit), 20 : Second mirror, 22: Light source, 23: Infrared light source, 24: Eyepiece, 25: Correction lens, 32: Optical shirt,
40:駆動ユニット 40: Drive unit
42:第 1の枢動部(回転式駆動部)、 44:第 2の枢動部、 46:第 1のアーム、 48:第 2 のアーム、 50:中空ガイドハウジング、 52:電磁石、 54:永久磁石、 56:外周壁、 57: 内周壁、 58:端壁、 60 :情報処理ユニット 42: first pivot (rotary drive), 44: second pivot, 46: first arm, 48: second arm, 50: hollow guide housing, 52: electromagnet, 54: Permanent magnet, 56: Outer wall, 57: Inner wall, 58: End wall, 60: Information processing unit
62:制御部、 64:不揮発性メモリ (仮想画像記憶部)、 66:入力部、 62: Control unit, 64: Non-volatile memory (virtual image storage unit), 66: Input unit,
LS :視線方向、 EB:眼球、 CB :キャリブレーション信号 (光束)、 GP :注視点、 RI :現 実画像、 VI:仮想画像。 LS: Gaze direction, EB: Eyeball, CB: Calibration signal (light flux), GP: Gaze point, RI: Actual image, VI: Virtual image.
発明を実施するための最良の形態 BEST MODE FOR CARRYING OUT THE INVENTION
[0011] 以下、添付図面を参照して本発明に係る画像表示装置の実施の形態を説明する。 Hereinafter, an embodiment of an image display device according to the present invention will be described with reference to the accompanying drawings.
実施の形態の説明において、理解を容易にするために方向を表す用語 (例えば、「 右」および「左」など)を適宜用いる力 これは説明のためのものであって、これらの用 語は本発明を限定するものでない。 In the description of the embodiment, a force that appropriately uses terms indicating directions (for example, “right” and “left”) to facilitate understanding. This is for explanation, and these terms are It is not intended to limit the invention.
[0012] 実施の形態 1. [0012] Embodiment 1.
図 1〜図 11を参照しながら、本発明に係る画像表示装置の実施の形態 1につ!/ヽて 以下に説明する。図 1は本発明の実施の形態 1に係る画像表示装置 1を示す概略図 で、図 2は図 1の画像表示装置 1の各構成ユニットを示すブロック図である。なお、こ の画像表示装置は、通常、観測者の両目(左目および右目)に装着して使用される 力 以下の実施の形態においては、特に言及しない限り、左目に装着される画像表 示装置についてのみ説明し、右目に装着される画像表示装置は、図 1の一点鎖線で 示す対称面 SSに対して対称的に構成されるものと理解されたい。 A first embodiment of the image display apparatus according to the present invention will be described below with reference to FIGS. FIG. 1 is a schematic diagram showing an image display device 1 according to Embodiment 1 of the present invention, and FIG. 2 is a block diagram showing each component unit of the image display device 1 of FIG. Note that this image display device is normally used by being attached to the observer's eyes (left eye and right eye). In the following embodiments, the image display device attached to the left eye unless otherwise specified. It is to be understood that the image display apparatus mounted on the right eye is configured symmetrically with respect to the symmetry plane SS indicated by the one-dot chain line in FIG.
[0013] 本実施の形態に係る画像表示装置 1は、概略、図 1および図 2に示すように、表示 ユニット 10と、表示ユニット 10を移動させる駆動ユニット 40と、表示ユニット 10および 駆動ユニット 40に有線または無線で制御可能に接続された情報処理ユニット 60とを 有する。 As shown schematically in FIGS. 1 and 2, an image display device 1 according to the present embodiment includes a display unit 10, a drive unit 40 that moves the display unit 10, and the display unit 10 and the drive unit 40. And an information processing unit 60 connected to be wired or wirelessly controllable.
[0014] 表示ユニット 10は、図 1において、ほぼ半球状の外形形状を有するハウジング 11の 内側に配設された液晶ディスプレイ (LCD)パネルなどの表示部 12、半透明のミラー などの光学部品 14、および図 1の破線矢印で示す観測者 (ユーザ)の視線方向 LS を検出するための眼球撮影用カメラ (視線方向検出部) 16を有する。さらに表示ュ- ット 10は、視線方向 LSに観測される実際の画像を撮像するための視線方向撮影用 カメラ (撮像部) 18を有する。図 1の構成によれば、撮像部 18は、ノ、ウジング 11の外 側にあって、視線方向 LS上に配設されている。 また、表示ユニット 10を構成する表示部 12、光学部品 14、視線方向検出部 16、お よび撮像部 18は、ハウジング 11内で互いに対して固定されており、ハウジング 11の 移動に伴って、それぞれの相対的な配置関係を維持した状態で一体に移動すること ができる。 In FIG. 1, the display unit 10 includes a display unit 12 such as a liquid crystal display (LCD) panel disposed inside a housing 11 having a substantially hemispherical outer shape, and an optical component 14 such as a semitransparent mirror. And an eye photography camera (line-of-sight detection unit) 16 for detecting the line-of-sight direction LS of the observer (user) indicated by a broken-line arrow in FIG. Further, the display unit 10 includes a camera (imaging unit) 18 for capturing a visual line direction for capturing an actual image observed in the visual line direction LS. According to the configuration shown in FIG. 1, the imaging unit 18 is located outside the nosing / housing 11 and is disposed in the line-of-sight direction LS. In addition, the display unit 12, the optical component 14, the line-of-sight direction detection unit 16, and the imaging unit 18 constituting the display unit 10 are fixed with respect to each other in the housing 11, and as the housing 11 moves, It is possible to move together while maintaining the relative positional relationship.
なお、これらのカメラ (視線方向検出部 16および撮像部 18)は、得られた画像情報 を情報処理ユニット 60によりデジタル式に処理するため、好適には、電荷結合デバィ ス (CCD)または CMOS撮像デバイスなどのデジタル式撮像素子を用いて構成され る。 Note that these cameras (the line-of-sight direction detection unit 16 and the imaging unit 18) process the obtained image information digitally by the information processing unit 60, and are preferably a charge-coupled device (CCD) or CMOS imaging unit. It is configured using a digital image sensor such as a device.
[0015] 同様に図 1を参照すると、駆動ユニット 40は、一端部で表示ユニット 10のハウジン グ 11に連結された第 1の枢動部(回転式駆動部) 42と他端部で第 2の枢動部 44とを 有する第 1のアーム 46、および第 2の枢動部 44に摺動可能に取り付けられた第 2の アーム 48を備えている。第 1のアーム 46 (第 1および第 2の枢動部 42, 44を含む)お よび第 2のアーム 48は、情報処理ユニット 60からの指令を受けて、表示ユニット 10を 任意の位置に移動させることができる。 Similarly, referring to FIG. 1, the drive unit 40 includes a first pivot part (rotary drive part) 42 connected to the housing 11 of the display unit 10 at one end and a second part at the other end. And a second arm 48 slidably attached to the second pivot 44. The first arm 46 (including the first and second pivot portions 42 and 44) and the second arm 48 move the display unit 10 to an arbitrary position in response to a command from the information processing unit 60. Can be made.
[0016] 情報処理ユニット 60は、パーソナル 'コンピュータまたはポータブル 'コンピュータな どの一般的な情報制御端末であって、中央演算ユニット (CPU)などの制御部 62と、 仮想画像を記憶するためのハードディスクまたはフラッシュメモリなどの不揮発性メモ リで構成される仮想画像記憶部 64と、キーボードまたはタツチパネルなどで構成され る入力部 66とを有する。 [0016] The information processing unit 60 is a general information control terminal such as a personal 'computer or a portable' computer, and includes a control unit 62 such as a central processing unit (CPU) and a hard disk for storing virtual images or It has a virtual image storage unit 64 composed of nonvolatile memory such as a flash memory and an input unit 66 composed of a keyboard or touch panel.
[0017] 表示ユニット 10が観測者 (ユーザ)に装着されると、上述のとおり、視線方向検出部 16が半透明ミラー 14を介して観測者の眼球 EBの瞳孔位置、すなわち観測者の視 線方向 LSを検出し、視線方向データを情報処理ユニット 60の制御部 62に送信する 。そして制御部 62は、駆動ユニット 40を駆動して、検出された視線方向 LS上に撮像 部 18が配置されるように表示ユニット 10を移動させる。視線方向検出部 16、情報処 理ユニット 60および駆動ユニット 40の一連の動作は、表示ユニット 10が観測者に装 着されている間、継続して実行されるため、撮像部 18は、常に観測者の視線方向 LS に観測される実際の対象物の現実画像を撮像する。撮像部 18で撮像された現実画 像のデータは制御部 62に逐次送信される。 [0018] 仮想画像記憶部 64に記憶された仮想画像のデータは、同様に、制御部 62に送信 され、ここでデジタル式にデータ処理され、撮像部 18で撮像された現実画像に重畳 され、合成された画像のデータが生成される。さらに合成画像データは、表示ュ-ッ ト 10に送信され、表示部 12で表示され、これを透過する光束がミラー 14で反射され て、観測者の眼球 EBに投影される。こうして実施の形態 1によれば、現実画像に仮 想画像を適正に融合して、臨場感のある合成画像を観測者に提供することができる。 このとき制御部 62は、現実画像と仮想画像の明るさを互いに調整して、合成画像デ ータを生成するので、より自然な合成画像を観測者に表示することができる。 [0017] When the display unit 10 is attached to an observer (user), as described above, the line-of-sight direction detection unit 16 passes through the semi-transparent mirror 14 and the pupil position of the observer's eyeball EB, that is, the observer's line of sight. The direction LS is detected, and the line-of-sight direction data is transmitted to the control unit 62 of the information processing unit 60. Then, the control unit 62 drives the drive unit 40 to move the display unit 10 so that the imaging unit 18 is arranged in the detected line-of-sight direction LS. Since a series of operations of the line-of-sight direction detection unit 16, the information processing unit 60, and the drive unit 40 is continuously performed while the display unit 10 is mounted on the observer, the imaging unit 18 always observes. A real image of the actual object observed in the person's gaze direction LS is captured. Real image data captured by the imaging unit 18 is sequentially transmitted to the control unit 62. [0018] Similarly, the virtual image data stored in the virtual image storage unit 64 is transmitted to the control unit 62, where it is digitally processed and superimposed on the real image captured by the imaging unit 18, Data of the synthesized image is generated. Further, the composite image data is transmitted to the display unit 10, displayed on the display unit 12, and the light beam passing therethrough is reflected by the mirror 14 and projected onto the observer's eyeball EB. Thus, according to Embodiment 1, it is possible to appropriately combine a virtual image with a real image and provide a realistic composite image to an observer. At this time, the control unit 62 adjusts the brightness of the real image and the virtual image to generate composite image data, so that a more natural composite image can be displayed to the observer.
[0019] また視線方向検出部 16は、観測者の視線方向を常時モニタし、視線方向データを 制御部 62に送信しているので、観測者の視線が動くと、制御部 62は、駆動ユニット 4 0に指令を出して、撮像部 18が視線方向 LSにある画像を撮像するように表示ュ-ッ ト 10を移動させる。このとき、好適には、観測者の眼球 EB力も表示ユニット 10までの 距離が一定となるよう、駆動ユニット 40は、観測者の眼球 EBを中心とする球面上に おいて表示ユニット 10を移動させる。その結果、実施の形態 1に係る画像表示装置 1 によれば、観測者の視線方向 LSに追従して、常に視線方向 LSにある現実画像を撮 像して、これに仮想画像を重畳した合成画像を観測者に表示することができるので、 より自然な複合現実感を観測者に与えることができる。 The line-of-sight direction detection unit 16 constantly monitors the line-of-sight direction of the observer and transmits line-of-sight direction data to the control unit 62. Therefore, when the line of sight of the observer moves, the control unit 62 A command is issued to 40, and the display unit 10 is moved so that the imaging unit 18 captures an image in the line-of-sight direction LS. At this time, preferably, the drive unit 40 moves the display unit 10 on a spherical surface centered on the eyeball EB of the observer so that the distance to the display unit 10 is constant also in the eyeball EB force of the observer. . As a result, according to the image display device 1 according to the first embodiment, the real image always in the line-of-sight direction LS is captured following the observer's line-of-sight direction LS, and the composite image is superimposed on the virtual image. Since the image can be displayed to the observer, a more natural mixed reality can be given to the observer.
[0020] 以下、実施の形態 1の各構成部品の詳細な構成 (変形例を含む)および動作につ いて詳細に説明するが、当業者ならば容易に理解されるその他の構成についても同 様に本発明に含まれるものと理解された 、。 [0020] Hereinafter, detailed configurations (including modifications) and operations of each component of the first embodiment will be described in detail, but other configurations that can be easily understood by those skilled in the art are similarly described. Is understood to be included in the present invention.
[0021] 図 1に示す表示ユニット 10の撮像部 18は、ハウジング 11の外側にあって、視線方 向 LS上に配設されている力 これに限定されるものではない。すなわち、図 3に示す 画像表示装置 1において、撮像部 18はハウジング 11の内側に配設されている。また 、表示部 12で表示された画像を観測者の眼球 EBに向けて反射させる第 1のミラー 1 4に加えて、視線方向 LSにある現実画像をノヽウジング 11内にある撮像部 18に投影 する第 2のミラー 20が設けられている。こうして、視線方向にある現実画像は、実施の 形態 1と同様に、撮像部 18により撮像され、制御部 62で仮想画像に重畳される。そ して合成画像が表示部 12で表示されて、第 1のミラー 14で反射され、観測者に表示 される。 The imaging unit 18 of the display unit 10 shown in FIG. 1 is located outside the housing 11 and is disposed on the line-of-sight direction LS, but is not limited to this. That is, in the image display device 1 shown in FIG. 3, the imaging unit 18 is disposed inside the housing 11. In addition to the first mirror 14 that reflects the image displayed on the display unit 12 toward the observer's eyeball EB, a real image in the line-of-sight direction LS is projected onto the imaging unit 18 in the nosing 11. A second mirror 20 is provided. Thus, the real image in the line-of-sight direction is picked up by the image pickup unit 18 and superimposed on the virtual image by the control unit 62 as in the first embodiment. The composite image is displayed on the display unit 12, reflected by the first mirror 14, and displayed to the observer. Is done.
[0022] なお、第 1のミラー 14は半透明であるものとして説明した力 偏向分離ミラーであつ てもよい。また第 2のミラー 20は、可視光を反射し、赤外光を透過するダイクロックミラ 一であってもよい。 [0022] The first mirror 14 may be a force deflection separation mirror described as being translucent. The second mirror 20 may be a dichroic mirror that reflects visible light and transmits infrared light.
[0023] 表示ユニット 10は、図 4でより具体的に示すように、光源 22、これに隣接して配設さ れた透過型の LCDパネル(表示部) 12、 LCDパネル 12からの光束を眼球 EBに向 けて反射する偏向ビームスプリッタ (光学部品) 14、および光束を眼球 EBに集光させ る接眼レンズ 24を有する。視線方向検出部 16および撮像部 18は、これらの垂直中 心軸が一直線上に配置されるように組み立てられる。 As shown more specifically in FIG. 4, the display unit 10 includes a light source 22, a transmissive LCD panel (display unit) 12 disposed adjacent to the light source 22, and a light flux from the LCD panel 12. It has a deflecting beam splitter (optical component) 14 that reflects toward the eyeball EB, and an eyepiece 24 that focuses the light beam onto the eyeball EB. The line-of-sight direction detection unit 16 and the imaging unit 18 are assembled so that their vertical center axes are arranged in a straight line.
このように構成された表示ユニット 10において、光源 22からの白色光が透過型の L CDパネル 12を透過し、透過型 LCDパネル 12で表示された合成画像を含む光束が 偏向ビームスプリッタ 16で反射して、観測者の眼球 EBに投影される。この結果、観 測者は、透過型 LCDパネル 12に表示された合成画像を見ることができる。 In the display unit 10 configured as described above, white light from the light source 22 is transmitted through the transmissive LCD panel 12, and the light beam including the composite image displayed on the transmissive LCD panel 12 is reflected by the deflecting beam splitter 16. And projected onto the observer's eyeball EB. As a result, the observer can see the composite image displayed on the transmissive LCD panel 12.
[0024] 上述のように、実施の形態 1に係る画像表示装置 1によれば、観測者の視線に追従 して、常に視線方向にある現実画像を撮像して、観測者に表示することができる。こ れを実現するためには、図 1に示すように、視線方向検出部 16および撮像部 18の垂 直中心軸を視線方向 LSに正確に位置合わせする必要がある。視線方向検出部 16 および撮像部 18は、それらの垂直中心軸が一直線上に配置されるように、組み立て る(固定)ことは容易である力 表示ユニット 10の装着の仕方などに起因して、視線方 向検出部 16 (撮像部 18)の垂直中心軸と視線方向 LSを正確に位置合わせすること は困難であり、観測前(および観測中にも)において、視線方向検出部 16の垂直中 心軸を視線方向 LSに確実に較正する処理 (キャリブレーション)を行う必要がある。 [0024] As described above, according to the image display device 1 according to the first embodiment, it is possible to follow the observer's line of sight, capture a real image always in the line of sight, and display it to the observer. it can. In order to realize this, as shown in FIG. 1, it is necessary to accurately align the vertical center axes of the line-of-sight direction detection unit 16 and the imaging unit 18 with the line-of-sight direction LS. The gaze direction detection unit 16 and the imaging unit 18 are easy to assemble (fix) so that their vertical central axes are arranged in a straight line. It is difficult to accurately align the vertical center axis of the line-of-sight direction detection unit 16 (imaging unit 18) and the line-of-sight direction LS. It is necessary to perform a process (calibration) that reliably calibrates the central axis to the line-of-sight direction LS.
[0025] 視線方向検出部 16の垂直中心軸と視線方向 LSを正確に位置合わせるためには、 観測者が所定の注視点を観測しているときに、図 5に示すように、 LCDパネル 12の 中心 (すなわち撮像部 18の中心) 13に、例えば丸印などの所定形状の画像を表示 して、キャリブレーション信号 (光束) CBを半透明ミラー 14に向力つて投射し (矢印 26 a)、半透明ミラー 14で反射させて眼球 EBに照射する(矢印 26b)。このとき、キヤリブ レーシヨン光束 CBは眼球 EBで視線方向 LSに反射し、半透明ミラー 14を透過して、 視線方向検出部 16で検出される(矢印 26c)。すなわち、視線方向検出部 16の中心 (所定形状のキャリブレーション信号 CB)と視線方向(図 6 (a)および (b)の一点鎖線 の交点 28) LSがー致しているならば、図 6 (a)に示すように、所定形状のキヤリブレー シヨン光束 (画像)がー点鎖線の交点 28に検出されるが、視線方向検出部 16の中心 軸と視線方向 LSが正確に位置合わせされていなければ、図 6 (b)に示すように、所 定形状の画像が一点鎖線の交点 28から逸脱してしまう。 [0025] In order to accurately align the vertical center axis of the line-of-sight direction detection unit 16 and the line-of-sight direction LS, as shown in Fig. 5, the LCD panel 12 An image of a predetermined shape such as a circle is displayed at the center of the image sensor 13 (ie, the center of the imaging unit 18), and a calibration signal (light beam) CB is projected to the translucent mirror 14 by directing it (arrow 26a). Then, it is reflected by the semitransparent mirror 14 and irradiated to the eyeball EB (arrow 26b). At this time, the calibration light beam CB is reflected by the eyeball EB in the line-of-sight direction LS, passes through the translucent mirror 14, It is detected by the line-of-sight direction detector 16 (arrow 26c). That is, if the center of the line-of-sight detection unit 16 (calibration signal CB of a predetermined shape) and the line-of-sight direction (intersection 28 of the dashed line in FIGS. 6 (a) and (b)) LS match, FIG. As shown in a), a calibration-shaped luminous flux (image) with a predetermined shape is detected at the intersection 28 of the dotted chain line, but if the central axis of the line-of-sight direction detector 16 and the line-of-sight direction LS are not accurately aligned, As shown in Fig. 6 (b), the image of the predetermined shape deviates from the intersection 28 of the alternate long and short dash line.
[0026] 図 6 (b)に示すように、所定形状の画像 (キャリブレーション光束) CBがー点鎖線の 交点 28から逸脱する場合、視線方向検出部 16で検出された図 6 (b)の画像を基に、 制御部 62は、一点鎖線の交点 28 (すなわち視線方向 LS)に対する撮像部 18の中 心軸 (すなわちキャリブレーション光束 CB)のずれ量を算出する。そして制御部 62は 、視線方向 LSに対する撮像部 18および視線方向検出部 16の中心軸のずれ量に応 じて、駆動ユニット 40を用いて表示ユニット 10を移動させ、図 6 (a)に示すような位置 合わせされた状態に較正する。こうした所定形状の画像 (キャリブレーション光束 CB) は、観測者が所定の注視点を観測しているときに、観測者が認識可能な時間より短 い間隔で眼球 EBに投影することが好ましい。これにより、観測者を極力煩わせること なぐ視線方向検出部 16 (および撮像部 18)の中心軸を視線方向 LSに較正すること ができる。さらに好適には、こうした較正処理を定期的に行い、視線方向検出部 16の 中心軸と視線方向 LSを常に位置合わせされた状態に維持する。 [0026] As shown in FIG. 6 (b), when the image (calibration light beam) CB of a predetermined shape deviates from the intersection 28 of the dashed-dotted line, the line-of-sight direction detection unit 16 detects that in FIG. Based on the image, the control unit 62 calculates a deviation amount of the center axis (that is, the calibration light beam CB) of the imaging unit 18 with respect to the intersection 28 (that is, the line-of-sight direction LS) of the one-dot chain line. Then, the control unit 62 moves the display unit 10 using the drive unit 40 in accordance with the amount of deviation of the central axes of the imaging unit 18 and the gaze direction detection unit 16 with respect to the gaze direction LS, as shown in FIG. Calibrate to such an aligned condition. Such an image having a predetermined shape (calibration light beam CB) is preferably projected onto the eyeball EB at an interval shorter than the time recognizable by the observer when the observer observes the predetermined gazing point. This makes it possible to calibrate the central axis of the gaze direction detection unit 16 (and the imaging unit 18) to the gaze direction LS without bothering the observer as much as possible. More preferably, such a calibration process is periodically performed so that the central axis of the line-of-sight direction detection unit 16 and the line-of-sight direction LS are always aligned.
[0027] 視線方向検出部 16の中心軸と視線方向 LSが位置合わせされた状態において、視 線方向検出部 16は、視線方向 LSを常時モニタする。具体的には、視線方向検出部 16は、図 7に示すように、赤外 LEDランプなどの赤外光源 23を有し、赤外光源 23か らの赤外光が半透明ミラー 16を透過して観測者の眼球 EBに照射され、反射された 赤外線を検出することにより、眼球 EBの瞳孔位置、すなわち視線方向 LSを検出する 。このように、実施の形態 1によれば、観測者により認知されない赤外光を用いて視 線方向 LSを検出するので、観測者に違和感を与えることなぐ観測者の視線方向 L Sを常時モニタすることができる。 In the state where the central axis of the line-of-sight direction detection unit 16 and the line-of-sight direction LS are aligned, the line-of-sight direction detection unit 16 constantly monitors the line-of-sight direction LS. Specifically, as shown in FIG. 7, the line-of-sight direction detection unit 16 has an infrared light source 23 such as an infrared LED lamp, and the infrared light from the infrared light source 23 passes through the translucent mirror 16. Then, the pupil position of the eyeball EB, that is, the line-of-sight direction LS is detected by detecting the reflected infrared ray irradiated to the eyeball EB of the observer. As described above, according to Embodiment 1, the line-of-sight direction LS is detected using infrared light that is not recognized by the observer, so the observer's line-of-sight direction LS is constantly monitored without giving the observer a sense of incongruity. be able to.
[0028] 図 7を用いて上記説明した表示ユニット 10は、視線方向 LSに観測される実際の画 像を撮像するための視線方向撮影用カメラ (撮像部) 18と、観測者の視線方向を検 出するための眼球 EB撮影用カメラ (視線方向検出部) 16とを有するが、本発明はこ れに限定されるものではなぐ図 8に示すように、撮像部 18および視線方向検出部 1 6を一体ィ匕して、単一の CCDまたは CMOS撮像デバイス 30を用いて、現実画像と 眼球 EBの瞳孔の両方を撮像してもよい。この撮像デバイス 30は、一般に、シリコン基 板を用いて形成されるが、表面を視線方向 LSに向けて、現実画像を撮像するととも に、裏面から照射される赤外光を検出するように構成することができる。この場合、図 8に示すように、撮像部 18の前方には液晶シャツタなどの光学式シャツタ 32を配置し 、現実画像を撮像するときには光学式シャツタ 32を開放し、視線方向を検出するとき には光学式シャツタ 32を閉じる。視線方向 LSを検出するために光学式シャツタ 32を 閉じている時間を極力短くし、定期的に視線方向 LSを検出することが好ましい。これ により、画像表示装置 1は、観測者の視線に追従して、視線方向 LSにある現実画像 を常に観測者に表示することができる。 [0028] The display unit 10 described above with reference to FIG. 7 includes a gaze direction photographing camera (imaging unit) 18 for capturing an actual image observed in the gaze direction LS, and the gaze direction of the observer. Inspection As shown in FIG. 8, the present invention is not limited to this. However, the present invention is not limited to this, and the imaging unit 18 and the line-of-sight detection unit 1 6 , And a single CCD or CMOS imaging device 30 may be used to image both the real image and the pupil of the eyeball EB. The image pickup device 30 is generally formed using a silicon substrate, and is configured to pick up a real image with the front side facing the line-of-sight direction LS and detect infrared light irradiated from the back side. can do. In this case, as shown in FIG. 8, an optical shirter 32 such as a liquid crystal shirter is arranged in front of the imaging unit 18, and the optical shirter 32 is opened to pick up a real image, and the line of sight is detected. Closes the optical shirt 32. In order to detect the line-of-sight direction LS, it is preferable to shorten the time during which the optical shirt 32 is closed as much as possible and detect the line-of-sight direction LS periodically. As a result, the image display apparatus 1 can always display the real image in the line-of-sight direction LS to the observer following the line of sight of the observer.
このように、単一の撮像デバイス 30を用いて、現実画像と眼球 EBの両方を撮像す ると、視線方向検出部 16およびその周辺回路 (画像形成回路)などを共有ィ匕 (省略) できるので、部品点数を削減し、表示ユニット 10の小型化 ·軽量ィ匕を図ることができる In this way, when both the real image and the eyeball EB are imaged using the single imaging device 30, the gaze direction detection unit 16 and its peripheral circuit (image forming circuit) can be shared (omitted). Therefore, the number of parts can be reduced and the display unit 10 can be reduced in size and weight.
[0029] また、実施の形態 1では、表示部 12として透過型 LCDパネルが採用された力 本 発明はこれに限定されるものではない。すなわち、表示ユニット 10は、図 9に示すよう に、光源 22と、反射型の LCDパネルなどの表示部 12を有する。すなわち、光源 22 力もの白色光は、偏向ビームスプリッタ 14を通過し、反射型 LCDパネル 12で画像形 成された光束が偏向ビームスプリッタ 14で反射して観測者の眼球 EBに照射される。 こうして観測者は、反射型 LCDパネル 12に表示された合成画像を観測することがで きる。反射型 LCDパネル 12を含む表示ユニット 10は、一般に、透過型 LCDパネル 1 2を含む表示ユニットより高精細な画質を実現できるので、観測者は、図 9に示す反 射型の表示ユニット 10を用いてより緻密な画像を観測することができる。 [0029] In the first embodiment, a force that employs a transmissive LCD panel as the display unit 12 is not limited to this. That is, the display unit 10 includes a light source 22 and a display unit 12 such as a reflective LCD panel, as shown in FIG. That is, the white light having the light source 22 passes through the deflecting beam splitter 14, and the light beam formed as an image by the reflective LCD panel 12 is reflected by the deflecting beam splitter 14 and irradiated to the observer's eyeball EB. Thus, the observer can observe the composite image displayed on the reflective LCD panel 12. Since the display unit 10 including the reflective LCD panel 12 can generally achieve higher definition image quality than the display unit including the transmissive LCD panel 12, the observer can use the reflective display unit 10 shown in FIG. It can be used to observe a more precise image.
[0030] さらに、実施の形態 1において、駆動ユニット 40は第 1および第 2のアーム 42, 44を 有するものとして説明した力 その他の構造を有していてもよい。例えば、図 10に示 すように、駆動ユニット 40は、概略、透明部材からなるほぼ半球状の外周壁 56およ び内周壁 57と、端壁 58により形成された中空ガイドハウジング 50を有し、中空ガイド ハウジング 50内には表示ユニット 10が摺動可能に配設されている。中空ガイドハウ ジング 50の端壁 58付近には、複数の電磁石 52が配設される一方、表示ユニット 10 内には複数の永久磁石 54が配設されている。制御部 62は、中空ガイドノヽウジング 5 0内の各電磁石 52の磁力(電磁石 52に流れる電流量)を調整することにより、検出さ れた視線方向 LS上に撮像部 18が配置されるように、表示ユニット 10を中空ガイドノヽ ウジング 50に沿って移動させることができる(矢印 56)。 Furthermore, in Embodiment 1, the drive unit 40 may have the force and other structures described as having the first and second arms 42 and 44. For example, as shown in FIG. 10, the drive unit 40 is roughly composed of a substantially hemispherical outer peripheral wall 56 and a transparent member. And a hollow guide housing 50 formed by the inner peripheral wall 57 and the end wall 58, and the display unit 10 is slidably disposed in the hollow guide housing 50. In the vicinity of the end wall 58 of the hollow guide housing 50, a plurality of electromagnets 52 are disposed, while a plurality of permanent magnets 54 are disposed in the display unit 10. The control unit 62 adjusts the magnetic force (the amount of current flowing through the electromagnet 52) of each electromagnet 52 in the hollow guide nosing 50 so that the imaging unit 18 is arranged on the detected line-of-sight direction LS. The display unit 10 can be moved along the hollow guide nosing 50 (arrow 56).
[0031] さらに、実施の形態 1の画像表示装置 1において、制御部 62は、表示ユニット 10ま たは情報処理ユニット 60に障害が生じて、適正な合成画像を観測者に表示できなく なったことを検出すると、図 11 (a)および (b)に示すように、第 1のアーム 46を駆動し て、表示ユニット 10を観測者の視線方向 LSから取り除くことが好ましい。これにより、 観測者の裸眼による視界を回復して、観測者の安全を確保する。表示ユニット 10お よび情報処理ユニット 60の障害には、これに限定されないが、例えば、視線方向検 出部 16および撮像部 18から制御部 62への画像データ送信の中断、表示部 12の誤 動作、制御部 62の停電または誤動作などが含まれる。また、表示ユニット 10を観測 者の視線方向 LSから排除するためには、当業者ならば理解される任意の手段を用 いて実現することができる力 例えば、表示ユニット 10を排除するように付勢するばね 部材(図示せず)が、障害に応じて解放され、図 11 (a)の状態力も図 11 (b)の状態に 第 1のアーム 46を移動させる。 [0031] Furthermore, in the image display device 1 according to the first embodiment, the control unit 62 has failed in the display unit 10 or the information processing unit 60 and cannot display an appropriate composite image to the observer. When this is detected, it is preferable to drive the first arm 46 to remove the display unit 10 from the observer's line-of-sight direction LS as shown in FIGS. 11 (a) and 11 (b). This will restore the observer's naked eye field of view and ensure the safety of the observer. The failure of the display unit 10 and the information processing unit 60 is not limited to this, but for example, interruption of image data transmission from the line-of-sight direction detection unit 16 and the imaging unit 18 to the control unit 62, malfunction of the display unit 12 The power failure or malfunction of the control unit 62 is included. Further, in order to exclude the display unit 10 from the observer's line-of-sight direction LS, a force that can be realized by using any means understood by those skilled in the art. For example, the display unit 10 is energized to exclude the display unit 10. A spring member (not shown) is released in response to the failure, and the state force of FIG. 11 (a) also moves the first arm 46 to the state of FIG. 11 (b).
[0032] 実施の形態 2. [0032] Embodiment 2.
図 12を参照しながら、本発明に係る画像表示装置の実施の形態 2について以下に 説明する。実施の形態 2の画像表示装置 2は、現実画像は観測者が直接的に観測し 、表示ユニット 10は仮想画像だけを観測者の眼球 EBに投影する点を除 ヽて実施の 形態 1の画像表示装置 1と同様の構成を有するので、重複する構成要素に関する詳 細な説明を省略する。また、実施の形態 1と同様の構成部品については、同様の符 号を用いて説明する。 Embodiment 2 of the image display apparatus according to the present invention will be described below with reference to FIG. In the image display device 2 of the second embodiment, the real image is directly observed by the observer, and the display unit 10 is the image of the first embodiment except that only the virtual image is projected onto the observer's eyeball EB. Since it has the same configuration as that of the display device 1, detailed description of the overlapping components is omitted. The same components as those in Embodiment 1 will be described using the same reference numerals.
[0033] 実施の形態 2に係る表示ユニット 10は、概略、ハウジング 11の内側に配設された光 源 22、液晶ディスプレイ (LCD)パネルなどの表示部 12、赤外光を反射し可視光を 透過するダイクロックミラー 15、赤外光を反射し可視光を透過する半透明ミラー 14、 視線方向 LS力 の光束を眼球 EB内に集光するための接眼レンズ 24と補正レンズ 2 5、観測者の視線方向 LSを検出するための視線方向検出部 16、および視線方向 L Sに観測される実際の画像の明るさを測定するための撮像部 18を備えている。 [0033] The display unit 10 according to the second embodiment generally includes a light source 22 disposed inside the housing 11, a display unit 12 such as a liquid crystal display (LCD) panel, and reflects visible light by reflecting infrared light. Transmitting dichroic mirror 15; Translucent mirror 14 that reflects infrared light and transmits visible light; Eyepiece 24 and correction lens 2 5 for converging LS-direction luminous flux into eyeball EB, observer A line-of-sight direction detection unit 16 for detecting the line-of-sight direction LS, and an imaging unit 18 for measuring the brightness of an actual image observed in the line-of-sight direction LS.
[0034] 上述のように、観測者は、接眼レンズ 25、半透明ミラー 14、および補正レンズ 24を 通して、視線方向 LSにある対象物を直接的に見ることができる。また、視線方向 LS 力もの光束は、半透明ミラー 14で反射され、撮像部 18でその光量 (視線方向 LSに 観測される実際の画像の明るさ)が検出される。 [0034] As described above, the observer can directly see the object in the line-of-sight direction LS through the eyepiece lens 25, the translucent mirror 14, and the correction lens 24. Further, the luminous flux with the line-of-sight direction LS force is reflected by the semi-transparent mirror 14, and the amount of light (the actual brightness of the image observed in the line-of-sight direction LS) is detected by the imaging unit 18.
視線方向検出部 16は赤外光源 23を有する。赤外光源 23からの赤外光は、ダイク ロックミラー 15および半透明ミラー 14で反射し、眼球 EB上に投射され、同様に半透 明ミラー 14およびダイクロックミラー 15で反射して、視線方向検出部 16により検出さ れる。こうして、実施の形態 1と同様、視線方向検出部 16は、観測者の眼球 EBの視 線方向 LSを検出することができる。 The line-of-sight direction detection unit 16 has an infrared light source 23. Infrared light from the infrared light source 23 is reflected by the dichroic mirror 15 and the semitransparent mirror 14 and projected onto the eyeball EB. It is detected by the detector 16. Thus, as in the first embodiment, the line-of-sight direction detection unit 16 can detect the line-of-sight direction LS of the eyeball EB of the observer.
[0035] また実施の形態 2によれば、処理演算ユニット 60の不揮発性メモリ 64に格納された 仮想画像が制御部 62で処理された後、 LCDパネル 12に送信され、画像が形成され る。そして光源 22からの白色光が LCDパネル 12を透過し、仮想画像を含む光束が ダイクロックミラー 15を透過し、半透明ミラー 14で反射して、観測者の眼球 EB上に投 影される。こうして観測者は、現実画像を直接見ながら、仮想画像を観測することが できる。このとき、撮像部 18で現実画像の光量 (視線方向 LSに観測される実際の画 像の明るさ)が検出され、光量データが制御部 62に送信されるので、制御部 62は、 仮想画像の明るさを現実画像の明るさに応じて調整することができる。これにより、実 施の形態 2の画像表示装置によれば、現実画像に仮想画像をより自然に重畳して観 測者に表示することができる。 According to the second embodiment, the virtual image stored in the non-volatile memory 64 of the processing arithmetic unit 60 is processed by the control unit 62 and then transmitted to the LCD panel 12 to form an image. Then, the white light from the light source 22 passes through the LCD panel 12, and the light beam including the virtual image passes through the dichroic mirror 15, is reflected by the semitransparent mirror 14, and is projected onto the observer's eyeball EB. Thus, the observer can observe the virtual image while directly viewing the real image. At this time, the light amount of the real image (the actual brightness of the image observed in the line-of-sight direction LS) is detected by the imaging unit 18, and the light amount data is transmitted to the control unit 62. Can be adjusted according to the brightness of the real image. Thereby, according to the image display device of the second embodiment, the virtual image can be superimposed on the real image more naturally and displayed to the observer.
[0036] 実施の形態 3. Embodiment 3.
図 13〜図 14を参照しながら、本発明に係る画像表示装置の実施の形態 3につい て以下に説明する。実施の形態 3の画像表示装置 3は、眼球 EBから注視点までの距 離を測定する距離センサを有する点を除いて、上述の画像表示装置 1および 2と同 様の構成を有するので、重複する構成要素に関する詳細な説明を省略する。また、 実施の形態 1と同様の構成部品については、同様の符号を用いて説明する。 A third embodiment of the image display apparatus according to the present invention will be described below with reference to FIGS. The image display device 3 of the third embodiment has the same configuration as the image display devices 1 and 2 described above except that it has a distance sensor that measures the distance from the eyeball EB to the gazing point. Detailed description of the constituent elements to be performed will be omitted. Also, Components similar to those in the first embodiment will be described using the same reference numerals.
[0037] 注視点は、一般に、観測者が注視する対象物が存在する点と定義されるが、図 13 に示すように、両方の目で対象物を観測するとき、注視点 GPは、左右の眼球の視線 LS , LSの交点でもある。図 13において、左右の眼球 EB , EBの中心間線分 C[0037] A gaze point is generally defined as a point where an object to be observed by an observer exists, but when observing an object with both eyes, the gaze point GP is This is also the intersection of the eyes LS and LS. In Fig. 13, the left and right eyeballs EB, the line segment between the centers of EB C
L R L R L L R L R L
— Cの距離 Lは、観測者に固有の値である。また、左右の眼球 EB , EBの各中心 — The distance L of C is unique to the observer. The center of each eyeball EB, EB
R 0 L R R 0 L R
C , C力 注視点 GPまで延びる直線と、中心間線分 C — Cとのなす角度は、それ C, C force Gaze point The angle between the straight line extending to GP and the center line segment C — C is
L R L R L R L R
ぞれの眼球 EBの視線方向 LSから算出することができ、例えば 0 , Θ であったとす Each eyeball EB can be calculated from the line-of-sight direction LS, for example, 0 and Θ.
1 2 1 2
る。すなわち、注視点 GPと左右の EB, EBの各中心 C, Cまでの距離 L, Lは、 The That is, the distances L and L from the gazing point GP to the left and right EB and the centers C and C of EB are
L R L R 1 2 距離 Lと角度 θ , Θ の関数で表すことができる。 L R L R 1 2 can be expressed as a function of distance L and angles θ and Θ.
0 1 2 0 1 2
[0038] 一般に、観測者が裸眼で近くにある対象物に注視し (焦点を合わせ)ながら、その 視線方向の延長線上の遠くにある別の対象物を同時に観測したとき、注視した (焦点 のあった)対象物が鮮明に見えるのに対し、それより遠くにある(または近くにある)対 象物はぼけた状態で知覚される。これにより、観測者は、距離が異なる 2つの対象物 の遠近感を感知することができる。 [0038] In general, when an observer observes a nearby object with the naked eye (focusing) and observes another object far away on the extended line of the line of sight, The object appears to be clear, while objects farther away (or closer) are perceived as blurred. This allows the observer to sense the perspective of two objects at different distances.
[0039] 実施の形態 3に係る画像表示装置 3は、上記のように裸眼で観測したときに得られ る自然な遠近感を再現するように構成される。すなわち、図 14において、観測者は、 注視点 GPにある椅子 (現実画像) RIを注視しており、眼球 EBから注視点までの距離 Lは上述のように算出される。このとき、注視点 GPより遠方に配置されたものと仮定さ れる仮想画像(図 14においては鳩の画像) VIを椅子の現実画像 RIに重畳させて合 成画像を形成するとき、制御部 62は、撮像部 18から得られた現実画像データおよび 仮想画像記憶部 64に記憶された仮想画像データを演算処理して、現実画像 RIを鮮 明に表示する一方、仮想画像 VIをぼかして表示するように、合成画像を形成する。 [0039] Image display device 3 according to Embodiment 3 is configured to reproduce the natural perspective obtained when observed with the naked eye as described above. That is, in FIG. 14, the observer is gazing at the chair (real image) RI at the gazing point GP, and the distance L from the eyeball EB to the gazing point is calculated as described above. At this time, when forming a composite image by superimposing a virtual image VI (image of a pigeon in FIG. 14) VI on the chair's real image RI that is assumed to be distant from the gazing point GP, the control unit 62 Calculates the real image data obtained from the imaging unit 18 and the virtual image data stored in the virtual image storage unit 64 to display the real image RI clearly, while displaying the virtual image VI in a blurred manner. Thus, a composite image is formed.
[0040] より具体的には、実施の形態 3において、仮想画像 VIが見えるべき位置(見えるだ ろう位置)までの距離 (仮想距離)に関する属性情報 Xを仮想画像 VIに付与して、こ の仮想画像 VIを不揮発性メモリ 64に格納する。択一的には、観測者が情報処理ュ ニット 60の入力部 66を用いて、仮想距離 Xの属性情報を入力してもよい。すなわち、 各仮想画像は任意の仮想距離 (属性情報) Xを有することができる。 [0040] More specifically, in the third embodiment, attribute information X related to the distance (virtual distance) to the position where the virtual image VI should be seen (the position where it should be seen) is assigned to the virtual image VI. The virtual image VI is stored in the nonvolatile memory 64. Alternatively, the observer may input the attribute information of the virtual distance X using the input unit 66 of the information processing unit 60. That is, each virtual image can have an arbitrary virtual distance (attribute information) X.
そして、実施の形態 3の情報処理ユニット 60は、現実画像 RIまでの距離 (現実距離 ) Lを測定するとともに、仮想画像 VIが見えるべき位置 (見えるだろう位置)までの距 離 (仮想距離 X)を比較し、現実距離 Lと仮想距離 Xが異なる場合に (仮想画像が現 実画像より遠くにあるか、近くにあるとき)、現実画像 RIを鮮明に、仮想画像 VIを不明 瞭に表示する。 Then, the information processing unit 60 of Embodiment 3 determines the distance to the real image RI (the real distance ) Measure L and compare the distance (virtual distance X) to the position where the virtual image VI should be visible (the position where it will be visible) .If the real distance L and the virtual distance X are different, the virtual image is Real image RI is clearly displayed and virtual image VI is unclearly displayed when the image is far from or near the image.
[0041] 例えば、仮想画像 (鳩) VIが見えるだろう位置までの仮想距離 Xが 10mと設定され ているとき、現実画像 (椅子) RIの注視点 GPまで距離 Lを測定したところ、 5mであつ たとする。このとき観測者は 10m前方にある仮想画像 (鳩) VIおよび 5m先に見える 現実画像 (椅子) RIを同様の鮮明さで観測した場合、通常の距離感覚とは異なるの で違和感を覚える。そこで、実施の形態 3による情報処理ユニット 60は、現実距離しと 仮想距離 Xを比較し、視線方向 LSにお ヽて現実画像 (注視点 GP)近傍に配置され な ヽ仮想画像 VIをぼかして現実画像 RIに重畳して合成画像を形成する。こうして得 られた合成画像により、観測者はより現実的な遠近感を得ることができる。 [0041] For example, when the virtual distance X to the position where the virtual image (dove) VI would be visible is set to 10m, the distance L to the gaze point GP of the real image (chair) RI is measured at 5m. Let's assume that. At this time, when the observer observes the virtual image (pigeon) VI 10m ahead and the real image (chair) RI seen 5m ahead with the same sharpness, it feels strange because it is different from the normal sense of distance. Therefore, the information processing unit 60 according to Embodiment 3 compares the virtual distance X with the real distance, and blurs the virtual image VI that is not placed near the real image (gaze point GP) in the gaze direction LS. Real image Superimposes on RI to form a composite image. The synthesized image obtained in this way allows the observer to obtain a more realistic perspective.
[0042] また、情報処理ユニット 60は、仮想距離 Xが現実距離 Lに対して所定範囲の距離 d の範囲にある場合に(すなわち、 Xく L— d、または X>L + dであるとき)、仮想画像 V Iだけをぼかして現実画像 RIに重畳して LCDパネル 12に表示してもよい。 [0042] Further, the information processing unit 60, when the virtual distance X is within a predetermined distance d with respect to the actual distance L (that is, when X becomes L−d or X> L + d). ) Alternatively, only the virtual image VI may be blurred and superimposed on the real image RI and displayed on the LCD panel 12.
[0043] なお上記説明では、左右の EB , EBの中心間線分 C Cの距離 Lと、左右の [0043] In the above description, the left and right EBs, the distance L between the EB center line segments C C,
L R L R 0 L R L R 0
眼球 EBと中心間線分 C — Cとのなす角度 0 , Θ を用いて、注視点 GP力も左右の Using the angles 0 and Θ between the eyeball EB and the center line segment C — C,
L R 1 2 L R 1 2
EB , EBの各中心 C , Cまでの距離 L , Lを求めたが、任意の距離センサを表示 The distances L and L to the centers C and C of EB and EB were obtained, but any distance sensor is displayed.
L R L R 1 2 L R L R 1 2
ユニット 10に設け、注視点 GPから眼球 EBまでの距離を直接的に測定してもよい。こ の場合、左および右の眼球 EB , EBに対するいずれか一方の表示ユニット 10を用 It may be installed in unit 10 and the distance from the point of sight GP to the eyeball EB may be measured directly. In this case, use one of the display units 10 for the left and right eyeballs EB and EB.
L R L R
いて、現実画像 RIに対する仮想画像 VIの遠近感を表現することができる。すなわち 、観測者は、より簡便な構成を有する表示ユニット 10を用いて、片目だけでも臨場感 のある画像を楽しむことができる。 The perspective of the virtual image VI with respect to the real image RI can be expressed. That is, the observer can enjoy a realistic image with only one eye using the display unit 10 having a simpler configuration.
同様に、仮想画像 VIだけが表示ユニット 10に表示され、現実画像 RIは裸眼で観 測される実施の形態 2による画像表示装置 2においても、現実距離 Lと仮想距離 Xを 比較して、視線方向 LSにお ヽて現実画像 (注視点 GP)近傍に配置されな!ヽ仮想画 像 VIをぼかして眼球 EBに投影することにより、臨場感のある仮想画像 VIを観測者に 提供することができる。 [0044] 実施の形態 4. Similarly, in the image display apparatus 2 according to Embodiment 2 in which only the virtual image VI is displayed on the display unit 10 and the real image RI is observed with the naked eye, the visual distance L is compared with the real distance L and the virtual distance X. It is not placed near the real image (gaze point GP) in the direction LS! ヽ It is possible to provide the observer with a realistic virtual image VI by blurring the virtual image VI and projecting it on the eyeball EB. it can. [0044] Embodiment 4.
図 15〜図 16を参照しながら、本発明に係る画像表示装置の実施の形態 4につい て以下に説明する。これまで説明した画像表示装置の表示ユニット 10の光源が面光 源であるのに対し、実施の形態 4の画像表示装置 4は、概略、点光源を含む表示ュ ニット 10を有する点を除いて、これまで説明した画像表示装置と同様の構成を有す るので、重複する構成要素に関する詳細な説明を省略する。また、上記の実施の形 態と同様の構成部品については、同様の符号を用いて説明する。 Embodiment 4 of the image display device according to the present invention will be described below with reference to FIGS. Whereas the light source of the display unit 10 of the image display device described so far is a surface light source, the image display device 4 of the fourth embodiment is generally except that the display unit 10 includes a point light source. Since it has the same configuration as that of the image display apparatus described so far, detailed description of the overlapping components will be omitted. The same components as those in the above embodiment are described using the same reference numerals.
[0045] 上述のように、図 15に示す実施の形態 4の表示ユニット 10の光源 22は、点光源( 指向性を有する光束を放射する光源)となるように構成され、光源 22の視線方向 LS と直交する方向における大きさ (視線方向における幅) wが眼球 EBの瞳孔径 aと同程 度の大きさを有するように構成されている。より具体的には、光源 22は、その大きさ w が瞳孔径 aの 4倍以下、好適には 2倍以下となるように設定され、一般的な瞳孔径 aが 3mmであるとすると、光源 22の大きさ wが 12mm以下、好適には 6mm以下となるよ うに設定されている。 As described above, the light source 22 of the display unit 10 of Embodiment 4 shown in FIG. 15 is configured to be a point light source (a light source that emits a light beam having directivity), and the line-of-sight direction of the light source 22 The size in the direction orthogonal to LS (width in the line-of-sight direction) w is configured to have the same size as the pupil diameter a of the eyeball EB. More specifically, the light source 22 is set so that its size w is 4 times or less, preferably 2 times or less of the pupil diameter a, and when the general pupil diameter a is 3 mm, The size w of 22 is set to be 12 mm or less, preferably 6 mm or less.
[0046] 表示ユニット 10の光源 22が上記のように設定されたとき、光源 22からの光束は、図 15に示すように、限定された領域 (点光源)から一様に拡散し、 LCDパネル 12を透 過し、集光レンズ 34により眼球 EBの瞳孔 Pに焦点を結ぶように集光されて、網膜上 に結像する。このとき、眼球 EBの水晶体 CLのレンズ効果に依存することなぐ観測 者は常に鮮明な画像を見ることができる (マクスゥエル視)。すなわち、観測者の眼球 EBから注視点までの距離 (焦点深度)が変化した場合であっても、 LCDパネル 12に 表示された画像をそのまま (再現性よく)観測者の眼球 EBに投影することができる。 When the light source 22 of the display unit 10 is set as described above, the light flux from the light source 22 is uniformly diffused from a limited area (point light source) as shown in FIG. The light passes through 12 and is condensed by the condenser lens 34 so as to focus on the pupil P of the eyeball EB, and forms an image on the retina. At this time, an observer without depending on the lens effect of the lens CL of the eyeball EB can always see a clear image (Maxwell's view). In other words, even if the distance (depth of focus) from the observer's eyeball EB to the gazing point changes, the image displayed on the LCD panel 12 should be projected directly (with good reproducibility) onto the observer's eyeball EB. Can do.
[0047] これに対して、光源 22が LCDパネル 12に隣接する面光源 (拡散性を有する光束 を放射する光源)として構成された場合 (図 16参照)、水晶体 CLが所定のレンズ効 果 (レンズ度数)を有するときに限り、観測者は、鮮明な画像を知覚することができる。 On the other hand, when the light source 22 is configured as a surface light source adjacent to the LCD panel 12 (a light source that emits a light beam having diffusibility) (see FIG. 16), the lens CL has a predetermined lens effect ( The observer can perceive a clear image only when it has a lens power.
[0048] したがって、実施の形態 4で説明したマクスゥエル視を実現する光学系を実施の形 態 1の画像表示装置 1に適用したとき、現実画像 RIに仮想画像 VIを重畳した合成画 像力 SLCDパネル 12に表示され、観測者は、現実距離 Lおよび仮想距離 Xによらず、 LCDパネル 12で表示された合成画像を鮮明に見ることができる。 同様に、マクスゥエル視を実現する光学系を実施の形態 2の画像表示装置 2に採用 した場合には、裸眼で現実画像を見るときの現実距離 L (焦点深度)に関係なぐす なわち実際に観測される対象物が遠くにあっても近くにあっても、 LCDパネル 12で 表示された仮想画像は、眼球 EBが有する水晶体 CLのレンズ効果に依存することな ぐ常に明瞭に認識することができる。 [0048] Therefore, when the optical system that realizes Maxwellian vision described in the fourth embodiment is applied to the image display device 1 of the first embodiment, the combined image power SLCD in which the virtual image VI is superimposed on the real image RI. Displayed on the panel 12, the observer can clearly see the composite image displayed on the LCD panel 12 regardless of the real distance L and the virtual distance X. Similarly, when an optical system that realizes Maxwell's vision is used in the image display device 2 of Embodiment 2, it is actually related to the actual distance L (depth of focus) when viewing a real image with the naked eye. Regardless of whether the observed object is far or near, the virtual image displayed on the LCD panel 12 can always be clearly recognized without depending on the lens effect of the lens CL of the eyeball EB. it can.
換言すると、実施の形態 3と同様、仮想距離 Xを仮想画像 VIの属性情報として、観 測者が入力部 66を用いて任意の値を設定し、制御部 62は、 LCDパネル 12を用い て、現実距離 Lと仮想距離 Xが異なる場合には、現実画像 RIを鮮明に、仮想画像 VI を不明瞭に表示してもよい。 In other words, as in the third embodiment, the observer sets an arbitrary value using the input unit 66 using the virtual distance X as attribute information of the virtual image VI, and the control unit 62 uses the LCD panel 12. When the real distance L and the virtual distance X are different, the real image RI may be displayed clearly and the virtual image VI may be displayed unclearly.
Claims
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2005-128203 | 2005-04-26 | ||
| JP2005128203A JP2006308674A (en) | 2005-04-26 | 2005-04-26 | Image display device |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2006118057A1 true WO2006118057A1 (en) | 2006-11-09 |
Family
ID=37307857
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2006/308448 Ceased WO2006118057A1 (en) | 2005-04-26 | 2006-04-21 | Image display device |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2006308674A (en) |
| WO (1) | WO2006118057A1 (en) |
Cited By (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2009122550A (en) * | 2007-11-16 | 2009-06-04 | Panasonic Electric Works Co Ltd | Retina projection display device |
| JP2011232669A (en) * | 2010-04-30 | 2011-11-17 | Casio Comput Co Ltd | Display device |
| EP2590002A1 (en) * | 2011-11-04 | 2013-05-08 | Honeywell International Inc. | Steerable near-to-eye display and steerable near-to-eye display system |
| JP2015005972A (en) * | 2013-05-22 | 2015-01-08 | 株式会社テレパシーホールディングス | Wearable device having privacy protection function of captured image, control method thereof, and image sharing system |
| JP2018173661A (en) * | 2018-07-23 | 2018-11-08 | 旭化成株式会社 | Optical device having spectacle lens, spectacles using the same, and spectacle type display device |
| CN109991746A (en) * | 2019-03-08 | 2019-07-09 | 成都理想境界科技有限公司 | Image source mould group and near-eye display system |
| CN110199324A (en) * | 2017-01-31 | 2019-09-03 | 株式会社和冠 | Display device and control method thereof |
| WO2021020069A1 (en) * | 2019-07-26 | 2021-02-04 | ソニー株式会社 | Display device, display method, and program |
| WO2021181797A1 (en) * | 2020-03-11 | 2021-09-16 | 国立大学法人福井大学 | Retina scanning display device and image display system |
| JPWO2022054740A1 (en) * | 2020-09-09 | 2022-03-17 |
Families Citing this family (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2010085786A (en) * | 2008-09-30 | 2010-04-15 | Brother Ind Ltd | Head-mounted display device |
| JP5295714B2 (en) * | 2008-10-27 | 2013-09-18 | 株式会社ソニー・コンピュータエンタテインメント | Display device, image processing method, and computer program |
| JP5420464B2 (en) * | 2010-04-02 | 2014-02-19 | オリンパス株式会社 | Display device, electronic device, portable electronic device, mobile phone, and imaging device |
| JP2015213226A (en) * | 2014-05-02 | 2015-11-26 | コニカミノルタ株式会社 | Wearable display and display control program therefor |
| KR101648021B1 (en) * | 2014-11-28 | 2016-08-23 | 현대자동차주식회사 | Vehicle having gaze recognizing function and controlling method thereof, and gaze recognizing system |
| US10591735B2 (en) * | 2015-01-15 | 2020-03-17 | Sony Interactive Entertainment Inc. | Head-mounted display device and image display system |
| JP7207954B2 (en) * | 2018-11-05 | 2023-01-18 | 京セラ株式会社 | 3D display device, head-up display system, moving object, and program |
| CN109856796A (en) * | 2018-11-20 | 2019-06-07 | 成都理想境界科技有限公司 | Image source mould group, waveguide, near-eye display system and its control method |
| JP2022049247A (en) * | 2020-09-16 | 2022-03-29 | 株式会社テックジェーピー | Head-mounted image display device |
| JP2024109394A (en) * | 2023-02-01 | 2024-08-14 | トヨタ自動車株式会社 | Terminal equipment |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH11196351A (en) * | 1997-12-27 | 1999-07-21 | Mr System Kenkyusho:Kk | Display device |
| JP2000059666A (en) * | 1998-08-07 | 2000-02-25 | Victor Co Of Japan Ltd | Image pickup device |
| JP2002090688A (en) * | 2000-09-12 | 2002-03-27 | Masahiko Inami | Sight-line direction dependent type retina display device |
| JP2002271691A (en) * | 2001-03-13 | 2002-09-20 | Canon Inc | Image processing method, image processing device, storage medium, and program |
| JP2004191962A (en) * | 2002-11-29 | 2004-07-08 | Brother Ind Ltd | Image display device |
| JP2006039359A (en) * | 2004-07-29 | 2006-02-09 | Shimadzu Corp | Head-mounted display device |
-
2005
- 2005-04-26 JP JP2005128203A patent/JP2006308674A/en active Pending
-
2006
- 2006-04-21 WO PCT/JP2006/308448 patent/WO2006118057A1/en not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH11196351A (en) * | 1997-12-27 | 1999-07-21 | Mr System Kenkyusho:Kk | Display device |
| JP2000059666A (en) * | 1998-08-07 | 2000-02-25 | Victor Co Of Japan Ltd | Image pickup device |
| JP2002090688A (en) * | 2000-09-12 | 2002-03-27 | Masahiko Inami | Sight-line direction dependent type retina display device |
| JP2002271691A (en) * | 2001-03-13 | 2002-09-20 | Canon Inc | Image processing method, image processing device, storage medium, and program |
| JP2004191962A (en) * | 2002-11-29 | 2004-07-08 | Brother Ind Ltd | Image display device |
| JP2006039359A (en) * | 2004-07-29 | 2006-02-09 | Shimadzu Corp | Head-mounted display device |
Cited By (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2009122550A (en) * | 2007-11-16 | 2009-06-04 | Panasonic Electric Works Co Ltd | Retina projection display device |
| JP2011232669A (en) * | 2010-04-30 | 2011-11-17 | Casio Comput Co Ltd | Display device |
| EP2590002A1 (en) * | 2011-11-04 | 2013-05-08 | Honeywell International Inc. | Steerable near-to-eye display and steerable near-to-eye display system |
| US8681426B2 (en) | 2011-11-04 | 2014-03-25 | Honeywell International Inc. | Steerable near-to-eye display and steerable near-to-eye display system |
| JP2015005972A (en) * | 2013-05-22 | 2015-01-08 | 株式会社テレパシーホールディングス | Wearable device having privacy protection function of captured image, control method thereof, and image sharing system |
| CN110199324B (en) * | 2017-01-31 | 2023-12-29 | 株式会社和冠 | Display device and control method thereof |
| CN110199324A (en) * | 2017-01-31 | 2019-09-03 | 株式会社和冠 | Display device and control method thereof |
| JP2018173661A (en) * | 2018-07-23 | 2018-11-08 | 旭化成株式会社 | Optical device having spectacle lens, spectacles using the same, and spectacle type display device |
| CN109991746A (en) * | 2019-03-08 | 2019-07-09 | 成都理想境界科技有限公司 | Image source mould group and near-eye display system |
| WO2021020069A1 (en) * | 2019-07-26 | 2021-02-04 | ソニー株式会社 | Display device, display method, and program |
| US11854444B2 (en) | 2019-07-26 | 2023-12-26 | Sony Group Corporation | Display device and display method |
| JP2021144124A (en) * | 2020-03-11 | 2021-09-24 | 国立大学法人福井大学 | Retina scan type display device, and image display system |
| WO2021181797A1 (en) * | 2020-03-11 | 2021-09-16 | 国立大学法人福井大学 | Retina scanning display device and image display system |
| US12174381B2 (en) | 2020-03-11 | 2024-12-24 | University Of Fukui | Image display device using retinal scanning display unit and image display system |
| JPWO2022054740A1 (en) * | 2020-09-09 | 2022-03-17 | ||
| JP7123452B2 (en) | 2020-09-09 | 2022-08-23 | 株式会社Qdレーザ | image projection device |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2006308674A (en) | 2006-11-09 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20230379448A1 (en) | Head-mounted augmented reality near eye display device | |
| WO2006118057A1 (en) | Image display device | |
| US9711072B1 (en) | Display apparatus and method of displaying using focus and context displays | |
| JP5167545B2 (en) | Viewpoint detection device | |
| US9711114B1 (en) | Display apparatus and method of displaying using projectors | |
| HK1245897A1 (en) | Display apparatus and method of displaying using the display apparatus | |
| JPH08313843A (en) | Wide visual field and high resolution video presentation device in line of sight followup system | |
| JPH0759032A (en) | Picture display device | |
| WO2012077713A1 (en) | Method for detecting point of gaze and device for detecting point of gaze | |
| CN114503011A (en) | Compact retinal scanning device that tracks the movement of the pupil of the eye and uses thereof | |
| WO2005063114A1 (en) | Sight-line detection method and device, and three- dimensional view-point measurement device | |
| JP4500992B2 (en) | 3D viewpoint measuring device | |
| TW201843494A (en) | Display system with video see-through | |
| JP2023539962A (en) | System and method for superimposing virtual images on real-time images | |
| JP5484453B2 (en) | Optical devices with multiple operating modes | |
| JPH0449943A (en) | Eye ball motion analyzer | |
| JP2006053321A (en) | Projection observation device | |
| JP3976860B2 (en) | Stereoscopic imaging device | |
| JP2507913B2 (en) | Eye movement tracking type visual presentation device by projection method | |
| JP2008227834A (en) | Head-mounted video presentation device, head-mounted imaging device, and head-mounted video device | |
| JP5962062B2 (en) | Automatic focusing method and apparatus | |
| JP2011158644A (en) | Display device | |
| JPH0638142A (en) | Sight line tracing type head mount display device | |
| JPH06326946A (en) | Eye-ball motion follow-up type vision indicating device | |
| JPH0758992A (en) | Virtual reality device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| NENP | Non-entry into the national phase |
Ref country code: RU |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 06745579 Country of ref document: EP Kind code of ref document: A1 |