[go: up one dir, main page]

WO2013118328A1 - Affichage, dispositif électronique et programme pour affichage - Google Patents

Affichage, dispositif électronique et programme pour affichage Download PDF

Info

Publication number
WO2013118328A1
WO2013118328A1 PCT/JP2012/070093 JP2012070093W WO2013118328A1 WO 2013118328 A1 WO2013118328 A1 WO 2013118328A1 JP 2012070093 W JP2012070093 W JP 2012070093W WO 2013118328 A1 WO2013118328 A1 WO 2013118328A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
display device
observer
displayed
eye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2012/070093
Other languages
English (en)
Japanese (ja)
Inventor
堀川 嘉明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Priority to JP2013526016A priority Critical patent/JP5496425B2/ja
Publication of WO2013118328A1 publication Critical patent/WO2013118328A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • G02B3/0037Arrays characterized by the distribution or form of lenses
    • G02B3/0056Arrays characterized by the distribution or form of lenses arranged along two different directions in a plane, e.g. honeycomb arrangement of lenses
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/133Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
    • G02F1/1333Constructional arrangements; Manufacturing methods
    • G02F1/1335Structural association of cells with optical devices, e.g. polarisers or reflectors
    • G02F1/133526Lenses, e.g. microlenses or Fresnel lenses

Definitions

  • the present invention relates to a display device for displaying an image to an observer, an electronic apparatus including the display device, and a display device program executable by the display device.
  • the observer when watching the monitor of the car navigation system, the observer may be a driving driver. It is dangerous for the driver himself to put off reading glasses, and it is virtually impossible to remove reading glasses. As another scene, it is troublesome for the observer to wear reading glasses every time when observing a liquid crystal screen of a personal computer (PC). Accordingly, there is a demand for an electronic device that allows a monitor to be viewed without removing reading glasses.
  • PC personal computer
  • Patent Document 1 shows a configuration example in which a Fresnel lens or a microlens is attached in front of an FPD that is a monitor of a digital camera, and the FPD is viewed like a loupe.
  • Patent Document 1 also shows a configuration in which a microlens array is used for each pixel or for a plurality of pixels instead of the Fresnel lens.
  • a microlens array enlarges a pixel or a set of a plurality of pixels by each microlens. For this reason, adjacent pixels are observed in an overlapping manner, and a practical image cannot be obtained.
  • the present invention has been made in view of such circumstances, and an object of the present invention is to provide a practical thin display device that can be easily focused and an electronic apparatus including the display device.
  • a touch panel superimposed on a flat panel display such as a liquid crystal panel to input various objects such as icons and switches.
  • a flat panel display that displays a real image
  • the position of a visually recognized object does not change at any observation position. Since the position of the object is known in the displayed image, it can be detected that the object is touched from the coordinates of the touch position detected by the touch panel.
  • the position of the observed virtual image moves depending on the observation position (direction).
  • the size of the virtual image does not change depending on the observation distance, the size of the touch panel appears to change.
  • the relative position and relative size between the surface of the display device on which the touch panel can be installed and the display of the virtual image change.
  • the touch panel is provided on the display device and the coordinates of the touch position are detected, the virtual image moves depending on the observation position, and thus the object cannot be specified on the touch panel.
  • a display device that displays a virtual image will be explained a little.
  • the display device performs display by emitting a light beam from a distance from the surface of the display device (for example, a parallel light beam at infinity).
  • the display is performed not on the surface of the display device but on a virtual image formed in the back of the display device.
  • a display device includes: A lens array having a plurality of lenses; A plurality of display areas respectively corresponding to the lenses, The same image is displayed in the plurality of display areas.
  • the display device includes: A display unit capable of displaying a virtual image farther than the display surface as viewed from the observer; A touch panel provided on a surface of the display unit and outputting position information based on a touch position.
  • the display device makes it possible for a person who cannot normally focus on the near point where the display device exists to view a focused display by causing a light beam from infinity to enter the observer's pupil. It becomes possible. For example, a presbyopic person can view a focused display without wearing (or removing) reading glasses.
  • the optical element in the display device according to the present invention comprises a lens array and a display area as a set, and the lens and the display area are made minute so that the display area It is possible to shorten the optical length between the lenses. Therefore, it is possible to reduce the thickness of the display device.
  • the display device includes a display unit that can display a virtual image.
  • the position of the observer's eyes with respect to the display unit is detected based on the output of the imaging unit, and the position information and display output by the touch panel based on the detected position of the eyes of the observer
  • the electronic device can be operated by a touch panel without wearing (or removing) reading glasses even by a presbyopic person.
  • Sectional drawing which shows the display apparatus which concerns on 1st Embodiment.
  • Front view showing a display device according to a second embodiment The figure which shows the arrangement
  • FIG. 1 Front view showing a display device according to a second embodiment
  • the figure for demonstrating the rotation error of the display device which concerns on 2nd Embodiment, and a lens array The figure which shows the control structure which displays on the display device which concerns on this embodiment
  • the perspective view which shows the digital camera (electronic device) which has a display apparatus which concerns on this embodiment.
  • the flowchart which shows the automatic adjustment process of the light beam direction which concerns on 3rd Embodiment.
  • the flowchart which shows the display correction process which concerns on 4th Embodiment Illustration for explaining the change in the appearance of a virtual image observed by the change in the observation position The figure which shows the structure of the display apparatus which concerns on 5th Embodiment.
  • FIG. 1 is a perspective view showing a basic configuration of the first embodiment.
  • the display device of this embodiment includes a lens array 1 and a display device 2.
  • display 4 (letters of alphabet A in the figure) is performed in a display area surrounded by a broken line at a position corresponding to each lens 3 of the lens array 1.
  • This display also corresponds to the other lenses 3, and the same display is performed corresponding to each lens 3.
  • the display content is the display content that the observer finally observes.
  • a broken line indicating one display area in the display device 2 is a description for helping understanding of the display area, and does not actually exist.
  • the display 4 is projected at infinity by parallel light as indicated by an arrow 5 in the figure by the corresponding lens 3.
  • the same number of displays as the lenses 3 of the lens array 1 are displayed on the display device 2.
  • the gap between the lenses 3 of the lens array 1 is shielded 8 to avoid stray light. By this shielding 8, only the light flux from each lens 3 is emitted to the observer side, and it becomes possible for the observer to observe a clear image.
  • 6 indicates a lens of the observer's eye.
  • Reference numeral 7 denotes an image reflected on the retina of the eye.
  • the eye lens 6 can be easily focused at infinity, so that the observer can observe a focused image by condensing the parallel light of the arrow 5 on the retina.
  • the light beam emitted from the display device is substantially thickened by using the lens array 1 in which the lenses 3 are arranged on a plane. As a result, the observation range in the display device is expanded.
  • the focal length of the lens 3 that projects the display 4 can be shortened, it is possible to reduce the thickness of the display device.
  • FIG. 2 shows a cross-sectional view of the display device of FIG. 1 as viewed from the side.
  • the lens array 1 is actually formed by arranging an extremely large number of lenses 3 in the vertical direction and in the front and back direction of the paper surface.
  • the image height from the center to the end of the display 4 displayed in each display area is indicated by an arrow.
  • the display 4 displayed in each display area is projected at infinity by each lens 3 corresponding to the display area. That is, the light beam emitted from each lens 3 is converted into parallel light 5.
  • each display 4 is exactly the same, the light beam emitted from each lens 3 forms one large parallel light beam 9 or 10 for the image.
  • the luminous flux from the pixel at the center of the display is 9, and the luminous flux from the uppermost pixel is 10.
  • this lens array 1 is the pupil of the display optical system of the display device.
  • the microscope is used by looking through the eyepiece. Since the pupil of the optical system is formed only in the vicinity of the eyepiece, in order to observe the subject through the eyepiece, it is necessary to align the position of the eye lens with the pupil position of the optical system. Therefore, in order to observe a subject using an optical instrument typified by a microscope, an observer needs to look into the eyepiece lens of the microscope. However, since the light beam emitted from the eyepiece lens can be adjusted to parallel light equivalent to the light beam coming from infinity, a presbyopic person can easily observe a focused microscopic image.
  • the display device since one large parallel light beam 9 or 10 is formed so that the entire lens array 1 becomes a pupil of the optical system, an observer can perform a peeping operation necessary when using a microscope. There is no need to do it.
  • the display can be seen from a position away from the display device.
  • FIG. 3 is also a cross-sectional view seen from the side for simplicity.
  • Each display 4 is projected at infinity by the corresponding lens 3.
  • This figure shows a state in which a light beam 11 from four of the lenses is incident on the lens 6 of the observer's eye.
  • This light beam 11 is parallel light, and can be focused by the lens 6 of the observer's eye that is focused only at infinity.
  • the display 4 displayed in each display area is projected onto the retina of the observer as one focused image 7.
  • each lens 3 the size of each lens 3 is small, the light beam emitted from the lens 3 spreads due to diffraction, and parallel light cannot be emitted. Presbyopic people are unable to focus on near points because their ability to adjust their eyes is reduced. However, if it is assumed that the presbyopic person can be focused up to 2 meters away, the presbyopic person can observe a light beam having a divergence angle of 3 mm / 2 m [rad] with the lens diameter of the eye being 3 mm.
  • the diameter of the lens 3 is preferably 0.3 mm or more.
  • the resolution of a person with a visual acuity of 0.5 is about 2 minutes.
  • FIG. 4 is a perspective view showing a basic configuration of a display device according to the second embodiment.
  • the display device of the present embodiment includes a lens array 1 and a display device 2.
  • display 4 is performed in a display area surrounded by a broken line at a position corresponding to each lens 3 of the lens array 1.
  • This display 4 also corresponds to the other lenses 3, and the same display is performed.
  • the display content is the display content that the observer finally observes.
  • a broken line indicating one display area in the display device 2 is a description for helping understanding of the display area, and does not actually exist.
  • This display 4 is projected at infinity as parallel light by the lens 3 as indicated by an arrow 5 in the figure.
  • the same number of displays as the lenses 3 of the lens array 1 are displayed on the display device 2.
  • the display device of this embodiment differs from that of Example 1 in the shape of the lens 3 in the lens array 1.
  • the shape of the lens 3 of the present embodiment is the same as that of the display area. That is, when the lens array 1 is viewed from the viewer side, the shape is substantially the same rectangular shape as the display area.
  • the lens array 1 is configured by arranging the lenses 3 side by side in the vertical and horizontal directions. . In the second embodiment, such a configuration eliminates the need for light shielding for removing stray light as in the first embodiment, and realizes a bright display.
  • 6 indicates a lens of an observer's eye.
  • Reference numeral 7 denotes an image reflected on the retina of the eye.
  • the eye lens 6 is focused at infinity, like presbyopia or hyperopic human eyes, and can collect the parallel light of the arrow 5 on the retina. That is, an in-focus image can be observed.
  • the display 4 be projected to the front by the lens 3. That is, it is preferably projected in a direction perpendicular to the display device. For this purpose, it is desirable that the display 4 be displayed directly below the lens 3.
  • the operation of the second embodiment is the same as that of the first embodiment described with reference to FIGS.
  • FIG. 5 shows a front view of the display device 2 used in the display device of the present embodiment.
  • a so-called flat panel display device such as a well-known organic EL device, liquid crystal device, or electronic paper device can be used.
  • a display device having a horizontal pixel count of 4000 and a vertical pixel count of 2000 and capable of displaying 8 million pixels, so-called 4K ⁇ 2K display is used.
  • the size is 60 mm ⁇ 30 mm, which is a standard size for a monitor of a mobile phone or a digital camera.
  • the pixel pitch is 15 ⁇ m, and the pixel size is approximately 15 ⁇ m ⁇ 15 ⁇ m.
  • the display area of the present embodiment is provided by dividing the display range of this one display device 2 into a plurality of parts. As shown in the figure, the same display 4 (“ABC” in the figure) is repeatedly displayed. In the drawing, a broken line indicating one display area described in the display device 2 is a description for helping understanding of the display area, and does not actually exist. It only exists on the data as the boundary of the same display given to the display device 2.
  • the size of one display area is 3 mm ⁇ 1.5 mm. That is, the number of display areas is 20 ⁇ 20 vertically and horizontally. In the figure, it is drawn at 10 ⁇ 10 for easy viewing. The number of pixels displayed in this one display area is 200 ⁇ 100.
  • FIG. 6 shows the correspondence between the lens array 1 and the display device 2.
  • FIG. 7 shows a display device of the present invention in which the lens array 1 and the display device 2 are overlapped.
  • the observed display is the image “ABC” of the display 4 displayed in one display area, and the number of pixels is 200 ⁇ 100. Since the observer sees the display 4 as being displayed on the entire display device 2, the perceived pixel size is 0.3 mm.
  • the lens array 1 and the display device 2 do not have a significant effect on the performance even if they are translated from side to side and up and down.
  • the rotation error ⁇ in FIG. 8 is a rotation angle of the display device 2 or the lens array 1 with respect to an ideal arrangement of the display device 2 and the lens array 1. For example, the angle formed by the long side of the display device 2 and the long side of the lens array 1 is formed.
  • the rotation error ⁇ it is preferable to suppress the rotation error ⁇ to be equal to or less than one display area. That is, ⁇ [rad] ⁇ vertical length of the display area (a) / horizontal length of the display range of the display device 2 (B) is preferable. It is possible to suppress a significant shift in the correspondence between the display area in the display device 2 and the lenses 3 of the lens array 1.
  • an organic EL display device having a horizontal pixel number of 6000 and a vertical pixel number of 8000 pixels (the number of pixels is 48 million pixels) can be used as the display device 2.
  • Super Hi-Vision technology with 32 million pixels, which is said to be 8K ⁇ 4K display, has been developed and has become a practical technology.
  • the dimensions are set to 120 mm in width and 160 mm in length, and used for electronic devices such as electronic books.
  • the pixel pitch is 20 ⁇ m and the pixel dimensions are approximately 20 ⁇ m ⁇ 20 ⁇ m.
  • the size of one display area is 6 mm ⁇ 8 mm.
  • the number of pixels displayed in this one display area is 300 ⁇ 400.
  • the size of the lens 3 is also 6 mm ⁇ 8 mm, and the lens array 1 is composed of 20 ⁇ 20 lenses 3.
  • the observed display is 300 ⁇ 400 as the number of pixels.
  • the pixel size perceived by the observer is 0.4 mm.
  • FIG. 9 shows a control configuration when displaying on the display device 2 for each of the embodiments described above.
  • data to be displayed on the display device 2 is formed by duplicating an image or data to be displayed by the image processing unit.
  • the input original data (“ABC”) is duplicated to be converted into one piece of data arranged vertically and horizontally by the required number and displayed on the display device 2.
  • FIG. 10 is a perspective view showing a digital camera (electronic device) 12 employing the display device according to the embodiment of the present invention.
  • a digital camera electronic device
  • the monitor 13 of the digital camera 12 a display device having the lens array 1 and the display device 2 of the various embodiments described above is used.
  • the digital camera 12 can be precisely operated by operating the control button 15 or the like while viewing the GUI (graphical user interface) displayed on the monitor 13. Can be used.
  • the display device of the present embodiment can be used not only for such a digital camera but also for an electronic device such as a mobile phone or an electronic book, and even a presbyopic person uses the original function of the electronic device. Can do.
  • FIG. 11 is a diagram for explaining image missing due to an observation position in the display device according to the above-described embodiment.
  • the light beam 9 from the display center pixel of the display device and the light beam 10 from the end pixel are opposite to each other. All the luminous flux 20 from the end pixel enters the lens of the observer's eye. Therefore, the observer can see the entire display content.
  • the observer when the observer is at a position 18 slightly deviated from the front of the display device, the light flux 9 from the display center pixel of the display device and the light flux 20 from the end pixel enter the eyes of the observer.
  • the luminous flux 10 from the pixel at the opposite end does not enter the eye lens. Therefore, the observer cannot see the edge part of the image formed with the light beam 10 among the displayed images.
  • the observer's eye is at position 19
  • neither the light beam 9 from the center pixel of the display nor the light beam 10 from the end pixel enters the eye lens. Therefore, only the display near the pixel formed by the light beam 20 can be seen. That is, the entire display content cannot be viewed unless it is from the front of the display device. It is necessary to move the face and bring the eyes in front of the display device.
  • the display device according to the present invention has a narrow observation range as compared with a normal display device.
  • FIG. 12 shows the solution.
  • each display 4 displayed on the display device 2 is projected at infinity by each lens 3 corresponding thereto. That is, the light beam emitted from each lens 3 is parallel light 5.
  • the display 4 of each display area is the same, the light beams emitted from each lens can form one large parallel light beam 9 or 10.
  • the luminous flux from the center pixel of the display 4 is 9, and the luminous flux from the uppermost pixel is 10.
  • the display 4 shown in FIG. 2 is slightly shifted downward.
  • the light beam 9 from the center pixel of the display 4 is emitted obliquely upward rather than in front of the display device.
  • the light beam 10 is also shifted upward and emitted. That is, by shifting the display 4, it is possible to change the emission direction of the parallel light emitted from the lens array 1.
  • FIG. 13 shows a case where the injection direction is changed upward from the state shown in FIG. 11 using such an injection direction change function.
  • the observer is at a position 18 slightly deviated from the front of the display device, all of the luminous fluxes 9, 10 and 20 can be incident on the lens of the observer's eye. Makes it possible to observe the entire display content.
  • the observer's eye is at position 19, the light beam 10 does not enter the eye, but all the light beams can be incident by further increasing the shift amount of the display 4 to be displayed.
  • the shift of display 4 is adjusted based on the input of the observer.
  • the observer adjusts the direction of the luminous flux so that the entire image can be observed using the input means connected to the display device.
  • the direction of the light beam may be automatically adjusted according to the location of the observer, that is, the positional relationship between the display device and the observer.
  • the position of the observer's face or eye is specified, and the direction is automatically adjusted so that the light beam is directed toward the observer.
  • Various sensors such as an infrared sensor and an ultrasonic sensor can be used as the sensor.
  • a camera provided for the purpose of a videophone can be used. It is.
  • the digital camera described with reference to FIG. 10 can also be realized by providing a detection camera 14 beside the monitor 13.
  • FIG. 14 is a flowchart showing an automatic adjustment process of the light beam direction.
  • the face of the observer is photographed with the camera (S101), and the face direction relative to the display device is specified by detecting the face from the captured image (S102).
  • the display shift amount necessary for emitting the light beam in that direction is calculated (S103), and the display is actually shifted (S104).
  • the display is shifted by a necessary amount left, right, up and down depending on the position of the face.
  • An example in which the display of the display device 2 is shifted is shown in FIG.
  • the display shown in FIG. 5 is shifted to the lower right.
  • a broken line indicating one display area in the display device 2 is a description for helping understanding of the display area, and does not actually exist.
  • the emission direction of the parallel light is changed by changing the positional relationship between the display 4 displayed on the display device 2 and the lens 3.
  • the direction of the parallel light can be changed not only by shifting the image displayed on the display device 2 but also by physically shifting the display device 2 or the lens 3.
  • the emission direction of the parallel light is changed by physically moving the display device 2 or the lens array 1.
  • a touch panel is mounted on the display device according to the present embodiment.
  • the touch panel 29 is provided, for example, on the display device of this embodiment shown in FIG.
  • the difference from a normal FPD is that information is displayed on the surface of a normal FPD, but the display device according to the present invention has a lens array on the surface, and the information is displayed as a virtual image behind it. Is Rukoto.
  • a switch or icon is displayed on the display surface, the touched place of the touch panel becomes the displayed switch or icon.
  • the touch panel detects the coordinates (position data) of the touched location. By matching the coordinates of the display (display position data), the touched position and the coordinates of the displayed image can be identified one-to-one.
  • the image displayed in the present embodiment is a virtual image and is not fixedly displayed on the surface of the display device.
  • the appearance of the observed virtual image changes depending on the distance from the display device 2 to the observer, and the consistency between the position in the displayed virtual image and the coordinates of the touch panel. Will change.
  • the observed virtual image may be chipped. This will be described in detail with reference to FIGS.
  • FIG. 16 shows a change in incident light flux due to a change in observation distance.
  • the size of the virtual image is a light beam 32 (a part of the light beam 10 in FIG. 2) and a light beam 33. Therefore, the size of the displayed virtual image is observed with the size of the display surface of the display device (FIG. 17A).
  • the size of the virtual image is determined by the light beam 35 and the light beam 36, and therefore, it is observed smaller than the size of the display surface of the display device (FIG. 17B).
  • the size of the visible virtual image does not change, and the distance to look at the display device is different, so the display surface of the display device looks different. Further, when the place where the display device 30 is viewed is 37, since the luminous fluxes 38 and 39 do not exist, only a part of the displayed virtual image is observed (FIG. 17C). As described above, the position of the virtual image that can be seen in the back of the lens array of the display device, that is, the touch panel 6 differs depending on the position of the observer, specifically, the distance from the display device to the observation position of the observer. That is, the position to be touched cannot be specified.
  • the display 4 is shifted by specifying the direction of the observer.
  • the coordinates of the display can be matched with the coordinates of the touch panel 29. As a result, it is possible to display switches and icons in accordance with the coordinates to be touched on the touch panel 29. Alternatively, by identifying the relative position between the touch panel 29 and the virtual image visible in the back, it is possible to determine which place on the display is touched by the touch panel 29. Furthermore, when a part of the display is missing as shown in FIG. 17C, it is possible to observe all of the display by reducing the size of the display (FIG. 17D).
  • FIG. 18 shows a flow showing the display correction process.
  • the face of the observer is photographed with the camera (S201), the face is detected in the captured image, and the distance from the camera (that is, the position of the display device) to the face is detected using the position and size in the image.
  • the direction are specified (S202).
  • a display shift amount necessary for emitting the light beam in the specified direction and a display size corresponding to the distance from the display device to the face are calculated (S203).
  • the display is shifted according to the calculated shift amount, and the image is converted into the calculated size (S204).
  • the coordinate system of the touch panel 29 corresponding to such display correction processing, it is possible to achieve consistency between the displayed virtual image and the input position (contact detection position) of the touch panel 29. .
  • the distance to the observer it is possible to calculate the appearance of the observed virtual image, that is, the location and size of the virtual image with respect to the display surface (touch panel 29). Therefore, it is possible to specify which part of the display is touched with the touch panel 29, and input using an object such as an icon or a switch displayed as a virtual image is possible.
  • an object such as an icon or a switch displayed as a virtual image is possible.
  • this embodiment by maintaining the consistency between the object displayed in the observed virtual image and the contact detection position, it is possible to input by instructing the switch or icon displayed in the virtual image. .
  • touch input can be performed on a virtual image whose relative position and relative size change depending on the observation position by aligning the observed virtual image with the coordinates of the touch panel.
  • the position and size of the virtual image to be displayed are not changed, and the touch position based on the position information output from the touch panel and the relative position of the displayed virtual image are specified.
  • the display device of the present embodiment may be in any form that can display a virtual image farther than the display surface as viewed from the observer, and is limited to the form using the lens array described in the first to third embodiments. Is not something
  • the relative position and the relative size of the image that can be seen in the opening of the display device are changed by changing the relative position of the observer and the display device. Changes. Therefore, even if the coordinate position touched by the observer is detected on the touch panel, it cannot be specified where the image is touched. That is, even if an object such as an icon displayed on the image is touched, the coordinate position detected by the touch panel does not always match the coordinate position of the displayed object. A touch panel cannot be used in a display device that displays a virtual image.
  • FIG. 19 will be described in detail.
  • the virtual image 142 displayed in the distance can be seen at the center of the display device 141 (143).
  • FIG. 19B when the observer's eye 140 moves to the left side, the virtual image 142 displayed far away appears to be in front of the eye 140. Therefore, the visible image (143) appears to move to the left side of the display device 141. That is, in the observation frame of the display device 141, the image (143) observed by the observer is moved by the movement of the eye 140 of the observer.
  • a virtual image formed is 20 cm or more (maximum: infinite) from the display surface of the display unit. Large), the problem described above becomes conspicuous when positioned far from the observer.
  • the present embodiment is preferably applied to a display device having such a display unit.
  • the opening of the virtual image display type display device 101 that is, the effective area of the touch panel and the relative position and relative size of the image displayed by the virtual image are determined. Identify.
  • the opening of the virtual image display type display device 101 that is, the effective area of the touch panel and the relative position and relative size of the image displayed by the virtual image.
  • the dominant eye of the observer can be set in advance by the input unit of the display device or can be determined by a detection eye determination process described later.
  • FIG. 20 shows a control configuration of the display device of the present embodiment.
  • the display device of the present embodiment includes a touch panel, a detection camera (imaging unit) that detects an observer, and a display unit that displays a virtual image (not shown).
  • a control unit including a face recognition circuit, an eye position detection circuit, a touch position specifying circuit, an electronic device control circuit, and the like is provided.
  • the control unit of the present embodiment is configured by a plurality of circuits, but these circuits can be appropriately configured by a CPU or the like.
  • this embodiment can also be provided as a program for a display device that can be executed by the control unit.
  • the detection camera (imaging unit) is provided in the housing of the display device at a position where the face of the observer who observes the display unit can be imaged.
  • the face recognition circuit detects the face of the observer in the image information captured by the detection camera.
  • the eye position detection circuit obtains the position (direction and distance) of the observer's eyes relative to the display device from the observer's face. Note that the distance to the observer can also be specified using the focal position of the detection camera.
  • the touch panel outputs touch position information to the touch position specifying circuit based on an operation (touch input) by an observer.
  • the touch position specifying circuit calculates a relative position between the virtual image (image) observed by the observer and the touch panel based on the position of the eye of the observer detected by the eye position detection circuit, and displays the display (image). Determine if you touched the part.
  • the specified touch position information is input to the electronic device control circuit, and various processes specified by the specified position information are executed.
  • S404 the relationship between the image and the coordinates of the touch panel is constructed, and the corresponding part of the image touched on the touch panel is specified (touch position detection).
  • the detection process including S401 and S402 and the specific process including S403 and S404 even when the observation position of the observer fluctuates, the displayed image and the display image are displayed by the observer. It is possible to ensure the consistency of the touch position that is input intentionally.
  • both eyes are detected and the intermediate position thereof is determined. It can also be a position.
  • the opening of the display unit is small as described above, the observer often observes using either the right or left dominant eye. Therefore, it is preferable to use the dominant eye of either the left or right eye in the detection process and the identification process.
  • Such dominant setting can be set by inputting from the input unit provided in the display device.
  • FIG. 22 shows detected eye determination processing executed by the control unit of the display device.
  • This detected eye determination process is a process executed by the control unit as a setting function of the display device.
  • the observer is photographed with a camera, and the positions of the left and right eyes (relative position, distance and direction with respect to the display device) are obtained (S501, S502).
  • the relative position and size relationship between the image visible to the observer and the opening (touch panel) of the display device are specified (S503).
  • a predetermined mark for dominant eye detection is displayed on the display unit (S504). This mark may be about the size of an icon or a numeric keypad displayed as a switch when used in a display device, but it is preferable that the mark is as small as possible to accurately detect the dominant eye.
  • the contact position touching the mark is detected (S505).
  • the display position of the mark calculated from the positions of the left and right eyes is compared with the position information output from the touch panel (S506). It is determined which of the mark positions observed by the left and right eyes matches the position information output from the touch panel, and the matching eye is set as the dominant eye in the touch position specifying circuit (S507). ).
  • the observer sets the dominant eye on the display device without being aware of which eye is his dominant eye, and corrects position information according to the dominant eye. Can be performed.
  • the position information output from the touch panel is specified without changing the position and size of the virtual image to be displayed.
  • the position and size of the virtual image to be displayed may be changed so that the entire virtual image or a predetermined area in the virtual image can be visually recognized by the observer.
  • the control unit of the display device changes at least one of the position or size of the image displayed on the display unit based on the detection process (S401, S402, etc. in FIG. 21) for detecting the position of the observer's eyes.
  • a visual field adjustment process is executed.
  • the input position (touch position) of the touch panel is optimized, and the display used in these embodiments
  • the part is not limited to the one using the lens array described in the first to third embodiments, and any display part that can display a virtual image can be applied.
  • different forms are proposed for the display unit capable of displaying a virtual image.
  • FIG. 23 shows an example (sixth embodiment) of a virtual image display type display unit using pupil synthesis.
  • a virtual image display type display unit using pupil synthesis Think of looking at newspapers with a loupe. This means that you are looking at a virtual image magnified with a loupe.
  • this system is large, and a flat panel display (hereinafter referred to as FPD) cannot be constructed as it is. Therefore, consider that an image displayed on the display 120 is enlarged by enlarging a small display 120 of several mm square with a small lens 121.
  • the aperture of the lens 121 is small, and it is necessary to look closely while keeping the eye in close contact with the lens 121. You can't look away like a normal FPD. Therefore, the light beam emitted from the lens 121 is incident on the optical waveguide 118. The light incident on the optical waveguide 118 travels through the optical waveguide 118 with total reflection.
  • the hologram 119 diffracts a part of the light transmitted by being totally reflected in the optical waveguide 118 and emits it outside the optical waveguide. Since the display 120 is at the focal position of the lens 121, the light beam emitted from the center 122 of the image displayed on the display 120 enters the optical waveguide 118 as a parallel light beam. Then, the light is totally reflected in the optical waveguide 118, and a part of the light is emitted from the optical waveguide 118 as a parallel light beam 123 by the hologram 119.
  • the light beam reflected by the hologram 119 continues to be totally reflected inside the optical waveguide 118, and a part of the light beam is emitted out of the optical waveguide 118 as a parallel light beam by the hologram 119. That is, the light beam is emitted in the direction of 125 as a thick parallel light beam from almost the entire surface 124 of the optical waveguide.
  • the light beam emitted from the end 126 of the image displayed on the display becomes a parallel light beam and enters the optical waveguide 118.
  • the light is then totally reflected through the optical waveguide 118, and a part thereof is emitted as a parallel light beam by the hologram 119.
  • the direction is 127.
  • the light beam from the opposite end of the display is emitted from the optical waveguide 118 as a parallel light beam in 128 directions. That is, a small pupil of the lens 121 is synthesized from the surface 124 of the optical waveguide 118, and a large parallel light beam is emitted. The observer can see the image displayed on the display 120 at a position away from the optical waveguide 118. Because of the parallel light flux, even presbyopic people can see the displayed image.
  • FIG. 23 shows an example of unidirectional waveguide, and such a unidirectional waveguide forms an elongated display portion. By combining two or more optical waveguides in different directions, a display portion having a normal aspect ratio can be formed. This is shown in FIG.
  • the optical waveguide 118 guides light in the direction 130 and makes the light incident on the optical waveguide 129.
  • the optical waveguide 129 located below guides light in the direction of 131.
  • the small pupil of the lens 121 can be synthesized into a large pupil. That is, a parallel light beam is emitted in all directions 133 from the entire surface 132 of the optical waveguide 129. This means that, like a loupe, this is observing a virtual image.
  • the system is as thin as a very thin optical waveguide.
  • FIG. 25 and 26 show a virtual image display type display unit according to the seventh embodiment.
  • the display unit of the present embodiment is an example applied to a digital camera (electronic device) that looks into the monitor 151 (display) with the loupe 152.
  • An optical path through which the monitor 151 of the digital camera 150 is viewed with the loupe 152 is formed by two mirrors 153 and 154.
  • FIG. 26A shows an optical path when the digital camera 150 is viewed from the side during use.
  • the light emitted from the monitor 151 is reflected by the mirror 154, further reflected by the mirror 153, converted into a parallel light beam by the loupe 152, and enters the observer's eyes.
  • FIG. 28B shows a case where the loupe 152 and the mirrors 153 and 154 are stored. In this way, it can be reduced when carried.
  • a touch panel 158 is provided on the surface of the loupe 152.
  • the eye of the observer is photographed by the detection camera 155, and the eye position, that is, the direction and the distance are detected. This makes it possible to clarify at what position and in what size the virtual image is observed with respect to the opening of the loupe 152. Therefore, the relative position and relative size with respect to the coordinate axis of the touch panel 158 provided on the surface of the loupe 152 can be grasped.
  • An observer can touch the touch panel 158 while viewing an image displayed as a virtual image. That is, it is possible to operate the digital camera 150 based on a GUI (Graphical User Interface) displayed as a virtual image, select a desired shooting mode, and press the shutter 156 to take an image.
  • GUI Graphic User Interface
  • the observer when changing at least one of the position or size of the virtual image based on the position of the observer, the observer can visually recognize the entire virtual image to be displayed. Become. Specifically, when there is an observer facing substantially the front of the loupe 152, the observer can observe a virtual image in the center of the loupe 152. Even when the observer moves from the front of the loupe 152, a virtual image is observed near the center of the loupe 152 by changing at least one of the position and size of the image following the position of the eyes of the observer. It becomes possible to do.
  • a method of detecting the position of the user's finger and identifying which object among the objects displayed on the display device is pointed out can be considered. That is, in place of the touch panel, a finger position near the display surface of the display device is specified in a non-contact manner by detecting the position and movement of the finger near the display opening of the display device. According to such a method, it is possible to specify an object to be displayed corresponding to the position of the display surface pointed to by the user.
  • a camera that images the vicinity of the display opening of the display device.
  • This camera may also be used as the camera for detecting the position of the observer's eye described above.
  • the user can operate the object intended even when the position of the observer changes.
  • the touch panel in the present invention includes input means having a function of detecting the position of the display surface of the display device as described above in a non-contact manner. Further, the touch position in the present invention includes a position on the display surface of the display device that is specified in a non-contact manner using such input means.
  • SYMBOLS 1 ... Lens array, 2 ... Display device, 3 ... Lens, 4 ... Display area, 5 ... Parallel light, 6 ... Lens of an observer's eye, 8 ... Shield part, 7 ... Image reflected on retina, 29 ... Touch panel 118 ... Optical waveguide, 119 ... Hologram, 121 ... Lens, 122 ... Display, 123 ... Parallel light beam, 124 ... Surface of optical waveguide, 125, 127, 128 ... Ejecting direction of parallel light beam, 126 ... Image end, 129 ... Optical waveguide, 130, 131 ... Waveguide direction of optical waveguide, 132 ... Surface of optical waveguide, 133 ...
  • Ejecting direction of parallel light beam 140 ... Eye of observer, 141 ... Display device, 142 ... Virtual image, 143 ... Image, 150 ... Digital camera, 151 ... Monitor (display), 152 ... Loupe, 153 ... Mirror, 154 ... Mirror, 155 ... Detection camera, 156 ... Shutter, 157 ... Ejected Ruhikaritaba, 158 ... touch panel

Landscapes

  • Physics & Mathematics (AREA)
  • Nonlinear Science (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Mathematical Physics (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)
  • Liquid Crystal (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Studio Devices (AREA)
  • User Interface Of Digital Computer (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
PCT/JP2012/070093 2012-02-07 2012-08-07 Affichage, dispositif électronique et programme pour affichage Ceased WO2013118328A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2013526016A JP5496425B2 (ja) 2012-02-07 2012-08-07 表示装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012024025 2012-02-07
JP2012-024025 2012-02-07

Publications (1)

Publication Number Publication Date
WO2013118328A1 true WO2013118328A1 (fr) 2013-08-15

Family

ID=48947123

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/070093 Ceased WO2013118328A1 (fr) 2012-02-07 2012-08-07 Affichage, dispositif électronique et programme pour affichage

Country Status (2)

Country Link
JP (2) JP5496425B2 (fr)
WO (1) WO2013118328A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014041281A (ja) * 2012-08-23 2014-03-06 Canon Inc 画像表示装置
WO2015146073A1 (fr) * 2014-03-26 2015-10-01 セイコーエプソン株式会社 Projecteur
WO2015170497A1 (fr) * 2014-05-09 2015-11-12 オリンパス株式会社 Procédé d'affichage et dispositif d'affichage
WO2016072518A1 (fr) * 2014-11-07 2016-05-12 ソニー株式会社 Dispositif d'affichage et procédé de commande d'affichage
WO2017122427A1 (fr) * 2016-01-12 2017-07-20 ソニー株式会社 Système d'affichage et dispositif électronique
JP2018067929A (ja) * 2014-10-24 2018-04-26 イメージン コーポレイション マイクロディスプレイベースの没入型ヘッドセット
CN112882240A (zh) * 2021-03-16 2021-06-01 拾斛科技(南京)有限公司 显示装置以及显示方法

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08272009A (ja) * 1995-03-29 1996-10-18 Terumo Corp 立体画像表示装置
JP2002277610A (ja) * 2001-03-21 2002-09-25 Ricoh Co Ltd 遮光部付きマイクロレンズ基板の作製方法
JP2006203668A (ja) * 2005-01-21 2006-08-03 Konica Minolta Photo Imaging Inc 画像生成システム及び画像生成方法
JP2007140500A (ja) * 2005-10-18 2007-06-07 Yamazaki Emiko 無限遠視表示装置及び無限遠視プラネタリウム装置
JP2009225064A (ja) * 2008-03-14 2009-10-01 Ricoh Co Ltd 画像入力装置、認証装置、およびそれらを搭載した電子機器
JP2010032818A (ja) * 2008-07-29 2010-02-12 National Institute Of Information & Communication Technology レンズ機能部材アレイおよびホログラム生成装置ならびにホログラム生成プログラム
JP2011028011A (ja) * 2009-07-27 2011-02-10 Sharp Corp 映像表示装置
JP2011170277A (ja) * 2010-02-22 2011-09-01 Olympus Corp 表示方法、表示装置、光学ユニット、及び電子機器
JP2012003175A (ja) * 2010-06-21 2012-01-05 Noriji Ooishi 立体像表示装置
EP2413609A1 (fr) * 2010-07-29 2012-02-01 Pantech Co., Ltd. Appareil d'affichage de type actif et procédé de commande pour fournir une image stéréographique

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3900578B2 (ja) * 1997-03-14 2007-04-04 ソニー株式会社 追従型虚像視ディスプレイシステム
JPH10319342A (ja) * 1997-05-15 1998-12-04 Olympus Optical Co Ltd 眼球投影型映像表示装置
JP2004168230A (ja) * 2002-11-21 2004-06-17 Nissan Motor Co Ltd 車両用表示装置
JP2005138755A (ja) * 2003-11-07 2005-06-02 Denso Corp 虚像表示装置およびプログラム
JP4367212B2 (ja) * 2004-04-15 2009-11-18 株式会社デンソー 虚像表示装置およびプログラム
JP4370997B2 (ja) * 2004-07-29 2009-11-25 株式会社島津製作所 頭部装着型表示装置
JP2006126280A (ja) * 2004-10-26 2006-05-18 Konica Minolta Photo Imaging Inc 透過式映像表示装置及び透過式映像表示装置注文システム
JP5145710B2 (ja) * 2006-12-18 2013-02-20 日本精機株式会社 ヘッドアップディスプレイ装置
JP5145832B2 (ja) * 2007-09-12 2013-02-20 株式会社島津製作所 頭部装着型表示装置及び頭部装着型表示装置システム
JP4657331B2 (ja) * 2008-08-27 2011-03-23 富士フイルム株式会社 3次元表示時における指示位置設定装置および方法並びにプログラム
JP4702437B2 (ja) * 2008-11-25 2011-06-15 トヨタ自動車株式会社 車両用表示装置
JP5117418B2 (ja) * 2009-01-28 2013-01-16 株式会社東芝 情報処理装置及び情報処理方法
JP2010262232A (ja) * 2009-05-11 2010-11-18 Konica Minolta Opto Inc 映像表示装置およびヘッドマウントディスプレイ
JP5476910B2 (ja) * 2009-10-07 2014-04-23 株式会社ニコン 画像生成装置、画像生成方法、および、プログラム
JP5558094B2 (ja) * 2009-10-07 2014-07-23 オリンパス株式会社 表示方法、表示装置、光学ユニット、表示装置の製造方法、及び電子機器
JP5749444B2 (ja) * 2010-03-16 2015-07-15 オリンパス株式会社 表示装置、電子機器、携帯用電子機器、携帯電話、及び撮像装置
US9164621B2 (en) * 2010-03-18 2015-10-20 Fujifilm Corporation Stereoscopic display apparatus and stereoscopic shooting apparatus, dominant eye judging method and dominant eye judging program for use therein, and recording medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08272009A (ja) * 1995-03-29 1996-10-18 Terumo Corp 立体画像表示装置
JP2002277610A (ja) * 2001-03-21 2002-09-25 Ricoh Co Ltd 遮光部付きマイクロレンズ基板の作製方法
JP2006203668A (ja) * 2005-01-21 2006-08-03 Konica Minolta Photo Imaging Inc 画像生成システム及び画像生成方法
JP2007140500A (ja) * 2005-10-18 2007-06-07 Yamazaki Emiko 無限遠視表示装置及び無限遠視プラネタリウム装置
JP2009225064A (ja) * 2008-03-14 2009-10-01 Ricoh Co Ltd 画像入力装置、認証装置、およびそれらを搭載した電子機器
JP2010032818A (ja) * 2008-07-29 2010-02-12 National Institute Of Information & Communication Technology レンズ機能部材アレイおよびホログラム生成装置ならびにホログラム生成プログラム
JP2011028011A (ja) * 2009-07-27 2011-02-10 Sharp Corp 映像表示装置
JP2011170277A (ja) * 2010-02-22 2011-09-01 Olympus Corp 表示方法、表示装置、光学ユニット、及び電子機器
JP2012003175A (ja) * 2010-06-21 2012-01-05 Noriji Ooishi 立体像表示装置
EP2413609A1 (fr) * 2010-07-29 2012-02-01 Pantech Co., Ltd. Appareil d'affichage de type actif et procédé de commande pour fournir une image stéréographique

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014041281A (ja) * 2012-08-23 2014-03-06 Canon Inc 画像表示装置
WO2015146073A1 (fr) * 2014-03-26 2015-10-01 セイコーエプソン株式会社 Projecteur
CN106104377A (zh) * 2014-03-26 2016-11-09 精工爱普生株式会社 投影仪
WO2015170497A1 (fr) * 2014-05-09 2015-11-12 オリンパス株式会社 Procédé d'affichage et dispositif d'affichage
JP2015215464A (ja) * 2014-05-09 2015-12-03 オリンパス株式会社 表示方法及び表示装置
US9946080B2 (en) 2014-05-09 2018-04-17 Olympus Corporation Display method and display apparatus
JP2021073780A (ja) * 2014-10-24 2021-05-13 イメージン コーポレイション マイクロディスプレイベースの没入型ヘッドセット
JP7033678B2 (ja) 2014-10-24 2022-03-10 イメージン コーポレイション マイクロディスプレイベースの没入型ヘッドセット
US11256102B2 (en) 2014-10-24 2022-02-22 Emagin Corporation Microdisplay based immersive headset
JP6994362B2 (ja) 2014-10-24 2022-01-14 イメージン コーポレイション マイクロディスプレイベースの没入型ヘッドセット
JP2018067929A (ja) * 2014-10-24 2018-04-26 イメージン コーポレイション マイクロディスプレイベースの没入型ヘッドセット
WO2016072518A1 (fr) * 2014-11-07 2016-05-12 ソニー株式会社 Dispositif d'affichage et procédé de commande d'affichage
CN107148590A (zh) * 2014-11-07 2017-09-08 索尼公司 显示设备和显示控制方法
JPWO2016072518A1 (ja) * 2014-11-07 2017-08-31 ソニー株式会社 表示装置及び表示制御方法
JPWO2017122427A1 (ja) * 2016-01-12 2018-11-08 ソニー株式会社 表示システムおよび電子機器
WO2017122427A1 (fr) * 2016-01-12 2017-07-20 ソニー株式会社 Système d'affichage et dispositif électronique
CN112882240A (zh) * 2021-03-16 2021-06-01 拾斛科技(南京)有限公司 显示装置以及显示方法

Also Published As

Publication number Publication date
JP5857082B2 (ja) 2016-02-10
JP5496425B2 (ja) 2014-05-21
JP2014171220A (ja) 2014-09-18
JPWO2013118328A1 (ja) 2015-05-11

Similar Documents

Publication Publication Date Title
JP5857082B2 (ja) 表示装置及び電子機器
JP6123365B2 (ja) 画像表示システム及び頭部装着型表示装置
KR102578929B1 (ko) 조향 가능한 중심와 디스플레이
TWI534475B (zh) 虛像顯示裝置
US8878780B2 (en) Display apparatus
JP6089705B2 (ja) 表示装置、および、表示装置の制御方法
JP2019091051A (ja) 表示装置、およびフォーカスディスプレイとコンテキストディスプレイを用いた表示方法
HK1245897A1 (en) Display apparatus and method of displaying using the display apparatus
JP5694257B2 (ja) 表示装置、表示方法及びプログラム
WO2017145590A1 (fr) Dispositif d'affichage, procédé de pilotage de dispositif d'affichage et dispositif électronique
CN106291959B (zh) 一种虚拟显示面板及显示装置
JP2018182570A (ja) 視線情報共有方法および視線情報共有システム
CN109997067B (zh) 使用便携式电子设备的显示装置和方法
CN204595329U (zh) 头戴式显示装置
JPWO2019221105A1 (ja) 表示装置
JP2018054976A (ja) 頭部搭載型表示装置及び頭部搭載型表示装置の表示制御方法
JP6740613B2 (ja) 表示装置、表示装置の制御方法、及び、プログラム
CN206115049U (zh) 一种虚拟显示面板及显示装置
JP7707290B2 (ja) レチクルのアライメント
US20200166752A1 (en) Display for use in display apparatus
JP2017134399A (ja) 眼幅の調整を必要としないメガネなし3dディスプレイ装置
JP2022176094A (ja) ライトフィールドニアアイ表示装置及びライトフィールドニアアイ表示方法
JP2016197816A (ja) ヘッドマウントディスプレイ及び制御方法
JP5330623B2 (ja) 表示装置、その表示装置を備えた電子機器、及び投影ユニット
JP2016110146A (ja) 画像表示装置

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2013526016

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12867970

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12867970

Country of ref document: EP

Kind code of ref document: A1