WO2025115757A1 - Dispositif d'affichage vidéo, procédé de commande de dispositif d'affichage vidéo et programme associé - Google Patents
Dispositif d'affichage vidéo, procédé de commande de dispositif d'affichage vidéo et programme associé Download PDFInfo
- Publication number
- WO2025115757A1 WO2025115757A1 PCT/JP2024/041335 JP2024041335W WO2025115757A1 WO 2025115757 A1 WO2025115757 A1 WO 2025115757A1 JP 2024041335 W JP2024041335 W JP 2024041335W WO 2025115757 A1 WO2025115757 A1 WO 2025115757A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image display
- display device
- state
- image
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/318—Heart-related electrical modalities, e.g. electrocardiography [ECG]
- A61B5/346—Analysis of electrocardiograms
- A61B5/347—Detecting the frequency distribution of signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/318—Heart-related electrical modalities, e.g. electrocardiography [ECG]
- A61B5/346—Analysis of electrocardiograms
- A61B5/349—Detecting specific parameters of the electrocardiograph cycle
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/398—Electrooculography [EOG], e.g. detecting nystagmus; Electroretinography [ERG]
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B26/00—Optical devices or arrangements for the control of light using movable or deformable optical elements
- G02B26/08—Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/02—Viewing or reading apparatus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/383—Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/64—Constructional details of receivers, e.g. cabinets or dust covers
Definitions
- the present invention relates to technology for adjusting visibility in image display devices.
- image display devices such as head-mounted displays that users wear on their heads, and image display devices that users wear like glasses.
- a display unit is placed near the user's eyes, and parallax images are displayed and processed for each of the user's left and right eyes.
- the user visually recognizes the displayed parallax images, a three-dimensional feeling is obtained for the objects displayed in the parallax images.
- the user's line of sight is adjusted to the position of the 3D image generated by the images displayed for each eye, but the focal position is adjusted to each of the left and right images on the display screen. This creates an unnatural state that would not occur when viewing a real object.
- the user's convergence angle changes according to the position of the displayed object.
- the focal length of the crystalline lens changes according to the magnitude of the convergence angle based on experience in real space.
- the focal length of the crystalline lens and the diopter of the image display device may not match, and the displayed parallax image may not be in focus, making it difficult to see clearly, and good visibility may not be ensured.
- it is desirable to change the diopter according to the depth information in the parallax image that determines the user's convergence angle for example by driving a lens within the device.
- Patent Document 1 the visibility adjustment is made to the depth information according to the position of the user's gaze point when watching an image, taking into account individual differences and the usage state of the image display device, thereby making it possible to reduce discomfort during stereoscopic viewing.
- the present invention aims to provide an image display device that can provide users with stereoscopic images using parallax while reducing the burden on users when viewing images for long periods of time.
- the image display device that achieves the above object comprises: a first image display unit that displays a first image to a right eye of a user; a second image display unit that displays a second image to the left eye of the user; a first display optical element corresponding to the first image display unit; a second display optical element corresponding to the second image display unit; an actuator for changing a position of the display optical element,
- the image display device further includes a state detection unit that detects a physiological state or psychological state of a user while using the image display device, and by driving the actuator based on a detection result of the state detection unit, at least one of the position of the first display optical element relative to the first image display unit and the position of the second display optical element relative to the second image display unit is changed.
- the image display device of the present invention can reduce the burden on the user when displaying stereoscopic images caused by parallax, depending on the user's state of sickness or fatigue when watching images for a long period of time.
- FIG. 1 is an explanatory diagram showing a schematic configuration of a video display device according to a first embodiment
- 11 is an explanatory diagram showing a state when a lens position is changed in the image display device.
- FIG. 11 is an explanatory diagram showing a state when a lens position is changed in the image display device.
- FIG. 1 is an explanatory diagram showing an example of displaying a stereoscopic image using parallax.
- FIG. 1 is an explanatory diagram showing an example of displaying a stereoscopic image using parallax.
- FIG. 1 is an explanatory diagram showing an example of displaying a stereoscopic image using parallax.
- 1 is an explanatory diagram showing the convergence angle and diopter of the eyeballs when gazing at an object.
- 1 is an explanatory diagram showing the convergence angle and diopter of the eyeballs when gazing at an object.
- 1 is an explanatory diagram showing the convergence angle and diopter of the eyeballs when gazing at an object.
- 4 is a flowchart illustrating an operation of the image display device according to the first embodiment.
- 5A and 5B are explanatory diagrams showing changes in physiological or psychological states and lens positions in the first embodiment.
- FIG. 10 is an explanatory diagram showing a modification of the schematic configuration of the image display device according to the first embodiment.
- 13A and 13B are explanatory diagrams showing a physiological state or a psychological state and a change in a lens position in a modified example of the image display device of the first embodiment.
- FIG. 10 is a flowchart illustrating an operation of the image display device according to the second embodiment.
- 13 is an explanatory diagram showing a change in a physiological state or a psychological state and a maximum drive speed in the second embodiment.
- FIG. 10 is a flowchart illustrating the operation of a modified example of the image display device of the second embodiment.
- 13 is an explanatory diagram showing a change in physiological state or psychological state and a maximum drive speed in a modified example of the image display device of the second embodiment.
- a head-mounted display will be shown and described as an example of an image display device in which a pair of images having parallax are displayed by a plurality of image display units, and stereoscopic display can be performed via a plurality of display optical elements.
- the image display device of the present invention comprises a first and a second image display section that respectively display a first and a second image to the right and left eyes of a user. It further comprises a first and a second display optical element corresponding to the first and the second image display section, respectively, and an actuator that changes the position of the display optical element.
- the image display device changes at least one of the position of the first display optical element relative to the first image display section and the position of the second display optical element relative to the second image display section by driving the actuator based on the detection result of the physiological state or psychological state of the user while using the image display device.
- the image display device, the control method for the image display device, and the program thereof can be configured in any desired combination.
- FIG. 1 is a diagram showing a schematic configuration of an image display device 100 according to a first embodiment of the present invention.
- the image display device 100 comprises an image acquisition unit 101 and a display processing unit 102.
- the image acquisition unit 100 acquires an image to be displayed on the image display unit 103 through an external device or network (not shown).
- the display processing unit 102 performs processing such as adjusting the display magnification of the acquired image.
- the processed image is sent to the image display units 103a and 103b and displayed.
- the image is divided and displayed on each of the image display units 103a and 103b, but the present invention is not limited to this configuration, and a single screen may be divided into two regions and an image corresponding to each of the divided image display units may be displayed.
- the image display device 100 has a first and a second display optical element corresponding to the left eye 201a and the right eye 201b, respectively.
- the first display optical element can have a lens 104a
- the second display optical element can have a lens 104b.
- the images displayed on the image display units 103a and 103b are presented to the left eye 201a and the right eye 201b, respectively, through the corresponding lenses 104a and 104b.
- the lenses can be convex lenses as shown in FIG. 1.
- Lens 104a, 104b are driven by actuators 105a, 105b and can move in the direction along the optical axis of the lens (arrows 107a, 107b).
- the optical axes 202a, 202b of lenses 104a, 104b are described as passing through the centers of image display units 103a, 103b and left eye 201a, right eye 201b, respectively, but this is not limited to the above.
- FIG. 2A and 2B are explanatory diagrams that show the state when the lenses 104a and 104b are moved by the actuators 105a and 105b.
- FIG. 2A shows the state when the lenses 104a and 104b are moved close to the image display units 103a and 103b, respectively.
- Figure 2B is a diagram showing the state when lenses 104a, 104b are moved close to left eye 201a and right eye 201b, respectively.
- a user of image display device 100 views image display units 103a, 103b with left eye 201a and right eye 201b through lenses 104a, 104b, virtual images 203a, 203b are viewed.
- the positions of virtual images 203a, 203b in the directions of optical axes 202a, 202b, based on the positions of left eye 201a and right eye 201b are defined as virtual image formation position X.
- the virtual image formation position X approaches the left eye 201a and the right eye 201b.
- the virtual image formation position X moves away from the left eye 201a and the right eye 201b. In this way, the diopter can be changed by moving the lenses 104a and 104b with the actuators 105a and 105b.
- the virtual image forming position X approaches the left eye 201a and the right eye 201b when the lenses 104a and 104b approach the image display units 103a and 103b, but this is not limited to this.
- the virtual image forming position X may be moved away from the left eye 201a and the right eye 201b when the lenses 104a and 104b approach the image display units 103a and 103b.
- the actuators 105a and 105b move the lenses 104a and 104b to change the diopter, but this is not limited to the movement of optical members.
- a liquid lens may be used, and the position of the boundary surface formed by water and oil may be changed by an electrical signal applied from the actuators 105a and 105b.
- an electromagnetic motor such as an ultrasonic motor or a voice coil motor, which is a vibration type actuator that is highly quiet, as the actuator that changes the diopter.
- the lenses 104a and 104b are represented as a single convex lens in Figures 1, 2A, and 2B, each may be a multiple lens, and the position of a specific lens may be moved when the diopter is changed.
- the parallax image 300 displayed on the image display units 103a and 103b will be described with reference to Figures 3A to 3C.
- Figures 3A to 3C are explanatory diagrams of an example of an image.
- the parallax image 300 is composed of, for example, a left eye image 301a displayed on the image display unit 103a corresponding to the left eye 201a, and a right eye image 301b displayed on the image display unit 103b corresponding to the right eye 201b.
- the parallax image 300 is composed of an image 301a for the left eye and an image 301b for the right eye, but the present invention is not limited to this.
- a configuration in which the display processing unit 102 processes the image based on three-dimensional data and generates parallax images to be displayed on the two image display units 103a and 103b may be used.
- an image captured by a camera mounted on the image display device 100 may be treated as at least a part of the parallax image 300.
- the image display device 1 it becomes possible for the image display device 1 to display an image in an augmented reality space representation format.
- the parallax image 300 is created by superimposing an image captured by a camera mounted on the image display device 1 on an image based on three-dimensional data created in advance, and a composite image is displayed.
- the image 301a and 301b are each one, but the present invention is not limited to this.
- the image captured by the camera may be configured to display multiple objects.
- the device is equipped with gaze detection means 106a and 106b as shown in FIG. 1 that can detect the gaze direction of the user's left eye 201a and right eye 201b, and a process is performed in which the gaze point at which the user's gaze is directed is treated as the images 301a and 301b.
- the gaze detection means can be an infrared illumination unit and a gaze detection camera, etc.
- the diopter adjustment amount calculation unit 111 calculates the diopter adjustment amount based on the depth information at the position of the gaze point determined from the detection result by the gaze detection means 106a, 106b. Based on this calculated diopter adjustment amount, the control unit 108 outputs a drive command to drive the actuators 105a, 105b and adjust the diopter.
- the diopter is adjusted by changing the respective distances between the image display units 103a, 103b and the lenses 104a, 104b.
- the diopter detection means 106a, 106b allow the user to adjust the diopter in real time each time the gaze point is changed.
- the control unit 108 is a so-called microcomputer, and may be composed of electrical components such as a central processing unit (CPU), a memory for storing programs, and a memory serving as a working area in which the programs are deployed. In this case, the control unit 108 is responsible for generating signals having information for controlling the drive of the actuator 105.
- CPU central processing unit
- memory for storing programs
- memory serving as a working area in which the programs are deployed.
- the control unit 108 is responsible for generating signals having information for controlling the drive of the actuator 105.
- FIGS. 4A to 4C are diagrams explaining the convergence angle and visibility of the eyes when gazing at an object.
- FIG. 4A shows the state when gazing in the real world
- FIG. 4B shows the state when gazing at the image display device 100.
- the virtual image position 402 is the virtual image position when the user views the image display units 103a and 103b through the lenses 104a and 104b (not shown), respectively.
- the virtual image position 402 is a position that is a distance B away from the left eye 201a and right eye 201b.
- the image display device 100 is different in that the distance corresponding to the convergence angle ⁇ does not match the distance B corresponding to the visibility. This is known as a contradiction between convergence and visibility adjustment, and if this state continues, the user will feel more drowsy and fatigued.
- the virtual image position 402 can be changed by changing the distance between the image display units 103a, 103b and the lenses 104a, 104b (not shown). For example, if the gaze detection means detects that the user is gazing at the object 401, the distance between the display units and the lenses is changed so that the virtual image position 402, which is at distance B, becomes distance A. This causes the distance corresponding to the convergence angle and the distance corresponding to the visual acuity to match, as shown in Figure 4A, creating a more natural state that can reduce motion sickness and fatigue.
- the state detection unit 109 in FIG. 1 detects the physiological or psychological state of the user, for example, the user's heart rate, electrocardiogram, breathing, eye potential, skin potential, and center of gravity.
- heart rate the user's heart rate is measured, and the average instantaneous heart rate, the respiratory component in the heart rate fluctuations, and the magnitude of the Mayer wave component in the heart rate fluctuations are determined, and at least one of these is used as the detection value.
- the electrocardiogram of the user is measured, and the high frequency component (HF), low frequency component (LF), or ratio of HF and LF of the heart rate fluctuation, which is the magnitude of a specific frequency component of the baseline fluctuation of the electrocardiogram, is determined, and at least one of these is used as the detection value.
- HF high frequency component
- LF low frequency component
- ratio of HF and LF of the heart rate fluctuation which is the magnitude of a specific frequency component of the baseline fluctuation of the electrocardiogram
- the user's breathing frequency, breathing volume, and breathing irregularity are measured, and at least one of these is used as the detection value.
- the electrooculography measures the number of blinks, cumulative number, cumulative number, rate of change, and blink interval of the user, and at least one of these is used as the detection value.
- the detection value detected by the state detection unit 109 is output to the state determination unit 110, and the detection value is compared with a predetermined normal value recorded in advance to determine whether the user is experiencing drunkenness or fatigue.
- the physiological or psychological state may be determined using not only one of the electrocardiogram, respiration, electrooculography, electrodermal potential, and center of gravity, but also a combination of several of them.
- the judgment should be based on the magnitude of the resistance component. If the center of gravity is used, the judgment should be based on the fluctuation of the center of gravity.
- the control unit 108 drives the actuators 105a, 105b to move the lenses 104a, 104b toward the image display unit, thereby changing the visibility.
- This state detection unit 109 and state determination unit 110 do not necessarily have to form part of the image display unit, and can also be configured to wirelessly send and receive signals so that they can be sent and received. Conversely, by adopting an integrated configuration with the image display unit, it is possible to allow the user of the head mounted display to immediately use the state detection function without the need for remote communication.
- a program corresponding to the flowchart in FIG. 5 is stored in a memory unit in the arithmetic processing unit.
- the control unit 108 represents a CPU, memory, etc., and each process shown in the flowchart in FIG. 5 is realized by the CPU expanding a predetermined program stored in the memory.
- the configuration of the control unit 108 can also be either integrated with the image display device or not.
- the state detection unit 109 detects the physiological state or psychological state of the user.
- the state of intoxication or fatigue of the user is judged based on the detected value. If the detected value is equal to or greater than the desired value, in S503, the amount of movement is calculated to move the lenses 104a and 104b in a direction away from the image display units 103a and 103b, and in S504, the control unit 108 drives the actuators 105a and 105b.
- This amount of movement may be determined according to the detected value, or the amount of movement may be increased as the detected value increases and decreased as the detected value decreases. In this way, the displayed parallax image is not focused on and is not clearly visible, and the amount of information entering the left eye 201a and right eye 201b is reduced to relax the muscles of the eyeballs, thereby reducing intoxication and fatigue.
- the gaze detection means 106a and 106b are used to detect the gaze of the user's left eye 201a and right eye 201b in S505.
- control unit 107 performs processing to determine the gaze point from the gaze of the left eye 201a and the gaze of the right eye 201b.
- the average value of the gazes of the left eye 201a and the right eye 201b can be used.
- the diopter adjustment amount calculation unit 111 performs a process of calculating the diopter adjustment amount at the position of the gaze point of the image displayed on the image display units 103a and 103b. Specifically, the parallax between the image at the left eye gaze point position and the image at the right eye gaze point position is calculated, and the diopter adjustment amount is derived from depth information corresponding to this parallax. For example, by providing table data that associates the parallax with the diopter adjustment amount, it is possible to calculate the diopter adjustment amount corresponding to the parallax. Alternatively, the diopter adjustment amount calculation unit 111 may calculate the diopter adjustment amount from the parallax using a formula that indicates the relationship between the parallax and the diopter adjustment amount.
- the actuators 105a and 105b are driven by the control unit 108 based on the diopter adjustment amount calculated by the diopter adjustment amount calculation unit 111. This causes an operation to change the distance between the image display units 103a and 103b and the lenses 104a and 104b.
- FIG. 6 shows the detected values of the physiological or psychological state of the user and the lens position of the configuration of FIG. 1.
- the gaze point is calculated from the gaze detected by the gaze detection means, and diopter adjustment is performed in real time in the direction of the arrows 107a and 107b according to the depth information of the gaze point.
- the actuators 105a and 105b are moved so as to move the lenses 104a and 104b closer to the user and away from the image display units 103a and 103b.
- the gaze point is again calculated from the gaze detected by the gaze detection means, and diopter adjustment is performed in real time in the direction of the arrows 107a and 107b according to the depth information of the gaze point.
- the lens 104a corresponding to the left eye 201a and the lens 104b corresponding to the right eye 201b can be driven by independent actuators 105a and 105b, so it is possible to accommodate users with different vision in the left and right eyes.
- a configuration can be adopted in which the position of at least one of the first and second display optical elements can be changed.
- the visibility is adjusted according to the depth information of the gaze point, and the actuator is driven to move the lenses 104a, 104b in a direction away from the image display units 103a, 103b according to the user's state of sickness or fatigue.
- This causes the image displayed on the image display unit to appear blurred, and by reducing the amount of information the user obtains from the image, it is possible to reduce sickness and fatigue accumulated due to long-term use.
- the actuator is moved so that the image display unit and the lens move away from each other, but this is not limiting, and the actuator may be moved so that the image display unit and the lens move closer to each other, with the same effect being obtained.
- the actuator when the value detected by the state detection unit 109 exceeds a predetermined value, the lenses 104a, 104b are moved by the actuators 105a, 105b so that they move closer to the image display units 103a, 103b.
- the diopter adjustment is performed based on the depth information of the gaze point detected by the gaze detection means 106a, 106b.
- the diopter adjustment may be performed manually in a configuration as shown in FIG. 7.
- FIG. 8 shows the detection value of the physiological state or psychological state of the user and the lens position in the configuration of FIG. 7.
- the user himself manually adjusts the diopter to a position that matches the image being viewed at time t0, and this position is set as the reference position of the lens.
- the actuator When the value detected by the state detection unit exceeds a predetermined value at time t1, the actuator is moved so as to move the lens closer to the user and away from the display unit. After that, if the detected value falls below the desired value at time t2, the actuator is moved so as to return the lens to the same reference position that was manually adjusted at time t0.
- a program corresponding to the flowchart in FIG. 9 is stored in a storage unit in the arithmetic processing unit.
- the control unit 108 indicates a CPU, memory, etc., and each process shown in the flowchart in FIG. 9 is realized by the CPU expanding a predetermined program stored in the memory.
- the state detection unit 109 detects the physiological state or psychological state of the user.
- the state of intoxication or fatigue of the user is judged based on the detected value. If the detected value is equal to or lower than the desired value, in S1003, the control unit 108 sets the maximum drive speed of the actuators 105a and 105b to A (for example, 100 mm/s). If the detected value is equal to or higher than the desired value, in S1004, the control unit 108 sets the maximum drive speed of the actuators 105a and 105b to B (for example, 50 mm/s) which is smaller than A.
- gaze detection is performed on the left eye 201a and right eye 201b of the user using gaze detection means 106a and 106b.
- control unit 108 performs processing to determine the gaze point from the gaze of the left eye 201a and the gaze of the right eye 201b.
- the average value of the gazes of the left eye 201a and the right eye 201b can be used.
- the diopter adjustment amount calculation unit 111 performs a process of calculating the diopter adjustment amount at the position of the gaze point of the image displayed on the image display units 103a and 103b. Specifically, the parallax between the image at the left eye gaze point position and the image at the right eye gaze point position is calculated, and the diopter adjustment amount is derived from depth information corresponding to this parallax. For example, by providing table data that associates the parallax with the diopter adjustment amount, it is possible to calculate the diopter adjustment amount corresponding to the parallax. Alternatively, the diopter adjustment amount calculation unit 111 may calculate the diopter adjustment amount from the parallax using a formula that indicates the relationship between the parallax and the diopter adjustment amount.
- the actuators 105a and 105b are driven by the control unit 108 based on the diopter adjustment amount calculated by the diopter adjustment amount calculation unit 111. This changes the distance between the image display units 103a and 103b and the lenses 104a and 104b. Since the above operation is performed every time the gaze point is changed, the diopter can be adjusted in real time to change the diopter.
- Figure 10 shows the detected values of the physiological or psychological state of the user and the maximum drive speed of the actuators 105a, 105b.
- the maximum drive speed of the actuators 105a, 105b is set to A.
- the maximum drive speed is set to B, which is smaller than the maximum drive speed A.
- the maximum drive speed is again set to A.
- control gain is the ratio of the control output required to change the actuator by a certain amount, and by reducing this ratio, the responsiveness can be suppressed, the amount of change in the user's focus on the image can be suppressed, and the eyes can be reduced from trying to follow a sudden change in focus. This reduces the burden on the user, even when watching videos for a long period of time, and can reduce the increase or accumulation of sickness and fatigue.
- the control period is the time interval for issuing commands in controlling the actuator, and by widening this time interval, the frequency of updates can be suppressed, the amount of change in the user's focus on the image can be suppressed, and the eyes can be reduced from trying to follow a sudden change in focus. This reduces the burden on the user, even when watching videos for a long period of time, and can reduce the increase or accumulation of sickness and fatigue.
- the value detected by the state detection unit 109 is equal to or greater than a predetermined value, multiple of the maximum drive speed, control gain, and control period may be changed simultaneously.
- the maximum drive speed, control gain, and control cycle are changed depending on whether the detection value of the state detection unit 109 exceeds a predetermined value.
- the maximum drive speed, control gain, and control cycle may be changed depending on the detection value, as shown in the flowchart of FIG. 11.
- the state detection unit 109 is used to detect the physiological state or psychological state of the user.
- the control unit 108 sets the maximum drive speed of the actuators 105a and 105b based on the detected value.
- the maximum drive speed can be calculated depending on the detection value of the physiological state or psychological state.
- the maximum drive speed may be calculated using a formula that indicates the relationship between the detection value of the physiological state or psychological state and the maximum drive speed.
- FIG. 12 shows the detected values of the physiological or psychological state of the user and the maximum drive speed of the actuators 105a and 105b. As shown in FIG. 12, the maximum drive speed of the actuators 105a and 105b is set in stages according to the values detected by the state detection unit 109.
- control gain and control period may be changed to obtain the same effect.
- the control gain and control period may be changed simultaneously.
- a program that realizes one or more of the functions of the above-described embodiments can be supplied to a system or device via a network or storage medium, and one or more processors in the computer of the system or device can read and execute the program. It can also be realized by a circuit (e.g., an ASIC) that realizes one or more functions.
- a circuit e.g., an ASIC
- Image display device 101 Image acquisition unit 102 Display processing unit 103 Image display unit 104 Lens 105 Actuator 106 Line-of-sight detection means 107 Driving direction 108 Control unit 109 State detection unit 110 State determination unit 111 Depth information calculation unit 201a Left eye 201b Right eye 202 Optical axis 203 Virtual image
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Cardiology (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Surgery (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Ophthalmology & Optometry (AREA)
- Optics & Photonics (AREA)
- General Physics & Mathematics (AREA)
- Educational Technology (AREA)
- Developmental Disabilities (AREA)
- Child & Adolescent Psychology (AREA)
- Hospice & Palliative Care (AREA)
- Psychiatry (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- Human Computer Interaction (AREA)
- Mechanical Light Control Or Optical Switches (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)
- Eye Examination Apparatus (AREA)
Abstract
Dispositif d'affichage vidéo selon la présente demande comprenant : des première et seconde unités d'affichage vidéo qui affichent respectivement des première et seconde vidéos à l'œil droit et à l'œil gauche d'un utilisateur ; des premier et second éléments optiques d'affichage qui correspondent respectivement aux première et seconde unités d'affichage vidéo ; et un actionneur qui change la position de l'élément optique d'affichage. Le dispositif d'affichage vidéo est caractérisé par le changement de la position d'au moins l'un des premier et second éléments optiques d'affichage par entraînement de l'actionneur sur la base du résultat de détection de l'état physiologique ou de l'état psychologique de l'utilisateur qui utilise le dispositif d'affichage vidéo.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2023-201980 | 2023-11-29 | ||
| JP2023201980A JP2025087377A (ja) | 2023-11-29 | 2023-11-29 | 映像表示装置、映像表示装置の駆動方法、およびそのプログラム |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025115757A1 true WO2025115757A1 (fr) | 2025-06-05 |
Family
ID=95897859
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2024/041335 Pending WO2025115757A1 (fr) | 2023-11-29 | 2024-11-21 | Dispositif d'affichage vidéo, procédé de commande de dispositif d'affichage vidéo et programme associé |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2025087377A (fr) |
| WO (1) | WO2025115757A1 (fr) |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH06235885A (ja) * | 1993-02-08 | 1994-08-23 | Nippon Hoso Kyokai <Nhk> | 立体映像表示装置 |
| JP2010212899A (ja) * | 2009-03-09 | 2010-09-24 | Brother Ind Ltd | ヘッドマウントディスプレイ |
| JP2018137760A (ja) * | 2012-05-09 | 2018-08-30 | ノキア テクノロジーズ オーユー | フォーカス距離に基づいて表示情報の表現を決定する方法および装置 |
| JP2018163662A (ja) * | 2018-04-16 | 2018-10-18 | 株式会社東芝 | 電子機器、支援システムおよび支援方法 |
| JP2020507797A (ja) * | 2016-12-29 | 2020-03-12 | マジック リープ, インコーポレイテッドMagic Leap,Inc. | 外部条件に基づくウェアラブルディスプレイデバイスの自動制御 |
| JP2022132311A (ja) * | 2016-06-20 | 2022-09-08 | マジック リープ, インコーポレイテッド | 視覚的処理および知覚の疾患を含む神経学的疾患の評価および修正のための拡張現実ディスプレイシステム |
| JP2022171579A (ja) * | 2021-04-30 | 2022-11-11 | 株式会社半導体エネルギー研究所 | 電子機器 |
| JP2023145446A (ja) * | 2020-10-01 | 2023-10-11 | 株式会社東芝 | 電子機器及び表示方法 |
-
2023
- 2023-11-29 JP JP2023201980A patent/JP2025087377A/ja active Pending
-
2024
- 2024-11-21 WO PCT/JP2024/041335 patent/WO2025115757A1/fr active Pending
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH06235885A (ja) * | 1993-02-08 | 1994-08-23 | Nippon Hoso Kyokai <Nhk> | 立体映像表示装置 |
| JP2010212899A (ja) * | 2009-03-09 | 2010-09-24 | Brother Ind Ltd | ヘッドマウントディスプレイ |
| JP2018137760A (ja) * | 2012-05-09 | 2018-08-30 | ノキア テクノロジーズ オーユー | フォーカス距離に基づいて表示情報の表現を決定する方法および装置 |
| JP2022132311A (ja) * | 2016-06-20 | 2022-09-08 | マジック リープ, インコーポレイテッド | 視覚的処理および知覚の疾患を含む神経学的疾患の評価および修正のための拡張現実ディスプレイシステム |
| JP2020507797A (ja) * | 2016-12-29 | 2020-03-12 | マジック リープ, インコーポレイテッドMagic Leap,Inc. | 外部条件に基づくウェアラブルディスプレイデバイスの自動制御 |
| JP2018163662A (ja) * | 2018-04-16 | 2018-10-18 | 株式会社東芝 | 電子機器、支援システムおよび支援方法 |
| JP2023145446A (ja) * | 2020-10-01 | 2023-10-11 | 株式会社東芝 | 電子機器及び表示方法 |
| JP2022171579A (ja) * | 2021-04-30 | 2022-11-11 | 株式会社半導体エネルギー研究所 | 電子機器 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2025087377A (ja) | 2025-06-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11669160B2 (en) | Predictive eye tracking systems and methods for foveated rendering for electronic displays | |
| US11061240B2 (en) | Head-mountable apparatus and methods | |
| US10871825B1 (en) | Predictive eye tracking systems and methods for variable focus electronic displays | |
| JP6870080B2 (ja) | 画像生成装置、画像表示システム、および画像生成方法 | |
| US20200142480A1 (en) | Immersive displays | |
| US20210373657A1 (en) | Gaze tracking apparatus and systems | |
| US20180004286A1 (en) | Augmenting Virtual Reality Content With Real World Content | |
| CN106484116B (zh) | 媒体文件的处理方法和装置 | |
| US11216907B2 (en) | Image generating apparatus, method, and program for displaying image resolution with motion in head-mounted display (HMD) | |
| JP2013077013A (ja) | 表示装置、表示方法 | |
| US11579690B2 (en) | Gaze tracking apparatus and systems | |
| US11983310B2 (en) | Gaze tracking apparatus and systems | |
| JP2020137128A (ja) | コンピュータ読取可能な非一過性の記憶媒体、Webサーバ、及び、瞳孔間のキャリブレーション方法 | |
| JP6097919B2 (ja) | 立体視映像のための画像表示システム及び3次元表示装置 | |
| JP5904246B2 (ja) | 頭部装着型表示装置、表示方法 | |
| EP3945401B1 (fr) | Système et procédé de suivi de regard | |
| US12393027B2 (en) | Head-mountable display apparatus and methods | |
| GB2595909A (en) | Gaze tracking apparatus and systems | |
| WO2025115757A1 (fr) | Dispositif d'affichage vidéo, procédé de commande de dispositif d'affichage vidéo et programme associé | |
| JP6806062B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
| JP2023032278A (ja) | 映像表示装置およびその制御方法、プログラム | |
| Rogmans et al. | Biological-aware stereoscopic rendering in free viewpoint technology using gpu computing | |
| JP2025078023A (ja) | 画像処理方法およびシステム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24897443 Country of ref document: EP Kind code of ref document: A1 |