WO2023276156A1 - Dispositif d'affichage d'image spatiale, procédé d'affichage d'image spatiale et programme - Google Patents
Dispositif d'affichage d'image spatiale, procédé d'affichage d'image spatiale et programme Download PDFInfo
- Publication number
- WO2023276156A1 WO2023276156A1 PCT/JP2021/025200 JP2021025200W WO2023276156A1 WO 2023276156 A1 WO2023276156 A1 WO 2023276156A1 JP 2021025200 W JP2021025200 W JP 2021025200W WO 2023276156 A1 WO2023276156 A1 WO 2023276156A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display
- visual effect
- unit
- display device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/122—Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/346—Image reproducers using prisms or semi-transparent mirrors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/388—Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
- H04N13/395—Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume with depth sampling, i.e. the volume being constructed from a stack or sequence of 2D image planes
Definitions
- the present disclosure relates to a spatial image display device, a spatial image display method, and a program.
- Patent Document 1 It is also known to display a spatial image composed of a plurality of virtual images by displaying a plurality of virtual images at different distances in the same direction from the observer. This allows the observer to observe the spatial image with a higher stereoscopic effect. Also, by designing the distances between a plurality of virtual images to be short, the three-dimensional effect can be further enhanced.
- a stereoscopic image can be provided by displaying a plurality of images with different luminances at different distances in the same direction from the observer (Patent Document 2).
- Patent Document 2 a stereoscopic image can be provided by displaying a plurality of images with different luminances at different distances in the same direction from the observer.
- the plurality of virtual images are virtual images formed on a two-dimensional virtual image plane. Therefore, in a configuration in which a plurality of virtual image planes are positioned at different distances in the same direction as viewed from the observer, if an object (for example, a ball used in a sports game) moves in a direction corresponding to the observation direction of the observer, , the virtual image of the object may be displayed on one virtual image plane at one time and displayed on another virtual image plane at another time. In such a case, the observer sees the virtual image of the object instantaneously moving from one virtual image plane to another, which seems unnatural compared to the movement of the object in real space. . Therefore, the observer may not be able to obtain a sufficient sense of realism.
- Patent Document 2 when the technology described in Patent Document 2 is applied to an image showing a moving object to display a virtual image of the object on each of a plurality of virtual image planes, parallax between the plurality of virtual images may occur depending on the position of the observer. may occur and you may not be able to obtain a sufficient sense of presence.
- An object of the present disclosure which has been made in view of such circumstances, is to provide a spatial image display device, a spatial image display method, and a program capable of giving the observer a sufficient sense of realism when displaying virtual images of objects on a plurality of virtual image planes. to provide.
- a spatial image display device includes a plurality of display units that display images, and image light emitted from the images displayed on the plurality of display units, respectively. a plurality of optical elements arranged such that a virtual image of the image is displayed at different distances in the same direction from the observer by the reflected image light reaching the eyes of the observer; a determination unit that determines a visual effect to be added to an object image including the image of the object based on the position of an object that is a specific subject whose image is included in the object; and determining the image based on the object image and the visual effect.
- a display controller for controlling the display to display.
- a spatial image display method includes a plurality of display units that display images, and image light emitted from the images displayed on the plurality of display units. and a plurality of optical elements arranged such that the reflected image light reaches an observer's eye, thereby displaying a virtual image of the image at different distances in the same direction from the observer.
- a spatial image display method for a spatial image display device comprising: determining a visual effect to be added to an object image including an image of the object based on the position of the object, which is a specific subject whose image is included in the image. and controlling the display to display the image based on the object image and the visual effect.
- a program according to the present disclosure causes a computer to function as the spatial image display device described above.
- the spatial image display device the spatial image display method, and the program according to the present disclosure, it is possible to give the observer a sufficient sense of realism when displaying virtual images of objects on a plurality of virtual image planes.
- FIG. 1 is a schematic diagram of a spatial image display device according to a first embodiment
- FIG. 2 is a schematic diagram of a virtual image display unit shown in FIG. 1
- FIG. 3 is a diagram for explaining a virtual image displayed by the virtual image display unit shown in FIG. 2
- FIG. It is a figure which shows the virtual image seen from the observer.
- 2 is a diagram showing an example of visual effects stored in a visual effect storage section shown in FIG. 1
- FIG. FIG. 4 is a diagram showing an example of the position of an object in real space
- 6B is a diagram showing an example of a virtual image displayed based on the position of the object shown in FIG. 6A;
- FIG. 10 is a diagram showing another example of the position of an object in real space
- 7B is a diagram showing an example of a virtual image displayed based on the position of the object shown in FIG. 7A
- FIG. FIG. 10 is a diagram showing still another example of the position of an object in real space
- 8B is a diagram showing an example of a virtual image displayed based on the position of the object shown in FIG. 8A
- FIG. 2 is a flow chart showing an example of the operation in the spatial image display device shown in FIG. 1
- FIG. 10 is a diagram showing a first modification of visual effects
- 10B is a diagram showing an example of a virtual image displayed based on the visual effect shown in FIG. 10A
- FIG. 10B is a diagram showing another example of a virtual image displayed based on the visual effect shown in FIG. 10A;
- FIG. 10 is a diagram showing a second modification of the visual effect;
- 11B is a diagram showing an example of a virtual image displayed based on the visual effect shown in FIG. 11A;
- FIG. 11B is a diagram showing another example of a virtual image displayed based on the visual effect shown in FIG. 11A.
- FIG. FIG. 11 is a diagram showing a third modification of visual effects;
- 12B is a diagram showing an example of a virtual image displayed based on the visual effect shown in FIG. 12A;
- FIG. 12B is a diagram showing another example of a virtual image displayed based on the visual effect shown in FIG. 12A;
- FIG. 14 is a sequence diagram showing an example of operations in the spatial image display device shown in FIG. 13;
- FIG. 11 is a schematic diagram of a spatial image display device according to a third embodiment;
- 16 is a sequence diagram showing an example of operations in the spatial image display device shown in FIG. 15;
- FIG. 2 is a hardware block diagram of a spatial image display device;
- FIG. 1 is a schematic diagram of a spatial image display device 1 according to the first embodiment.
- the spatial image display device 1 includes a virtual image display section 2 and a visual effect determination section 3.
- the virtual image display section 2 includes multiple display sections 21 and multiple optical elements 22 .
- the display unit 21 is configured by an organic EL (Electro Luminescence), a liquid crystal panel, or the like.
- the organic EL, liquid crystal panel, etc. may be attached to a device such as a tablet, VR goggles, or the like.
- the optical element 22 is composed of, for example, a half mirror that transmits part of the incident light and reflects the remaining part of the light.
- the virtual image display unit 2 includes two plurality of display units 211 and 212 and two optical elements 221 and 222, but is not limited to this, and three or more display units 21 and 3 More than one optical element 22 may be provided.
- the display unit 21 displays images. Specifically, the display unit 21 displays an image based on the object image under the control of the visual effect determination unit 3 .
- An object image is a part of an image generated by an imaging device such as a camera and includes an image of an object OB.
- the object OB is any object, but can be, for example, an object contained in an image that can be noticed by the observer.
- the objects OB are, for example, a first player PL1, a second player PL2 who is an opponent of the first player PL1, and the first player PL2.
- the ball B is used by the second player PL1 and the second player PL2 during the game.
- the imaging device is arranged so that the first player PL1 is positioned on the front side and the second player PL2 is positioned on the back side with respect to the imaging surface of the imaging device.
- a video based on the object video includes a video in which the visual effect determined by the visual effect determination unit 3 is added to the object video.
- the image based on the object image may include an object image to which the visual effect determined by the visual effect determining unit 3 has not been added, that is, the object image itself. That is, the display unit 21 displays an image obtained by adding the visual effect determined by the visual effect determination unit 3 to the object image. Further, the display unit 21 may display an image to which no visual effect is added to the object image.
- the image obtained by adding a visual effect to the object image is an image obtained by processing the object image. The image is generated by processing the object image according to the type of visual effect, as will be described later in detail.
- the display unit 212 displays an object image IM2 including an image of the second player PL2, which is one of the objects OB. Further, the display unit 212 may display an object image IM3 including an image of a ball B, which is one of the objects OB, under the control of the visual effect determination unit 3.
- FIG. At this time, the display unit 211 does not have to display the object image IM3. Also, the display unit 211 may display an object image IM3. At this time, the display unit 212 does not have to display the object image IM3.
- the optical elements 221 and 222 reflect image light emitted from the images displayed on the plurality of display units 211 and 212, respectively.
- the optical elements 221 and 222 are arranged such that the reflected image light reaches the observer's eye, thereby displaying a virtual image VI of the image at different distances in the same direction from the observer.
- the distance from the observer to the virtual image plane VF1, which is one virtual image plane VF, and the distance from the observer to the virtual image plane VF2, which is the other virtual image plane VF are different from each other.
- the direction from the observer to the virtual image plane VF1 may be the same as the direction from the observer to the virtual image plane VF2.
- the position of the virtual image plane VF1 can be calculated by a known method based on the positions of the display unit 211 and the optical element 221. is placed.
- the virtual image plane VF1 is located on a plane symmetrical to the display surface of the display unit 211 with the optical element 221 interposed therebetween.
- the optical element 221 may be arranged so that
- the position of the virtual image plane VF2 can be calculated by a known method based on the positions of the display unit 212 and the optical element 222. and an optical element 222 are arranged.
- the display surface of the display unit 212 is symmetrical with respect to the desired position where the virtual image plane VF2 is displayed.
- the optical element 222 may be arranged so that
- the optical elements 221 and 222 are arranged so that the virtual image planes VF1 and VF2 are positioned on the actual court.
- the optical element 221 is arranged so that the virtual image plane VF1 is positioned closer to the observer than the actual net on the actual court.
- the optical element 222 is arranged so that the virtual image plane VF2 is positioned farther from the observer than the actual net on the actual court.
- the observer observes the virtual image VI1 displayed on the virtual image plane VF1 on the near side, and observes the virtual image VI2 displayed on the virtual image plane VF2 on the far side.
- the virtual image VIb shown in FIG. 4 is displayed on either the virtual image plane VF1 or the virtual image plane VF2.
- the virtual image plane VF on which the virtual image VIb is displayed will be described later in detail. Therefore, the observer observes a spatial image composed of the virtual image VI1, the virtual image VI2, and the virtual image VIb.
- Such a virtual image is generally called a pepper's ghost or the like, and it looks to the observer as if the display portions 211 and 212 are on the virtual image planes VF1 and VF2, respectively.
- the visual effect determination unit 3 includes a communication unit 31, a visual effect storage unit 32, a determination unit 33, and a display control unit .
- the communication unit 31 is configured by a communication interface. Standards such as Ethernet (registered trademark), FDDI (Fiber Distributed Data Interface), and Wi-Fi (registered trademark) may be used for the communication interface.
- the visual effect storage unit 32 is stored by memories such as HDD (Hard Disk Drive), SSD (Solid State Drive), EEPROM (Electrically Erasable Programmable Read-Only Memory), ROM (Read-Only Memory) and RAM (Random Access Memory). Configured.
- the determination unit 33 and the display control unit 34 constitute a control unit (controller).
- the control unit may be composed of dedicated hardware such as ASIC (Application Specific Integrated Circuit) or FPGA (Field-Programmable Gate Array), or may be composed of a processor, or may be composed of both. good too.
- the communication unit 31 receives object video information indicating the object video and object position information indicating the position of the object OB from an external device.
- An object image is an image including an image of an object OB, which is a specific subject.
- the external device may be a video processing device that generates object video information and object position information, or may be any device that acquires object video information and object position information generated by a video processing device. .
- the position of the object OB may be indicated by the distance in a predetermined direction from the reference plane, which is a plane in real space, corresponding to the virtual image plane VF on which the virtual image is displayed, to the object OB.
- the predetermined direction is the normal direction of the imaging surface of the camera that images the object. Note that the positional relationship of the specific subject viewed from the camera in a predetermined direction is substantially the same as the positional relationship of the virtual image of the specific subject viewed from the observer.
- the predetermined direction is a predetermined position (for example, the center position) of the range in which the first player PL1 located on the near side from the imaging device exists. This is the direction in which the player is heading, and is the direction from the imaging device toward a predetermined position (for example, the center position) of the range where the second player PL2 exists.
- the visual effect storage unit 32 associates and stores the position of the object OB in the real space with the visual effect.
- the visual effect is a visual effect given to the object image when the virtual image display unit 2 displays the virtual image VI of the object image indicated by the object image information.
- the position is indicated by the distance from the reference plane to the object OB in real space, and the visual effect storage unit 32 stores the distance and the visual effect in association with each other.
- the determination unit 33 determines a visual effect to be added to the object image including the image of the object OB, based on the position of the object OB, which is a specific subject whose image is included in the image.
- the position of the object OB may be the position of the object OB in the real space or the position of the image of the object OB in the video space.
- the determination unit 33 adds to the object image indicated by the image information. Determine visual effects.
- the determination unit 33 determines the visual effect based on the distance from the reference plane to the object OB in real space. Specifically, the determination unit 33 extracts the visual effect stored in the visual effect storage unit 32 corresponding to the distance. Then, the determination unit 33 determines the visual effect to be added to the object video as the extracted visual effect.
- the determination unit 33 causes the display unit 21 to display the object image from among the plurality of display units 21 based on the distance. may be determined.
- the determination unit 33 causes the display unit 21 that displays the object OB to display the virtual image VI on the virtual image plane VF corresponding to the reference plane with the shortest distance to the object OB among the plurality of reference planes. and decide.
- the display control unit 34 controls the plurality of display units 21 to display images based on the object images.
- the virtual image display unit 2 controlled by the display control unit 34 may be variable, and may be, for example, a tablet, and may be replaced with VR goggles later. That is, the display control unit 34 may be compatible with multiple devices.
- the display control unit 34 controls the display unit 21 to display an image obtained by adding a visual effect to the image based on the object image.
- the display control unit 34 causes the display unit to display an image obtained by adding the visual effect determined by the determination unit 33 to the object image of the object OB indicated by the object image information received by the communication unit 31. 21.
- the display control unit 34 determines whether the display unit 21 determined by the determining unit 33 is the object image. Control to display images with visual effects.
- the display control unit 34 controls the display unit 21 determined by the determination unit 33 to display an image obtained by adding a visual effect to the object image IM3 including the image of the ball B that is the object OB. do.
- the display control unit 34 may control the predetermined display unit 21 to display the object video of the predetermined object OB indicated by the object video information.
- the display control unit 34 controls the display unit 211 to display the object image IM1 including the image of the first player PL1, which is the predetermined object OB.
- the display control unit 34 also controls the display unit 212 to display the object image IM2 including the image of the second player PL2, which is the predetermined object OB.
- the object video includes images of the first player PL1, the second player PL2, and the ball B, respectively. Also, the distance between the first reference plane S1 and the second reference plane S2 is 9 in the real space. In real space, a ball B, which is one of the objects, is positioned between the first reference plane S1 and the second reference plane S2, and the distance between the first reference plane S1 and the ball B is is 8.2.
- the determination unit 33 determines the display unit 21 that displays the object OB as the display unit 21 that displays the virtual image VI on the virtual image plane VF corresponding to the reference plane with the shortest distance to the object OB.
- the distance from the first reference plane S1 to the ball B is 8.2
- the distance from the second reference plane S2 to the ball B is 0.8. Therefore, the determining unit 33 causes the display unit 21 that displays the virtual image VIb of the ball B to display the virtual image VI2 on the virtual image plane VF2 (see FIG. 3) corresponding to the second reference plane S2 with the shortest distance to the ball B.
- the display unit 212 to be displayed is determined.
- the determination unit 33 selects the visual effect “none” stored in the visual effect storage unit 32 as the object, corresponding to the distance 0.8 from the second reference plane S2 to the ball B, which is the object OB. Determine the visual effect to be added to OB.
- the display control unit 34 controls the display unit 212 to display the object image IM3 including the image of the ball B, which is the object OB, without visual effects. Furthermore, the display control unit 34 controls the display unit 211 to display the object image IM1 including the image of the first player PL1, which is the predetermined object OB. The display control unit 34 also controls the display unit 212 to display the object image IM2 including the image of the second player PL2, which is the predetermined object OB. As a result, as shown in FIG. 6B, the observer observes the virtual image VI1 of the first player PL1 on the near side, and observes the virtual image VI2 of the second player PL2 and the virtual image VIb of the ball B on the far side.
- the observer observes the virtual image VI1 of the first player PL1, the virtual image VI2 of the second player PL2, and the virtual image VIb of the ball B in a positional relationship similar to that of the respective objects OB in real space. be able to.
- the object video includes images of the first player PL1, the second player PL2, and the ball B, similar to the example shown in FIG. 6A. Also, the distance between the first reference plane S1 and the second reference plane S2 is 9 in the real space. Further, in the real space, the ball B, which is one of the objects, is positioned between the first reference plane S1 and the second reference plane S2, and unlike the example shown in FIG. and ball B is 7.
- the determination unit 33 causes the display unit 212 that displays the virtual image VI on the virtual image plane VF2 corresponding to the second reference plane S2, which is the shortest distance to the ball B, to display the object image including the image of the ball B.
- the display unit 21 is determined.
- the determining unit 33 sets the visual effect “enlargement/reduction” stored in the visual effect storage unit 32 to the object OB, corresponding to the distance 2 from the second reference plane S2 to the ball B, which is the object OB.
- the determination unit 33 reduces the object image including the image of the ball B as the distance between the ball B and the second reference plane S2 becomes shorter, and reduces the object image including the image of the ball B as the distance increases. Decide to add visual effects such as magnifying.
- the determination unit 33 determines the display unit 211 that displays the virtual image on the first reference plane S1 as the display unit 21 that displays the object image including the image of the ball B, and determines the display unit 21 that displays the object image including the image of the ball B. is within a predetermined range, the object image is enlarged as the distance between the ball B and the first reference surface S1 becomes shorter, and the object image is reduced as the distance becomes longer. may decide to do so.
- the display control unit 34 controls the display unit 212 to display an image to which a visual effect of enlarging or reducing the image of the ball B is added.
- the visual effect of enlarging or reducing is, for example, such that the image of ball B is enlarged as the ball B approaches the first player PL1, and the image of the ball B is reduced as the ball B moves away from the first player PL1. It can be a visual effect.
- the display control unit 34 controls the display unit 211 to display the object image IM1 including the image of the first player PL1 that is the predetermined object OB, and the image of the second player PL2 that is the predetermined object OB.
- the display unit 212 is controlled to display the object image IM2 including As a result, the observer observes the virtual image VI1 of the first player PL1 on the near side, and the virtual image VI2 of the second player PL2 and the virtual image VIb of the ball B on the far side, as shown in FIG. 7B. Therefore, the observer can observe the virtual images of the first player PL1, the second player PL2, and the ball B in a positional relationship similar to their respective positional relationships in real space. Furthermore, since the virtual image display unit 2 reduces or enlarges the image of the ball B according to the change in distance, the observer can perceive the movement of the ball in the real space with a more realistic feeling.
- the object video includes images of the first player PL1, the second player PL2, and the ball B, similar to the examples shown in FIGS. 6A and 7A.
- the distance between the first reference plane S1 and the second reference plane S2 is 9 in the real space.
- the ball B which is one of the objects, is positioned between the first reference plane S1 and the second reference plane S2, and unlike the examples shown in FIG. 6A and FIG.
- the distance between the reference plane S1 and the ball B is 5.
- the determining unit 33 causes the display unit 21 that displays the object image including the image of the ball B to be displayed on the virtual image plane VF2 corresponding to the second reference plane S2 having the shortest distance to the ball B, which is the object OB. is determined to be the display unit 212 that displays .
- the determination unit 33 adds the visual effect “flash light” stored in the visual effect storage unit 32 to the object OB corresponding to the distance 2 from the second reference plane S2 to the ball B. Decide on the effect.
- the display control unit 34 displays the object image including the image of the ball B by adding a visual effect (shaded portion in FIG. 8B) that makes the image of the ball B look as bright as flash light. It controls the display unit 212 . Further, the display control unit 34 controls the display unit 211 to display the object image IM1 including the image of the first player PL1 that is the predetermined object OB, and the image of the second player PL2 that is the predetermined object OB.
- the display unit 212 is controlled to display the object image IM2 including As a result, the observer observes the virtual image VI1 of the first player PL1 on the near side, and the virtual image VI2 of the second player PL2 and the virtual image VIb of the ball B on the far side, as shown in FIG. 8B. Therefore, the observer can observe the virtual images of the first player PL1, the second player PL2, and the ball B in a positional relationship similar to their respective positional relationships in real space. Furthermore, the observer can more strongly recognize that the ball B is approaching by observing the image of the ball B to which the visual effect of the flash light is added.
- FIG. 9 is a flow chart showing an example of the operation of the spatial image display device 1 according to the first embodiment.
- the operation of the spatial image display device 1 described with reference to FIG. 9 corresponds to an example of the spatial image display method of the spatial image display device 1 according to the first embodiment.
- step S11 the communication unit 31 receives object video information and object position information from an external device via the communication network.
- step S12 the determining unit 33 determines a visual effect to be added to the object image, which is the image of the object included in the object image information, based on the position of the object corresponding to the object, which is the specific subject whose image is included in the image. decide.
- step S13 the display control unit 34 determines, from among the plurality of display units 21, the display unit 21 on which to display the object image including the image of the object OB.
- step S14 the display control unit 34 controls the display unit 21 to display the image based on the object image and the visual effect. Specifically, the display control unit 34 causes the display unit determined in step S13 to display an image obtained by adding the visual effect determined by the determination unit 33 to the object image of the object OB indicated by the object image information. 21. Furthermore, the display control unit 34 may control the predetermined display unit 21 to display the object image of the predetermined object OB indicated by the object image information.
- step S15 the display unit 21 displays the object image under the control of the display control unit 34. Specifically, the display unit 21 displays an image obtained by adding the visual effect determined by the determination unit 33 to the object image of the object OB indicated by the object image information. At this time, the display unit 21 may display an object image of a predetermined object.
- the spatial image display apparatus 1 adds an image of an object OB to an object image including an image of the object OB based on the position of the object, which is a specific subject whose image is included in the image. A visual effect is determined, and the display unit 21 is controlled to display an image based on the object image and the visual effect. Therefore, the spatial image display device 1 can display virtual images of objects on a plurality of virtual image planes VF. Therefore, the spatial image display device 1 can provide a sufficient sense of realism to an observer observing a spatial image formed by virtual images on a plurality of virtual image planes VF.
- the spatial image display device 1 receives object video information indicating an object video and object position information indicating the position of the object OB, and based on the position indicated by the object position information, A visual effect to be added to the object video indicated by the object video information is determined. Therefore, the spatial image display device 1 can display a virtual image obtained by adding a visual effect to an object image included in the image, at a remote location where the object OB is imaged.
- the spatial image display device 1 determines a visual effect such as changing the brightness of each object image based on the distance. Therefore, the spatial image display device 1 can display the virtual image VI of the moving object OB on a plurality of virtual image planes VF so that the observer can intuitively recognize the position of the object OB. It is possible to give a more realistic feeling to the observer who observes the stereoscopic image formed by.
- the spatial image display device 1 determines, from among the plurality of display units 21, the display unit 21 on which the image based on the object image is to be displayed, based on the distance.
- the display unit 21 determined by is controlled to display an image obtained by adding a visual effect to the object image. Therefore, the spatial image display device 1 can display the virtual image VI of the moving object OB on a plurality of virtual image planes VF so that the observer can intuitively recognize the position of the object OB. It is possible to give a more realistic feeling to the observer who observes the stereoscopic image formed by.
- the spatial image display device 1 includes the virtual image display unit 2 in the first embodiment described above, this is not the only option.
- the spatial image display device 1 may not include the virtual image display unit 2 and may control an external virtual image display device to display the object image. In such a configuration, the spatial image display device 1 does not execute step S15 of the flowchart described above.
- the position of the object OB is indicated by the distance from the reference plane to the object OB in real space, but the present invention is not limited to this.
- the position of object OB may be indicated by the height from a predetermined horizontal plane (for example, ground plane) to object OB in real space.
- the visual effect storage unit 32 stores heights and visual effects in association with each other, as shown in FIG. 10A.
- a determination unit 33 determines a visual effect based on the height.
- the display control unit 34 controls the display unit 21 to display an image obtained by adding the visual effect determined by the determination unit 33 to the object image.
- the determining unit 33 determines that the visual effect is "none" (see FIG. 10A). Then, the display control unit 34 controls the display unit 21 to display the object image without visual effects. This allows the observer to observe a spatial image formed by virtual images VI1, VI2, and VIb as shown in FIG. 10B.
- the determination unit 33 determines the visual effect “50% transparency” (see FIG. 10A), and the determination unit 33 changes the transparency of the object image to 50%.
- the display unit 21 is controlled to display as As a result, the observer can observe a spatial image composed of the virtual images VI1 and VI2 as shown in FIG. 10C and the virtual image VIb of the object image with a visual effect of 50% transparency.
- the determination unit 33 determines the visual effect as “80% transparency” (see FIG. 10A), and changes the transparency of the object image to 80%.
- the display unit 21 is controlled to display
- the determining unit 33 determines the visual effect based on the position of the object OB.
- a visual effect may be determined.
- the change rate of the position can be, for example, the change in the position of the image of the object OB in a few unit frames forming the video.
- the visual effect storage unit 32 stores the rate of change in position and the visual effect in association with each other, as shown in FIG. 11A.
- the display control unit 34 also controls the display unit 21 to display an image obtained by adding the visual effect determined by the determination unit 33 to the object image.
- the determination unit 33 determines the visual effect to be "none" (see FIG. 11A). Then, the display control unit 34 controls the display unit 21 to display the object image without visual effects. This allows the observer to observe a spatial image formed by virtual images VI1, VI2, and VIb as shown in FIG. 11B.
- the determining unit 33 determines the visual effect to be "overlapping multiple frames" (see FIG. 11A). Then, the display control unit 34 controls the display unit 21 to display an image with a visual effect of superimposing and displaying a plurality of frames of object images. As a result, the observer can observe a spatial image composed of the virtual images VI1 and VI2 as shown in FIG. 11C and the virtual image VIb of the image displayed by overlapping the object images of a plurality of frames.
- the determination unit 33 determines the visual effect as "50% transparency”. Then, the display control unit 34 controls the display unit 21 to change the transparency of the object image to 50% and display it.
- the determination unit 33 may determine the visual effect to be added to the object image based on whether the position of the object OB is within a predetermined range.
- the visual effect storage unit 32 stores visual effects in association with whether the position of the object OB is within a predetermined range.
- a predetermined range In the example shown in FIG. 12A, inside and on the line of a court used in a tennis match are within a predetermined range (IN), and outside the line is outside a predetermined range (OUT).
- the display control unit 34 also controls the display unit 21 to display an image obtained by adding the visual effect determined by the determination unit 33 to the object image.
- the determining unit 33 determines "no" visual effect. Then, the display control unit 34 controls the virtual image display unit 2 to display the object image without visual effects. This allows the observer to observe a spatial image formed by virtual images VI1, VI2, and VIb as shown in FIG. 12B.
- the determination unit 33 determines the visual effect "flash light". Then, the display control unit 34 controls the virtual image display unit 2 to display the object OB by adding a visual effect such that the image of the object OB looks bright like flash light. As a result, the observer can observe a spatial image composed of the virtual images VI1 and VI2 as shown in FIG. 12C and the virtual image VIb of the image added with the visual effect of making it look bright like flash light. .
- the determination unit 33 determines the visual effect as "50% transparency". Then, the display control unit 34 controls the virtual image display unit 2 so that the transparency of the object image is changed to 50% and displayed.
- the determination unit 33 determines the visual effect based on whether the position of the object OB in the real space is within a predetermined range.
- a visual effect may be determined based on whether the image of is within a predetermined range.
- the determination unit 33 determines the display unit 21 to display the object image based on the position of the object OB, but the present invention is not limited to this.
- the determination unit 33 may determine to display the object image on each of the multiple display units 21 .
- the position is indicated by the distance in a predetermined direction from the reference plane, which is a plane in real space, to the object, corresponding to the virtual image plane on which the image is displayed.
- the determination unit 33 selects the object image to be displayed on each display unit 21 according to the position of the object OB so that the observer can observe the spatial image with a higher stereoscopic effect. The luminance is changed for each display section 21 .
- the determination unit 33 may further determine irradiation light to be applied to a member that defines the space in which the virtual image is displayed.
- Members that define the space in which the virtual image is displayed can be, for example, floors, walls, and pillars.
- the irradiation light is visible light.
- the determination unit 33 may determine the color, intensity, etc. of the visible light, or may determine the projection image formed by the visible light according to the irradiation position. .
- the display control unit 34 controls the irradiation device to emit the irradiation light determined by the determination unit 33 .
- the illumination device may be a lighting device, a projection device, or the like.
- the determination unit 33 determines whether the ball B is positioned outside the line on the court. It is decided to irradiate the real coat with white light having a higher intensity than before. Then, the display control unit 34 controls the irradiation device to irradiate the coat with white light.
- the observer can determine whether the ball B is positioned within or on the line on the court, or whether the ball B is on the line on the court, for example, due to the fast movement speed of the actual ball B. Even when it is not possible to clearly perceive whether the ball B is positioned outside, it is possible to instantly recognize the range on the court where the ball B is positioned by illuminating it with white light. Therefore, the observer can grasp the contents of the game without missing the timing.
- FIG. 13 is a schematic diagram of a spatial image display device 1-1 according to the second embodiment.
- functional units that are the same as those in the first embodiment are denoted by the same reference numerals, and descriptions thereof are omitted.
- the spatial image display device 1-1 includes a virtual image display section 2, a visual effect determination section 3-1, and a video processing section 4-1.
- the video processing unit 4-1 includes an input unit 41 and an object extraction unit .
- the input unit 41 is configured by an input interface that receives input of information.
- the object extraction unit 42 constitutes a control unit.
- the input unit 41 accepts input of image information indicating an image generated by the imaging device.
- the object extraction unit 42 extracts the image of the object OB from the image information whose input is accepted by the input unit 41 . Any method may be used for the object extraction unit 42 to extract the object video.
- the visual effect determination unit 3-1 includes a communication unit 31-1, a visual effect storage unit 32, a determination unit 33-1, and a display control unit .
- the communication unit 31-1 receives object position information. Specifically, it receives object position information from an external device via a communication network.
- the determination unit 33-1 determines the visual effect to be added to the object video including the image of the object OB based on the position of the object OB. Specifically, the determination unit 33-1 determines the visual effect to be added to the object video extracted by the object extraction unit 42 based on the position of the object OB indicated by the object position information received by the communication unit 31-1. decide.
- a specific method for determining the visual effect by the determining unit 33-1 is the same as the specific method for determining the visual effect by the determining unit 33 in the first embodiment described above.
- FIG. 14 is a flow chart showing an example of the operation of the spatial image display device 1-1 according to the second embodiment.
- the operation of the spatial image display device 1-1 described with reference to FIG. 14 corresponds to an example of the spatial image display method of the spatial image display device 1-1 according to the second embodiment.
- step S21 the input unit 41 accepts input of video information indicating the video generated by the imaging device.
- step S22 the object extraction unit 42 extracts object video information indicating an object video including the image of the object OB from the video information.
- step S23 the communication unit 31-1 receives object position information from an external device via the communication network.
- the spatial image display device 1-1 executes the processing from step S24 to step S27.
- the processing from step S25 to step S28 is the same as the processing from step S12 to step S15 in the first embodiment.
- the visual effect determination unit 3-1 and the video processing unit 4-1 may be configured separately.
- the video processing unit 4-1 has a communication unit configured by a communication interface, and the communication unit determines the visual effect of the object video information indicating the object video extracted by the object extraction unit 42. It is transmitted to the communication section 31-1 of the section 3-1.
- FIG. 15 is a schematic diagram of a spatial image display device 1-2 according to the third embodiment.
- the same reference numerals are given to the same functional units as in the second embodiment, and the description thereof is omitted.
- the spatial image display device 1-2 includes a virtual image display section 2, a visual effect determination section 3-2, and an image processing section 4-2.
- the video processing unit 4-2 includes an input unit 41, an object extraction unit 42, and an object position estimation unit 43.
- the object position estimation unit 43 estimates the position of the object OB. Any method may be used by the object position estimation unit 43 to estimate the position of the object OB. For example, the object position estimation unit 43 estimates the distance from the reference plane in the real space to the object OB using deep learning, which indicates the position of the object OB based on the video information input by the input unit 41. good too.
- the visual effect determination unit 3-2 includes a visual effect storage unit 32, a determination unit 33-2, and a display control unit .
- the determination unit 33-2 determines a visual effect to be added to the object image including the image of the object OB, based on the position of the object OB. Specifically, the determination unit 33 - 2 determines the visual effect to be added to the object image extracted by the object extraction unit 42 based on the position estimated by the object position estimation unit 43 .
- a specific method for determining the visual effect by the determining unit 33-2 is the same as the specific method for determining the visual effect by the determining unit 33 in the first embodiment described above.
- FIG. 16 is a sequence diagram showing an example of the operation of the spatial image display device 1-2 according to the third embodiment.
- the operation of the spatial image display device 1-2 described with reference to FIG. 16 corresponds to an example of the spatial image display method of the spatial image display device 1-2 according to the third embodiment.
- step S31 the input unit 41 accepts input of image information indicating an image generated by an imaging device such as a camera.
- step S32 the object extraction unit 42 extracts object video information indicating the video of the object OB from the video information.
- step S33 the object position estimation unit 43 estimates the object position indicating the position of the object OB.
- step S34 to step S37 executes the processing from step S34 to step S37.
- the processing from step S34 to step S37 is the same as the processing from step S12 to step S15 in the first embodiment.
- the object position estimation unit 43 estimates the height from a predetermined horizontal plane to the object OB in real space, which indicates the position of the object OB.
- the object position estimation unit 43 estimates the change rate of the position of the image of the object OB in the video.
- the object estimation unit 44 estimates whether or not the position of the object OB in real space is within a predetermined range.
- FIG. 17 is a block diagram showing a schematic configuration of the computer 100 functioning as the determination units 33, 33-1, 33-2, and the display control unit 34, respectively.
- the computer 100 may be a general-purpose computer, a dedicated computer, a workstation, a PC (Personal Computer), an electronic notepad, or the like.
- Program instructions may be program code, code segments, etc. for performing the required tasks.
- the computer 100 includes a processor 110, a ROM (Read Only Memory) 120, a RAM (Random Access Memory) 130, a storage 140, an input unit 150, a display unit 160, and a communication interface ( I/F) 170.
- the processor 110 is specifically a CPU (Central Processing Unit), MPU (Micro Processing Unit), GPU (Graphics Processing Unit), DSP (Digital Signal Processor), SoC (System on a Chip), etc. may be configured by a plurality of processors of
- the processor 110 controls each configuration and executes various arithmetic processing. That is, processor 110 reads a program from ROM 120 or storage 140 and executes the program using RAM 130 as a work area. The processor 110 performs control of each configuration and various arithmetic processing according to programs stored in the ROM 120 or the storage 140 .
- the ROM 120 or storage 140 stores the program according to the present disclosure.
- the program may be stored in a storage medium readable by the computer 100.
- a program can be installed in the computer 100 by using such a storage medium.
- the storage medium storing the program may be a non-transitory storage medium.
- the non-temporary storage medium is not particularly limited, but may be, for example, a CD-ROM, a DVD-ROM, a USB (Universal Serial Bus) memory, or the like.
- this program may be downloaded from an external device via a network.
- the ROM 120 stores various programs and various data.
- RAM 130 temporarily stores programs or data as a work area.
- the storage 140 is configured by a HDD (Hard Disk Drive) or SSD (Solid State Drive) and stores various programs including an operating system and various data.
- the input unit 150 includes one or more input interfaces that receive user's input operations and acquire information based on the user's operations.
- the input unit 150 is a pointing device, keyboard, mouse, etc., but is not limited to these.
- the display unit 160 includes one or more output interfaces that output information.
- the display unit 160 is a display that outputs information as video or a speaker that outputs information as audio, but is not limited to these.
- the display unit 160 also functions as the input unit 150 when it is a touch panel type display.
- a communication interface (I/F) 170 is an interface for communicating with an external device.
- (Appendix 1) a plurality of display units for displaying images; The image light emitted from the images displayed on the plurality of display units is reflected, and the reflected image light reaches the observer's eye, whereby the image is displayed at different distances in the same direction from the observer.
- a plurality of optical elements arranged to display a virtual image of an image; a control unit; The control unit determining a visual effect to be added to an object image including an image of the object based on the position of an object that is a specific subject whose image is included in the image;
- a spatial image display device for controlling the display unit to display the image based on the object image and the visual effect.
- (Appendix 2) a communication unit that receives object video information indicating the object video and object position information indicating the position of the object; 2.
- the spatial image display device according to claim 1, wherein the control unit determines the visual effect to be added to the object image indicated by the object image information based on the position indicated by the object position information.
- (Appendix 3) The control unit estimating the position of the object; 3.
- the aerial image display device according to claim 1 or 2, wherein the visual effect is determined based on the position.
- the position is indicated by the distance from a reference plane, which is a plane in real space, to the object, corresponding to the virtual image plane on which the virtual image is displayed, The control unit determining, from among the plurality of display units, a display unit on which to display the object image based on the distance; 4.
- the spatial image display device according to any one of additional items 1 to 3, wherein the display unit determined by the determination unit controls to display the image added with the visual effect to the object image.
- the position is indicated by a height from a predetermined horizontal plane to the object in real space, 5.
- the spatial image display device according to any one of additional items 1 to 4, wherein the control unit determines the visual effect based on the height. (Appendix 6) 6.
- the aerial image display device according to any one of additional items 1 to 5, wherein the control unit determines the visual effect based on the change rate of the position.
- (Appendix 7) The aerial image display device according to any one of additional items 1 to 6, wherein the control unit determines the visual effect based on whether the position is within a predetermined range.
- (Appendix 8) The position is indicated by a distance in a predetermined direction from a reference plane, which is a plane in real space, to the object, corresponding to the virtual image plane on which the virtual image is displayed; 4.
- the spatial image display device according to claim 3, wherein the control unit determines the visual effect of changing the brightness of each of the object images displayed on the plurality of display units, based on the distance.
- the control unit Determining irradiation light irradiated to a member defining a space in which the virtual image is displayed; 9.
- (Appendix 10) 10.
- Appendix 11 a plurality of display units that display images; and image light emitted from the images displayed on the plurality of display units are reflected, respectively, and the reflected image light reaches an observer's eye.
- a spatial image display method for a spatial image display device comprising a plurality of optical elements arranged such that virtual images of the video are displayed at different distances in the same direction from an observer, determining a visual effect to be added to an object image including an image of the object based on the position of the object, which is a specific subject whose image is included in the image; controlling the plurality of displays to display the image based on the object image and the visual effect;
- a spatial image display method comprising: (Appendix 12) A non-temporary storage medium storing a program executable by a computer, the non-temporary storage medium storing the program causing the computer to function as the determining unit and the display control unit according to any one of appendices 1 to 10. storage medium.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Processing Or Creating Images (AREA)
Abstract
La présente invention concerne un dispositif d'affichage d'image spatiale (1) qui comprend : une pluralité d'unités d'affichage (21) qui affichent une image ; une pluralité d'éléments optiques (22) qui réfléchissent respectivement une lumière d'image projetée à partir de l'image affichée sur la pluralité d'unités d'affichage (21), et qui sont agencés de telle sorte que la lumière d'image réfléchie atteint l'œil d'un observateur, affichant ainsi une image virtuelle (VI) de l'image, dans la même direction de l'observateur et à une distance différente ; une unité de détermination (22) qui détermine un effet visuel à ajouter à une image d'objet contenant une image d'un objet (OB), sur la base de la position de l'objet (OB), qui est un sujet spécifique dont une image est incluse dans l'image ; et une unité de commande d'affichage (34) qui commande les unités d'affichage (21) de telle sorte que l'image est affichée sur la base de l'image d'objet et de l'effet visuel.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2023531331A JP7598061B2 (ja) | 2021-07-02 | 2021-07-02 | 空間像表示装置、空間像表示方法、及びプログラム |
| PCT/JP2021/025200 WO2023276156A1 (fr) | 2021-07-02 | 2021-07-02 | Dispositif d'affichage d'image spatiale, procédé d'affichage d'image spatiale et programme |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2021/025200 WO2023276156A1 (fr) | 2021-07-02 | 2021-07-02 | Dispositif d'affichage d'image spatiale, procédé d'affichage d'image spatiale et programme |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2023276156A1 true WO2023276156A1 (fr) | 2023-01-05 |
Family
ID=84691058
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2021/025200 Ceased WO2023276156A1 (fr) | 2021-07-02 | 2021-07-02 | Dispositif d'affichage d'image spatiale, procédé d'affichage d'image spatiale et programme |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP7598061B2 (fr) |
| WO (1) | WO2023276156A1 (fr) |
Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2000261832A (ja) * | 1999-03-08 | 2000-09-22 | Nippon Telegr & Teleph Corp <Ntt> | 三次元表示方法およびヘッドマウントディスプレイ装置 |
| JP2000333211A (ja) * | 1999-05-18 | 2000-11-30 | Nippon Telegr & Teleph Corp <Ntt> | 三次元表示方法およびヘッドマウントディスプレイ装置 |
| JP2003058912A (ja) * | 2001-05-18 | 2003-02-28 | Sony Computer Entertainment Inc | 表示装置及び画像処理方法 |
| JP2004163644A (ja) * | 2002-11-13 | 2004-06-10 | Nippon Telegr & Teleph Corp <Ntt> | 三次元表示方法 |
| JP2004240090A (ja) * | 2003-02-05 | 2004-08-26 | Pioneer Electronic Corp | 表示装置及び方法 |
| JP2009267557A (ja) * | 2008-04-23 | 2009-11-12 | Seiko Epson Corp | 映像表示装置、映像表示方法 |
| JP2016018560A (ja) * | 2014-07-08 | 2016-02-01 | 三星電子株式会社Samsung Electronics Co.,Ltd. | 視覚効果を有するオブジェクトを表示する装置及び方法 |
| JP2018073172A (ja) * | 2016-10-31 | 2018-05-10 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および画像生成方法 |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR101231232B1 (ko) * | 2011-05-25 | 2013-02-08 | 엠앤서비스 주식회사 | 3차원 영상 제공 장치 |
-
2021
- 2021-07-02 JP JP2023531331A patent/JP7598061B2/ja active Active
- 2021-07-02 WO PCT/JP2021/025200 patent/WO2023276156A1/fr not_active Ceased
Patent Citations (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2000261832A (ja) * | 1999-03-08 | 2000-09-22 | Nippon Telegr & Teleph Corp <Ntt> | 三次元表示方法およびヘッドマウントディスプレイ装置 |
| JP2000333211A (ja) * | 1999-05-18 | 2000-11-30 | Nippon Telegr & Teleph Corp <Ntt> | 三次元表示方法およびヘッドマウントディスプレイ装置 |
| JP2003058912A (ja) * | 2001-05-18 | 2003-02-28 | Sony Computer Entertainment Inc | 表示装置及び画像処理方法 |
| JP2004163644A (ja) * | 2002-11-13 | 2004-06-10 | Nippon Telegr & Teleph Corp <Ntt> | 三次元表示方法 |
| JP2004240090A (ja) * | 2003-02-05 | 2004-08-26 | Pioneer Electronic Corp | 表示装置及び方法 |
| JP2009267557A (ja) * | 2008-04-23 | 2009-11-12 | Seiko Epson Corp | 映像表示装置、映像表示方法 |
| JP2016018560A (ja) * | 2014-07-08 | 2016-02-01 | 三星電子株式会社Samsung Electronics Co.,Ltd. | 視覚効果を有するオブジェクトを表示する装置及び方法 |
| JP2018073172A (ja) * | 2016-10-31 | 2018-05-10 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および画像生成方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP7598061B2 (ja) | 2024-12-11 |
| JPWO2023276156A1 (fr) | 2023-01-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7407869B2 (ja) | カラー仮想コンテンツワーピングを伴う複合現実システムおよびそれを使用して仮想コンテンツ生成する方法 | |
| JP7304934B2 (ja) | 仮想コンテンツワーピングを伴う複合現実システムおよびそれを使用して仮想コンテンツを生成する方法 | |
| JP7095189B2 (ja) | 多ソース仮想コンテンツ合成を伴う複合現実システムおよびそれを使用して仮想コンテンツを生成する方法 | |
| JP6747504B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
| US10204452B2 (en) | Apparatus and method for providing augmented reality-based realistic experience | |
| CA3054619C (fr) | Systeme de realite mixte a deformation de contenu virtuel et procede de generation de contenu virtuel l'utilisant | |
| US10715791B2 (en) | Virtual eyeglass set for viewing actual scene that corrects for different location of lenses than eyes | |
| EP3855290A1 (fr) | Rendu à distance pour images virtuelles | |
| JP2023520765A (ja) | 仮想および拡張現実のためのシステムおよび方法 | |
| JP7166484B1 (ja) | 前の目線からのレンダリングされたコンテンツおよびレンダリングされなかったコンテンツを使用した新しいフレームの生成 | |
| CN108780578A (zh) | 用于增强现实系统的直接光补偿技术 | |
| JP2012253690A (ja) | プログラム、情報記憶媒体及び画像生成システム | |
| US11709370B2 (en) | Presentation of an enriched view of a physical setting | |
| JP2023505235A (ja) | 仮想、拡張、および複合現実システムおよび方法 | |
| CN113875230B (zh) | 混合模式三维显示方法 | |
| JP7598061B2 (ja) | 空間像表示装置、空間像表示方法、及びプログラム | |
| EP3185103A1 (fr) | Un module de détermination d'un identifiant d'un objet virtuel regardé, un système pour mettre en oeuvre une translucidité du regard dans une scène virtuelle, et la méthode associée | |
| US20250069209A1 (en) | Image processing device | |
| US20240046584A1 (en) | Information processing apparatus | |
| WO2023181634A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21948460 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2023531331 Country of ref document: JP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 21948460 Country of ref document: EP Kind code of ref document: A1 |