WO2019171802A1 - Dispositif de traitement d'informations, procédé de traitement d'informations, et programme - Google Patents
Dispositif de traitement d'informations, procédé de traitement d'informations, et programme Download PDFInfo
- Publication number
- WO2019171802A1 WO2019171802A1 PCT/JP2019/002072 JP2019002072W WO2019171802A1 WO 2019171802 A1 WO2019171802 A1 WO 2019171802A1 JP 2019002072 W JP2019002072 W JP 2019002072W WO 2019171802 A1 WO2019171802 A1 WO 2019171802A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- dimming
- user
- information processing
- display
- processing apparatus
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/02—Viewing or reading apparatus
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/38—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
Definitions
- This disclosure relates to an information processing apparatus, an information processing method, and a program.
- VR virtual reality
- AR augmented reality
- Patent Document 1 discloses a technique for switching between a state in which a user views an external image displayed on a display and a state in which the user directly sees the external environment in front of the eyes.
- the present disclosure has been made in view of the above, and the present disclosure provides a new and improved information processing apparatus and information processing method capable of more flexibly performing dimming control in a user's visual field region. And provide programs.
- the dimming region that has performed the dimming in the user's visual field region and the dimming are more relative to the dimming region.
- a dimming control unit that generates a non-dimming region that is not performed automatically, and a display control unit that controls the display device to display the first image in the dimming region as seen from the user An information processing apparatus is provided.
- the dimming region that performs the dimming in the user's visual field region and the dimming from the dimming region Generating a non-dimming area that is not relatively performed, and controlling a display device to display a first image in the dimming area as seen from the user, And controlling the display device to display the first image in the dimming area as viewed from the above, an information processing method executed by a computer is provided.
- the dimming region that performs the dimming in the user's visual field region and the dimming from the dimming region Generating a non-dimming area that is not relatively performed, controlling the display device to display the first image in the dimming area as viewed from the user, and
- a program for causing a computer to control a display device to display a first image in a dimming area is provided.
- FIG. 1 is a diagram illustrating an overview of an information processing apparatus 100 according to the present embodiment.
- the information processing apparatus 100 is realized by, for example, a head-mounted display (HMD: Head Mounted Display) that is worn on the user's head, but is not limited thereto.
- the head mounted display may be a glasses type.
- the information processing apparatus 100 may be at least one processor that controls display on a head-mounted display. Alternatively, it may be a module including a processor and an IMU (Inertial Measurement Unit) built in the head mounted display.
- IMU Inertial Measurement Unit
- the information processing apparatus 100 is provided with an outward camera 131 that captures the user's line-of-sight direction, that is, the outward direction when worn. Further, although not shown in FIG. 1, the information processing apparatus 100 is provided with various sensors such as an inward camera and a microphone (hereinafter referred to as “microphone”) that captures the user's eyes when worn. A plurality of outward cameras 131 and inward cameras may be provided. A depth map (distance image) can be obtained by the outward camera 131, and the surrounding environment can be sensed.
- the shape of the information processing apparatus 100 is not limited to the example shown in FIG.
- the information processing apparatus 100 is an HMD of a headband type (a type that is worn with a band that goes around the entire circumference of the head, or a band that passes through the top of the head as well as the temporal region) or a helmet type.
- HMD the visor part of a helmet corresponds to a display
- the visor part of a helmet corresponds to a display
- the information processing apparatus 100 adjusts the amount of light (incident light) incident on a part of the user's visual field region by using, for example, a dimming element described later, so that an object in the real world (real object) ). Then, the information processing apparatus 100 performs display control so that an arbitrary image (first image) is reflected in the field-of-view area where the shielding is performed.
- the adjustment of the amount of light may be simply referred to as dimming.
- shielding may be regarded as preventing the user from visually recognizing a real object other than by displaying an image.
- performing shielding may be simply referred to as “shielding”.
- a general AR display device can display information on a real object in a superimposed manner.
- the visual field region is partially switched to the virtual world. Can not.
- the information processing apparatus 100 can partially switch the visual field region to the virtual world by the above processing.
- the information processing apparatus 100 includes a light control unit 110 and an AR display unit 120, and light from an observation target (real object) is converted into a light control unit 110 and an AR display unit.
- Each component is arranged to be incident in the order of 120 and the user's eyeball.
- “light incident from an actual object” includes light spontaneously emitted by the actual object and external light reflected by the actual object. That is, “light incident from a real object” may be regarded as light for the user to recognize the real object.
- the AR display unit 120 may be regarded as constituting at least a part of the display device according to the present disclosure.
- the dimming unit 110 includes a dimming element (hereinafter, referred to as “partial dimming element”) that can partially control dimming, and controls the light incident on the AR display unit 120 and the eyeball. It is a light capable member. More specifically, as shown in FIG. 2, the dimming unit 110 controls the partial dimming element on the surface on which the light from the observation object is incident, thereby adjusting the dimming region 10 (hereinafter, “ An area 11 that is not dimmed (hereinafter referred to as “non-dimming area 11”) can be generated. Note that the non-dimming region 11 may be regarded as a region where dimming is not performed relative to the dimming region 10.
- the non-dimming region 11 may include a region where the visibility of the virtual object is enhanced by incompletely transmitting a predetermined amount of incident light and the real space is visible.
- FIG. 2 shows a case where the light control unit 110 can control the light control of each region arranged in a lattice shape, but this is merely an example, and the shape of each region where the light control is controlled Is optional.
- a user can visually recognize an observation object by making light from the observation object enter the eyeball and forming an image on the retina.
- the light incident on the dimming area 10 cannot pass through the dimming area 10 and cannot form an image on the retina as shown in FIG. 2 (see the dotted line portion in FIG. 2).
- the user cannot clearly see the observation object.
- the light control region 10 is formed so that light from the observation object cannot form an image on the retina, the user cannot see the observation object at all.
- the light control unit 110 can effectively shield the observation object.
- the AR display unit 120 described later is relatively brighter than the surroundings, so that the light control unit 110 can effectively shield the observation target.
- the light control unit 110 effects the observation object by further adjusting the degree of light control in the area where the light from the observation object is incident after adjusting the entire visual field area as necessary. May be shielded.
- the shape of the dimming unit 110 is arbitrary as long as the light incident on the AR display unit 120 and the eyeball can be dimmed.
- the AR display unit 120 is a display capable of projecting an arbitrary image in an area corresponding to the dimming area 10. For example, the AR display unit 120 displays an object that is completely different from the observation target in the display area corresponding to the dimming area 10, thereby giving the user the impression that the observation target is replaced with another object. be able to.
- the information processing apparatus 100 may perform display control of the image in the light control region 10 so that the image recognized by the user from the light control region 10 to the non-light control region 11 is substantially continuous in the user's visual field region.
- the AR display unit 120 can give the user an impression that the real world has been reproduced by displaying an object that is the same as or similar to the observation object.
- the AR display unit 120 interpolates the display area corresponding to the dimming area 10 based on information related to the non-dimming area 11 (such as the luminance value of the display area corresponding to the non-dimming area 11). It is possible to give the user an impression as if the observation object does not exist (in other words, an impression that the observation object disappears).
- the information regarding the non-dimming region 11 may be regarded as including information regarding the background of the real object.
- an image corresponding to the background is displayed at a position where the user should recognize the observation object during non-dimming.
- the image corresponding to the background may be a complementary image generated by a general image editing technique.
- the information processing apparatus 100 displays an image (second image) in the non-dimming area adjacent to the dimming area.
- the AR display unit 120 may be controlled to display (image).
- the AR display unit 120 displays the observation object based on information about the non-dimming area 11 (such as a luminance value of a display area corresponding to the non-dimming area 11). It is also possible to generate an image that assists in shielding and to display the image on a display area corresponding to the light control area 10.
- the display content of the AR display unit 120 can be changed as appropriate.
- the information processing apparatus 100 can perform switching between the real world and the virtual world in the user's visual field region gradually or stepwise.
- FIG. 3A the real world is shown in the entire visual field region.
- the information processing apparatus 100 gradually switches from the real world to the virtual world to make a transition to the 3B state in which the real world and the virtual world are mixed in the visual field region. Transition to the 3C state shown. That is, the information processing apparatus 100 may be regarded as gradually increasing the area of the light control region 10 and increasing the area of the image in the light control region 10 according to the movement of the user's line of sight. In contrast to the above, the information processing apparatus 100 may perform stepwise switching from the virtual world to the real world.
- FIG. 3 shows an example in which the AR display unit 120 displays an image corresponding to a person and an object in the real world.
- the present invention is not limited to this.
- the information processing apparatus 100 may perform stepwise switching based on the user's line-of-sight direction. More specifically, the information processing apparatus 100 can estimate the user's line-of-sight direction using an inward camera that captures the user's eyes when worn. Then, the information processing apparatus 100 may sequentially switch between the real world and the virtual world from an area where the user's line of sight is not poured, or an area away from the area where the user's line of sight is poured. Good. Thereby, the information processing apparatus 100 can make it difficult for the user to perceive switching between the real world and the virtual world. In other words, the information processing apparatus 100 can give the user an impression that the real world and the virtual world are being switched over time.
- the information processing apparatus 100 may switch between the real world and the virtual world from the area where the depth map is created or the area where the likelihood of the depth map is high. More specifically, the information processing apparatus 100 uses a depth map created based on a captured image of the outward camera 131 when switching between the real world and the virtual world. Then, the information processing apparatus 100 may perform switching from the region where the depth map is created or the region where the depth map having a likelihood higher than a certain value is created. Accordingly, it is possible to sequentially switch without waiting until a depth map or a high likelihood depth map is created in the entire visual field region.
- the above processing is merely an example, and can be changed as appropriate.
- the information processing apparatus 100 can further reduce the sense of discomfort given to the user by performing coordinate conversion processing, color conversion processing, or delay correction processing when switching between the real world and the virtual world.
- the information processing apparatus 100 uses a coordinate system with the viewpoint of the user as the origin from a coordinate system with the viewpoint of the outward camera 131 as the origin. Convert to. Accordingly, the information processing apparatus 100 can display a virtual object (image) so as to correspond to the position, orientation, and size of the real object viewed from the user. For this reason, the discomfort given to the user can be further reduced.
- the information processing apparatus 100 performs dimming so as to correspond to the position, orientation, and size of the real object viewed from the user so that the virtual object does not protrude from the dimming area in accordance with display control of the virtual object. At least one of the position, shape, and size of the region 10 may be changed. Thereby, the information processing apparatus 100 can more reliably perform shielding of the real object.
- the specific method of the coordinate conversion process is not particularly limited.
- the information processing apparatus 100 uses various sensors such as an inward camera to grasp the positional relationship between the outward camera 131 and the eyeball to perform coordinate conversion processing. May be realized.
- the information processing apparatus 100 may interpolate the non-included area based on the surrounding luminance values.
- the information processing apparatus 100 may grasp the positional relationship between the outward camera 131 and the eyeball again and reflect it in the coordinate conversion process.
- the information processing apparatus 100 may not perform the coordinate conversion process.
- the color conversion process will be described more specifically.
- the information processing apparatus 100 perceives the user when dimming is not performed based on the prediction of the change in color of the real object according to the change in the state of the real object. A color corresponding to the color of the real object is added to the display image of the light control region 10. Thereby, the information processing apparatus 100 can further reduce the uncomfortable feeling given to the user.
- the specific content of the color conversion process is not particularly limited.
- the information processing apparatus 100 may perform any conversion process as long as it is a parameter (lightness, saturation, luminance, white balance, etc.) relating to a color perceived by the user.
- the delay correction process will be described more specifically.
- the information processing apparatus 100 grasps the state of the person wearing the real world, the object, or the user wearing the own apparatus based on the outputs of various sensors including the outward camera 131. Then, these states (position, orientation, shape, movement, color, etc.) after a predetermined time have elapsed are predicted. For example, the information processing apparatus 100 predicts the movement of the user wearing the apparatus based on the output of the outward camera 131, the acceleration sensor, or the gyro sensor mounted on the apparatus.
- the information processing apparatus 100 controls the dimming by the dimming unit 110 and the display by the AR display unit 120 based on the prediction, so that even if the state of the person, object or user in the real world changes, these It is possible to appropriately perform shielding and image display. For example, even if they are moving, the real object can be appropriately shielded without the light control region 10 being displaced from the real object.
- the information processing apparatus 100 temporarily enlarges the dimming area 10. The light control area 10 may be prevented from deviating from the actual object.
- the information processing apparatus 100 considers that the dimming region 10 is made larger when the real object has the first momentum than when the real object has the second momentum smaller than the first momentum. May be.
- the AR display unit 120 can reduce the uncomfortable feeling given to the user by interpolating the display area corresponding to the temporarily dimming area 10 based on the surrounding luminance values.
- the information processing apparatus 100 may appropriately omit the conversion process or the delay correction process.
- the information processing apparatus 100 may omit the coordinate conversion process and display an image in the coordinate system with the viewpoint of the outward camera 131 as the origin (“ Also called “camera through display”).
- the viewpoint of the outward camera 131 as the origin
- the information processing apparatus 100 can reduce the processing load and improve the processing speed.
- the information processing apparatus 100 when the visual field area of the outward camera 131 cannot include the entire visual field area of the user, the information processing apparatus 100 performs interpolation or the like on the non-included area based on the surrounding luminance value.
- the information processing apparatus 100 can reduce the processing load required for the interpolation and the like by omitting the coordinate conversion process.
- the information processing apparatus 100 cancels the switching from the real world to the virtual world and releases the dimming (or reduces the degree of dimming) to ensure safety, so that the user can visually recognize the real world. You may be able to do it. Further, even when the position of the information processing apparatus 100 is shifted, the information processing apparatus 100 may allow the user to visually recognize the real world as described above.
- the technology of the present disclosure can be applied to various applications (services).
- a case where the technology of the present disclosure is applied to a VR call application will be described in “4. Application to VR Call Application”.
- An application to which the technology is applied is not particularly limited.
- FIG. 4 is a block diagram illustrating an example of a functional configuration of the information processing apparatus 100.
- the information processing apparatus 100 includes a dimming unit 110, an AR display unit 120, a sensor unit 130, a control unit 140, an operation input unit 150, a speaker 160, a communication unit 170, A storage unit 180.
- the light control unit 110 performs partial light control using a partial light control element.
- the light control element used for a partial light control element is a member which can control the light transmittance by control, such as application of a voltage.
- a general light control device uniformly controls the transmittance of the entire region.
- the transmittance of an arbitrary pixel can be controlled by adopting a configuration similar to that of a liquid crystal display.
- the light control unit 110 includes a partial light control element using this principle, thereby enabling partial light control. Note that the light control unit 110 may perform not only partial light control but also light control of the entire region. Moreover, the light control part 110 may implement
- the dimmer 110 further includes a member (color filter, polarizing filter, etc.) that can control transmission of light in a specific wavelength band or specific polarization, so that only light in a specific wavelength band or specific polarization can be obtained. May be shielded or transmitted through the AR display unit 120.
- the light control unit 110 can shield the real object more strictly by shielding specific polarized light in units of pixels for light incident from the real object.
- “light incident from a real object” includes light spontaneously emitted from the real object and external light reflected by the real object, that is, “light incident from the real object” It may be regarded as light for recognizing a real object.
- the AR display unit 120 displays an image on the display so as to cooperate with the light control unit 110.
- the AR display unit 120 is realized by, for example, a lens unit that performs display using a hologram optical technique, a liquid crystal display (LCD) device, an OLED (Organic Light Emitting Diode) device, and the like. Further, the AR display unit 120 will be described assuming that it is a transmissive or transflective member, but the present invention is not limited to this.
- Control unit 140 functions as an arithmetic processing unit and a control unit, and controls overall operations in the information processing apparatus 100 according to various programs.
- the control unit 140 recognizes a real object in the visual field of the user based on the output of the sensor unit 130 including the outward camera 131, and the recognition result Then, based on various factors such as the depth map creation status, the user's line of sight, or the contents of the application, the area, timing, or contents for switching between the real world and the virtual world are determined.
- control unit 140 controls the overall processing related to the VR call application (hereinafter may be referred to as “VR call”). Details of the VR call will be described in “4. Application to VR Call Application”.
- control unit 140 includes a recognition engine 141, a dimming control unit 142, and a display control unit 143, as shown in FIG.
- the recognition engine 141 has a function of recognizing various situations of the user or the surroundings using various sensor information sensed by the sensor unit 130. More specifically, the recognition engine 141 includes a depth recognition engine 141a, a head posture recognition engine 141b, a SLAM (Simultaneous Localization and Mapping) recognition engine 141c, a gaze recognition engine 141d, a speech recognition engine 141e, a position recognition engine 141f, and an action. It includes a recognition engine 141g. In addition, these recognition engines shown in FIG. 4 are examples, and this embodiment is not limited to this.
- the depth recognition engine 141a recognizes the depth information in the space around the user by using various sensor information sensed by the sensor unit 130.
- the Depth recognition engine 141a can analyze the surrounding captured image acquired by the outward camera 131 and recognize the distance information of the object in the surrounding space and the planar position of the object.
- a generally known algorithm may be used as the depth recognition algorithm, and is not particularly limited in the present embodiment.
- the head posture recognition engine 141b recognizes the posture of the user's head (including the orientation or inclination of the face with respect to the body) using various sensor information sensed by the sensor unit 130.
- the head posture recognition engine 141 b is a peripheral captured image captured by the outward camera 131, gyro information acquired by the gyro sensor 134, acceleration information acquired by the acceleration sensor 135, and an orientation sensor 136.
- the orientation of the user's head can be recognized by analyzing at least one of the orientation information.
- the head posture recognition algorithm may be a generally known algorithm, and is not particularly limited in the present embodiment.
- the SLAM recognition engine 141c can simultaneously estimate its own position and create a map of the surrounding space using various sensor information sensed by the sensor unit 130, and identify its own position in the surrounding space. For example, the SLAM recognition engine 141c can analyze the surrounding captured image acquired by the outward camera 131 and identify the self-position of the information processing apparatus 100.
- the SLAM recognition algorithm may be a generally known algorithm, and is not particularly limited in the present embodiment.
- the recognition engine 141 can perform space recognition (space grasping) based on the recognition result of the depth recognition engine 141a and the recognition result of the SLAM recognition engine 141c. Specifically, the recognition engine 141 can recognize the position of the information processing apparatus 100 in the surrounding three-dimensional space.
- the line-of-sight recognition engine 141d detects the line of sight of the user using various sensor information sensed by the sensor unit 130.
- the line-of-sight recognition engine 141d analyzes the captured image of the user's eye acquired by the inward camera 132 and recognizes the direction of the user's line of sight.
- the line-of-sight detection algorithm is not particularly limited.
- the line-of-sight direction of the user can be recognized based on the positional relationship between the eyes and the iris, or the positional relationship between the corneal reflection (such as a Purkinje image) and the pupil.
- the line-of-sight recognition engine 141d may regard the front of the information processing apparatus 100 that is the head-mounted display as the line-of-sight direction.
- the voice recognition engine 141e recognizes a user or environmental sound using various sensor information sensed by the sensor unit 130.
- the speech recognition engine 141e may perform noise removal, sound source separation, and the like on the collected sound information acquired by the microphone 133, and perform speech recognition, morphological analysis, sound source recognition, noise level recognition, and the like.
- the position recognition engine 141f recognizes the absolute position of the information processing apparatus 100 using various sensor information sensed by the sensor unit 130. For example, the position recognition engine 141f determines the location of the information processing apparatus 100 (eg, station, school, house, company, train, theme park, etc.) based on the position information measured by the positioning unit 137 and the map information acquired in advance. recognize.
- the information processing apparatus 100 eg, station, school, house, company, train, theme park, etc.
- the behavior recognition engine 141g recognizes the user's behavior using various sensor information sensed by the sensor unit 130.
- the behavior recognition engine 141g includes the captured image of the outward camera 131, the collected sound of the microphone 133, the angular velocity information of the gyro sensor 134, the acceleration information of the acceleration sensor 135, the orientation information of the orientation sensor 136, and the absolute position of the positioning unit 137.
- the user's action situation (an example of an activity state) is recognized using at least one of the information. Examples of user behavior include a resting state, a walking state (slow walking, jog), a traveling state (dash, high-speed traveling), a sitting state, a standing state, a sleeping state, and a bicycle riding state.
- the behavior recognition engine 141g may recognize the state state according to the amount of activity measured based on the angular velocity information and the acceleration information.
- the various sensor information according to the present embodiment can be said to be an example of information related to the activity state of the user.
- the dimming control unit 142 controls the dimming process performed by the dimming unit 110. More specifically, the dimming control unit 142 provides a control signal to the dimming unit 110 based on the content of switching between the real world and the virtual world determined by the control unit 140, thereby controlling the dimming unit. The dimming process by 110 is controlled. Thereby, the dimming control unit 142 can partially control the dimming of the dimming unit 110, and can generate the dimming region 10 and the non-dimming region 11.
- the display control unit 143 controls display processing by the AR display unit 120. More specifically, the display control unit 143 provides a control signal to the AR display unit 120 based on the content of switching between the real world and the virtual world determined by the control unit 140, whereby the AR display unit 120. Controls the display process.
- the display control unit 143 can display an image on the display area corresponding to the light control area 10 in cooperation with the light control by the light control unit 142. As described above, the shielding of the real object may be assisted by displaying an interpolative image in the vicinity of the display area corresponding to the light control area 10.
- the sensor unit 130 has a function of acquiring various information related to the user or the surrounding environment.
- the sensor unit 130 includes an outward camera 131, an inward camera 132, a microphone 133, a gyro sensor 134, an acceleration sensor 135, an orientation sensor 136, and a positioning unit 137.
- the specific example of the sensor unit 130 given here is an example, and the present embodiment is not limited to this. There may be a plurality of sensors.
- the outward camera 131 and the inward camera 132 are obtained by a lens system including an imaging lens, an aperture, a zoom lens, a focus lens, and the like, a drive system that causes the lens system to perform a focus operation and a zoom operation, and a lens system.
- a lens system including an imaging lens, an aperture, a zoom lens, a focus lens, and the like, a drive system that causes the lens system to perform a focus operation and a zoom operation, and a lens system.
- Each has a solid-state image sensor array that photoelectrically converts captured image light to generate an image signal.
- the solid-state imaging device array may be realized by a CCD (Charge Coupled Device) sensor array or a CMOS (Complementary Metal Oxide Semiconductor) sensor array, for example.
- the microphone 133 collects the user's voice and the surrounding environmental sound and outputs it to the control unit 140 as voice data.
- the gyro sensor 134 is realized by, for example, a three-axis gyro sensor and detects an angular velocity (rotational speed).
- the acceleration sensor 135 is realized by, for example, a three-axis acceleration sensor (also referred to as a G sensor), and detects acceleration during movement.
- a three-axis acceleration sensor also referred to as a G sensor
- the direction sensor 136 is realized by, for example, a triaxial geomagnetic sensor (compass), and detects an absolute direction (direction).
- the positioning unit 137 has a function of detecting the current position of the information processing apparatus 100 based on an externally acquired signal.
- the positioning unit 137 is realized by a GPS (Global Positioning System) positioning unit, receives a radio wave from a GPS satellite, detects a position where the information processing apparatus 100 exists, and detects the detected position. Information is output to the control unit 140.
- the positioning unit 137 detects the position by, for example, Wi-Fi (registered trademark), Bluetooth (registered trademark), transmission / reception with a mobile phone / PHS / smartphone, or short distance communication. May be.
- the operation input unit 150 is realized by an operation member having a physical structure such as a switch, a button, or a lever.
- the speaker 160 reproduces an audio signal according to the control of the control unit 140.
- the communication unit 170 is a communication module for transmitting / receiving data to / from other devices by wire / wireless.
- the communication unit 170 is a system such as a wired LAN (Local Area Network), wireless LAN, Wi-Fi (Wireless Fidelity (registered trademark)), infrared communication, Bluetooth (registered trademark), short-range / non-contact communication, and the like. Wirelessly or directly with a network access point.
- the storage unit 180 stores programs, parameters, image data, and the like for the above-described control unit 140 to execute each function.
- the storage unit 180 stores image data related to a person or an object displayed by the AR display unit 120.
- the configuration of the information processing apparatus 100 according to the present embodiment has been specifically described above, but the configuration of the information processing apparatus 100 according to the present embodiment is not limited to the example illustrated in FIG.
- at least a part of the processing of the control unit 140 of the information processing apparatus 100 may be performed by a server on a cloud connected via the communication unit 170.
- FIG. 5 is a flowchart illustrating an example of an operation in which the information processing apparatus 100 performs switching from the real world to the virtual world.
- the control unit 140 recognizes a real object in the user's visual field region based on the output of the sensor unit 130 including the outward camera 131, and the recognition result, the creation status of the depth map, the user's Based on various factors such as the line of sight and the contents of the application, an area, timing, or contents for switching from the real world to the virtual world are determined.
- the display control unit 143 provides a control signal to the AR display unit 120 based on this determination.
- the AR display unit 120 performs rendering processing of an image to be displayed on the display from the viewpoint of the user wearing the information processing apparatus 100 based on the control signal.
- step S1008 the AR display unit 120 displays the rendered virtual object on the display.
- the dimming control unit 142 provides the control signal to the dimming unit 110, so that in step S1016, the dimming unit 110 is linked to the display of the virtual object by the AR display unit 120.
- the light is partially dimmed to gradually shield the real object.
- step S1020 when the shielding of the entire visual field region (or a visual field region wider than a certain area) is completed, a control signal for the display control unit 143 to switch from the user viewpoint to the camera through display in step S1024. Is provided to the AR display unit 120.
- the AR display unit 120 that has received this control signal switches the display converted to the user viewpoint to the camera through display (in other words, various conversion processes such as coordinate conversion are omitted).
- step S1024 and step S1028 may not be performed.
- the operation of each step may be appropriately replaced or performed in parallel.
- the VR call refers to an application that realizes bidirectional communication by allowing a plurality of users to transmit voice and video to each other.
- the user In a VR call to which the present disclosure is applied, the user is in a state where the avatar of the call partner is displayed on the AR display unit 120 when the call partner is remote (assuming a separation distance that prevents direct conversation). Then make a call through the application. As a result, the VR call can give the user the feeling of having a direct conversation with the other party. Then, when the remote caller approaches the user (assuming a separation distance that allows direct conversation), the information processing apparatus 100 adjusts so that the user can visually recognize the real caller instead of the avatar. The optical unit 110 and the AR display unit 120 are controlled, the call function is stopped, and the users are directly talked to each other. As a result, the VR call can smoothly transition from a state in which the users talk through the VR call to a state in which the user talks directly.
- the avatar is assumed to be a 3D model that reproduces the appearance of the other party, but is not limited to this.
- the avatar may be a fictitious character having attributes similar to the attributes of the other party (gender, age, nationality, etc.).
- the avatar may be obtained by acquiring a photographed image of the other party's expression by a predetermined method and reproducing the expression of the other party on the face based on the photographed image.
- the avatar may be positioned so as to face the user, or may be positioned so as to be lined up with the user (that is, the profile of the avatar may be seen by the user).
- the AR display unit 120 displays an avatar
- various processing is performed on the avatar so as not to cause a contradiction with the real world based on various sensing data acquired by the sensor unit 130.
- the information processing apparatus 100 analyzes the image captured by the outward camera 131 to grasp the intensity or irradiation direction of the external light, and processes the surface of the avatar so that the avatar is illuminated by the external light. Also good.
- FIG. 6 shows the visual field area of user A.
- the information processing apparatus 100 can give the user A a feeling of having a direct conversation with the user B. At this time, even if the user B enters the visual field area of the user A, the user B is shielded from being seen by the user A by the cooperation of the light control unit 110 and the AR display unit 120.
- the information processing apparatus 100 may cause the user B to be visually recognized by the user A instead of the avatar as in 6C.
- the light control unit 110 and the AR display unit 120 are controlled.
- the information processing apparatus 100 stops the call function and allows the users to talk directly.
- the information processing apparatus 100 can smoothly transition from a state in which the user A and the user B are talking via a VR call to a state in which the user A and the user B are talking directly. That is, the information processing apparatus 100 can give the user A a feeling that the other party of conversation is changing from the avatar to the user B in the real world. Contrary to the above, the information processing apparatus 100 may shift from a state in which the user A and the user B have a direct conversation to a state in which a call is made via a VR call.
- the viewing angle of the user A is different from the viewing angle of the outward camera 131 and the viewing angle of the outward camera 131 is larger than the viewing angle of the user A.
- the information processing apparatus 100 analyzes the captured image of the outward camera 131. Recognize user B. Then, when the user B enters the viewing angle of the user A as in the state 2, the information processing apparatus 100 controls the dimming unit 110 and the AR display unit 120 to change the user B to the user A. Shield from seeing.
- the information processing apparatus 100 sets the dimming unit 110 and the AR display unit 120 so that the user A can see the real world user B instead of the avatar. Control.
- the information processing apparatus 100 can prevent the user A from seeing the avatar and the real world user B at the same time, it can reduce the uncomfortable feeling given to the user A.
- FIG. 8 is a block diagram illustrating an example of a hardware configuration of the information processing apparatus 100.
- the information processing apparatus 100 includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 902, a RAM (Random Access Memory) 903, and a host bus 904.
- the information processing apparatus 100 includes a bridge 905, an external bus 906, an interface 907, an input device 908, an output device 909, a storage device (HDD) 910, a drive 911, and a communication device 912.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- the information processing apparatus 100 includes a bridge 905, an external bus 906, an interface 907, an input device 908, an output device 909, a storage device (HDD) 910, a drive 911, and a communication device 912.
- HDMI storage device
- the CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation of the information processing apparatus 100 according to various programs. Further, the CPU 901 may be a microprocessor.
- the ROM 902 stores programs used by the CPU 901, calculation parameters, and the like.
- the RAM 903 temporarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like. These are connected to each other by a host bus 904 including a CPU bus.
- the function of the control unit 140 is realized by the cooperation of the CPU 901, the ROM 902, and the RAM 903.
- the host bus 904 is connected to an external bus 906 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 905.
- an external bus 906 such as a PCI (Peripheral Component Interconnect / Interface) bus
- PCI Peripheral Component Interconnect / Interface
- the host bus 904, the bridge 905, and the external bus 906 are not necessarily configured separately, and these functions may be mounted on one bus.
- the input device 908 is an input means for a user to input information such as a mouse, keyboard, touch panel, button, microphone, switch, and lever, and an input means for various sensors such as the outward camera 131 to input sensing data. It is composed of an input control circuit that functions, generates an input signal based on the input, and outputs the input signal to the CPU 901. By operating the input device 908, the user can input various data and instruct processing operations to each device.
- the function of the sensor unit 130 or the operation input unit 150 is realized by the input device 908.
- the output device 909 includes display devices such as a CRT (Cathode Ray Tube) display device, a liquid crystal display (LCD) device, an OLED (Organic Light Emitting Diode) device, and a lamp.
- the output device 909 includes a partial dimming element that can partially control dimming.
- the output device 909 includes an audio output device such as a speaker and headphones.
- the output device 909 outputs the played content, for example.
- the display device displays various information such as reproduced video data as an image or text.
- the audio output device converts reproduced audio data or the like into audio and outputs it. With the output device 909, the functions of the light control unit 110, the AR display unit 120, or the speaker 160 are realized.
- the storage device 910 is a data storage device configured as an example of the storage unit 180.
- the storage device 910 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
- the storage device 910 is composed of, for example, an HDD (Hard Disk Drive).
- the storage device 910 drives a hard disk and stores programs executed by the CPU 901 and various data.
- the drive 911 is a storage medium reader / writer, and is built in or externally attached to the information processing apparatus 100.
- the drive 911 reads information recorded in a removable storage medium 913 such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM 903.
- the drive 911 can also write information to the removable storage medium 913.
- the communication device 912 is a communication interface configured by a communication device for connecting to the communication network 914, for example.
- the function of the communication unit 170 is realized by the communication device 912.
- the information processing apparatus 100 shields an object in the real world by dimming light incident on a part of the user's field of view, and Display control is performed so that an arbitrary image is reflected. Thereby, the information processing apparatus 100 can partially switch between the real world and the virtual world in the user's visual field region.
- the following configurations also belong to the technical scope of the present disclosure.
- (1) By performing dimming to limit the amount of incident light from a real object, the dimming area where the dimming is performed and the dimming are not performed relative to the dimming area in the user's visual field area
- a dimming control unit for generating a non-dimming region A display control unit that controls the display device to display the first image in the dimming area as seen from the user, Information processing device.
- the dimming control unit generates the dimming area so as to prevent the user from visually recognizing the real object.
- the display control unit performs display control of the first image so that an image recognized by the user from the light control region to the non-light control region in the visual field region is substantially continuous.
- the first image includes an image corresponding to the real object;
- the display control unit controls the display device to display an image corresponding to the real object so that an image of the real object recognized by the user is substantially continuous.
- the first image includes an image corresponding to a background of the real object;
- the display control unit controls the display device to display an image corresponding to the background so that the image of the background recognized by the user is substantially continuous;
- the display control unit displays the second image in the non-dimming area adjacent to the dimming area when a part of the real object is present in the non-dimming area when viewed from the user.
- Control the display device The information processing apparatus according to any one of (3) to (5).
- the display control unit is configured to perform a conversion process according to a change in a position, an orientation, and a size of the real object viewed from the user based on a coordinate system having the user's viewpoint as an origin. Controlling the display device to display an image of The information processing apparatus according to any one of (3) to (6).
- the light control unit is configured to display the real image as viewed from the user so that the first image does not protrude from the light control region in accordance with display control of the first image subjected to the conversion process. Changing the position, shape and size of the dimming area to correspond to the position, orientation and size of the body, The information processing apparatus according to (7). (9) The display control unit omits the conversion process when the dimming area is larger than a predetermined area. The information processing apparatus according to (7) or (8). (10) The display control unit changes the color of the real object to be perceived by the user when the dimming is not performed based on the prediction of the color change of the real object according to the state change of the real object.
- the dimming control unit enlarges the dimming region when the real object has a first momentum than when the real object has a second momentum smaller than the first momentum.
- the dimming control unit gradually increases or decreases the area of the dimming region.
- the dimming control unit performs the dimming from a region where the user's line of sight is not poured, or a region separated from the region where the line of sight is poured by a certain distance, The information processing apparatus according to (12).
- the dimming control unit performs the dimming from an area where a depth map is created, or an area where the likelihood of the depth map is a certain value or more.
- the dimming control unit gradually increases the area of the dimming region and increases the area of the first image according to the movement of the user's line of sight.
- the dimming control unit restricts light in a specific wavelength band or specific polarization from being transmitted through the display device in the dimming.
- the display device is a head mounted display.
- the display control unit and the dimming control unit constitute at least one processor,
- the information processing apparatus further includes the head mounted display.
- the dimming area where the dimming is performed and the dimming are not performed relative to the dimming area in the user's visual field area Creating a non-dimming region; Controlling the display device to display the first image in the dimming area as seen from the user; Controlling the display device to display the first image in the dimming area as seen from the user; A program to make a computer realize.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Afin d'effectuer une commande de gradation plus souple dans le champ visuel d'un utilisateur, un dispositif de traitement d'informations selon la présente invention comprend : une unité de commande de gradation qui effectue une commande de gradation limitant la quantité de lumière incidente reçue en provenance d'un objet réel et génère ainsi, dans le champ visuel d'un utilisateur, une région de gradation où la gradation est effectuée et une région de non-gradation où une gradation moindre est effectuée par rapport à la région de gradation ; et une unité de commande d'affichage qui commande un dispositif d'affichage de façon à afficher une première image dans ladite région de gradation lorsque l'utilisateur regarde le dispositif d'affichage.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2018-038837 | 2018-03-05 | ||
| JP2018038837A JP2019152794A (ja) | 2018-03-05 | 2018-03-05 | 情報処理装置、情報処理方法およびプログラム |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019171802A1 true WO2019171802A1 (fr) | 2019-09-12 |
Family
ID=67845989
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2019/002072 Ceased WO2019171802A1 (fr) | 2018-03-05 | 2019-01-23 | Dispositif de traitement d'informations, procédé de traitement d'informations, et programme |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP2019152794A (fr) |
| WO (1) | WO2019171802A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112153319A (zh) * | 2020-09-02 | 2020-12-29 | 芋头科技(杭州)有限公司 | 基于视频通信技术的ar信息显示方法和装置 |
Families Citing this family (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11082685B2 (en) * | 2019-11-05 | 2021-08-03 | Universal City Studios Llc | Head-mounted device for displaying projected images |
| US11948483B2 (en) | 2020-03-17 | 2024-04-02 | Sony Interactive Entertainment Inc. | Image generation apparatus and image generation method |
| WO2022091398A1 (fr) * | 2020-11-01 | 2022-05-05 | 正典 伊原 | Dispositif d'affichage comprenant unité de commande de transmittance |
| US11461986B2 (en) * | 2021-01-27 | 2022-10-04 | Qualcomm Incorporated | Context-aware extended reality systems |
Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH08292394A (ja) * | 1995-01-24 | 1996-11-05 | Matsushita Electric Ind Co Ltd | 頭部搭載型画像表示装置 |
| JP2008185609A (ja) * | 2007-01-26 | 2008-08-14 | Sony Corp | 表示装置、表示方法 |
| JP2008242134A (ja) * | 2007-03-28 | 2008-10-09 | Nikon Corp | 表示装置 |
| US20120062444A1 (en) * | 2010-09-09 | 2012-03-15 | Cok Ronald S | Switchable head-mounted display transition |
| JP2014155207A (ja) * | 2013-02-14 | 2014-08-25 | Seiko Epson Corp | 頭部装着型表示装置および頭部装着型表示装置の制御方法 |
| JP2015104014A (ja) * | 2013-11-26 | 2015-06-04 | 圭祐 戸田 | 表示装置および表示方法 |
| JP2016192137A (ja) * | 2015-03-31 | 2016-11-10 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
| WO2016203792A1 (fr) * | 2015-06-15 | 2016-12-22 | ソニー株式会社 | Dispositif de traitement d'informations, procédé de traitement d'informations et programme |
| JP2016218306A (ja) * | 2015-05-22 | 2016-12-22 | セイコーエプソン株式会社 | 頭部装着型表示装置、表示システム、頭部装着型表示装置の制御方法、および、コンピュータープログラム |
| JP2017203952A (ja) * | 2016-05-13 | 2017-11-16 | 株式会社リコー | 画像表示装置、眼鏡型画像表示装置及び両眼型画像表示装置 |
-
2018
- 2018-03-05 JP JP2018038837A patent/JP2019152794A/ja active Pending
-
2019
- 2019-01-23 WO PCT/JP2019/002072 patent/WO2019171802A1/fr not_active Ceased
Patent Citations (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH08292394A (ja) * | 1995-01-24 | 1996-11-05 | Matsushita Electric Ind Co Ltd | 頭部搭載型画像表示装置 |
| JP2008185609A (ja) * | 2007-01-26 | 2008-08-14 | Sony Corp | 表示装置、表示方法 |
| JP2008242134A (ja) * | 2007-03-28 | 2008-10-09 | Nikon Corp | 表示装置 |
| US20120062444A1 (en) * | 2010-09-09 | 2012-03-15 | Cok Ronald S | Switchable head-mounted display transition |
| JP2014155207A (ja) * | 2013-02-14 | 2014-08-25 | Seiko Epson Corp | 頭部装着型表示装置および頭部装着型表示装置の制御方法 |
| JP2015104014A (ja) * | 2013-11-26 | 2015-06-04 | 圭祐 戸田 | 表示装置および表示方法 |
| JP2016192137A (ja) * | 2015-03-31 | 2016-11-10 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
| JP2016218306A (ja) * | 2015-05-22 | 2016-12-22 | セイコーエプソン株式会社 | 頭部装着型表示装置、表示システム、頭部装着型表示装置の制御方法、および、コンピュータープログラム |
| WO2016203792A1 (fr) * | 2015-06-15 | 2016-12-22 | ソニー株式会社 | Dispositif de traitement d'informations, procédé de traitement d'informations et programme |
| JP2017203952A (ja) * | 2016-05-13 | 2017-11-16 | 株式会社リコー | 画像表示装置、眼鏡型画像表示装置及び両眼型画像表示装置 |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112153319A (zh) * | 2020-09-02 | 2020-12-29 | 芋头科技(杭州)有限公司 | 基于视频通信技术的ar信息显示方法和装置 |
| US11521297B2 (en) | 2020-09-02 | 2022-12-06 | Sichuan Smart Kids Technology Co., Ltd. | Method and device for presenting AR information based on video communication technology |
| CN112153319B (zh) * | 2020-09-02 | 2023-02-24 | 芋头科技(杭州)有限公司 | 基于视频通信技术的ar信息显示方法和装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2019152794A (ja) | 2019-09-12 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20240104877A1 (en) | Methods for time of day adjustments for environments and environment presentation during communication sessions | |
| US20240103686A1 (en) | Methods for controlling and interacting with a three-dimensional environment | |
| US11487354B2 (en) | Information processing apparatus, information processing method, and program | |
| US20240104819A1 (en) | Representations of participants in real-time communication sessions | |
| WO2019171802A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations, et programme | |
| US20230343049A1 (en) | Obstructed objects in a three-dimensional environment | |
| US20240103617A1 (en) | User interfaces for gaze tracking enrollment | |
| US12417596B2 (en) | User interfaces for managing live communication sessions | |
| US20240353922A1 (en) | Devices, methods, and graphical user interfaces for user enrollment and authentication | |
| US20240257486A1 (en) | Techniques for interacting with virtual avatars and/or user representations | |
| US20240406368A1 (en) | Devices, methods, and graphical user interfaces for capturing and viewing immersive media | |
| US20240118746A1 (en) | User interfaces for gaze tracking enrollment | |
| US20250103133A1 (en) | Devices, methods, and graphical user interfaces for gaze navigation | |
| CN111095364A (zh) | 信息处理装置、信息处理方法和程序 | |
| US20240395073A1 (en) | Devices, methods, and graphical user interfaces for biometric feature enrollment | |
| US20240402869A1 (en) | Devices, methods, and graphical user interfaces for content collaboration and sharing | |
| US12374069B2 (en) | Devices, methods, and graphical user interfaces for real-time communication | |
| KR20250166987A (ko) | 사용자 등록 및 인증을 위한 디바이스들, 방법들, 및 그래픽 사용자 인터페이스들 | |
| US20200348749A1 (en) | Information processing apparatus, information processing method, and program | |
| WO2020044916A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et programme | |
| US20240370542A1 (en) | Devices, methods, and graphical user interfaces for transitioning between multiple modes of operation | |
| US20240370344A1 (en) | Devices, methods, and graphical user interfaces for providing environment tracking content | |
| US20250378654A1 (en) | Devices, methods, and graphical user interfaces for displaying a representation of a user | |
| US20240404216A1 (en) | Devices and methods for presenting system user interfaces in an extended reality environment | |
| US20250356064A1 (en) | Devices, methods, and graphical user interfaces for transitioning between multiple modes of operation |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19763759 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19763759 Country of ref document: EP Kind code of ref document: A1 |