WO2024166715A1 - Système de traitement d'informations et programme - Google Patents
Système de traitement d'informations et programme Download PDFInfo
- Publication number
- WO2024166715A1 WO2024166715A1 PCT/JP2024/002463 JP2024002463W WO2024166715A1 WO 2024166715 A1 WO2024166715 A1 WO 2024166715A1 JP 2024002463 W JP2024002463 W JP 2024002463W WO 2024166715 A1 WO2024166715 A1 WO 2024166715A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- virtual
- space
- virtual space
- control unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/216—Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
- A63F13/33—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers using wide area network [WAN] connections
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
Definitions
- the present invention relates to an information processing system and a program.
- Patent Document 1 Technologies that provide users with mixed reality (MR) spaces and virtual reality (VR) spaces have been known for some time (see, for example, Patent Document 1 and Patent Document 2).
- MR mixed reality
- VR virtual reality
- the present invention was made in consideration of the above circumstances, and aims to broaden the range of uses for virtual space.
- a first control means for providing a first view, the first view being a view of a virtual space, to a first user; an input receiving means for receiving an output of a detection means for detecting a movement of a second user as an input from the second user; a position information receiving means for receiving position information indicating a current position of the second user in real space; a correspondence relationship storage means for storing information indicating a correspondence relationship between positions in the real space and the virtual space,
- An information processing system is provided in which the first control means places an avatar of the second user at a position in the virtual space corresponding to the position of the second user indicated by the position information, and reflects the movement of the second user detected by the detection means in the avatar of the second user.
- the present invention makes it possible to broaden the range of uses for virtual space.
- FIG. 1 is a diagram illustrating a schematic configuration of an information processing system.
- FIG. 1 is a diagram showing a schematic configuration of a VR system.
- FIG. 1 is a diagram showing a schematic configuration of an MR system.
- FIG. 1 is a diagram conceptually illustrating a virtual space.
- 1 is a diagram showing a YZ cross section of a field of view in a virtual space as viewed from an X direction.
- 1 is a diagram showing an XZ cross section of a field of view in a virtual space as viewed from a Y direction.
- FIG. 2 is a diagram illustrating a schematic configuration of a controller.
- FIG. 2 is a diagram illustrating a mixed reality space and a virtual space.
- FIG. 2 is a diagram illustrating a functional configuration of the information processing system.
- 1A and 1B are diagrams illustrating a real object and a virtual object corresponding to the real object.
- 13 is a flowchart illustrating an example of a process executed by the VR system.
- 13 is a flowchart illustrating an example of a process executed by the MR system.
- 13 is a flowchart showing an example of a process for reflecting the behavior of an MR user in a virtual space.
- 13 is a flowchart illustrating an example of a process for reflecting the behavior of a VR user in a mixed reality space.
- 1A and 1B are diagrams for explaining how an event in a virtual space is reflected in a real space.
- 1A and 1B are diagrams for explaining how an event in a real space is reflected in a virtual space;
- the information processing system 100 of the present embodiment includes a VR (Virtual Reality) system 200, an MR (Mixed Reality) system 400, a server 600, and an external device 700.
- the VR system 200 is configured to be able to communicate with the server 600, the MR system 400, and the external device 700 via a network 2.
- the MR system 400 is configured to be able to communicate with the server 600, the VR system 200, and the external device 700 via the network 2.
- the number of MR systems 400 constituting the information processing system 100 is not limited to one, and may be multiple.
- the number of VR systems 200 constituting the information processing system 100 is not limited to one, and may be multiple.
- the network 2 may be composed of, for example, the Internet, a mobile communication system (e.g., 3G, 4G, 5G, LTE (Long Term Evolution), etc.), Wi-Fi (registered trademark), Bluetooth (registered trademark), other communication lines, or a combination of these.
- a mobile communication system e.g., 3G, 4G, 5G, LTE (Long Term Evolution), etc.
- Wi-Fi registered trademark
- Bluetooth registered trademark
- the VR system 200 includes a VR device 210, a computer 300, a detection device 260, a display 270, and a controller 280.
- the VR device 210 includes a display 211, a gaze sensor 212, a first camera 213, a second camera 214, a microphone 215, a speaker 216, and a sensor 217.
- a VR user a user who uses the VR device 210 is referred to as a VR user.
- the computer 300 can be connected to the Internet or other network 2, and can communicate with, for example, a server 600, a computer of the MR system 400, and other computers connected to the network 2. Examples of other computers include computers of other VR systems 200 and external devices 700.
- the VR device 210 is worn on the head of a VR user and can provide the VR user with a virtual space during operation.
- the VR device 210 can be, for example, a so-called head-mounted display equipped with a display, or a head-mounted device equipped with a smartphone or other terminal having a display.
- the VR device 210 displays, for example, an image for the right eye and an image for the left eye on the display 211. When each eye of the VR user views the respective image, the VR user can recognize the image as a three-dimensional image based on the parallax between the two eyes.
- the display 211 is realized, for example, as a non-transparent display device.
- the display 211 is disposed on the main body of the VR device 210 so as to be located, for example, in front of both eyes of the VR user. Therefore, when the VR user visually recognizes the three-dimensional image displayed on the display 211, he or she can become immersed in the virtual space.
- the display 211 may also be realized by a display provided on a so-called smartphone or other terminal.
- the detection device 260 detects the movement of a VR user who uses the VR device 210.
- the detection device 260 may have a position tracking function for detecting the movement of the VR device 210, thereby detecting the movement of the VR user.
- the detection device 260 may have, for example, a sensor that reads light (e.g., infrared light) from the VR device 210 as a sensor for detecting the movement of the VR user, and detect the position, inclination, etc. of the VR device 210 in real space.
- the VR device 210 may have multiple light sources not shown.
- each light source may be realized, for example, by an LED (Light Emitting Diode) that emits infrared light.
- the detection device 260 may also be realized by, for example, a camera.
- the detection device 260 may have an image sensor (for example, an image sensor that acquires an RGB image, an image sensor that acquires a black and white image, or a depth sensor, etc.) as a sensor for detecting the movement of the VR user.
- the movement of the VR user may be detected by a camera.
- the detection device 260 may be, for example, a depth camera equipped with a device that emits a predetermined light such as infrared light or a predetermined pattern of light, and an image sensor (for example, a depth sensor), and may detect the reflected light of the light emitted from the device by the image sensor, and may detect the position, posture, etc.
- the detection device 460 may also be equipped with such a depth camera and a camera that can acquire an RGB image, and may detect the position, posture, etc. of the VR user based on the output from these cameras.
- the detection device 260 may also be, for example, a stereo camera equipped with multiple image sensors, and may detect the position, posture, etc. of the VR user based on the output from the multiple image sensors. Note that detecting the position, posture, etc. of the VR user may be detecting the position, posture, etc. of the VR user's body, or may be detecting the position, tilt, etc. of the VR device 210.
- the VR system 200 may have one or more types of detection devices as the detection device 260, and may have a plurality of each type of detection device.
- the VR system 200 may not have the detection device 260.
- a part of the detection device 260 may be configured by the computer 300 or the like.
- analysis of the output from the image sensor e.g., image recognition, etc.
- the detection device 260 may have a sensor capable of detecting the position, inclination, etc. of the detection device 260 itself.
- the detection device 260 may have an angular velocity sensor (e.g., a three-axis angular velocity sensor), an acceleration sensor (e.g., a three-axis acceleration sensor), or a geomagnetic sensor (e.g., a three-axis geomagnetic sensor).
- the output from these sensors may be sent to the computer 300, etc., and may be used, for example, when a predetermined process is performed based on the output from an image sensor provided in the detection device 260.
- the VR device 210 may also include a sensor 217 instead of or in addition to the detection device 260 as a detection means for detecting the movement of the VR user.
- the VR device 210 may use the sensor 217 to detect the position and inclination of the VR device 210 itself.
- the sensor 217 may be, for example, an angular velocity sensor (e.g., a three-axis angular velocity sensor), an acceleration sensor (e.g., a three-axis acceleration sensor), or a geomagnetic sensor (e.g., a three-axis geomagnetic sensor).
- the VR device 210 may also have one or more types of sensors as the sensor 217, and may have multiple sensors of each type.
- the VR device 210 can detect the angular velocity around the three axes of the VR device 210 in real space over time. The VR device 210 can then calculate the change over time in the angles around the three axes of the VR device 210 based on each angular velocity, and can further calculate the inclination of the VR device 210 based on the change over time in the angles.
- the sensor 217 may be, for example, an image sensor.
- the position, posture, etc. of the VR user may be detected based on the output from the image sensor. In other words, the position, posture, etc. of the VR user may be detected based on information from a camera that is provided in the VR device 210 and captures the surroundings of the VR device 210. In other words, tracking of the VR device 210 may be performed by an outside-in method or an inside-out method.
- the gaze sensor 212 detects the direction in which the VR user's right and left eyes are looking. That is, the gaze sensor 212 detects the VR user's gaze (in other words, the movement of the eyes). The detection of the gaze direction is achieved, for example, by a known eye tracking function.
- the gaze sensor 212 is achieved by a sensor having the eye tracking function.
- the gaze sensor 212 may include a sensor for the right eye and a sensor for the left eye.
- the gaze sensor 212 may be, for example, a sensor that irradiates the VR user's right and left eyes with infrared light and detects the rotation angle of each eyeball by receiving reflected light from the cornea and iris in response to the irradiated light. In this case, the gaze sensor 212 can detect the VR user's gaze based on each detected rotation angle.
- the first camera 213 photographs the lower part of the VR user's face. More specifically, the first camera 213 photographs the VR user's nose, mouth, etc.
- the second camera 214 photographs the VR user's eyes, eyebrows, etc.
- the VR user side of the housing of the VR device 210 is defined as the inside of the VR device 210
- the opposite side of the housing of the VR device 210 from the VR user is defined as the outside of the VR device 210.
- the first camera 213 may be disposed outside the VR device 210
- the second camera 214 may be disposed inside the VR device 210.
- the images photographed by the first camera 213 and the second camera 214 are input to the computer 300.
- the first camera 213 and the second camera 214 may be realized as a single camera, and the face of the VR user may be photographed by this single camera.
- the speaker 216 which serves as a sound output means, converts the sound signal into sound and outputs it to the VR user. Note that the VR device 210 may include earphones instead of the speaker 216 as a sound output means.
- Display 270 displays an image similar to the image displayed on display 211. This allows users other than the VR user wearing VR device 210 to view the same image (in other words, a virtual space) as the VR user.
- the image displayed on display 270 does not need to be a three-dimensional image, and may be, for example, an image for the right eye or an image for the left eye displayed on VR device 210.
- Examples of display 270 include a liquid crystal display and an organic EL display.
- the controller 280 is connected to the computer 300 by wire or wirelessly.
- the controller 280 accepts input operations related to instructions from the VR user to the computer 300.
- the controller 280 also accepts input operations by the VR user to control the position and movement of a virtual object placed in a virtual space.
- the controller 280 may be configured to be held by the VR user, for example.
- the controller 280 may also be configured to be wearable, for example, on the body or a part of the clothing of the VR user.
- the controller 280 may be, for example, in the form of a glove.
- the controller 280 may also be configured to be able to output at least one of vibration, sound, and light based on a signal transmitted from the computer 300.
- the controller 280 may also have multiple light sources. Each light source may be realized, for example, by an LED that emits infrared light.
- the detection device 260 may then read the infrared light from the controller 280 and detect the position and inclination of the controller 280 in real space. In other words, the detection device 260 may have a position tracking function that detects the movement of the controller 280, thereby detecting the movement of the VR user.
- the VR system 200 does not necessarily have to include the controller 280.
- the information processing system 100 may also have a sensor 286 that detects the movement of the VR user.
- the sensor 286 may be, for example, an angular velocity sensor, an acceleration sensor, or a geomagnetic sensor.
- the sensor 286 may also be provided in the controller 280, for example (see FIG. 7).
- the information processing system 100 may have one or more types of sensors as the sensor 286, and may have multiple sensors of each type.
- the device including the sensor 286 (for example, the controller 280 or a specific camera) and the computer 300 may be connected to each other, for example, wirelessly.
- the information acquired by the sensor 286 may also be transmitted to the computer 300, for example, by wireless communication.
- the position and movement of the VR user may also be detected by the VR device 210 being equipped with a device for performing a specific wireless communication (for example, short-range wireless communication such as Wi-Fi communication, Bluetooth communication, or UWB (Ultra Wide Band) communication) and having the device perform wireless communication with surrounding devices to obtain position information of the VR device 210.
- a specific wireless communication for example, short-range wireless communication such as Wi-Fi communication, Bluetooth communication, or UWB (Ultra Wide Band) communication
- UWB Ultra Wide Band
- the position and movement of the VR user may also be detected using an external device 700 (and a device that tracks the movement of the VR user in cooperation with the device via short-range wireless communication or the like) that can track the movement of the VR user, such as a device worn by the VR user (for example, a wearable device such as a watch-type, wristband-type, ring-type, or clothing-type device, or an implantable device, etc.).
- a device worn by the VR user for example, a wearable device such as a watch-type, wristband-type, ring-type, or clothing-type device, or an implantable device, etc.
- These external devices 700 may also be used as the controller 280.
- the position and movement of the VR user may also be detected by a GPS sensor, etc.
- the computer 300 includes, as its main components, a processor 301, a memory 302, a storage 303, an input/output interface 304, and a communication interface 305.
- the components are connected to each other via a bus.
- the processor 301 controls the operation of the VR device 210.
- the processor 301 reads a program from the storage 303 and expands it in the memory 302.
- the processor 301 executes the expanded program.
- the processor 301 may be configured to include, for example, one or more of the following: a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), an MPU (Micro Processor Unit), and an FPGA (Field-Programmable Gate Array).
- Memory 302 is a main storage device. Memory 302 is composed of storage devices such as ROM (Read Only Memory) and RAM (Random Access Memory). Memory 302 provides a working area for processor 301 by temporarily storing programs and various data that processor 301 reads from storage 303. Memory 302 also temporarily stores various data generated while processor 301 is operating according to a program, various data input to computer 300, etc.
- ROM Read Only Memory
- RAM Random Access Memory
- Storage 303 is an auxiliary storage device.
- Storage 303 is composed of a storage device such as a flash memory or a HDD (Hard Disk Drive).
- Storage 303 stores programs for providing services in information processing system 100.
- Storage 303 also stores various data for providing services in information processing system 100.
- the data stored in storage 303 includes data for defining a virtual space and data related to virtual objects.
- storage 303 may be implemented as a removable storage device, such as a memory card. Also, instead of storage 303 built into computer 300, programs and data stored in an external storage device may be used.
- the input/output interface 304 is an interface through which the computer 300 accepts data input, and is also an interface through which the computer 300 outputs data.
- the input/output interface 304 can transmit or receive data between the VR device 210, the detection device 260, and the display 270. It may also transmit or receive data between the display 211, the gaze sensor 212, the first camera 213, the second camera 214, the microphone 215, the speaker 216, and the sensor 217 included in the VR device 210.
- the input/output interface 304 may transmit or receive data to and from the controller 280.
- the input/output interface 304 may receive input of signals output from the controller 280 and the sensor 286.
- the input/output interface 304 may also transmit commands output from the processor 301 to the controller 280.
- the commands may instruct the controller 280 to vibrate, output sound, emit light, or the like.
- the controller 280 Upon receiving the command, the controller 280 performs vibration, output sound, emit light, or the like in accordance with the command.
- each of the VR device 210, the detection device 260, the display 270, the controller 280, and the sensor 286 may be connected to the computer 300 via a wired or wireless connection.
- the communication interface 305 controls the sending and receiving of various data with other computers (e.g., the server 600, the computer 500, or other computers 300, etc.) via the network 2.
- other computers e.g., the server 600, the computer 500, or other computers 300, etc.
- the processor 301 accesses the storage 303, loads a program stored in the storage 303 into the memory 302, and executes a series of instructions contained in the program.
- the processor 301 also sends a signal to the VR device 210 via the input/output interface 304 to provide a virtual space.
- the computer 300 may be provided outside the VR device 210, or a part or the whole of the computer 300 may be built into the VR device 210.
- a portable terminal e.g., a smartphone
- the computer 300 may be provided for each VR device 210, or may be shared by the multiple VR devices 210.
- a real coordinate system which is a coordinate system in real space, is set in advance in the VR system 200.
- the real coordinate system has three reference directions (axes) that are parallel to the vertical direction in real space, the horizontal direction perpendicular to the vertical direction, and the front-rear direction perpendicular to both the vertical and horizontal directions.
- the position and tilt of the VR device 210 in real space can be detected by the detection device 260 and the sensor 217.
- the detected tilt of the VR device 210 corresponds to, for example, each tilt around the three axes of the VR device 210 in the real coordinate system.
- the computer 300 sets a uvw field of view coordinate system for the VR device 210 based on the tilt of the VR device 210 in the real coordinate system (see Figure 4).
- the uvw field of view coordinate system set for the VR device 210 corresponds to the viewpoint coordinate system when a VR user wearing the VR device 210 views an object in virtual space.
- the computer 300 sets a three-dimensional uvw field of view coordinate system with the head of the VR user wearing the VR device 210 as its center (origin). More specifically, the computer 300 tilts the horizontal, vertical, and front-to-rear directions that define the real coordinate system around each axis by the tilt of the VR device 210 around each axis in the real coordinate system, and sets the three newly obtained directions as the pitch axis (u-axis), yaw axis (v-axis), and roll axis (w-axis) of the uvw field of view coordinate system in the VR device 210.
- the processor 301 sets a field of view coordinate system in the VR device 210 that is parallel to the real coordinate system.
- the horizontal direction, vertical direction, and front-to-back directions in the real coordinate system coincide with the pitch axis (u-axis), yaw axis (v-axis), and roll axis (w-axis) of the field of view coordinate system in the VR device 210.
- the detection device 260 or the sensor 217 can detect the tilt of the VR device 210 in the set uvw field of view coordinate system based on the movement of the VR device 210.
- the detection device 260 or the sensor 217 detects the pitch angle, yaw angle, and roll angle of the VR device 210 in the uvw field of view coordinate system as the tilt of the VR device 210.
- the pitch angle represents the tilt angle of the VR device 210 around the pitch axis in the uvw field of view coordinate system.
- the yaw angle represents the tilt angle of the VR device 210 around the yaw axis in the uvw field of view coordinate system.
- the roll angle represents the tilt angle of the VR device 210 around the roll axis in the uvw field of view coordinate system.
- the computer 300 Based on the detected inclination of the VR device 210, the computer 300 sets the uvw field of view coordinate system of the VR device 210 after the VR device 210 moves in the VR device 210.
- the relationship between the VR device 210 and the uvw field of view coordinate system of the VR device 210 is always constant, regardless of the position and inclination of the VR device 210.
- the position and inclination of the VR device 210 change, the position and inclination of the uvw field of view coordinate system of the VR device 210 in the real coordinate system change in conjunction with the change in the position and inclination.
- the detection device 260 may identify the position of the VR device 210 in real space as a relative position with respect to the detection device 260.
- the processor 301 may also determine the origin of the uvw field of view coordinate system of the VR device 210 in real space (actual coordinate system) based on the identified relative position.
- FIG. 4 is a diagram conceptually showing one mode of expressing the virtual space 11 according to an embodiment.
- the virtual space 11 has a spherical structure covering the entire 360-degree direction of the center 12.
- FIG. 4 in order to avoid complicating the description, the upper half of the celestial sphere in the virtual space 11 is illustrated.
- Each mesh is defined in the virtual space 11.
- the position of each mesh is defined in advance as a coordinate value in the XYZ coordinate system, which is a global coordinate system defined in the virtual space 11.
- the computer 300 associates each partial image constituting the panoramic image 13 (still image, video, etc.) that can be deployed in the virtual space 11 with each corresponding mesh in the virtual space 11.
- an XYZ coordinate system is defined with a specific point as the origin 12.
- the XYZ coordinate system is, for example, parallel to the real coordinate system.
- the horizontal direction, vertical direction (up-down direction), and front-to-back direction in the XYZ coordinate system are defined as the X-axis, Y-axis, and Z-axis, respectively. Therefore, the X-axis of the XYZ coordinate system is parallel to the horizontal direction of the real coordinate system, the Y-axis (vertical direction) of the XYZ coordinate system is parallel to the vertical direction of the real coordinate system, and the Z-axis (front-to-back direction) of the XYZ coordinate system is parallel to the front-to-back direction of the real coordinate system.
- the virtual camera 14 When the VR device 210 is started up, i.e., in the initial state of the VR device 210, the virtual camera 14 is placed at a predetermined position (e.g., the center) in the virtual space 11.
- the processor 301 also displays an image captured by the virtual camera 14 on the display 211 of the VR device 210.
- the virtual camera 14 moves within the virtual space 11 in conjunction with the movement of the VR device 210 in the real space. This allows changes in the tilt and position of the VR device 210 in the real space to be reproduced in the virtual space 11 in the same way.
- a uvw field of view coordinate system is defined for the virtual camera 14, as in the case of the VR device 210.
- the uvw field of view coordinate system of the virtual camera 14 in the virtual space 11 is defined so as to be linked to the uvw field of view coordinate system of the VR device 210 in real space (actual coordinate system). Therefore, when the inclination of the VR device 210 changes, the inclination of the virtual camera 14 also changes accordingly.
- the virtual camera 14 may move in the virtual space 11 in conjunction with the movement of the VR user in real space, but in this embodiment, it does not move in the virtual space 11 even if the VR user moves in real space.
- the processor 301 of the computer 300 determines the viewing area 15 in the virtual space 11 based on the position and inclination of the virtual camera 14.
- the viewing area 15 corresponds to the area of the virtual space 11 that is viewed by a VR user wearing the VR device 210.
- the position of the virtual camera 14 can be said to be the viewpoint of the VR user in the virtual space 11.
- the uvw field of view coordinate system of the VR device 210 is equal to the viewpoint coordinate system when the VR user views the display 211.
- the uvw field of view coordinate system of the virtual camera 14 is linked to the uvw field of view coordinate system of the VR device 210. Therefore, the line of sight of the VR user detected by the gaze sensor 212 can be regarded as the line of sight of the VR user in the uvw field of view coordinate system of the virtual camera 14.
- Fig. 5 is a diagram showing a YZ cross section of the field of view 15 in the virtual space 11 as viewed from the X direction.
- Fig. 6 is a diagram showing an XZ cross section of the field of view 15 in the virtual space 11 as viewed from the Y direction.
- the field of view 15 in the YZ cross section includes an area 18.
- the area 18 is defined by the position of the virtual camera 14, the reference line of sight 16, and the YZ cross section of the virtual space 11.
- the processor 301 defines the range including the polar angle ⁇ centered on the reference line of sight 16 in the virtual space as the area 18.
- the field of view 15 in the XZ cross section includes area 19.
- Area 19 is defined by the position of virtual camera 14, reference line of sight 16, and the XZ cross section of virtual space 11.
- Processor 301 defines a range including azimuth angle ⁇ centered on reference line of sight 16 in virtual space 11 as area 19.
- Polar angles ⁇ and ⁇ are determined according to the position of virtual camera 14 and the inclination (direction) of virtual camera 14.
- the VR system 200 provides the VR user with a field of view in the virtual space 11 by displaying a field of view image 17 on the display 211 based on a signal from the computer 300 (see FIG. 4).
- the field of view image 17 is an image that corresponds to the portion of the panoramic image 13 that corresponds to the field of view area 15.
- the VR user moves the VR device 210 worn on their head
- the virtual camera 14 also moves in conjunction with the movement.
- the position of the field of view area 15 in the virtual space 11 changes.
- the field of view image 17 displayed on the display 211 is updated to an image of the panoramic image 13 that is superimposed on the field of view area 15 in the direction in which the VR user is facing in the virtual space 11.
- the VR user can view the desired direction in the virtual space 11.
- the VR user While wearing the VR device 210, the VR user can only view the panoramic image 13 displayed in the virtual space 11, without being able to see the real world. Therefore, the information processing system 100 can give the VR user a highly immersive feeling in the virtual space 11.
- the virtual camera 14 may include two virtual cameras, i.e., a virtual camera for providing an image for the right eye and a virtual camera for providing an image for the left eye. In this case, an appropriate parallax is set for the two virtual cameras so that the VR user can recognize the three-dimensional virtual space 11.
- the virtual camera 14 may also be realized by a single virtual camera. In this case, an image for the right eye and an image for the left eye may be generated from an image obtained by the single virtual camera.
- controller An example of the controller 280 will now be described with reference to FIG.
- the controller 280 may include a right controller 280R and a left controller (not shown).
- the right controller 280R is operated by the right hand of the VR user.
- the left controller is operated by the left hand of the VR user.
- the right controller 280R and the left controller are configured as separate devices. Therefore, the VR user can freely move both the right hand holding the right controller 280R and the left hand holding the left controller.
- the controller 280 may be an integrated controller that accepts operation from both hands. The right controller 280R will be described below.
- the right controller 280R includes a grip 281, a frame 282, a top surface 283, buttons 284 and 285, a sensor (e.g., a motion sensor) 286, an infrared LED 287, buttons 288 and 289, and an analog stick 290.
- the grip 281 is configured to be held by the right hand of a VR user.
- the grip 281 can be held by the palm and three fingers (middle finger, ring finger, and little finger) of the VR user's right hand.
- Button 284 is located on the side of grip 281 and is operable by the middle finger of the right hand.
- Button 285 is located on the front of grip 281 and is operable by the index finger of the right hand. Buttons 284 and 285 are also equipped with switches that detect the movement of the user pressing buttons 284 and 285. Buttons 284 and 285 may also be configured as trigger-type buttons.
- the sensor 286 is built into the housing of the grip 281.
- the sensor 286 detects the movement of the VR user. Specifically, the sensor 286 detects the movement of the VR user's hand. For example, the sensor 286 detects the rotation speed and number of rotations of the hand. Note that the controller 280 does not necessarily have to include the sensor 286.
- a number of infrared LEDs 287 are arranged along the circumferential direction of the frame 282.
- the infrared LEDs 287 emit infrared light in accordance with the progress of the program.
- the infrared light emitted from the infrared LEDs 287 can be used to detect the positions and attitudes (tilt, direction) of the right controller 280R and the left controller.
- the top surface 283 is equipped with buttons 288, 289 and an analog stick 290.
- the buttons 288, 289 are operated by the thumb of the VR user's right hand.
- the buttons 288, 289 are also equipped with switches that detect the movement of the user pressing the buttons 288, 289.
- the analog stick 290 is equipped with operations in any direction within 360 degrees from the initial position (neutral position). Such operations include, for example, operations for moving an object placed in the virtual space 11.
- the analog stick 290 is also equipped with a sensor that detects the movement of the user operating the analog stick 290.
- the controller 280 defines the yaw, roll, and pitch directions for, for example, the right hand of a VR user. Specifically, for example, when a VR user extends his or her thumb and index finger, the direction in which the thumb extends is defined as the yaw direction, the direction in which the index finger extends is defined as the roll direction, and the direction perpendicular to the plane defined by the axis of the yaw direction and the axis of the roll direction is defined as the pitch direction.
- the MR system 400 includes an MR device 410, a computer 500, a detection device 460, a display 470, and a controller 480.
- the MR device 410 includes a display 411, a gaze sensor 412, a camera 413, a microphone 415, a speaker 416, and a sensor 417.
- a user who uses the MR device 410 is referred to as an MR user 6.
- the computer 500 can be connected to the Internet or other network 2, and can communicate with, for example, a server 600, a computer of the VR system 200, and other computers connected to the network 2. Examples of other computers include computers of other MR systems 400 and external devices 700.
- the MR device 410 is worn on the head of the MR user 6 and can provide a mixed reality space to the MR user 6 during operation.
- the MR device 410 may be, for example, a glasses-type device equipped with a display (for example, so-called MR glasses).
- the MR device 410 may also be a contact lens-type device, etc.
- the display 411 is realized as, for example, a transmissive display device.
- the transmissive display 411 may temporarily function as a non-transmissive display device by adjusting its transmittance.
- the display 411 is disposed on the main body of the MR device 410 so as to be located, for example, in front of both eyes of the MR user 6. Therefore, the MR user 6 is provided with a display in which a virtual object displayed by the display 411 is superimposed on the real space seen through the transmissive display 411.
- the MR device 410 provides the MR user 6 with a view of a mixed reality space in which a virtual object is disposed in the real space.
- the MR device 410 allows the MR user 6 to simultaneously see a virtual object and an object existing in the real space (hereinafter referred to as a "real object").
- the detection device 460 detects the movement of the MR user 6 using the MR device 410.
- the detection device 460 may have a position tracking function for detecting the movement of the MR device 410, thereby detecting the movement of the MR user 6.
- the detection device 460 may have, for example, a sensor that reads light (e.g., infrared light) from the MR device 410 as a sensor for detecting the movement of the MR user 6, and detect the position, inclination, etc. of the MR device 410 in the mixed reality space.
- the MR device 410 may have multiple light sources not shown.
- each light source may be realized, for example, by an LED (Light Emitting Diode) that emits infrared light.
- the detection device 460 may also be realized by, for example, a camera.
- the detection device 460 may have an image sensor (e.g., an image sensor that acquires an RGB image, an image sensor that acquires a black and white image, or a depth sensor, etc.) as a sensor for detecting the movement of the MR user 6.
- an image sensor e.g., an image sensor that acquires an RGB image, an image sensor that acquires a black and white image, or a depth sensor, etc.
- the movement of the MR user 6 may be detected by a camera.
- the detection device 460 may be, for example, a depth camera equipped with a device that emits a predetermined light such as infrared light or a predetermined pattern of light, and an image sensor (e.g., a depth sensor), and may detect the reflected light of the light emitted from the device by the image sensor, and detect the position, posture, etc. of the MR user 6 based on the output from the image sensor.
- the detection device 460 may also be equipped with such a depth camera and a camera capable of acquiring an RGB image, and may detect the position, posture, etc. of the MR user 6 based on the output from these cameras.
- the detection device 460 may be, for example, a stereo camera equipped with multiple image sensors, and may detect the position, posture, etc.
- detection of the position, posture, etc. of the MR user 6 may be detection of the body position, posture, etc. of the MR user 6, or detection of the position, tilt, etc. of the MR device 410.
- the MR system 400 may have one or more types of detection devices as the detection device 460, and may have multiple detection devices of each type.
- the MR system 400 may have multiple cameras (in other words, multiple sensors), and the multiple cameras may function as the detection device 460.
- the MR system 400 is described as having three detection devices 460, and each detection device 460 is described as having a depth camera and a camera capable of acquiring RGB images.
- a part of the detection device 460 may be configured by the computer 500 or the like.
- analysis of the output from the image sensor e.g., image recognition, etc.
- the MR system 400 may not have the detection device 460.
- the detection device 460 may have a sensor capable of detecting the position, inclination, etc. of the detection device 460 itself.
- the detection device 460 may have an angular velocity sensor (e.g., a three-axis angular velocity sensor), an acceleration sensor (e.g., a three-axis acceleration sensor), or a geomagnetic sensor (e.g., a three-axis geomagnetic sensor).
- the output from these sensors may be sent to the computer 500 or the like, and may be used, for example, when a predetermined process is performed based on the output from an image sensor provided in the detection device 460.
- the MR device 410 of another MR user 6 may be used as the detection device 460.
- the MR device 410 may also include a sensor 417 instead of or in addition to the detection device 460 as a detection means for detecting the movement of the MR user 6.
- the MR device 410 may use the sensor 417 to detect the position and inclination of the MR device 410 itself.
- the sensor 417 may be, for example, an angular velocity sensor (e.g., a three-axis angular velocity sensor), an acceleration sensor (e.g., a three-axis acceleration sensor), or a geomagnetic sensor (e.g., a three-axis geomagnetic sensor).
- the MR device 410 may also include one or more types of sensors as the sensor 417, and may include multiple sensors of each type.
- the gaze sensor 412 detects the direction in which the right and left eyes of the MR user 6 are directed. That is, the gaze sensor 412 detects the gaze of the MR user 6 (in other words, the movement of the eyes).
- the detection of the gaze direction is realized, for example, by a known eye tracking function.
- the gaze sensor 412 is realized by a sensor having the eye tracking function.
- the gaze sensor 412 may include a sensor for the right eye and a sensor for the left eye.
- the gaze sensor 412 may be, for example, a sensor that irradiates the right and left eyes of the MR user 6 with infrared light and detects the rotation angle of each eyeball by receiving reflected light from the cornea and iris in response to the irradiated light. In this case, the gaze sensor 412 can detect the gaze of the MR user 6 based on each detected rotation angle.
- the camera 413 captures the surroundings (e.g., in front) of the MR user 6.
- the MR device 410 may be equipped with, for example, a depth camera and a camera capable of acquiring RGB images as the camera 413.
- the computer 500 may detect the shape of objects around the MR device 410 (in other words, the MR user 6) and the relative distance between the MR device 410 and the surrounding objects.
- the position and inclination of the MR device 410 may also be detected based on the output from the camera 413. In other words, tracking of the MR device 410 may be performed by an outside-in method or an inside-out method.
- the microphone 415 serving as a sound input means, converts the voice of the MR user 6 into a sound signal (in other words, an electrical signal) and outputs it to the computer 500.
- the speaker 416 serving as a sound output means, converts the sound signal into sound and outputs it to the MR user 6.
- the MR device 410 may include earphones instead of the speaker 416 as a sound output means.
- the display 470 displays, for example, an image similar to the image displayed on the display 411. This allows users other than the MR user 6 wearing the MR device 410 to view the same image as the MR user 6.
- the image displayed on the display 470 does not need to be a three-dimensional image, and may be, for example, an image for the right eye or an image for the left eye displayed by the MR device 410.
- Examples of the display 470 include a liquid crystal display and an organic EL display.
- the display 470 may display an image showing a view of the mixed reality space provided to the MR user 6 by the MR device 410.
- the controller 480 is connected to the computer 500 by wire or wirelessly.
- the controller 480 accepts input operations related to instructions from the MR user 6 to the computer 500.
- the controller 480 also accepts input operations by the MR user 6 to control the position and movement of a virtual object placed in the mixed reality space.
- the controller 480 may be configured to be held by the MR user 6, for example.
- the controller 480 may also be configured to be worn on the body or a part of the clothing of the MR user 6, for example.
- the controller 480 may also be configured to be able to output at least one of vibration, sound, and light based on a signal transmitted from the computer 500.
- the controller 480 may also have multiple light sources. Each light source may be realized, for example, by an LED that emits infrared light.
- the detection device 460 may then read the infrared light from the controller 480 and detect the position and inclination of the controller 480 within the mixed reality space. In other words, the detection device 460 may have a position tracking function that detects the movement of the controller 480, thereby detecting the movement of the MR user 6.
- the MR system 400 does not necessarily have to have the controller 480.
- the controller 480 may also have, for example, some or all of the components of the controller 280.
- the information processing system 100 may have a sensor 486 that detects the movement of the MR user 6.
- the sensor 486 may be, for example, an angular velocity sensor, an acceleration sensor, or a geomagnetic sensor.
- the sensor 486 may also be provided in the controller 480, for example.
- the information processing system 100 may have one or more types of sensors as the sensor 486, and may have multiple sensors of each type.
- the device including the sensor 486 (for example, the controller 480 or a specific camera) and the computer 500 may be connected to each other, for example, wirelessly.
- the information acquired by the sensor 486 may be transmitted to the computer 300, for example, by wireless communication.
- the position and movement of the MR user 6 may also be detected by the MR device 410 being equipped with a device that performs predetermined wireless communication (e.g., short-range wireless communication such as Wi-Fi communication, Bluetooth communication, or UWB communication) and having the device perform wireless communication with surrounding devices to obtain position information of the VR device 210.
- predetermined wireless communication e.g., short-range wireless communication such as Wi-Fi communication, Bluetooth communication, or UWB communication
- the position and movement of the MR user 6 may also be detected using an external device 700 (and a device that tracks the movement of the MR user 6 in cooperation with the device via short-range wireless communication or the like) that can track the movement of the MR user 6, such as a device worn by the MR user 6 (for example, a wearable device such as a watch-type, wristband-type, ring-type, or clothing-type device, or an implantable device, etc.).
- a device worn by the MR user 6 for example, a wearable device such as a watch-type, wristband-type, ring-type, or clothing-type device, or an implantable device, etc.
- These external devices 700 may also be used as the controller 480.
- the position and movement of the MR user 6 may also be detected by a GPS sensor or the like.
- the computer 500 includes, as its main components, a processor 501, a memory 502, a storage 503, an input/output interface 504, and a communication interface 505.
- the components are connected to each other via a bus.
- the processor 501 controls the operation of the MR device 410.
- the processor 501 reads a program from the storage 503 and expands it in the memory 502.
- the processor 501 executes the expanded program.
- the processor 501 may be configured to include, for example, one or more of the following: a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), an MPU (Micro Processor Unit), and an FPGA (Field-Programmable Gate Array).
- Memory 502 is a main storage device. Memory 502 is composed of storage devices such as ROM (Read Only Memory) and RAM (Random Access Memory). Memory 502 provides a working area for processor 501 by temporarily storing programs and various data that processor 501 reads from storage 503. Memory 502 also temporarily stores various data generated while processor 501 is operating according to a program, various data input to computer 500, etc.
- ROM Read Only Memory
- RAM Random Access Memory
- Storage 503 is an auxiliary storage device.
- Storage 503 is composed of a storage device such as a flash memory or a HDD (Hard Disk Drive).
- Storage 503 stores programs for providing services in information processing system 100.
- Storage 503 also stores various data for providing services in information processing system 100.
- Data stored in storage 503 includes data for defining a mixed reality space, data related to virtual objects, etc.
- storage 503 may be implemented as a removable storage device, such as a memory card. Also, instead of storage 503 built into computer 500, programs and data stored in an external storage device may be used.
- the input/output interface 504 is an interface through which the computer 500 accepts data input, and is also an interface through which the computer 500 outputs data.
- the input/output interface 504 can transmit or receive data between the MR device 410, the detection device 460, and the display 470.
- the input/output interface 504 may also transmit or receive data between the display 411, the gaze sensor 412, the camera 413, the microphone 415, the speaker 416, and the sensor 417 included in the MR device 410.
- the input/output interface 504 may transmit or receive data to and from the controller 480.
- the input/output interface 504 may receive input of signals output from the controller 480 and the sensor 486.
- the input/output interface 504 may also transmit commands output from the processor 501 to the controller 480.
- the commands may instruct the controller 480 to vibrate, output sound, emit light, or the like.
- the controller 480 Upon receiving the command, the controller 480 performs vibration, output sound, emit light, or the like in accordance with the command.
- the MR device 410, the detection device 460, the display 470, the controller 480, and the sensor 486 may each be connected to the computer 500 via a wired or wireless connection.
- the communication interface 505 controls the sending and receiving of various data with other computers (e.g., the server 600, the computer 300, or other computers 500, etc.) via the network 2.
- other computers e.g., the server 600, the computer 300, or other computers 500, etc.
- the processor 501 accesses the storage 503, loads a program stored in the storage 503 into the memory 502, and executes a series of instructions contained in the program.
- the processor 501 also sends a signal to the MR device 410 via the input/output interface 504 to provide a mixed reality space.
- the computer 500 may be provided outside the MR device 410, or a part or the whole of the computer 500 may be built into the MR device 410. Furthermore, when there are multiple MR devices 410, the computer 500 may be provided for each MR device 410, or may be used commonly for the multiple MR devices 410. Furthermore, in this case, a part of the computer 500 may be built into each of the multiple MR devices 410. In this embodiment, an example will be described in which multiple MR devices 410 are connected to the computer 500.
- the server 600 may transmit a program to the computer 300.
- the server 600 may also transmit a program to the computer 500.
- the server 600 also enables communication between the computer 300 and the computer 500.
- the server 600 also enables communication between the computer 300 and other computers 300.
- each computer 300 of the VR user may communicate with the other computers 300 via the server 600, thereby enabling multiple VR users to share experiences in the same virtual space.
- each computer 300 may communicate with the other computers 300 without going through the server 600.
- the server 600 includes a processor 601, a memory 602, a storage 603, an input/output interface 604, and a communication interface 605. Each component is connected to each other via a bus.
- Processor 601 controls the operation of the entire server 600.
- Processor 601 reads programs from storage 603 and expands them into memory 602.
- Processor 601 executes the expanded programs.
- Processor 601 may be configured to include one or more of the following, for example: a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), an MPU (Micro Processor Unit), and an FPGA (Field-Programmable Gate Array).
- a CPU Central Processing Unit
- GPU Graphics Processing Unit
- MPU Micro Processor Unit
- FPGA Field-Programmable Gate Array
- Memory 602 is a main storage device. Memory 602 is composed of storage devices such as ROM (Read Only Memory) and RAM (Random Access Memory). Memory 602 provides a working area for processor 601 by temporarily storing programs and various data that processor 601 reads from storage 603. Memory 602 also temporarily stores various data generated while processor 601 is operating according to a program, various data input to server 600, etc.
- ROM Read Only Memory
- RAM Random Access Memory
- Storage 603 is an auxiliary storage device.
- Storage 603 is composed of a storage device such as a flash memory or a HDD (Hard Disk Drive).
- Storage 603 stores programs for providing services in information processing system 100.
- Storage 603 also stores various data for providing services in information processing system 100.
- Data stored in storage 603 includes data for defining a virtual space, data for defining a mixed reality space, data related to virtual objects, etc.
- storage 603 may be implemented as a removable storage device, such as a memory card. Also, instead of storage 603 built into server 600, programs and data stored in an external storage device may be used.
- the input/output interface 604 is an interface through which the server 600 accepts data input, and is also an interface through which the server 600 outputs data.
- the input/output interface 604 can transmit or receive data between, for example, input devices such as a mouse or keyboard, and output devices such as a display.
- the communication interface 605 controls the sending and receiving of various data with other computers (e.g., computer 300, computer 500, etc.) via network 2.
- computers e.g., computer 300, computer 500, etc.
- the external device 700 may be any device capable of communicating with the computer 300, the computer 500, or the server 600.
- the external device 700 may be, for example, a device capable of communicating with the computer 300 via the network 2, or a device capable of communicating with the computer 300 via short-range wireless communication or a wired connection.
- the external device 700 may be, for example, a device capable of communicating with the computer 500 via the network 2, or a device capable of communicating with the computer 500 via short-range wireless communication or a wired connection.
- the external device 700 may be, for example, a device capable of communicating with the server 600 via the network 2. Examples of the external device 700 include, but are not limited to, smart devices, PCs (Personal Computers), and peripheral devices of the computers 300 and 500.
- the MR device 410 provides the MR user 6 with a view of a mixed reality space in which a virtual object is arranged in a real space, similar to a known MR device.
- the VR device 210 provides the VR user with a view of a virtual space, similar to a known VR device.
- the MR user 6 is capable of operating a virtual object in the mixed reality space.
- the VR user is capable of operating a virtual object in the virtual space.
- the movement of the VR user using the VR system 200 is reflected in a virtual object included in the view of the mixed reality space provided by the MR system 400.
- the movement of the MR user 6 using the MR system 400 is reflected in a virtual object included in the view of the virtual space provided by the VR system 200.
- the motion of the VR user and the like are reflected in the virtual objects in the virtual space and the mixed reality space in real time.
- the motion of the MR user 6 and the like are reflected in the virtual objects in the virtual space and the mixed reality space in real time.
- the configuration according to this embodiment is applied to a service (in other words, an application) that enables communication between an MR user 6 and a VR user.
- a service in other words, an application
- the configuration according to this embodiment is applied to a service that enables an MR user 6 in a conference room as a specific real space and a VR user in a location away from the conference room to hold a conference while simultaneously viewing an object such as a product mockup as a virtual object.
- the configuration according to this embodiment can also be applied to a system that enables a student to take school classes from home.
- the configuration according to this embodiment can also be applied to a system that enables a student to participate in a home party held at a specific home from another location.
- the application of the configuration according to this embodiment is not limited to these services.
- the configuration according to this embodiment is applied to a conference room in which a plurality of detection devices 460 are installed in advance at specific positions in the conference room as shown in FIG. 8.
- the configuration according to this embodiment is applied to a system that enables a student to participate in a home party held at a specific home from another location.
- the configuration according to this embodiment is applied to a conference room in which a plurality of detection devices 460 are installed in advance at specific positions in the conference room as shown in FIG.
- the configuration according to this embodiment is applied to a system that enables a student to participate in a home party held at a specific home from another location.
- the avatar object of the VR user is referred to as the VR avatar 25.
- the avatar object of the MR user 6 is referred to as the MR avatar 26.
- the avatar object of the MR user 6A is referred to as the MR avatar 26A
- the avatar object of the MR user 6B is referred to as the MR avatar 26B
- the avatar object of the MR user 6C is referred to as the MR avatar 26C.
- FIG. 8 is a schematic diagram showing the state of the real conference room 21 and the state of the virtual space 11 when MR users 6A, 6B, 6C and a VR user are using a service.
- the upper part of FIG. 8 shows the state of the real conference room 21, and the lower part of FIG. 8 shows the state of the virtual space 11. Note that in FIG.
- the VR avatar 25 is wearing the VR device 210, but this is for ease of explanation, and in reality, the VR avatar 25 does not need to wear the VR device 210 (virtual VR device 210). Also, in FIG. 8, the MR avatar 26 is not wearing an MR device 410 (virtual MR device 410), but the MR avatar 26 may be wearing an MR device 410.
- a desk 8 and the like are arranged in the conference room 21 as real objects.
- Three detection devices 460 are also arranged in the conference room 21.
- Three MR users 6A, 6B, and 6C are also present in the conference room 21.
- no VR users are present in the conference room 21.
- a VR avatar 25 and a mock-up virtual object 30 (hereinafter also referred to as a "virtual model 30") are displayed as virtual objects on the display 411 of the MR device 410 used by each of the MR users 6A, 6B, and 6C. That is, the MR device 410 displays the VR avatar 25 and the virtual model 30 in the real conference room 21 visible through the transparent display 411. Therefore, the MR users 6A, 6B, and 6C are provided with a view as if the VR avatar 25 and the virtual model 30, which do not actually exist in the conference room 21, exist in the conference room 21.
- the virtual space 11 that the VR user sees through the VR device 210 is modeled after the conference room 21, and includes a virtual object 31 of a desk 8 that exists in the real conference room 21.
- the virtual space 11 also contains MR avatars 26A, 26B, 26C of the MR users 6A, 6B, 6C, a virtual model 30, and the like. This provides the VR user with a view as if he or she were in the conference room 21 where the MR users 6A, 6B, 6C are located.
- the MR user 6 can see other MR users 6 and real objects that are actually present in the same place through the transparent display 411.
- the VR avatar 25 of the VR user that appears as a virtual object can be seen. Therefore, the MR user 6 can communicate with the VR user and other MR users 6 without losing the sense of being in the real world.
- the VR user is in a different place from the MR user 6 in the real world, and in such a case, the shape of the room where the VR user is and the place where the MR user 6 is often different.
- the VR user also uses the MR device 410 or the like to display the MR avatar 26 in the real space, there is a risk that the display will be unnatural.
- the VR user can enter the virtual space 11 and communicate with the MR user 6, so that such an unnatural feeling can be prevented.
- the virtual space 11 provided to the VR user mimics the location of the MR user 6, when the VR user or MR user 6 moves around or moves a virtual object in the virtual space 11 or real space, respectively, the movements of the avatar or virtual model 30 displayed to the other party become natural.
- the VR user or MR user 6 operates a virtual object such as the virtual model 30, it becomes easy for them to imagine how it will look to the other party, enabling smooth communication.
- each of the VR system 200, the MR system 400, and the server 600 may have at least some of the functions of the other devices.
- some or all of the functional blocks of the computer 300, the computer 500, and the server 600 in this embodiment may be provided by the computer 300, the computer 500, the server 600, or other devices.
- each of the devices such as the computer 300, the computer 500, and the server 600 does not have to be realized by an integrated device, and may be realized by, for example, multiple devices connected via a network, etc.
- the processor 301, processor 501, or processor 601 will be described as executing each process described below by executing a program stored in the information processing system 100.
- the processor 301 may be executed by a processor other than the processor 301.
- at least a part of the process described below and performed by the processor 501 may be executed by a processor other than the processor 501.
- at least a part of the process described below and performed by the processor 601 may be executed by a processor other than the processor 601.
- the computer that executes the program in this embodiment may be any computer including the computer 300, the computer 500, and the server 600, or may be realized by a combination of multiple devices.
- FIG. 9 is a block diagram showing the functional configuration of the information processing system 100.
- the computer 300 (in other words, the VR system 200) functions as a control unit 310 and a storage unit 311 through cooperation between the processor 301, memory 302, storage 303, input/output interface 304, and communication interface 305.
- the computer 500 (in other words, the MR system 400) functions as a control unit 510 and a storage unit 511 through cooperation between the processor 501, memory 502, storage 503, input/output interface 504, and communication interface 505.
- the server 600 functions as a control unit 610 and a storage unit 611 through cooperation between the processor 601, memory 602, storage 603, input/output interface 604, and communication interface 605.
- the control unit 510 of the MR system 400 includes a virtual space generation unit 810, a coordinate definition unit 812, an MR side input reception unit 816, a user information acquisition unit 818, an MR side object control unit 820, a display control unit 840, a sound control unit 845, and a communication control unit 850.
- the virtual space generating unit 810 generates a virtual space (in other words, virtual space data representing a virtual space) based on a predetermined real space.
- the virtual space generating unit 810 generates a virtual space that imitates a predetermined real space.
- the predetermined real space is a predetermined place that exists in the real world, and may be, for example, a specific room such as a conference room or a school classroom.
- the predetermined real space may also be, for example, a school or a specific building such as a house.
- the predetermined real space may also be, for example, a specific town. In other words, the predetermined real space does not have to be a space separated by walls or the like.
- simulating a predetermined real space means that the generated virtual space has a structure similar to the basic structure of the real space.
- a virtual space that imitates a real-world conference room can be one in which the shape of the virtual room as a virtual space (for example, the shape of the walls, floor, etc.) is approximately the same as the shape of the conference room in the real world, and virtual objects corresponding to real objects such as desks and chairs that exist in the conference room in the real world are arranged.
- virtual objects such as desks, chairs, walls, and floors do not need to be exact copies of the shapes of real objects, and may be simplified in shape, pattern, color, etc.
- real objects such as posters on the wall or small items placed on a desk or floor may exist that do not appear in the virtual world (real objects that do not have corresponding virtual objects).
- the specified real space is a specific conference room 21, and a virtual space 11 that mimics this specific conference room 21 is generated.
- a publicly known method can be used to generate a virtual space that imitates a real space, and the method is not particularly limited, but may be, for example, as follows. That is, the virtual space generation unit 810 generates the virtual space 11 based on information from a sensor that can detect the shapes and positions of real objects that constitute the real space, such as walls, floors, desks, and chairs. Specifically, the virtual space generation unit 810 may obtain three-dimensional information of the real space based on information from an image sensor (e.g., an image sensor that obtains RGB images, an image sensor that obtains black and white images, or a depth sensor) provided in the detection device 460 or the MR device 410, and generate the virtual space. In addition, the virtual space generation unit 810 may use information from, for example, an acceleration sensor, an angular velocity sensor, or a geomagnetic sensor to generate the virtual space 11.
- an acceleration sensor e.g., an angular velocity sensor, or a geomagnetic sensor
- the virtual space generation unit 810 stores data representing the generated virtual space 11 (hereinafter referred to as "virtual space data"), for example, in the storage unit 611, which serves as a virtual space data storage unit.
- the virtual space generation unit 810 may generate virtual space data representing the virtual space 11 in advance before the VR user or MR user 6 starts using the service. Also, the virtual space generation unit 810 may generate virtual space data representing the virtual space 11 in real time, for example, while the VR user or MR user 6 is using the service.
- the coordinate definition unit 812 defines the coordinates of the real space (in other words, the coordinates of the mixed reality space 21, hereinafter referred to as “mixed reality coordinates").
- the coordinate definition unit 812 also defines the coordinates of the virtual space 11 (hereinafter referred to as “virtual space coordinates").
- the mixed reality coordinates have three mutually orthogonal axes, for example, the x-axis, y-axis, and z-axis.
- the virtual space coordinates have three mutually orthogonal axes, for example, the x-axis, y-axis, and z-axis.
- the coordinate definition unit 812 defines (in other words, associates) each coordinate for each point of the virtual space 11 (in other words, the virtual conference room) that corresponds to (in other words, indicates the same point) each point of the real space (in other words, the real conference room 21) so that the mixed reality coordinates indicating each point of the real space and the virtual space coordinates indicating each point of the virtual space correspond one-to-one.
- the coordinate definition unit 812 defines the virtual space coordinates associated with the mixed reality coordinates.
- the coordinate definition unit 812 may associate the mixed reality coordinates with the virtual space coordinates based on, for example, a feature point of the conference room detected by the detection device 460 or the MR device 410.
- the coordinate definition unit 812 may associate the mixed reality coordinates with the virtual space coordinates, for example, based on the position of the detection device 460 or the position of a marker or the like that has been installed in advance in the conference room.
- the coordinate definition unit 812 also stores information about the correspondence between mixed reality coordinates and virtual space coordinates (in other words, the correspondence between each point in the real space and each point in the virtual space) in the storage unit 611, which functions as a correspondence storage unit.
- the method for defining mixed reality coordinates and virtual space coordinates can be any known method, and is not particularly limited.
- the timing for defining the mixed reality coordinates and the virtual space coordinates may be, for example, when the virtual space generating unit 810 generates a virtual space that imitates the real space, or when the MR device 410 starts displaying a virtual object.
- the generation of the virtual space and the definition of each coordinate may be performed in advance, for example, by a dedicated device serving as external device 700 capable of scanning real space with high accuracy, and the generated data may be stored in storage unit 611.
- the MR side input receiving unit 816 receives input from the MR user 6. In other words, the MR side input receiving unit 816 acquires input information from the MR user 6. Specifically, the MR side input receiving unit 816 receives the output of the detection device 460, the controller 480, the camera 413, the device worn by the MR user, the gaze sensor 412, the sensor 417, the sensor 486, and the image sensors provided in the detection device 460 and the MR device 410 as input from the MR user 6.
- the MR side input receiving unit 816 receives the output of the detection means (e.g., the detection device 460, the controller 480, the camera 413, the gaze sensor 412, the sensor 417, the sensor 486, and the image sensors provided in the detection device 460 and the MR device 410) that detects the movement of the MR user 6 as input from the MR user 6.
- the detection means e.g., the detection device 460, the controller 480, the camera 413, the gaze sensor 412, the sensor 417, the sensor 486, and the image sensors provided in the detection device 460 and the MR device 4
- these devices and sensors are intended to obtain information about the movements of the MR user 6, which is used to control virtual objects in the virtual space 11 and the mixed reality space 21, and the MR user 6 can input information to the MR system 400 to move the virtual objects, etc., by his or her own movements.
- the MR side input receiving unit 816 may receive, for example, output from the camera or camera 413 serving as the detection device 460 as input from the MR user 6. More specifically, the MR side input receiving unit 816 may receive, for example, position information indicating the position of the MR user 6 in the mixed reality space 21, detected by image recognition from an image captured by an image sensor provided in each of these cameras, as input from the MR user 6. In other words, the MR side input receiving unit 816 may receive, for example, output from a position detection means that detects the position of the MR user 6 in the mixed reality space 21 as input from the MR user 6.
- the position detection means may detect the position of the MR user 6 and acquire the position information based on, for example, positioning based on wireless communication (e.g., Wi-Fi communication, Bluetooth communication, or UWB communication) between the MR device 410 and a specific device of the MR system 400 (e.g., the detection device 460 or a beacon (not shown)), or positioning using a GPS (Global Positioning System) sensor provided in the MR device 410.
- the position information indicating the position of the MR user 6 can also be considered information regarding the movement of the MR user 6.
- the MR side input receiving unit 816 may also receive, as input from the MR user 6, information regarding the movement of the MR user 6 detected by image recognition from an image captured by a camera serving as the detection device 460 or an image sensor provided in the camera 413.
- the MR side input receiving unit 816 may receive, as input from the MR user 6, information regarding the hand movement of the MR user 6 detected by image recognition, or information regarding the inclination or facing direction of the MR user 6.
- the MR side input receiving unit 816 may also receive, as input from the MR user 6, information regarding the movements of the MR user 6 obtained using, for example, a device worn by the MR user 6 (for example, a wearable device such as a watch-type, wristband-type, ring-type, or clothing-type device, or an implantable device) capable of tracking the movements of the MR user 6 (and a device that tracks the movements of the MR user 6 in cooperation with the device via short-range wireless communication, etc.).
- a device worn by the MR user 6 for example, a wearable device such as a watch-type, wristband-type, ring-type, or clothing-type device, or an implantable device capable of tracking the movements of the MR user 6 (and a device that tracks the movements of the MR user 6 in cooperation with the device via short-range wireless communication, etc.
- the MR side input receiving unit 816 may also receive, for example, information indicating the operation of the MR user 6 detected by the controller 480 (in other words, information regarding the movement of the MR user 6) as input from the MR user 6.
- the MR side input receiving unit 816 may also receive output data (in other words, information regarding the movement of the MR user 6) regarding the inclination or direction of the MR device 410 from a sensor (e.g., an acceleration sensor, an angular velocity sensor, a geomagnetic sensor, etc.) provided in the detection device 460 or the MR device 410 as input from the MR user 6.
- a sensor e.g., an acceleration sensor, an angular velocity sensor, a geomagnetic sensor, etc.
- the user information acquisition unit 818 acquires information about the MR user 6 present in the mixed reality space 21. Specifically, for example, the user information acquisition unit 818 acquires identification information that enables each MR user 6A, 6B, 6C in the mixed reality space 21 to be identified (in other words, identification information that enables each MR device 410 to be identified). For example, the user information acquisition unit 818 may communicate with the MR device 410 worn by each user and acquire information about the user using the MR device 410 from the MR device 410 (for example, information about the account logged in to the MR device 410, etc.) as the identification information.
- the user information acquisition unit 818 may communicate with the MR device 410 worn by each user and acquire information specific to each MR device 410 that enables each MR device 410 to be identified from the MR device 410 as the identification information. Also, for example, the user information acquisition unit 818 may acquire the identification information by identifying the MR user 6 in the mixed reality space 21 by image recognition from an image captured by an image sensor provided in the detection device 460 or the MR device 410. The user information acquisition unit 818 also associates the information acquired by the MR side input acceptance unit 816 with the identification information. In other words, the user information acquisition unit 818 controls so that it is possible to know which MR user 6 the information belongs to when information about the movement of the MR user 6 is used in controlling a virtual object, which will be described later.
- the MR side object control unit 820 controls virtual objects in the mixed reality space 21.
- the MR-side object control unit 820 places virtual objects in the mixed reality space 21. Specifically, the MR-side object control unit 820 places, for example, the VR avatar 25 of the VR user, and objects that are given predetermined changes by at least one of the VR user and the MR user 6. In this embodiment, as shown in FIG. 8, the MR-side object control unit 820 places the VR avatar 25 and a mock-up virtual object 30 (hereinafter also referred to as the "virtual model 30") that can be moved by the VR user and the MR user 6 in the mixed reality space 21.
- a mock-up virtual object 30 hereinafter also referred to as the "virtual model 30"
- the MR-side object control unit 820 places a virtual object corresponding to the virtual object at a position in the mixed reality space 21 corresponding to the position where the virtual object in the virtual space 11 is placed, for example.
- the virtual object corresponding to the virtual object i.e., the virtual object in the mixed reality space 21 corresponding to the virtual object in the virtual space 11
- a virtual model 30 having the same form as the virtual model 30 in the virtual space 11 is placed in the mixed reality space 21.
- a VR avatar 25 having the same form as the VR avatar 25 in the virtual space 11 is placed in the mixed reality space 21.
- the placement of the VR avatar 25 in the mixed reality space 21 can also be described as follows. That is, the MR-side object control unit 820 can also be said to place the VR avatar 25 at a position in the mixed reality space 21 that corresponds to the position of the VR user in the virtual space 11.
- the position of the VR user in the virtual space 11 is, for example, the position of the VR avatar 25 in the virtual space 11.
- the VR user in content in which the VR user enters the virtual space 11 from a first-person perspective, there may be cases in which the VR user (the VR avatar 25) cannot be seen (specifically, only a part of the user's body, such as the hands, or even a part of the user cannot be seen), but even in such cases, the VR user in the virtual space 11 recognized by the computer 300 can be said to be the VR avatar 25, and the position of the VR user in the virtual space 11 recognized by the computer 300 can be said to be the position of the VR avatar 25 in the virtual space.
- the appearance (in other words, the form) of the VR avatar 25 may imitate the appearance of the VR user, but it does not have to imitate the appearance.
- the appearance of the VR avatar 25 may imitate the appearance of a specific animal.
- the VR avatar 25 placed in the virtual space 11 and the VR avatar 25 placed in the mixed reality space 21 may have different appearances. With this configuration, the VR user can display his/her avatar in the mixed reality space 21 while hiding the form of his/her avatar in the virtual space 11.
- the appearance of the VR avatar 25 placed in the virtual space 11 may be an appearance that is visible to the VR user operating the VR avatar 25, or an appearance that is visible to other VR users operating other VR avatars 25 in the virtual space 11 in which the VR avatar 25 exists.
- the MR-side object control unit 820 may also place a virtual object in the mixed reality space 21, for example, as follows:
- the MR-side object control unit 820 determines where in the mixed reality space 21 to place the VR avatar 25 based on position information indicating the position of the VR user (in other words, the VR avatar 25) in the virtual space 11. Specifically, for example, when the VR-side object control unit 920 described later places the VR avatar 25 in the virtual space 11, it transmits information regarding the position of the VR avatar 25 in the virtual space 11 (for example, the coordinate value of the virtual space coordinates) to the MR-side object control unit 820 via the server 600.
- the MR-side object control unit 820 places the VR avatar 25 at a position in the mixed reality space 21 corresponding to the position of the VR user in the virtual space 11 (for example, the position of the coordinate value of the mixed reality coordinates corresponding to the coordinate value of the virtual space coordinates).
- the VR avatar 25 is placed at the same position in the conference room in both the virtual space 11 and the mixed reality space 21.
- the MR side object control unit 820 may determine the position in the mixed reality space 21 to place the virtual object based on position information (e.g., position information transmitted by the VR side object control unit 920) indicating the position of the virtual object (e.g., the VR avatar 25 or the virtual model 30, etc.) in the virtual space corresponding to the virtual object.
- position information e.g., position information transmitted by the VR side object control unit 920
- the MR-side object control unit 820 may determine at what position in the mixed reality space 21 to place the virtual object based on information detected by a predetermined sensor of the MR system 400. Specifically, for example, a predetermined marker may be set up in advance in the real space, and the MR-side object control unit 820 may place the virtual model 30 as a virtual object at the position of the marker photographed by the image sensor provided in the detection device 460 or the MR device 410. Also, the MR-side object control unit 820 may place the virtual model 30 as a virtual object at a position indicated by the MR user 6.
- the instruction may be made by a predetermined gesture (for example, a gesture of pointing at the location where the virtual model 30 is to be placed), or may be made to place the marker, etc.
- the MR-side object control unit 820 may detect a flat surface in the mixed reality space 21 from an image captured by an image sensor provided in the detection device 460 or the MR device 410, and place the virtual model 30 on the flat surface.
- the MR-side object control unit 820 When arranging a virtual object in the mixed reality space 21, the MR-side object control unit 820 receives object data indicating the form of the virtual object, and arranges the virtual object based on the object data.
- the MR-side object control unit 820 may receive the object data, for example, from the control unit 610 of the server 600, or from the VR-side object control unit 920 of the VR system 200.
- the MR-side object control unit 820 may receive object data indicating the form of the VR avatar 25 from the control unit 610, and arrange the VR avatar 25 in the form indicated by the object data in the mixed reality space 21.
- the MR-side object control unit 820 may receive object data indicating the form of the VR avatar 25 from the VR-side object control unit 920, and arrange the VR avatar 25 in the form indicated by the object data in the mixed reality space 21.
- the display control unit 840 controls the image display on the display 411 of the MR device 410.
- the display control unit 840 generates an image for displaying a virtual object in the mixed reality space 21 placed by the MR side object control unit 820 at the position placed by the MR side object control unit 820.
- the display control unit 840 also causes the image to be displayed on the display 411. This provides the MR user 6 with a view of the mixed reality space in which the virtual object is placed at the desired position in the real space.
- the display control unit 840 also controls the image display on the display 470.
- the display control unit 840 causes the display 470 to display an image similar to the image displayed on the display 411.
- the display control unit 840 may also cause the display 470 to display an image showing a view of the mixed reality space provided by the MR device 410 to the MR user 6.
- the sound control unit 845 acquires sound data corresponding to the speech.
- the sound control unit 845 also transmits the acquired sound data to the computer 300, etc. via the network 2.
- the sound control unit 845 receives sound data from the computer 300 via the network 2, it outputs sound (speech) corresponding to the sound data from the speaker 416.
- the MR side input receiving unit 816 may also receive sound data related to the speech of the MR user 6 detected by the microphone 415 as input from the MR user 6.
- the communication control unit 850 can communicate with the server 600, the computer 300, and other information and communication devices via the network 2.
- the communication control unit 850 transmits, for example, information used by the server 600 or the computer 300 to the server 600 or the computer 300.
- the communication control unit 850 also receives, for example, information used by the computer 500 from the server 600 or the computer 300.
- the control unit 310 of the VR system 200 includes a VR input receiving unit 916, a VR object control unit 920, a virtual camera control unit 930, a display control unit 940, a sound control unit 945, and a communication control unit 950.
- the VR side input receiving unit 916 receives input from the VR user. In other words, the VR side input receiving unit 916 acquires input information from the VR user. Specifically, the VR side input receiving unit 916 receives outputs from the detection device 260, the controller 280, the first camera 213, the second camera 214, a wearable device worn by the VR user, the gaze sensor 212, the sensor 217, the sensor 286, and image sensors provided in the detection device 260 and the VR device 210 as input from the VR user.
- the VR-side input receiving unit 916 receives the output of detection means for detecting the movements of the VR user (e.g., the detection device 260, the controller 280, the first camera 213, the second camera 214, a wearable device worn by the VR user, the gaze sensor 212, the sensor 217, the sensor 286, and the image sensors provided in the detection device 260 and the VR device 210) as input from the VR user.
- these devices and sensors are for obtaining information about the movements of the VR user that is used to control virtual objects in the virtual space 11 and the mixed reality space 21, and the VR user can input information for moving the virtual objects, etc., to the VR system 200 through his or her own movements.
- the VR side input receiving unit 916 may receive, for example, output from a camera serving as the detection device 260 or a camera provided in the VR device 210 that captures the surroundings of the VR device 210 as input from the VR user. More specifically, the VR side input receiving unit 916 may receive, as input from the VR user, position information indicating the position of the VR user in real space, detected by image recognition from images captured by the image sensors provided in each of these cameras. In other words, the VR side input receiving unit 916 may receive, as input from the VR user, output from a position detection means that detects the position of the VR user in real space.
- the position detection means may detect the position of the VR user and acquire the position information based on, for example, positioning based on wireless communication (e.g., Wi-Fi communication, Bluetooth communication, or UWB communication) between the VR device 210 and a specific device of the VR system 200 (e.g., detection device 260 or a beacon (not shown)), or positioning using a GPS (Global Positioning System) sensor provided in the VR device 210.
- the position information indicating the position of the VR user can also be considered information regarding the movement of the VR user.
- the VR-side input receiving unit 916 may also receive, as input from the VR user, information about the VR user's movements detected by image recognition from images captured by an image sensor provided in a camera serving as the detection device 260 or a camera provided in the VR device 210 that captures the surroundings of the VR device 210.
- the VR-side input receiving unit 916 may receive, as input from the VR user, information about the VR user's hand movements detected by image recognition, or information about the VR user's inclination or facing direction.
- the VR side input receiving unit 916 may also receive, as input from a VR user, information regarding the user's movements obtained using, for example, a device worn by the VR user (for example, a wearable device such as a watch-type, wristband-type, ring-type, or clothing-type device, or an implantable device) that can track the movements of the VR user (and a device that tracks the movements of the MR user 6 in cooperation with the device via short-range wireless communication, etc.).
- a device worn by the VR user for example, a wearable device such as a watch-type, wristband-type, ring-type, or clothing-type device, or an implantable device
- a device worn by the VR user for example, a wearable device such as a watch-type, wristband-type, ring-type, or clothing-type device, or an implantable device that can track the movements of the VR user (and a device that tracks the movements of the MR user 6 in cooperation with the device via short-range wireless communication, etc.
- the VR-side input receiving unit 916 may also receive, for example, information indicating the VR user's operation detected by the controller 280 (in other words, information regarding the VR user's movements) as input from the VR user.
- the VR-side input receiving unit 916 may also receive output data (in other words, information regarding the VR user's movements) regarding the inclination or direction of the VR device 210 from a sensor (e.g., an acceleration sensor, an angular velocity sensor, a geomagnetic sensor, etc.) provided in the detection device 260 or the VR device 210 as input from the VR user.
- a sensor e.g., an acceleration sensor, an angular velocity sensor, a geomagnetic sensor, etc.
- the control unit 310 controls the virtual space 11.
- the control unit 310 acquires virtual space data representing the virtual space 11 stored in the storage unit 611, and defines the virtual space 11 to be provided to the VR user based on the virtual space data.
- the VR side object control unit 920 controls the virtual objects in the virtual space 11.
- the VR-side object control unit 920 places a virtual object in the virtual space 11 indicated by the virtual space data.
- the object data indicating the virtual object is stored, for example, in the storage unit 611 or the storage unit 311 as an object data storage unit.
- the VR-side object control unit 920 then uses the object data to place the virtual object in the virtual space 11.
- the VR-side object control unit 920 places, for example, in a virtual conference room, MR avatars 26A, 26B, 26C of MR users 6A, 6B, 6C, and virtual objects that are given predetermined changes by at least one of the VR user and MR user 6.
- the VR-side object control unit 920 places, in the virtual space 11, the VR avatar 25, the MR avatar 26, and a mock-up virtual object 30 (hereinafter also referred to as the "virtual model 30") that can be moved by the VR user and MR user 6.
- a virtual object 31 corresponding to a desk 8 as a real object is also placed in the virtual space 11.
- the VR-side object control unit 920 may place a VR avatar 25 of the other VR user in the virtual space 11.
- the VR avatar 25 of the other VR user can be operated by the other VR user.
- the VR-side object control unit 920 places the MR avatar 26 of the MR user 6 at a position in the virtual space 11 that corresponds to the position of the MR user 6 in the mixed reality space 21.
- the VR-side object control unit 920 places the MR avatars 26A, 26B, and 26C so that the positions of the MR users 6A, 6B, and 6C in the real conference room 21 are the same as the positions of the MR avatars 26A, 26B, and 26C in the virtual conference room.
- the appearance (in other words, the form) of the MR avatar 26 may imitate the appearance of the MR user 6, but it does not have to imitate it.
- the appearance of the MR avatar 26 may imitate the appearance of a specific animal.
- the VR-side object control unit 920 receives object data indicating the form of the virtual object, and places the virtual object based on the object data.
- the VR-side object control unit 920 may receive the object data from the control unit 610 of the server 600, or may receive it from the MR-side object control unit 820 of the MR system 400.
- the VR-side object control unit 920 may receive object data indicating the form of the MR avatar 26 from the control unit 610 or the MR-side object control unit 820, and place the MR avatar 26 in the form indicated by the object data in the virtual space 11.
- the appearance of the MR avatar 26 may be preset by the MR user 6, and when the MR avatar 26 is displayed on the VR device 210, it may be displayed with the appearance set by the MR user 6.
- the VR-side object control unit 920 may place the MR avatar 26 in the virtual space 11 (e.g., generated by the control unit 510) based on an image of the MR user 6 acquired by a specific camera such as the camera 413 or the camera serving as the detection device 460.
- the virtual camera control unit 930 places the virtual camera 14 in the virtual space 11.
- the virtual camera control unit 930 also controls the position of the virtual camera 14 in the virtual space 11 and the tilt (direction) of the virtual camera 14.
- the virtual camera control unit 930 places the virtual camera 14 at the eye position of the VR avatar 25 in the virtual space 11.
- the virtual camera control unit 930 links the position of the virtual camera 14 to the position of the VR avatar 25, and moves the virtual camera 14 in the virtual space 11 when the VR avatar 25 moves in the virtual space 11.
- the position of the virtual camera 14 does not have to be linked to the position of the VR avatar 25. For example, it may be possible to move the VR avatar 25 in the virtual space 11 while keeping the virtual camera 14 fixed in a predetermined position.
- the display control unit 940 controls the image display on the display 211 of the VR device 210.
- the display control unit 940 defines the field of view 15 according to the position and tilt of the virtual camera 14 (in other words, the tilt of the head of the VR user wearing the VR device 210).
- the display control unit 940 also generates a field of view image 17 to be displayed on the display 211 based on the defined field of view 15.
- the field of view image 17 generated by the display control unit 940 is output to the VR device 210.
- the display control unit 940 also controls the image display on the display 270.
- the display control unit 940 causes the display 270 to display an image similar to the image displayed on the display 211.
- the sound control unit 945 When the sound control unit 945 detects speech from the VR user using the microphone 215 from the VR device 210, it acquires sound data corresponding to the speech. The sound control unit 945 also transmits the acquired sound data to the computer 500 or other VR users' computers 300 via the network 2. When the sound control unit 945 receives sound data from the computer 500 or other users' computers 300 via the network 2, it outputs sound (speech) corresponding to the sound data from the speaker 216. This allows the VR user to communicate with the MR user 6, for example, by phone call.
- the VR side input receiving unit 916 may also accept sound data related to the VR user's speech detected by the microphone 215 as input from the VR user.
- the communication control unit 950 can communicate with the server 600, the computer 500, other VR users' computers 300, and other information and communication devices via the network 2.
- the communication control unit 950 transmits, for example, information used by the server 600, the computer 500, or other VR users' computers 300 to the server 600, the computer 500, or other VR users' computers 300.
- the communication control unit 850 also receives, for example, information used by the computer 300 from the server 600, the computer 500, or other VR users' computers 300.
- the virtual objects displayed by the MR device 410 and the virtual objects displayed by the VR device 210 are controlled based on input from the MR user 6 acquired by the MR side input receiving unit 816.
- the MR-side object control unit 820 applies a predetermined change to a virtual object in the mixed reality space 21, specifically the virtual model 30, based on input from the MR user 6 received by the MR-side input receiving unit 816.
- the predetermined change may be a change to move a virtual object in the mixed reality space 21, such as moving the virtual object, changing the inclination or direction of the virtual object, or changing the relative positional relationship between parts of the virtual object.
- the predetermined change may be a change to the color of the virtual object, a change to information displayed by the virtual object, or a change to the form of the virtual object.
- the MR-side object control unit 820 moves the virtual model 30 in the mixed reality space 21 based on, for example, information on the movement of the MR user 6 received by the MR-side input receiving unit 816. Specifically, for example, when a gesture of the MR user 6 carrying the virtual model 30 is detected by a detection means (for example, the detection device 460 or the camera 413, etc.) and the detection means outputs information related to the gesture, the MR-side input receiving unit 816 receives the output as an input from the MR user 6. Also, based on the input received by the MR-side input receiving unit 816, the MR-side object control unit 820 moves the virtual model 30 in the mixed reality space 21 according to the instruction from the MR user 6 by the gesture.
- a detection means for example, the detection device 460 or the camera 413, etc.
- the gesture of the MR user 6 that causes a predetermined change to the virtual model 30 as a virtual object does not have to be one that touches the virtual model 30 in the mixed reality space 21.
- the MR-side object control unit 820 may also make a predetermined change to a virtual object based on, for example, information about the operation of the MR user 6 on the controller 480 (in other words, information about the movement of the MR user 6) received by the MR-side input receiving unit 816.
- the MR-side object control unit 820 may also make a predetermined change to a virtual object based on, for example, sound data about the speech of the MR user 6 received by the MR-side input receiving unit 816.
- the VR-side object control unit 920 also applies predetermined changes to virtual objects in the virtual space 11 based on input from the MR user 6 received by the MR-side input receiving unit 816.
- the VR-side object control unit 920 applies changes to the virtual model 30 in the virtual space 11 similar to the changes applied to the virtual model 30 in the mixed reality space 21 based on, for example, input from the MR user 6 received by the MR-side input receiving unit 816. That is, for example, as described above, when the virtual model 30 in the mixed reality space 21 moves due to a gesture of the MR user 6 to carry the virtual model 30, the VR-side object control unit 920 moves the virtual model 30 in the virtual space 11 in the same manner.
- the method of imparting a change to the virtual model 30 in the virtual space similar to the change imparted to the virtual model 30 in the mixed reality space 21 is not particularly limited, but may be, for example, as follows. That is, for example, when the MR-side object control unit 820 moves the virtual model 30 (in other words, the virtual object) in the mixed reality space 21 based on an input from the MR user 6 (in other words, at a predetermined trigger), it may send information indicating the position and tilt of the virtual model 30 after the movement (in other words, information regarding the state of the virtual model 30) to the VR-side object control unit 920.
- the VR-side object control unit 920 may place the virtual model 30 at the position indicated by the information in the virtual space 11, or place the virtual model 30 at the tilt indicated by the information.
- the communication control unit 850 may send information based on the input from the MR user 6 (for example, information on the movement amount, movement trajectory, inclination, etc. of the virtual model 30 calculated (in other words, acquired) by the control unit 510 based on the input) to the communication control unit 950, and the VR-side object control unit 920 may move a virtual object such as the virtual model 30 based on the information.
- the VR-side object control unit 920 makes a predetermined change to the virtual object in the virtual space 11 based on the input from the MR user 6 accepted by the MR-side input acceptance unit 816, but "based on the input from the MR user 6" is not limited to the VR-side object control unit 920 itself receiving the input and controlling the virtual object, and may be any virtual object in the virtual space 11 that ultimately makes a predetermined change in response to the input.
- the VR-side object control unit 920 moves the MR avatar 26 in the virtual space 11 based on, for example, input from the MR user 6 received by the MR-side input receiving unit 816. Specifically, the VR-side object control unit 920 moves the MR avatar 26 based on, for example, information on the movement of the MR user 6 received by the MR-side input receiving unit 816. For example, the VR-side object control unit 920 moves the MR avatar 26 based on position information indicating the position of the MR user 6 in the mixed reality space 21 received by the MR-side input receiving unit 816 as input from the MR user 6.
- the VR-side object control unit 920 places the MR avatar 26 at a position in the virtual space 11 corresponding to the position of the MR user 6 in the mixed reality space 21 based on the position information, and when the MR user 6 moves in the mixed reality space 21, places the MR avatar 26 at a position in the virtual space 11 corresponding to the position after the movement.
- the VR-side object control unit 920 also moves the MR avatar 26 based on, for example, information on the movements of each part of the MR user 6's body (e.g., hand movements, head movements, eye movements, changes in facial expression, etc.) and information on the inclination and facing direction of the MR user 6, which are received by the MR-side input receiving unit 816 as input from the MR user 6. That is, the VR-side object control unit 920 reflects the movements of each part of the MR user 6's body (in other words, posture) and the inclination and facing direction of the MR user 6 in the MR avatar 26 in the virtual space 11 based on each of these pieces of information.
- the VR-side object control unit 920 controls the movement of the MR avatar 26 so that the movement of the MR avatar 26 in the virtual space 11 imitates the movement of the MR user 6 detected by a detection means for detecting the movement of the MR user 6.
- the VR-side object control unit 920 controls the movement of the MR avatar 26 so that the movement of at least part of the body (e.g., hands, feet, head, eyes, mouth, etc.) of the MR avatar 26 is linked to the movement of the MR user 6.
- the movement of the MR avatar 26 may be realized by the MR user 6 operating the controller 480 (e.g., operating an analog stick, button, etc.). That is, for example, the VR-side object control unit 920 may move a part of the body of the MR avatar 26 based on information indicating the MR user 6's operation on the controller 480 as input from the MR user 6.
- At least one of the position and movement of the MR avatar 26 in the virtual space 11 does not have to be linked to the position or movement of the MR user 6 in the mixed reality space 21. That is, for example, even if the MR user 6 moves in the mixed reality space 21, the MR avatar 26 in the virtual space 11 does not have to move in response to this. Also, for example, even if the MR user 6 moves his/her hand in the mixed reality space 21, the MR avatar 26 in the virtual space 11 does not have to make the same movement.
- the virtual objects displayed by the VR device 210 and the virtual objects displayed by the MR device 410 are controlled based on input from the VR user acquired by the VR-side input receiving unit 916.
- the VR-side object control unit 920 applies a predetermined change to a virtual object in the virtual space 11, specifically the virtual model 30, based on the input from the VR user received by the VR-side input receiving unit 916.
- the VR-side object control unit 920 makes a predetermined change to the virtual model 30 in the virtual space 11 based on, for example, information indicating the VR user's operation on the controller 280 (in other words, information regarding the VR user's movements) received by the VR-side input receiving unit 916. Specifically, for example, the VR-side object control unit 920 moves the virtual model 30 in the virtual space 11 according to instructions from the VR user via operations on the controller 280.
- the VR-side object control unit 920 may also make a predetermined change to a virtual object based on, for example, a gesture of the VR user detected by a detection means.
- the VR-side object control unit 920 may also make a predetermined change to a virtual object based on, for example, sound data related to the speech of the VR user received by the VR-side input receiving unit 916.
- the MR-side object control unit 820 also applies predetermined changes to virtual objects in the mixed reality space 21 based on input from the VR user received by the VR-side input receiving unit 916.
- the MR-side object control unit 820 applies changes to the virtual model 30 in the mixed reality space 21 similar to the changes applied to the virtual model 30 in the virtual space 11 based on, for example, input from the VR user received by the VR-side input receiving unit 916. That is, for example, as described above, when the virtual model 30 in the virtual space 11 moves due to the VR user's operation on the controller 280, the MR-side object control unit 820 moves the virtual model 30 in the mixed reality space 21 in the same manner.
- the method of imparting a change to the virtual model 30 in the mixed reality space 21 similar to the change imparted to the virtual model 30 in the virtual space 11 is not particularly limited, but may be, for example, as follows. That is, for example, when the VR-side object control unit 920 moves the virtual model 30 (in other words, the virtual object) in the virtual space 11 based on input from the VR user (in other words, at a predetermined trigger), it may send information indicating the position and tilt of the virtual model 30 after the movement (in other words, information regarding the state of the virtual model 30) to the MR-side object control unit 820.
- the MR-side object control unit 820 may place the virtual model 30 at the position indicated by the information in the mixed reality space 21, or place the virtual model 30 at the tilt indicated by the information.
- the communication control unit 950 may send information based on the input from the VR user (for example, information on the movement amount, movement trajectory, inclination, etc. of the virtual model 30 calculated (in other words, acquired) by the control unit 310 based on the input) to the communication control unit 850, and the MR-side object control unit 820 may move a virtual object such as the virtual model 30 based on the information.
- the MR-side object control unit 820 makes a predetermined change to a virtual object in the mixed reality space 21 based on the input from the VR user received by the VR-side input receiving unit 916, but "based on the input from the VR user" does not necessarily mean that the MR-side object control unit 820 itself receives the input and controls the virtual object, but may mean that the virtual object in the mixed reality space 21 ultimately makes a predetermined change in response to the input.
- the VR-side object control unit 920 also makes a predetermined change to the VR avatar 25 in the virtual space 11 based on input from the VR user received by the VR-side input receiving unit 916. Specifically, the VR-side object control unit 920 makes a predetermined change to the VR avatar 25 in the virtual space 11 based on, for example, information indicating the VR user's operation on the controller 280 (in other words, information regarding the VR user's movements) received by the VR-side input receiving unit 916. Specifically, for example, the VR-side object control unit 920 moves the VR avatar 25 in the virtual space 11 in accordance with instructions from the VR user via operation on the controller 280.
- the VR-side object control unit 920 also moves the VR avatar 25 based on, for example, information on the movements of each part of the VR user's body (e.g., hand movements, head movements, eye movements, changes in facial expression, etc.) and information on the VR user's inclination and facing direction, which the VR-side input receiving unit 916 receives as input from the VR user. That is, based on each of these pieces of information, the VR-side object control unit 920 reflects the movements of each part of the VR user's body (in other words, posture) and the VR user's inclination and facing direction in the VR avatar 25 in the virtual space 11.
- information on the movements of each part of the VR user's body e.g., hand movements, head movements, eye movements, changes in facial expression, etc.
- the VR-side input receiving unit 916 receives as input from the VR user. That is, based on each of these pieces of information, the VR-side object control unit 920 reflects the movements of each part of the VR user's body (in
- the VR-side object control unit 920 controls the movement of the VR avatar 25 so that the movement of the VR avatar 25 in the virtual space 11 imitates the movement of the VR user detected by a detection means for detecting the movement of the VR user.
- the VR-side object control unit 920 controls the movement of the VR avatar 25 so that the movement of at least part of the body (e.g., hands, feet, head, eyes, mouth, etc.) of the VR avatar 25 is linked to the movement of the VR user.
- the MR-side object control unit 820 also imparts to the VR avatar 25 in the mixed reality space 21 a change similar to the change imparted to the VR avatar 25 in the virtual space 11 based on, for example, input from the VR user received by the VR-side input receiving unit 916. That is, for example, as described above, when the VR avatar 25 in the virtual space 11 moves as a result of the VR user operating the controller 280, the MR-side object control unit 820 moves the VR avatar 25 in the mixed reality space 21 in the same manner.
- the method of imparting to the VR avatar 25 in the mixed reality space 21 a change similar to the change imparted to the VR avatar 25 in the virtual space 11 is not particularly limited, but may be the same as in the case of the virtual model 30 described above, for example.
- the position of the VR avatar 25 in the virtual space 11 and the position in the mixed reality space 21 may move in conjunction with the movement of the VR user in real space. That is, for example, the VR-side object control unit 920 may move the VR avatar 25 based on position information indicating the position of the VR user in real space as an input from the VR user.
- the movement of the VR avatar 25 in the virtual space 11 and a part of the VR avatar 25 in the mixed reality space 21 may be realized by an operation on the controller 280. That is, for example, the VR side object control unit 920 and the MR side object control unit 820 may move a part of the body of the VR avatar 25 based on information indicating the VR user's operation on the controller 280 as an input from the VR user.
- the results of actions taken on real objects that actually exist in the conference room 21, which is a specified real space can also be shared between the VR user and the MR user 6.
- the following describes an example of control over a real object on which a specific piece of writing can be made (e.g., a blackboard or whiteboard).
- a blackboard 35 is placed as a real object in the mixed reality space 21
- a virtual blackboard 36 is placed as a virtual object corresponding to the blackboard 35 in the virtual space 11 at a position corresponding to the blackboard 35.
- the action (in other words, the result of the action) taken by the MR user 6 on the blackboard 35 as a real object in the mixed reality space 21 is reflected on the blackboard 35 in the mixed reality space 21. Also, in this embodiment, the action is reflected on the virtual blackboard 36 in the virtual space 11.
- the MR user 6 can make predetermined entries on the blackboard 35.
- the MR user 6's writing action on the blackboard 35 is detected by a detection means for detecting the writing action (hereinafter referred to as the "MR side writing detection means"), and the output from the MR side writing detection means (in other words, information regarding the writing action) is received by the MR side input receiving unit 816 as an input from the MR user 6.
- the MR side writing detection means may be, for example, a detection means capable of detecting the movement of the MR user 6's hand.
- the MR side writing detection means may be, for example, a pen-type controller or the like as the controller 480.
- the MR side writing detection means may be, for example, a camera or the camera 413 as the detection device 460.
- the MR side writing detection means may be, for example, a device worn by the MR user 6.
- the MR-side entry detection means may be a sensor provided on the blackboard 35 (e.g., a touch sensor or a sensor that reads reflected infrared light from an infrared light emitting device provided on the blackboard 35).
- the blackboard 35 may be capable of communicating with the computer 500, and information regarding the writing action of the MR user 6 detected by the sensor provided on the blackboard 35 may be received by the MR-side input receiving unit 816 as input from the MR user 6.
- the writing action of the MR user 6 is reflected on the blackboard 35 in the mixed reality space 21.
- the blackboard 35 changes to a state displaying the writing content of the MR user 6 (for example, a picture of a moon, a picture of a heart, and the word "Sample” in FIG. 10(a)), as shown in FIG. 10(a).
- the blackboard 35 may be an electronic blackboard that electronically displays (in other words, reflects) the writing content of the MR user 6, and when the MR user 6 performs a writing action, it may change to a state displaying the writing content.
- the writing surface of the blackboard 35 itself may be a display, and the display may display the writing content.
- the blackboard 35 may also be equipped with a projector that projects an image onto the writing surface, and the projector may project the writing content onto the writing surface.
- the reflection of the user's behavior in the mixed reality space 21 (in other words, a change in the mixed reality space 21 seen through the MR device 410 due to the behavior of the MR user 6) may be realized by some change (for example, a change in the display content) occurring in the real object itself.
- the computer of the blackboard 35 may control the display based on the detection result of a sensor equipped in the blackboard, or the computer 500 may control the display based on the detection result of the MR side writing detection means. That is, for example, the control unit 510 of the computer 500 or the like may function as a control means (hereinafter referred to as "blackboard display control means") that controls the display of the blackboard 35 based on the input from the MR user 6 received by the MR side input receiving unit 816. Specifically, the blackboard display control means may control the image displayed by the display or projector of the blackboard 35 to be an image showing the written content based on the input from the MR user 6.
- the writing action of the MR user 6 on the blackboard 35 in the mixed reality space 21 may be reflected by the MR user 6 actually writing characters or pictures on the blackboard 35 with a writing implement such as chalk.
- the action of actually writing characters or pictures on the blackboard 35 with a writing implement may be detected by the MR-side writing detection means as the writing action of the MR user 6.
- the MR-side object control unit 820 may generate a virtual object (hereinafter referred to as "entry object 37") indicating the contents entered by the MR user 6, and place it at the position of the blackboard 35 in the mixed reality space 21 (see FIG. 10(a)).
- entity object 37 a virtual object
- the MR-side object control unit 820 may generate the entry object 37 based on the input from the MR user 6 received by the MR-side input receiving unit 816.
- the blackboard 35 is viewed through the MR device 410, the state in which the blackboard 35 displays the entry contents (in other words, the entry object 37) may be visually recognized.
- the reflection of the user's actions in the mixed reality space 21 may be realized by adding a predetermined virtual object to the real object.
- the VR-side object control unit 920 applies a predetermined change to the virtual object corresponding to the real object based on the input from the MR user 6 received by the MR-side input receiving unit 816. Specifically, as shown in FIG. 10(b), the VR-side object control unit 920 reflects the input from the MR user 6 on the virtual blackboard 36, and controls the virtual blackboard 36 to display the contents written by the MR user 6. In other words, the VR-side object control unit 920 reflects a change to a real object (e.g., the blackboard 35) in the mixed reality space 21 based on the behavior of the MR user 6 on a virtual object (e.g., the virtual blackboard 36) in the virtual space 11 corresponding to the real object.
- a real object e.g., the blackboard 35
- the VR side object control unit 920 may place the writing object 38 indicating the writing contents written by the MR user 6 at the position of the virtual blackboard 36 in the virtual space 11 (see FIG. 10B).
- the VR side object control unit 920 may control the virtual blackboard 36 to display the writing contents by changing the form of the virtual blackboard 36 itself.
- the virtual blackboard 36 may be a virtual object that displays an image displayed by the display or projector of the blackboard 35, and the VR side object control unit 920 may display the image on the virtual blackboard 36.
- the MR system 400 may be provided with a camera that captures the blackboard 35 in the mixed reality space 21, the virtual blackboard 36 may be a virtual object that displays an image captured by the camera, and the VR side object control unit 920 may display the image on the virtual blackboard 36.
- the action (in other words, the result of the action) taken by the VR user in the virtual space 11 on the virtual blackboard 36 serving as a virtual object is reflected on the virtual blackboard 36 in the virtual space 11. Furthermore, in this embodiment, the action is reflected on the blackboard 35 in the mixed reality space 21.
- the VR user is able to make predetermined entries on the virtual blackboard 36. Furthermore, the VR user's writing action on the virtual blackboard 36 is detected by a detection means for detecting the writing action (hereinafter referred to as "VR side writing detection means"), and the output from the VR side writing detection means is received by the VR side input receiving unit 916 as an input from the VR user.
- the VR side writing detection means may be, for example, a detection means capable of detecting the movement of the VR user's hands.
- the VR side writing detection means may be, for example, a controller 280 or the like.
- the VR side writing detection means may be a camera or the like serving as the detection device 260.
- the VR side writing detection means may be, for example, a device worn by the VR user.
- the writing action of the VR user is reflected on the blackboard 35 in the mixed reality space 21.
- the blackboard 35 changes to a state in which the contents written by the VR user are displayed.
- the input from the VR user received by the VR side input receiving unit 916 is reflected on the blackboard 35.
- the input from the VR user may be reflected on the blackboard 35, for example, as follows. That is, for example, the MR side object control unit 820 may place a writing object 37 indicating the contents written by the VR user at the position of the blackboard 35 in the mixed reality space 21.
- the blackboard display control means may control the image displayed by the display or projector of the blackboard 35 based on the input from the VR user so that it becomes an image indicating the contents written by the VR user.
- the MR-side object control unit 820 or the blackboard display control means can function as a reflection means that reflects input from a VR user received by the VR-side input receiving unit 916 in a real object in the mixed reality space 21.
- the VR user's writing action is also reflected on the virtual blackboard 36 in the virtual space 11.
- the VR-side object control unit 920 applies a predetermined change to the virtual blackboard 36 based on the input from the VR user received by the VR-side input receiving unit 916. More specifically, the VR-side object control unit 920 reflects the input from the VR user on the virtual blackboard 36, and controls the virtual blackboard 36 to a state in which the contents written by the VR user are displayed.
- the VR-side object control unit 920 may place an entry object 38 indicating the contents written by the VR user at the position of the virtual blackboard 36 in the virtual space 11. Also, the VR-side object control unit 920 may control the virtual blackboard 36 to display the contents written by the VR user, for example, by changing the form of the virtual blackboard 36 itself. Also, the virtual blackboard 36 may be a virtual object that displays an image displayed by the display or projector of the blackboard 35, and the VR-side object control unit 920 may display the image on the virtual blackboard 36.
- the MR system 400 may be provided with a camera that captures the blackboard 35 in the mixed reality space 21, the virtual blackboard 36 may be a virtual object that displays an image captured by the camera, and the VR-side object control unit 920 may display the image on the virtual blackboard 36.
- the input from the VR user received by the VR-side input receiving unit 916 may be reflected in a real object in the mixed reality space 21, thereby causing a change in the virtual blackboard 36 based on the input from the VR user.
- actions taken by the MR user 6 on real objects and actions taken by the VR user on virtual objects corresponding to the real objects are reflected in the mixed reality space 21 and the virtual space 11.
- the reflection of each action in the mixed reality space 21 or the virtual space 11 may be as follows:
- the pointed position (hereinafter referred to as the "pointed position") may also be known in the virtual space 11.
- the detection device 460 or the camera 413 of the MR system 400 may function as a pointed position detection means for detecting the pointed position.
- Information on the pointed position detected by the pointed position detection means may be received by the MR side input reception unit 816 as an input from the MR user 6.
- the VR side object control unit 920 may place a virtual object representing light from a laser pointer at a position in the virtual space 11 corresponding to the pointed position in the mixed reality space 21.
- the control unit 310 may control the display of the virtual space 11 in the VR device 210 so that the position in the virtual space 11 corresponding to the pointed position in the mixed reality space 21 is illuminated in a predetermined color representing light from a laser pointer.
- the pointed position may be known in the mixed reality space 21.
- the VR side object control unit 920 may place a virtual object representing light from the virtual laser pointer at the position pointed to by the operation of the VR user (in other words, the position pointed to by the virtual laser pointer).
- control unit 310 may control the display of the virtual space in the VR device 210 so that the position in the virtual space 11 pointed to by the operation of the virtual VR user is illuminated in a predetermined color representing light from the virtual laser pointer.
- control unit 510 or the like may control a position in the mixed reality space 21 corresponding to a designated position in the virtual space 11 (specifically, a designated position by the VR user) to be illuminated in a predetermined color representing light from a virtual laser pointer.
- the MR-side object control unit 820 may place a virtual object representing light from a virtual laser pointer at a position in the mixed reality space 21 corresponding to the designated position in the virtual space 11.
- the blackboard display control means may control, based on an input from the VR user, an image displayed by the display or projector of the blackboard 35 to be an image in which a position corresponding to the predetermined position on the virtual blackboard 36 is illuminated (i.e., an image indicating that a predetermined place is being pointed to by a laser pointer).
- the detection device 460, the MR device 410, or the like may be equipped with a device that emits laser light, and the control unit 510 may control the device in response to input from the VR user received by the VR-side input receiving unit 916 to direct the laser light at a position in the mixed reality space 21 that corresponds to the indicated position in the virtual space 11.
- the behavior of the MR user 6 or VR user may be reflected in the mixed reality space 21 and the virtual space 11 as follows. That is, for example, a device capable of inputting or outputting sound as a real object (e.g., an electronic piano) may be placed in the mixed reality space 21, and the behavior of the MR user 6 or VR user related to the device may be reflected in the mixed reality space 21 and the virtual space 11.
- a device capable of inputting or outputting sound as a real object e.g., an electronic piano
- the behavior of the MR user 6 or VR user related to the device may be reflected in the mixed reality space 21 and the virtual space 11.
- an electronic piano is placed as a real object in the mixed reality space 21
- a virtual electronic piano is placed as a virtual object corresponding to the electronic piano in the virtual space 11 at a position corresponding to the electronic piano.
- the electronic piano may be capable of communicating with the computer 500, for example.
- information about the operation for example, sound data corresponding to the operation
- the sound control unit 945 may output sound from the speaker 216 based on the information.
- a key of the electronic piano for example, the key of the note "C”
- a sound corresponding to the operated key for example, the note "C”
- the MR side input receiving unit 816 may receive an output (for example, sound data corresponding to the operation) from the electronic piano (in other words, a sensor that detects the operation on the key, etc.) as a detection means for detecting the operation of the MR user 6 as an input from the MR user 6, and the control unit 310 (for example, the sound control unit 945) may output a sound corresponding to the operation of the MR user 6 on the electronic piano from the speaker 216 based on the input.
- the sound control unit 945 may also control the direction from which the sound is emitted so that the sound corresponding to an operation on the electronic piano sounds as if it is being output from a virtual electronic piano in the virtual space 11.
- the VR user when the VR user operates the virtual electronic piano in the virtual space 11, information on the operation may be sent to the computer 500.
- the sound control unit 845 may output a sound from the speaker 416 based on the information.
- a key of the virtual electronic piano e.g., the key of the note "C”
- a sound corresponding to the operated key e.g., the note "C”
- the VR side input receiving unit 916 may receive an output from a detection means (e.g., the detection device 260 or the controller 280) that detects the operation of the VR user on the virtual electronic piano as an input from the VR user, and the control unit 510 (e.g., the sound control unit 845) may output a sound corresponding to the operation of the VR user on the virtual electronic piano from the speaker 416 based on the input.
- the control unit 510 may output a sound corresponding to the operation of the VR user on the virtual electronic piano from the speaker of the electronic piano.
- the sound corresponding to the operation may be output in the mixed reality space 21 by the speaker of the electronic piano or by the speaker 416. Also, when the VR user operates the virtual electronic piano, the sound corresponding to the operation may be output in the virtual space 11 by the speaker 216.
- step S101 the processor 301 of the computer 300 identifies the virtual space data and defines the virtual space 11.
- step S102 the processor 301 places the VR avatar 25 of the VR user in the virtual space 11.
- step S103 the processor 301 acquires position information indicating the position of the MR user 6 in the mixed reality space 21 from the processor 501 of the computer 500, which is connected to the computer 300 via the network 2 and controls the mixed reality space 21 corresponding to the virtual space 11.
- step S104 the processor 301 places the MR avatar 26 of the MR user 6 at a position in the virtual space 11 that corresponds to the position of the MR user 6 in the mixed reality space 21, based on the position information acquired in step S103.
- step S105 the processor 301 places the virtual model 30 as a virtual object in the virtual space 11.
- the processor 301 places the virtual model 30 at a position in the virtual space 11 that corresponds to the position of the virtual model 30 in the mixed reality space 21. Note that placing at a corresponding position may mean that the position in the mixed reality space 21 is determined first, or that the position in the virtual space 11 is determined first.
- step S106 the processor 301 accepts input from the VR user.
- the processor 301 accepts, for example, the output of a detection means that detects the movements of the VR user as input from the VR user.
- the processor 301 makes a predetermined change to the virtual object in the virtual space 11 based on the input from the VR user. Specifically, the processor 301 moves the VR avatar 25 or the virtual model 30 in the virtual space 11 based on, for example, the input from the VR user. For example, if the input is related to the VR avatar 25, the processor 301 moves the VR avatar 25. Also, for example, if the input is related to the virtual model 30, the processor 301 moves the virtual model 30. Also, the processor 301 transmits information related to the input from the VR user to the processor 501.
- step S108 the processor 301 acquires information regarding the input from the MR user 6 that was accepted by the processor 501.
- step S109 the processor 301 makes a predetermined change to the virtual object in the virtual space 11 based on the input from the MR user 6 acquired in step S108. Specifically, the processor 301 moves the MR avatar 26 or the virtual model 30 in the virtual space 11, for example, based on the input from the MR user 6. For example, if the input is related to the MR avatar 26, the processor 301 moves the MR avatar 26. Also, if the input is related to the virtual model 30, the processor 301 moves the virtual model 30.
- step S201 the processor 501 of the computer 500 defines the mixed reality space 21.
- step S202 the processor 501 detects the position of the MR user 6 in the mixed reality space 21.
- the processor 501 also transmits position information indicating the detected position of the MR user 6 to the processor 301 of the computer 300 that is connected to the computer 500 via the network 2 and that controls the virtual space 11 that corresponds to the mixed reality space 21.
- step S203 the processor 501 obtains position information indicating the position of the VR avatar 25 in the virtual space 11 from the processor 301.
- step S204 the processor 501 places the VR avatar 25 at a position in the mixed reality space 21 that corresponds to the position of the VR avatar 25 in the virtual space 11, based on the position information acquired in step S203. Note that placing at a corresponding position may mean that the position in the virtual space 11 is determined first, or that the position in the mixed reality space 21 is determined first.
- step S205 the processor 501 places the virtual model 30 as a virtual object in the mixed reality space 21.
- the processor 501 places the virtual model 30 at a position in the mixed reality space 21 that corresponds to the position of the virtual model 30 in the virtual space 11.
- step S206 the processor 501 accepts input from the MR user 6.
- the processor 501 accepts, for example, the output of a detection means that detects the movement of the MR user 6 as input from the MR user 6.
- step S207 the processor 501 applies a predetermined change to the virtual object in the mixed reality space 21 based on the input from the MR user 6. Specifically, the processor 501 moves the virtual model 30 in the mixed reality space 21, for example, based on the input from the MR user 6. The processor 501 also transmits information regarding the input from the MR user 6 to the processor 301.
- step S208 the processor 501 acquires information regarding the input from the VR user that was accepted by the processor 301.
- step S209 the processor 501 makes a predetermined change to the virtual object in the mixed reality space 21 based on the input from the VR user acquired in step S208. Specifically, the processor 501 moves the VR avatar 25 or the virtual model 30 in the mixed reality space 21 based on, for example, the input from the VR user. For example, if the input is related to the VR avatar 25, the processor 501 moves the VR avatar 25. Also, if the input is related to the virtual model 30, the processor 501 moves the virtual model 30.
- the processor 501 accepts input from the MR user 6. Specifically, the processor 501 accepts, as input from the MR user 6, the output of a detection means that detects the behavior of the MR user 6 with respect to a specific real object (e.g., the blackboard 35, a laser pointer, or an electronic piano, etc.).
- a detection means that detects the behavior of the MR user 6 with respect to a specific real object (e.g., the blackboard 35, a laser pointer, or an electronic piano, etc.).
- step S302 the processor 501 transmits information regarding the input from the MR user 6 received in step S301 to the processor 301 (in other words, the VR system 200).
- step S303 the processor 301, based on the information transmitted in step S302, applies a predetermined change to the virtual object corresponding to the specific real object on which the MR user 6 performed an action. For example, when the MR user 6 performs a writing action to write predetermined contents on the blackboard 35, the processor 301 changes the virtual blackboard 36 corresponding to the blackboard 35 to a state in which the predetermined contents are displayed.
- step S351 the processor 301 accepts input from the VR user. Specifically, the processor 301 accepts, as input from the VR user, the output of a detection means that detects the VR user's behavior with respect to a virtual object (e.g., a virtual blackboard 36 or a virtual electronic piano) that corresponds to a specific real object.
- a detection means that detects the VR user's behavior with respect to a virtual object (e.g., a virtual blackboard 36 or a virtual electronic piano) that corresponds to a specific real object.
- step S352 the processor 301 transmits information regarding the input from the VR user received in step S351 to the processor 501 (in other words, the MR system 400).
- step S353 the processor 501 reflects the action performed by the VR user on a virtual object corresponding to a specific real object, in the specific real object, based on the information transmitted in step S352. For example, when the VR user performs a writing action to write a specific entry on the virtual blackboard 36, the processor 501 changes the blackboard 35 to a state in which the specific entry is displayed.
- the information processing system 100 of the second embodiment may adopt the contents described in the first embodiment to the extent that no contradiction occurs.
- the information processing system 100 of the second embodiment may include a part or all of the configuration of the information processing system of the first embodiment.
- the description of the matters described in the first embodiment will be omitted or simplified.
- the following describes an example in which the information processing system 100 is applied to a service in which events taking place in the virtual space 11 are reflected in the mixed reality space 21.
- the following describes an information processing system 100 in which an MR user 6 who is at a position in the real space corresponding to a specific location (in front of store B in this case) in the virtual space 11 that mimics an actual place (in front of store A in this case) can experience an event (in front of a live music performance in this case) taking place at the specific location (in front of store B in this case) via the MR device 410 (here, the user can watch the live music performance).
- the coordinate definition unit 812 stores information about the correspondence between the coordinates measured by the GPS sensor (in other words, the position detection means) and the virtual space coordinates (in other words, the correspondence between points in the real space and points in the virtual space 11) in the storage unit 611, which serves as a correspondence storage unit.
- information indicating the correspondence between positions in the real space (in other words, the mixed reality space) and the virtual space 11 is stored in advance in the correspondence storage unit.
- the MR side input receiving unit 816 receives input from the position detection means.
- the position detection means acquires position information indicating the position of the MR user 6 using a GPS sensor provided in the MR device 410.
- the MR side input receiving unit 816 accepts input of position information indicating the current position of the MR user 6 in real space (e.g., position information acquired using a GPS sensor).
- the control unit 310 of the VR system 200 controls the execution of a predetermined event that is performed at a specific location in the virtual space 11.
- the predetermined event is a predetermined event (specifically, a live music concert).
- the VR-side object control unit 920 places an event-related object 50 as a virtual object related to the predetermined event at a specific location in the virtual space 11, as shown in FIG. 15. More specifically, the VR-side object control unit 920 places, for example, an object of a character appearing in a live music concert (hereinafter referred to as a "performing character") or an object of a stage for the live music concert as an event-related object 50 in front of a store B in the virtual space 11.
- performing character an object of a stage for the live music concert
- the VR-side object control unit 920 may place the event-related object 50 based on an instruction from a VR user (in other words, an input from the VR user received by the VR-side input receiving unit 916 (for example, an operation on the controller 280)).
- the VR-side object control unit 920 may also place the event-related object 50 based on an instruction from the operator of this service.
- the VR-side object control unit 920 also moves the event-related object 50 (e.g., a featured character) and controls the progress of the event (in other words, progresses the event).
- the VR-side object control unit 920 may move the featured character based on the VR user's instructions (in other words, input from the VR user received by the VR-side input receiving unit 916 (e.g., operation on the controller 280)).
- the movements of the featured characters may be prepared in advance, and the VR-side object control unit 920 may move the featured character by playing back the prepared data.
- the featured character may be a VR avatar 25 or a non-player character, etc.
- the VR system 200 (in other words, the control unit 310) provides a view related to a music live performance as a specified event to a VR user (in other words, a VR user controlling a character at a specific location) who is at a specific location (in other words, in front of virtual store B) in the virtual space 11 (in other words, virtual town A) in which the user can move around by controlling his or her own character (e.g., VR avatar 25).
- the VR user may be a different user from the user controlling the character appearing in the music live performance (in other words, for example, a spectator at the music live performance), or may be the same user (in other words, for example, a performer at the music live performance).
- the event-related objects 50 are provided with information (in other words, a tag) indicating that they are objects related to a specific event (in other words, a specific event).
- the event-related objects 50 are also provided with position information indicating the position of the event-related objects 50 in virtual space (in other words, coordinates).
- the position information of the event-related objects 50 may be provided individually to each event-related object 50, or may be provided collectively to the event, etc.
- the MR-side object control unit 820 places a virtual object corresponding to the event-related object 50 at a position in the mixed reality space 21 corresponding to the position at which the event-related object 50 is placed in the virtual space 11.
- the virtual object corresponding to the event-related object 50 i.e., the virtual object in the mixed reality space 21 corresponding to the event-related object 50 in the virtual space 11
- an event-related object 50 of the same form as the event-related object 50 in the virtual space 11 is placed in the mixed reality space 21.
- the MR-side object control unit 820 moves the event-related object 50 (e.g., a featured character) to progress the event being executed in the mixed reality space 21.
- the progress of the event by the MR-side object control unit 820 (specifically, control of moving the featured character, etc.) and the progress of the event by the VR-side object control unit 920 (specifically, control of moving the featured character, etc.) are linked. That is, the MR-side object control unit 820 reflects the event taking place in the virtual space 11 in the mixed reality space 21. In other words, in this embodiment, the event occurring in the virtual space 11 is also reflected in the mixed reality space 21 in real time.
- the MR-side object control unit 820 controls the event-related object 50 in the mixed reality space 21 to move in the same way as the event-related object 50 in the virtual space 11.
- "reflected in real time” here includes a case where there is a slight delay in the occurrence of the event (in other words, the progress of the event), and may be preceded by either the progress of the event in the virtual space 11 or the progress of the event in the mixed reality space 21.
- "reflected” here does not mean that the event-related object 50 in the mixed reality space 21 does not have to move in exactly the same way as the event-related object 50 in the virtual space 11. For example, some of the movements of the appearing characters in the virtual space 11 may not be reflected in the appearing characters displayed in the mixed reality space 21.
- the control unit 510 of the MR system 400 may provide the MR user 6 with a view that can be determined to be related to the same event as the view provided to the VR user.
- the live music concert takes place in front of store B in town A in mixed reality space 21. That is, the MR-side object control unit 820 places an event-related object 50 relating to a specific event taking place at a specific position in virtual space 11 at a position in mixed reality space 21 corresponding to the specific position, and moves the event-related object 50 so as to move in accordance with the movement in virtual space 11. Then, in accordance with position information input to the MR-side input receiving unit 816, the display control unit 840 causes the display 411 of the MR device 410 to display the event-related object 50 placed at the position in mixed reality space 21 indicated by the position information.
- the control unit 510 causes the display 411 to display an event-related object 50 relating to a live music performance taking place in front of the store B in the virtual space 11.
- This provides the MR user 6 with a view of the mixed reality space 21 in which the event-related object 50 is placed in front of the store B in the real space (in other words, a live music performance taking place in the virtual space 11 in front of the store B in the real space).
- the control unit 510 provides the MR user 6 with a view of the mixed reality space 21 reflecting an event taking place at the specific position in the virtual space 11.
- the control unit 510 may include a bonus granting unit that grants a bonus to the MR user 6 based on the MR user 6 visiting a location in real space that corresponds to the specific location.
- the bonus may be, for example, a virtual object (e.g., an item that a character can equip, etc.) that the user who has acquired the bonus can use in the game (in other words, in the service), an electronic ticket that can be exchanged for a real object in real space, or a specific point that the user owns in the service.
- based on visiting may mean directly determining whether or not the user has visited and granting the bonus, or may mean granting the bonus when a specific action that can be performed by visiting (e.g., starting or finishing watching a live music performance, or continuing for a specific period of time or more) is performed.
- a specific action that can be performed by visiting (e.g., starting or finishing watching a live music performance, or continuing for a specific period of time or more) is performed.
- a view of the mixed reality space 21 reflecting a specific event occurring at a specific position in the virtual space 11 is provided to the MR user 6, but the specific event is not limited to a live music performance.
- the specific event may be an event in which a character operated by a specific user (in other words, an event-related object 50 related to the specific event) moves.
- the VR system 200 provides a game in which a character can be moved in the virtual space 11 simulating town A.
- the VR-side object control unit 920 causes the character to perform a specific action (e.g., an action such as jumping or running) in the virtual space 11 simulating town A based on the instructions of the VR user as the specific user (in other words, input from the VR user received by the VR-side input receiving unit 916 (e.g., operation on the controller 280)). Further, the MR-side object control unit 820 places the character at a position in the mixed reality space 21 corresponding to the position where the character is placed in the virtual space 11, and causes the character to perform an action in the mixed reality space 21 similar to the action of the character in the virtual space 11.
- a specific action e.g., an action such as jumping or running
- the MR-side object control unit 820 places the character at a position in the mixed reality space 21 corresponding to the position where the character is placed in the virtual space 11, and causes the character to perform an action in the mixed reality space 21 similar to the action of the character in the virtual space 11.
- the control unit 510 reflects the movement of the character in the game (in other words, in the virtual space 11) moved by the specific user in the view of the mixed reality space 21 provided via the display 411 of the MR device 410.
- the virtual object operated by the specific user moves in the same manner in the view of the virtual space 11 provided to the specific user in the VR device 210 of the VR user as the specific user and the view of the mixed reality space 21 provided to the MR user 6 in the MR device 410 of the MR user 6 in the town A in the real space.
- This configuration makes it possible to replicate the movements of characters on the game screen in a real city, improving the entertainment value of the service.
- the MR user 6 who can receive a view of the mixed reality space 21 reflecting a specific event occurring at a specific position in the virtual space 11 may be limited.
- the storage unit 511 functions as a rights information storage unit that stores information regarding the right to receive the view (hereinafter referred to as "rights information"), and the control unit 510 may provide the view to the MR user 6 if the MR user 6 has the right, and may not provide the view to the MR user 6 if the MR user 6 does not have the right.
- the right may be granted to the MR user 6 by, for example, the MR user 6 paying a price (for example, a payment of cash, or consumption of the assets held by the MR user 6 (for example, virtual currency, etc.) within the service).
- the right to watch a live music performance may be available for purchase within the service.
- the rights information may store information regarding the presence or absence of tickets for a live music performance that can be purchased within the service, and the control unit 510 may control so that an MR user 6 who has the ticket can watch the live music performance, and an MR user 6 who does not have the ticket cannot watch the live music performance.
- the rights information may store information regarding the possession of an item (e.g., a ticket, etc.) related to the right to receive the view, and the control unit 510 may provide the view to the MR user 6 when the MR user 6 has the item.
- the right in other words, the item
- the item may be obtained by completing a specified mission within the service, etc.
- control unit 510 may provide the MR user 6 with a view of the mixed reality space 21 reflecting a predetermined event occurring at a specific position in the virtual space 11 and related to a VR user selected by the MR user 6 from among multiple users.
- the MR side input receiving unit 816 may receive an input by the MR user 6 related to an operation (e.g., an operation on the controller 480) for selecting which of the multiple users a predetermined event related to should be reflected, and the control unit 510 may reflect the predetermined event related to the VR user selected by the operation in the view provided to the MR user 6.
- the operation may be, for example, an operation by the MR user 6 to select a user from among his/her friends to receive the event reflection.
- the predetermined events related to all friends are reflected, and the operation may be an operation of registering another user as a friend, etc.
- the MR user 6 may be provided with a view of the mixed reality space 21 reflecting a predetermined event related to his/her friend.
- the operation may be an operation in which the MR user 6, such as a professional player, selects a user from among multiple players for which a predetermined event is to be reflected.
- the control unit 510 may reflect the movement of the character moved by the player in the game (in other words, in the virtual space 11) in the view of the mixed reality space 21 provided to the MR user 6.
- the MR user 6 may be provided with a view of the mixed reality space 21 reflecting a predetermined event related to a user associated with the MR user 6 among multiple users.
- control unit 510 may reflect the live show in the view of the mixed reality space 21 provided to the MR user 6. Note that, when there are multiple events related to a specific user, it may be possible to select a specific event related to the specific user as the event to be reflected.
- each user may be able to register other users (e.g., VR user or MR user 6) as friends.
- Each user can register other users as friends, for example, based on a predetermined operation on the controller 280, 480.
- Registering as a friend can also be said to be a user associating another user with himself/herself.
- Registering as a friend can also be said to be bookmarking a specific user other than himself/herself.
- Registering as a friend can also be said to be making information about a certain user (specifically, a friend) (e.g., information about the login state, etc.) easier to call up than information about other users (specifically, non-friends).
- a certain user registers another user as a friend, but the other user does not register the certain user as a friend. That is, there may be a state in which the certain user unilaterally follows the other user. There may also be a state in which the certain user and the other user follow each other. Note that there may not be a state in which one-sided following exists.
- a "friend" may refer only to other users with whom a reciprocal follow relationship is established from the perspective of a certain user, or may refer to other users with whom a one-sided follow relationship is established.
- the control unit 510 may also provide the MR user 6 with a view of the mixed reality space 21 reflecting a predetermined event taking place at a specific position in the virtual space 11 and having an attribute corresponding to the attribute set for the MR user 6.
- each MR user 6 may be assigned an attribute related to the user's hobbies and preferences, more specifically, an attribute such as "music lover” or "comedy lover” (or “like a specific entertainer”).
- the control unit 510 may, for example, reflect a predetermined event with the attribute "music” for an MR user 6 with the attribute "music lover", and not reflect a predetermined event with the attribute "music” for an MR user 6 without the attribute "music lover”.
- the control unit 510 may, for example, reflect a predetermined event with the attribute "comedy” for an MR user 6 with the attribute "comedy lover”, and not reflect a predetermined event with the attribute "comedy” for an MR user 6 without the attribute "comedy lover”.
- the attributes set for the MR users 6 may be set by each MR user 6 himself/herself (e.g., input of his/her own preferences via the controller 480), or may be set automatically by AI (Artificial Intelligence) (e.g., based on the user's actions within the service, etc.).
- the control unit 510 may reflect in the mixed reality space 21 a specific event selected by the MR user 6 among multiple events taking place in the virtual space 11 (for example, an event related to a specific user selected by the MR user 6, an event having an attribute corresponding to the MR user 6's own attribute set by the MR user 6, an event related to a ticket purchased by the MR user 6, etc.).
- the control unit 510 may not reflect in the mixed reality space 21 an event not selected by the MR user 6 among multiple events taking place in the virtual space 11. In this way, when an unselected event is not reflected in the mixed reality space 21, there may be an event that is reflected regardless of the user's selection.
- not reflecting an unselected event in the mixed reality space 21 means that there is at least one event that is reflected in the mixed reality space 21 if a selection is made, and is not reflected in the mixed reality space 21 if no selection is made.
- the selection of an event may be possible before the specific event selected by the MR user 6 occurs (in other words, starts), or may be possible after the specific event occurs (in other words, while it is taking place).
- the virtual space 11 in which an event takes place does not have to mimic the real space in which the event is reflected.
- an area corresponding to a plaza in front of store B in town A that actually exists may be prepared in the virtual space 11, and a specific event that takes place in that area may be reflected in the MR device 410 of an MR user 6 who is in that plaza in real space.
- the memory unit 611 serving as a correspondence storage unit may store, for example, information indicating the correspondence between that area and the plaza.
- the virtual space 11 in which real-space events are reflected, as described in the third embodiment does not have to mimic the real space.
- the virtual space 11 in which the event takes place may be one that imitates the topography of the real space in which the event is reflected, but may have a different landscape.
- the virtual space 11 may be one that imitates the topography of a town A that actually exists, but has a different streetscape.
- the virtual space 11 may be one that reproduces an area corresponding to town A in the past (for example, the Edo period), and points in the real space and points in the virtual space 11 may be associated so that each point in the reproduced area and each point in the current town A in the real space indicate the same place on the map (in other words, the same place with the same longitude and latitude).
- a specific event that takes place at a specific position in the reproduced area may be reflected in the MR device 410 of the MR user 6 that is at a position in the real space that corresponds to the specific position.
- the virtual space 11 in which the event in the real space is reflected may also be one that imitates the topography of the real space, but has a different landscape.
- the view of the mixed reality space 21 related to a specific event does not have to be provided by the MR device 410 that superimposes a virtual object on the real space seen through the transparent display 411, but may be provided by, for example, a smartphone that displays an image of the real space captured by a camera with a virtual object superimposed on it on the display 411.
- the view of the mixed reality space 21 may be provided by, for example, displaying the event-related object 50 on a transparent display, or by displaying an image of the real space captured with the event-related object 50 superimposed on it on the display.
- a specific system including a smartphone that displays an image of the real space captured with a virtual object superimposed on it on a display may have at least a part of the configuration of the MR system 400.
- the MR system 400 in this embodiment or other embodiments can be interpreted as the specific system (for example, a smartphone).
- the user of the specific system also corresponds to the MR user 6.
- the view of the virtual space 11 relating to a specified event does not have to be provided by a head-mounted display as the VR device 210, but may be provided, for example, by displaying an image of the virtual space 11 on a PC monitor or a television.
- a specified system including a PC or a game device, or these devices and a specified display may have at least a part of the configuration of the VR system 200.
- the VR system 200 in this embodiment or other embodiments can be interpreted as being the specified system.
- a user of the specified system also corresponds to a VR user.
- the information processing system 100 of the third embodiment may adopt the contents described in the first or second embodiment to the extent that no contradiction occurs.
- the information processing system 100 of the third embodiment may include a part or all of the configuration of the information processing system of the first embodiment.
- the information processing system 100 of the third embodiment may include a part or all of the configuration of the information processing system of the second embodiment.
- the matters described in the first or second embodiment will be omitted or simplified.
- the information processing system 100 may be configured to have both the configuration according to the second embodiment and the configuration according to the third embodiment, and to reflect an event taking place in the virtual space 11 in real space and to reflect an event taking place in the virtual space 11.
- the coordinate definition unit 812 stores information about the correspondence between the coordinates measured by the GPS sensor (in other words, the position detection means) and the virtual space coordinates (in other words, the correspondence between points in the real space and points in the virtual space 11) in the storage unit 611, which serves as a correspondence storage unit.
- information indicating the correspondence between positions in the real space (in other words, the mixed reality space) and the virtual space 11 is stored in advance in the correspondence storage unit.
- a detection device 460 is placed in front of store B as a specific location.
- the detection device 460 is a camera, and is configured to capture an image of the area in front of store B as a specific location in real space.
- multiple detection devices 460 may be placed in front of store B.
- the detection device 460 in this embodiment may not be intended to detect the movements of the MR user 6, but may simply be intended to capture images of a specific event.
- the MR system 400 may or may not include an MR device 410.
- the MR system 400 may include a detection device 460 and a computer 500.
- the computer 500 may be provided outside the detection device 460, or a part or the whole of the computer 500 may be built into the detection device 460.
- the MR system 400 in this embodiment or other embodiments can be interpreted as a camera serving as the detection device 460 installed in front of store B in this embodiment.
- the detection device 460 also includes a GPS sensor and is capable of acquiring location information indicating the location of the detection device 460.
- the MR side input acceptance unit 816 also accepts input of location information acquired using the GPS sensor. That is, the MR side input acceptance unit 816 accepts input of location information indicating a specific location.
- the location information acquired using the GPS sensor can also be considered as location information indicating the location where a specific event photographed by the detection device 460 is taking place.
- the location information acquired using the GPS sensor can also be considered as location information indicating the current location of the specific person.
- the control unit 310 reflects a predetermined event taking place at a specific location in real space in the virtual space 11.
- the predetermined event is a predetermined event (specifically, a live music concert).
- the control unit 310 performs control so that a live music concert taking place in front of store B in real town A also takes place in front of store B in the virtual space 11.
- the MR side input receiving unit 816 receives as input, for example, an image captured by an image sensor provided in a camera serving as the detection device 460.
- the image is an image of a live music performance serving as a specified event.
- the communication control unit 850 also sends the image and information indicating the specific position where the image was acquired to the communication control unit 950, and the control unit 310 displays the image at a position in the virtual space 11 corresponding to the specific position (i.e., in front of store B) based on this information, as shown in FIG. 16.
- the control unit 310 displays the image (i.e., an image of an event taking place at the specific position) on the display 211.
- a VR user in other words, a VR user operating a character at a specific position
- a virtual space 11 in other words, a virtual town A
- the control unit 310 provides the VR user with a view of the virtual space 11 reflecting an event taking place at the specific position in real space. This allows the VR user to share an experience with a person at the specific position in real space.
- information regarding the correspondence between the coordinates measured by the GPS sensor and the virtual space coordinates is stored in the correspondence storage unit as information indicating the correspondence between points in real space and points in virtual space 11.
- information regarding the correspondence between identification information that enables identification of detection device 460 installed at a specific position and coordinates indicating a position in virtual space 11 corresponding to the specific position may be stored as information indicating the correspondence between points in real space and points in virtual space 11, and a predetermined event may be reflected at a position in virtual space 11 corresponding to the specific position based on the information.
- the identification information that enables identification of detection device 460 may function as information indicating a specific position.
- the image related to the predetermined event that the control unit 310 causes to be displayed on the display 211 may be captured by the camera 413 of the MR device 410 worn by the predetermined MR user 6.
- the MR device 410 may function as the detection device 460 of this embodiment.
- the MR device 410 may be equipped with a GPS sensor for acquiring location information related to a specific location where the predetermined event is performed, and a camera for acquiring images related to the predetermined event.
- the MR user 6 wearing the MR device 410 may be a performer of a predetermined event as the predetermined event, or may be a spectator.
- the acquisition of location information and the acquisition of images related to the predetermined event may be performed by different MR devices 410 (in other words, MR devices 410 used by different users).
- a specific event taking place at a specific position in the real space may be reflected in the virtual space 11 as follows. That is, instead of displaying an image of the specific event as described above, the control unit 310 may cause an avatar of a person related to the specific event (here, a performer in a live music performance) to perform an action related to the specific event.
- the MR user 6 in the first embodiment may be read as the person, and the avatar of the person as the MR avatar 26 may be moved in the virtual space 11. That is, for example, the MR side input receiving unit 816 receives the output of a detection means that detects the movement of the person as an input from the person.
- the MR side input receiving unit 816 may receive information on the movement of the MR user 6 detected by image recognition from an image captured by an image sensor provided in a camera as the detection device 460 as an input from the person.
- the user information acquisition unit 818 acquires identification information that enables the person to be identified.
- the user information acquisition unit 818 may acquire the identification information by identifying the MR user 6 in the mixed reality space 21 by image recognition from an image captured by an image sensor included in the detection device 460.
- the VR-side object control unit 920 places the person's avatar (in other words, the MR avatar 26) at a position in the virtual space 11 corresponding to a specific position in the real space.
- the VR-side object control unit 920 moves the person's avatar in the virtual space 11 based on, for example, an input from the person accepted by the MR-side input acceptance unit 816. Specifically, the VR-side object control unit 920 moves the MR avatar 26 based on, for example, information on the movement of the MR user 6 accepted by the MR-side input acceptance unit 816. That is, the VR-side object control unit 920 controls the movement of the person's avatar in the virtual space 11 so that the movement of the person's avatar in the virtual space 11 imitates the movement of the person detected by a detection means for detecting the movement of the person in the real space. This allows a predetermined event taking place at a specific location in real space to be reflected in virtual space 11.
- a VR user in other words, a VR avatar 25
- a view of virtual space 11 reflecting an event taking place at that specific location in real space can be provided to the VR user based on an image of the predetermined event captured by a specific camera.
- a view of virtual space 11 in which a predetermined event is reproduced by an avatar of a person captured by a specific camera can be provided to the VR user.
- the detection of the movement of the person may be performed by the camera 413 of the MR device 410 worn by a specific MR user 6, or by the controller 480, a device worn by the MR user 6, the gaze sensor 412, the sensor 417, or the sensor 486.
- the MR user 6 wearing the MR device 410 may be a performer of a specific event as a specific event, or may be an audience member.
- the display 411 of the MR device 410 of an MR user 6 who is in front of store B as a specific location where the specified event is taking place may display the VR avatar 25 of the VR user watching the specified event in the virtual space 11.
- the control unit 510 of the MR device 410 may acquire information of the VR avatar 25 at a position in the virtual space 11 corresponding to the specific location via the communication control unit 850, and display the VR avatar 25 on the display 411.
- the control unit 510 may display an image of the VR avatar 25 at a position in the virtual space corresponding to the specific location on the display.
- a person in a place where a specific event is taking place in the real space can see the reaction of the VR user in the virtual space 11.
- the performers or spectators of the live performance as the MR user 6 can visually recognize the VR avatar 25 (in other words, the spectators in the virtual space 11) through the display 411 of the MR device 410.
- the performers or spectators of the live performance as the MR user 6 can visually grasp how many spectators have gathered, including the virtual space 11, and how each spectator is reacting.
- the MR device 410 on which the VR avatar 25 is displayed and the device that captures the specified event may be separate devices or may be the same device (specifically, the MR device 410 worn by a single MR user 6).
- the view of the mixed reality space 21 provided to the MR user 6 at a specific position where a specific event is occurring does not have to be provided by the MR device 410 that superimposes a virtual object (specifically, a VR avatar 25) on the real space seen through the transparent display 411, but may be provided by, for example, a smartphone that displays on a display an image in which a virtual object (specifically, a VR avatar 25) is superimposed on an image of the real space captured by a camera.
- the information processing system 100 can be configured to display, for example, a VR avatar 25 (e.g., an audience member's avatar) that is located in a position in the virtual space 11 that corresponds to the specific position on the display of the smartphone when a performer performs a specific live performance while taking a selfie using his or her smartphone at a specific position in the real space.
- the display of the smartphone or the like may display an image in which only the VR avatar 25 is displayed, or an image of the VR avatar 25 with the virtual space 11 (for example, an image of a position in the virtual space 11 corresponding to the specific position) as the background, rather than an image of the mixed reality space 21 in which the VR avatar 25 is superimposed on an image of the real space.
- the control unit 310 may include a reward granting unit that grants a reward to a VR user (in other words, the VR avatar 25) based on the VR user visiting a position in the virtual space 11 that corresponds to a specific position where a specific event is taking place.
- the reward may be, for example, a virtual object (e.g., an item that a character can equip, etc.) that the user who has acquired the reward can use in the game (in other words, in the service), an electronic ticket that can be exchanged for a real object in the real space, or a specific point that the user owns in the service.
- based on visiting may mean granting a reward by directly determining whether or not the user has visited, or may mean granting a reward when a specific action that can be performed by visiting (e.g., starting or finishing watching a live music performance, or continuing for a specific period of time or more) is performed.
- a specific action that can be performed by visiting (e.g., starting or finishing watching a live music performance, or continuing for a specific period of time or more) is performed.
- the storage unit 311 functions as a rights information storage unit that stores information regarding the right to receive the view (hereinafter referred to as "rights information"), and the control unit 310 may provide the view to the VR user if the VR user has the right, and may not provide the view to the VR user if the VR user does not have the right.
- the right may be granted to the VR user by, for example, the VR user paying a price (for example, a payment of cash, or consumption of the VR user's assets (for example, virtual currency, etc.) within the service).
- the right to watch a live music performance may be purchased within the service. More specifically, for example, information regarding the presence or absence of tickets for the live music performance that can be purchased within the service may be stored as the rights information, and the control unit 310 may control so that a VR user who has the ticket can watch the live music performance and a VR user who does not have the ticket cannot watch the live music performance. In other words, information regarding possession of an item (e.g., a ticket) related to the right to receive the view is stored as the rights information, and the control unit 310 may provide the view to a VR user if the VR user has the item. Note that the right (in other words, the item) may be obtained by completing a specified mission within the service, etc.
- the control unit 310 may also provide the VR user with a view of the virtual space 11 reflecting a predetermined event occurring at a specific position in real space and related to an MR user 6 selected by the VR user from among a plurality of users.
- the VR side input receiving unit 916 may receive an input by the VR user relating to an operation (e.g., an operation on the controller 280) for selecting which of a plurality of users a predetermined event related to should be reflected, and the control unit 310 may reflect the predetermined event related to the MR user 6 selected by the operation in the view provided to the VR user.
- the operation may be, for example, an operation by the VR user to select a user from among his/her friends to receive the event reflected.
- the predetermined events related to all friends may be reflected, and the operation may be an operation to register another user as a friend, etc.
- the VR user may be provided with a view of the virtual space 11 reflecting a predetermined event related to his/her friend.
- the VR user may be provided with a view of the virtual space 11 that reflects a specific event related to a user associated with the VR user among multiple users.
- the control unit 310 may reflect the live performance in the view of the virtual space 11 provided to the VR user. Note that, when there are multiple events related to a specific user, it may be possible to select a specific event related to the specific user as the event to be reflected.
- the control unit 310 may also provide the VR user with a view of the virtual space 11 reflecting a predetermined event occurring at a specific position in the real space and having an attribute corresponding to the attribute set for the VR user.
- each VR user may be assigned an attribute related to the user's hobbies and preferences, more specifically, an attribute such as "music lover” or "comedy lover” (or “like a specific entertainer”).
- the control unit 310 may, for example, reflect a predetermined event with the attribute "music” for a VR user with the attribute "music lover” and not reflect a predetermined event with the attribute "music” for a VR user without the attribute "music lover”.
- the control unit 310 may, for example, reflect a predetermined event with the attribute "comedy” for a VR user with the attribute "comedy lover” and not reflect a predetermined event with the attribute "comedy” for a VR user without the attribute "comedy lover”.
- the attributes set for a VR user may be set by each VR user themselves (e.g., by inputting their own preferences via the controller 280), or may be set automatically by AI (Artificial Intelligence) (e.g., based on the user's actions within the service, etc.).
- the control unit 310 may reflect in the virtual space 11 a specific event selected by the VR user among multiple events taking place in the real space (for example, an event related to a specific user selected by the VR user, an event having an attribute corresponding to the VR user's own attribute set by the VR user, an event related to a ticket purchased by the VR user, etc.). In other words, the control unit 310 may not reflect in the virtual space 11 an event not selected by the VR user among multiple events taking place in the real space. In this way, when an unselected event is not reflected in the virtual space 11, there may be an event that is reflected regardless of the user's selection.
- a specific event selected by the VR user among multiple events taking place in the real space for example, an event related to a specific user selected by the VR user, an event having an attribute corresponding to the VR user's own attribute set by the VR user, an event related to a ticket purchased by the VR user, etc.
- not reflecting an unselected event in the virtual space 11 means that there is at least one event that is reflected in the virtual space 11 if a selection is made, and is not reflected in the virtual space 11 if no selection is made. Note that the selection of an event may be made before the occurrence (in other words, start) of the specific event selected by the VR user, or may be made after the occurrence (in other words, while the specific event is taking place).
- the information processing system 100 of the second embodiment and the information processing system 100 of the third embodiment can also be applied to games that utilize both real space and virtual space.
- a specific example is shown below.
- the information processing system 100 may be applied to a game in which a character moving in the virtual space 11 is searched for in the real world (hereinafter, referred to as a "search game”).
- search game a game in which a character moving in the virtual space 11 is searched for in the real world.
- a virtual space 11 that imitates a theme park existing in the real world is prepared.
- the VR-side object control unit 920 moves a predetermined character (hereinafter, referred to as a "search target character”; in other words, a search target object) so as to move around in the virtual space 11.
- the search target character may be one that is operated by a VR user (in other words, one that moves based on the operation of the VR user), or one that moves without the operation of a user.
- the search target character may be a VR avatar 25 or a so-called non-player character.
- the appearance of the search target character may be, for example, the appearance of a character belonging to a theme park (for example, a character that symbolizes a theme park, etc.).
- the MR system 400 (in other words, the control unit 510) displays the search target character on the display 411 when the MR user 6 is in a position in real space that corresponds to the position in virtual space 11 where the search target character is located. In other words, the MR system 400 does not display the search target character on the display 411 when the MR user 6 is not in a position in real space that corresponds to the position in virtual space 11 where the search target character is located. In other words, this search game is a game in which the player searches for the search target character moving around in virtual space 11 in real space.
- the MR side input receiving unit 816 receives an input related to a specific operation by the MR user 6 in a situation where the MR user 6 is in a position in the real space corresponding to the position where the search target character exists in the virtual space 11 (in other words, a situation where the search target character is displayed on the display 411).
- the specific operation may be, for example, an operation of taking a photo of the search target character shown on the display 411 (in other words, an operation of storing an image).
- the specific operation here can also be an operation related to storing information indicating that the search target character has been found.
- the information indicating that the search target character has been found is a photo of the search target character, but it may also be a flag or the like that is set when the search target character is found.
- the information may be automatically stored without the intervention of a specific operation.
- the information indicating that the search target character has been found may be stored based on the MR user 6 visiting a position in the real space corresponding to the position where the search target character exists in the virtual space 11. That is, the control unit 510 stores information indicating that the operation target character has been discovered in the storage unit 511 based on the MR user 6 visiting that position in real space.
- the control unit 510 also grants a bonus to the MR user 6 based on the MR user's visit to the position in real space (in other words, when information indicating that the search target character has been discovered is stored).
- the bonus may be an item that can be used in the virtual space 11.
- the bonus may also be an electronic certificate or trophy indicating that the search target character has been discovered.
- the bonus may also be an electronic ticket that can be exchanged for a real object (e.g., merchandise of the search target character) in real space (e.g., at a real theme park).
- the VR user may be able to transmit information to the MR user 6 to assist in the search.
- the VR user may be the user who operates the search target character, or may be a user different from the user. Specifically, for example, the VR user operates his/her own VR avatar 25 in the virtual space 11 to search for the search target character in the virtual space 11 simulating a theme park.
- the VR side input receiving unit 916 also receives an input related to an operation (e.g., an operation on the controller 280) by the VR user to send sighting information of the search target character (e.g., information on the place where the sighting occurred; in other words, information that assists in the search) to the MR user 6.
- the control unit 510 also receives the sighting information via the communication control unit 850, and causes a predetermined output means to execute an output that assists in the search based on the sighting information. Specifically, for example, the control unit 510 causes the display 411 to display a display informing the user of the location in the virtual space 11 where the search target character has been sighted, and causes the speaker 416 to output a sound informing the user of the location where the search target character has been sighted.
- This configuration makes it possible for an MR user 6 who is actually at a theme park and a VR user who is not actually at the theme park (for example, at home) to cooperate with each other to enjoy a game.
- the information processing system 100 may be applied to a survival game in which a user playing in the virtual space 11 and a user playing in the real space can compete against each other.
- a venue where a user on the real space side plays and a virtual space 11 simulating the venue are prepared.
- the venue may be a so-called survival game field or a predetermined area such as a city.
- a user on the real space side wears an MR device 410
- a user on the virtual space 11 side wears a VR device 210.
- the control unit 310 of the VR system 200 controls the VR avatar 25 of the VR user based on the input from the VR user received by the VR side input receiving unit 916. Specifically, the control unit 310 moves the VR avatar 25 within the virtual space 11 based on the input from the VR user.
- control unit 310 of the VR system 200 receives input from the MR user 6 received by the MR side input receiving unit 816 via the communication control unit 950, and controls the MR avatar 26. Specifically, the control unit 310 moves the MR avatar 26 in the virtual space 11 based on position information (i.e., position information indicating the position of the MR user 6 in the real space) acquired by a GPS sensor (in other words, a position detection means) provided in the MR device 410. More specifically, the control unit 310 moves the MR avatar 26 so that the MR avatar 26 is located at a position in the virtual space 11 corresponding to the position of the MR user 6 in the venue on the real space side.
- position information i.e., position information indicating the position of the MR user 6 in the real space
- a GPS sensor in other words, a position detection means
- a display is provided that makes it appear as if the MR 6 user is present at a position in the virtual space 11 (in other words, within the venue) corresponding to the position of the MR user 6 in the mixed reality space 21.
- height information of the MR device 410 may be obtained via a sensor 417 of the MR device 410, etc.
- the control unit 510 of the MR system 400 also moves the VR avatar 25 so that the VR avatar 25 is located at a position in the mixed reality space 21 that corresponds to the position at which the VR avatar 25 is located in the virtual space 11. That is, on the display 411 of the MR device 410, a display is provided that makes it appear as if the VR user is present at a position in the mixed reality space 21 (in other words, within the venue) that corresponds to the position at which the VR user is located in the virtual space.
- the VR-side input receiving unit 916 also receives input related to an attack by a VR user against the MR user 6 (in other words, the MR avatar 26).
- the MR-side input receiving unit 816 also receives input related to an attack by the MR user 6 against a VR user (in other words, the VR avatar 25).
- an attack by the VR user or MR user 6 against an opponent may involve, for example, firing a gun at the opponent, and the input operation related to the attack may be performed via a gun-shaped controller 280, 480, etc.
- the control unit 310 also determines whether an attack by the VR user against the MR user 6 has hit the MR user 6. Specifically, the control unit 310 determines that an attack by the VR user has hit the MR user 6, for example, when the VR user performs an input operation while the gun held by the VR avatar 25 is pointed at the MR avatar 26 and a bullet fired from the gun hits the MR avatar 26 directly. The control unit 310 also determines whether an attack by the MR user 6 against the VR user has hit the VR user.
- the control unit 310 determines that an attack by the MR user 6 has hit the VR user, for example, when the MR user 6 performs an input operation while the gun held by the MR avatar 26 is pointed at the VR avatar 25 and a bullet fired from the gun hits the VR avatar 25 directly.
- the direction of the gun held by the MR avatar 26 is linked to the direction of the controller 480 held by the MR user 6, for example, and the aim of the gun of the MR avatar 26 is aligned with a position in the virtual space 11 corresponding to the position where the MR user 6 aims the controller 480 in the mixed reality space 21.
- control unit 310 determines that an attack by the MR user 6 hits the VR user when the MR user 6 performs an input operation while aiming the gun-shaped controller 480 at the VR avatar 25 in the mixed reality space 21.
- an attack by the VR user against the MR user 6 can be made similar to that of a normal shooting game.
- a part or all of the determination of an attack by the MR user 6 against the VR user or an attack by the VR user against the MR user 6 may be made by the control unit 510 or the control unit 610, etc.
- the determination unit that determines whether an MR user 6 has attacked a VR user may be realized by any of the control unit 310, the control unit 510, and the control unit 610.
- the survival game may also be capable of allowing the VR user and the MR user 6 to play cooperatively. That is, for example, in a survival game in which multiple teams (e.g., two teams) made up of multiple users can compete against each other, one of the multiple teams may be made up of only VR users (in other words, users on the virtual space 11 side), and the other team may be made up of only MR users 6 (in other words, users on the real space side). Also, one team may be capable of including both a VR user and an MR user 6.
- the information processing system 100 may be applied to an escape game (in other words, a puzzle-solving game) in which a user on the real space side and a user on the virtual space 11 side cooperate to play.
- an escape game in other words, a puzzle-solving game
- a venue where the user on the real space side plays and a virtual space 11 simulating the venue are prepared.
- the venue may be a specified area such as a city center, or may be a specific facility.
- the user on the real space side wears the MR device 410, and the user on the virtual space 11 side wears the VR device 210.
- a user wearing the MR device 410 and a user wearing the VR device 210 cooperate to accomplish a set mission.
- the MR device 410 and the VR device 210 display virtual objects related to the mission at corresponding positions.
- the virtual object may be, for example, an object related to the reception of an action performed by the VR user and the MR user 6 in cooperation.
- a predetermined action e.g., a hand-waving operation
- the VR user 6 performs a predetermined action (e.g., a hand-waving operation.
- the action may be the same as or different from the action performed by the VR user), the predetermined device may be started (in other words, the game (event) may progress), or the like.
- the virtual object may also be, for example, an item that can be acquired by at least one of the VR user or the MR user 6, such as a treasure chest.
- the virtual object may also be an object that includes information about solving the puzzle (for example, a sentence about the puzzle or hints for solving the puzzle).
- a second action related to the progress of the game by the MR user 6 or the VR user may be possible on the condition that one of the MR user 6 and the VR user has performed a first action related to the progress of the game.
- the first action is an action that can only be performed by the one user and cannot be performed by the other user.
- the second action is an action that can only be performed by the other user and cannot be performed by the one user.
- the VR system 200 may receive an input related to the second action by the VR user.
- the second action by the VR user may be an action that can be executed based on the MR user 6 visiting a specific place in the real space.
- the MR system 400 may accept an input related to the second action by the MR user 6.
- the second action by the MR user 6 may be an action that can be executed based on the VR user visiting a specific place in the virtual space 11.
- the VR device 210 may display a predetermined display related to the progress of the game (for example, a display of a hint related to solving a puzzle) on the display 211 only while the MR user 6 is pointing the camera of the smartphone as the MR device 410 at a specific place in the real space in the real space.
- the virtual space 11 may represent the past of the mixed reality space 21, and the VR system 200 may receive input related to a predetermined action (e.g., an action of placing an item at a specific position in the virtual space 11) as a first action by the VR user in the virtual space 11 (in other words, the past).
- a predetermined action e.g., an action of placing an item at a specific position in the virtual space 11
- the MR system 400 may receive input related to a predetermined action (e.g., an action of going to a position in the mixed reality space 21 corresponding to the specific position and finding the item) as a second action by the MR user 6 in the real space (in other words, the present), and may be configured to progress the game (in other words, an event) based on the input.
- a predetermined action e.g., an action of going to a position in the mixed reality space 21 corresponding to the specific position and finding the item
- the present invention is not limited to the above-described embodiment, and can be modified in various ways without departing from the spirit of the invention.
- the components can be freely combined, any component can be modified, or any component can be omitted.
- the process flow described in this specification is merely an example, and the order and configuration of each process can be different.
- a first control means e.g., a control unit 310) for providing a first view, which is a view of a virtual space, to a first user; an input receiving means (e.g., an MR side input receiving unit 816) that receives an output of a detection means that detects a movement of a second user as an input from the second user; A position information receiving unit (e.g., an MR side input receiving unit 816) that receives position information indicating a current position of the second user in real space; A correspondence relationship storage means (e.g., a storage unit 611) that stores information indicating a correspondence relationship between positions in the real space and the virtual space, The first control means places an avatar of the second user at a position in the virtual space corresponding to the position of the second user indicated by the position information, and reflects the movement of the second user detected by the detection means in the avatar of the second user.
- an input receiving means e.g., an MR side input receiving unit 816) that receives an output of a detection
- (Appendix 2) The information processing system according to claim 1, wherein the detection means is a wearable device or an implantable device.
- the device worn by the second user can detect the movements of the second user's hands and feet and reflect them in the avatar, so that the movements of the avatar can be faithfully reproduced to those of the second user, expanding the range of uses of the virtual space.
- a second control means (e.g., a control unit 510) for displaying an image of an avatar of the first user on a display of a predetermined device used by the second user,
- the second control means causes an image of the first user's avatar to be displayed on the display when the location information indicates that the second user is in a specific location and the first user's avatar is in a location in the virtual space corresponding to the specific location.
- a second user who is in a specific position in the real space can see an avatar of a first user who is in a position corresponding to the specific position in the virtual space, thereby expanding the range of uses of the virtual space.
- a first control means e.g., a control unit 310) for providing a first view, which is a view of a virtual space, to a first user; an input receiving means (e.g., an MR side input receiving unit 816) that receives an output of a detection means that detects a movement of a second user as an input from the second user; A position information receiving unit (e.g., an MR side input receiving unit 816) that receives position information indicating a current position of the second user in real space; a correspondence relationship storage unit (e.g., a storage unit 611) that stores information indicating a correspondence relationship between positions in the real space and the virtual space; and a program that causes a computer of an information processing system to function as the first control unit,
- the first control means is a program that places an avatar of the second user at a position in the virtual space corresponding to the position of the second user indicated by the position information, and reflects the movements of the second user detected by the detection means in the avatar of the second
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Graphics (AREA)
- Environmental & Geological Engineering (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Ce système de traitement d'informations comprend : un premier moyen de commande pour fournir une première vue, qui est une vue d'un espace virtuel, à un premier utilisateur ; un moyen d'acceptation d'entrée pour accepter une sortie d'un moyen de détection qui détecte un mouvement d'un second utilisateur, en tant qu'entrée provenant du second utilisateur ; un moyen d'acceptation d'informations de position pour accepter des informations de position indiquant la position actuelle du second utilisateur dans l'espace réel ; et un moyen de stockage de relation de correspondance pour stocker des informations indiquant une relation de correspondance entre des positions dans l'espace réel et dans l'espace virtuel. Le premier moyen de commande place un avatar du second utilisateur dans une position dans l'espace virtuel correspondant à la position du second utilisateur indiqué dans les informations de position, et reflète les mouvements du second utilisateur détectés par le moyen de détection dans l'avatar du second utilisateur.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2023018930A JP7412617B1 (ja) | 2023-02-10 | 2023-02-10 | 情報処理システムおよびプログラム |
| JP2023-018930 | 2023-09-15 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024166715A1 true WO2024166715A1 (fr) | 2024-08-15 |
Family
ID=89451980
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2024/002463 Ceased WO2024166715A1 (fr) | 2023-02-10 | 2024-01-26 | Système de traitement d'informations et programme |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP7412617B1 (fr) |
| WO (1) | WO2024166715A1 (fr) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2020126455A (ja) * | 2019-02-05 | 2020-08-20 | 凸版印刷株式会社 | 空間情報管理装置 |
| JP2021175043A (ja) * | 2020-04-22 | 2021-11-01 | セイコーエプソン株式会社 | 頭部装着型表示装置、音声画像出力システム、及び、音声画像出力方法 |
| JP2021189695A (ja) * | 2020-05-28 | 2021-12-13 | 株式会社Spacial | 方法、プログラム、情報処理装置 |
| WO2021261346A1 (fr) * | 2020-06-23 | 2021-12-30 | 株式会社ソニー・インタラクティブエンタテインメント | Dispositif, procédé et programme de traitement d'informations, et système de traitement d'informations |
-
2023
- 2023-02-10 JP JP2023018930A patent/JP7412617B1/ja active Active
-
2024
- 2024-01-26 WO PCT/JP2024/002463 patent/WO2024166715A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2020126455A (ja) * | 2019-02-05 | 2020-08-20 | 凸版印刷株式会社 | 空間情報管理装置 |
| JP2021175043A (ja) * | 2020-04-22 | 2021-11-01 | セイコーエプソン株式会社 | 頭部装着型表示装置、音声画像出力システム、及び、音声画像出力方法 |
| JP2021189695A (ja) * | 2020-05-28 | 2021-12-13 | 株式会社Spacial | 方法、プログラム、情報処理装置 |
| WO2021261346A1 (fr) * | 2020-06-23 | 2021-12-30 | 株式会社ソニー・インタラクティブエンタテインメント | Dispositif, procédé et programme de traitement d'informations, et système de traitement d'informations |
Also Published As
| Publication number | Publication date |
|---|---|
| JP7412617B1 (ja) | 2024-01-12 |
| JP2024113768A (ja) | 2024-08-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6276882B1 (ja) | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム | |
| JP6306442B2 (ja) | プログラム及びゲームシステム | |
| JP2022549853A (ja) | 共有空間内の個々の視認 | |
| CN109643161A (zh) | 动态进入和离开由不同hmd用户浏览的虚拟现实环境 | |
| CN107656615A (zh) | 大量同时远程数字呈现世界 | |
| JP6785325B2 (ja) | ゲームプログラム、方法、および情報処理装置 | |
| JP2018124665A (ja) | 情報処理方法、コンピュータ、および当該情報処理方法をコンピュータに実行させるためのプログラム | |
| JP2020107123A (ja) | プログラム、情報処理装置、および方法 | |
| JP2009070076A (ja) | プログラム、情報記憶媒体及び画像生成装置 | |
| JP6622832B2 (ja) | プログラム及びゲームシステム | |
| JP6826626B2 (ja) | 視聴プログラム、視聴方法、および視聴端末 | |
| JP6722316B1 (ja) | 配信プログラム、配信方法、コンピュータ、および視聴端末 | |
| JP2019032844A (ja) | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム | |
| JP2019168962A (ja) | プログラム、情報処理装置、及び情報処理方法 | |
| JP2018195287A (ja) | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム | |
| JP2020179184A (ja) | ゲームプログラム、方法、および情報処理装置 | |
| US20240013502A1 (en) | Storage medium, method, and information processing apparatus | |
| JP6718933B2 (ja) | プログラム、情報処理装置、および方法 | |
| JP5479503B2 (ja) | プログラム、情報記憶媒体及び画像生成装置 | |
| JP2018190196A (ja) | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム | |
| JP7354466B1 (ja) | 情報処理システムおよびプログラム | |
| JP7412613B1 (ja) | 情報処理システムおよびプログラム | |
| JP7412617B1 (ja) | 情報処理システムおよびプログラム | |
| JP2019049987A (ja) | 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム | |
| JP7413472B1 (ja) | 情報処理システムおよびプログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24753159 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |