WO2025022510A1 - Système de traitement d'informations, procédé de traitement d'informations, dispositif d'affichage et programme informatique - Google Patents
Système de traitement d'informations, procédé de traitement d'informations, dispositif d'affichage et programme informatique Download PDFInfo
- Publication number
- WO2025022510A1 WO2025022510A1 PCT/JP2023/026906 JP2023026906W WO2025022510A1 WO 2025022510 A1 WO2025022510 A1 WO 2025022510A1 JP 2023026906 W JP2023026906 W JP 2023026906W WO 2025022510 A1 WO2025022510 A1 WO 2025022510A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- information
- content
- unit
- processing system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B10/00—Instruments for taking body samples for diagnostic purposes; Other methods or instruments for diagnosis, e.g. for vaccination diagnosis, sex determination or ovulation-period determination; Throat striking implements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/02—Subjective types, i.e. testing apparatus requiring the active assistance of the patient
- A61B3/024—Subjective types, i.e. testing apparatus requiring the active assistance of the patient for determining the visual field, e.g. perimeter types
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
Definitions
- the present invention relates to an information processing system, an information processing method, and a display device.
- HMDs head-mounted displays
- Patent Document 1 discloses a rehabilitation support system that uses an HMD.
- the object of the present invention is therefore to provide a technical improvement that solves or alleviates at least some of the problems of the prior art described above.
- One of the more specific objectives of this disclosure is to provide an information processing system, an information processing method, a display device, and a computer program that are capable of appropriately determining the user's condition.
- the information processing system of the present invention is an information processing system having one or more computer processors, and the one or more computer processors are characterized by having a transmitting unit that transmits information for displaying a virtual space having a display area in which specific content can be played to a specified display device, a receiving unit that receives behavior information of a user playing a first content in the display area, transmitted from the specified display device, and a determining unit that determines the state of the user based on the user behavior information received by the receiving unit.
- the user's behavior information can be gaze information related to the user's gaze, operation information related to the user's operation, and/or speech information related to the user's speech.
- the user's state may be related to the function of the user's eyes, brain and/or mind.
- the first content can be game content or video content for obtaining information to determine the user's state.
- the one or more computer processors may further include an association unit that associates a first reward with the user information of the user when the gaze determination unit determines that the gaze position is within the first area.
- the one or more computer processors may further include a display processing unit that causes a screen to be displayed in the display area, including a first object displayed in a first area set in the center of the display area, a second object different from the first object and moving at least within the first area, and a third object different from the first object and the second object and displayed for at least a predetermined period of time in a second area outside the first area.
- a display processing unit that causes a screen to be displayed in the display area, including a first object displayed in a first area set in the center of the display area, a second object different from the first object and moving at least within the first area, and a third object different from the first object and the second object and displayed for at least a predetermined period of time in a second area outside the first area.
- the virtual space can include a first space having a display area in which specific content can be played.
- the information provided can be guidance to a service that provides information about the user's status in virtual space.
- the provided information can be information that guides the user to second content for maintaining or improving the user's status in the virtual space.
- the second content can be dedicated game content or video content to maintain or improve the user's condition.
- the one or more computer processors can change the display position of the display area in response to the user's head movement while the first content is being played back.
- the one or more computer processors can change the display position of the display area in response to the movement of the user's line of sight while the first content is being played back.
- a character object controlled by one or more computer processors is displayed, and the character object can act according to the user's behavior information.
- the receiving unit further receives biometric information of the user, and the determining unit can determine the user's condition based on the user's behavior information and the user's biometric information received by the receiving unit.
- the one or more computer processors may further include a notification unit that notifies the user terminals of other users associated with the user of the result of the determination made by the determination unit.
- the one or more computer processors further include a generation unit that generates information to be provided to the user based on the result of the determination unit, the information to be provided being guidance information to a service that provides information on the user's status in the virtual space, and the notification unit can further notify the user of the status of the provision of the service based on the guidance information.
- the one or more computer processors may allow access to the virtual space from user terminals of other users associated with the user, and enable conversation between the user and other users within the virtual space and/or display of the user's avatar and the avatars of the other users.
- the service that provides information about the user's condition in the virtual space can be an expert consultation service, a food delivery service, and/or an insurance service.
- the information processing method disclosed herein is characterized in that one or more computer processors are caused to execute a transmission step of transmitting, to a specified display device, information for displaying a virtual space having a display area capable of playing back specific content, a reception step of receiving behavior information of a user playing back a first content in the display area, transmitted from the specified display device, and a determination step of determining a state of the user based on the user behavior information received in the reception step.
- the display device of the present disclosure is characterized by comprising a display processing unit that causes a display unit to display a virtual space having a display area in which specific content can be played, an acquisition unit that acquires behavior information of a user playing a first content in the display area, and a determination unit that determines the state of the user based on the user behavior information acquired by the acquisition unit.
- the computer program of the present disclosure is characterized in that it causes one or more computer processors to realize a display processing function for displaying on a display unit a virtual space having a display area capable of playing specific content, an acquisition function for acquiring behavior information of a user playing a first content in the display area, and a determination function for determining the state of the user based on the user behavior information acquired by the acquisition function.
- the present invention provides a technical improvement that solves or alleviates at least some of the problems with the prior art described above.
- the user's state can be appropriately determined according to the particularities of the content and/or the determination environment.
- FIG. 1 is a system configuration diagram showing an example of a system configuration of an information processing system according to the present disclosure.
- FIG. 11 is a system configuration diagram showing another example of the system configuration of the information processing system according to the present disclosure.
- FIG. 2 is a configuration diagram illustrating an example of a hardware configuration of a server device according to the present disclosure.
- FIG. 2 is a configuration diagram illustrating an example of a functional configuration of an information processing system according to the present disclosure.
- FIG. 2 is a conceptual diagram illustrating an example of a screen displayed on a display device according to the present disclosure.
- FIG. 2 is a conceptual diagram illustrating an example of a screen displayed on a display device according to the present disclosure.
- FIG. 11 is a configuration diagram showing another example of the functional configuration of the information processing system according to the present disclosure.
- FIG. 2 is a conceptual diagram illustrating an example of a screen displayed on a display device according to the present disclosure.
- FIG. 2 is a conceptual diagram illustrating an example of a screen displayed on a display device according to the present disclosure.
- FIG. 2 is a conceptual diagram illustrating an example of a screen displayed on a display device according to the present disclosure.
- FIG. 2 is a conceptual diagram illustrating an example of a screen displayed on a display device according to the present disclosure.
- FIG. 11 is a flow diagram illustrating an example of an information processing method in the information processing system of the present disclosure.
- FIG. 2 is a circuit configuration diagram showing an example of a circuit configuration for realizing a computer program executed in the information processing system of the present disclosure.
- FIG. 2 is a functional configuration diagram showing an example of a functional configuration of a display device according to the present disclosure.
- FIG. 11 is a flow diagram showing an example of an information processing method in the display device of the present disclosure.
- FIG. 2 is a circuit configuration diagram showing an example of a circuit configuration for realizing a computer program executed by a display device of the present disclosure.
- an information processing system 1000 may include a server device 100 connected to a display device 200 via a network. As shown in Fig. 2, the information processing system 1000 may also include the display device 200 in addition to the server device 100.
- the display device 200 can be a television device, a smartphone (multifunction telephone terminal), a tablet terminal, a personal computer, a console game machine, a head mounted display (HMD), a wearable computer such as a glasses-type wearable terminal (AR glasses, etc.), and an information processing device capable of playing videos other than these devices.
- these terminals may be standalone devices that operate independently, or may be composed of multiple devices connected to each other so that they can send and receive various data.
- the server device 100 includes a processor 101, a memory 102, a storage 103, an input/output interface (input/output I/F) 104, and a communication interface (communication I/F) 105.
- the components are connected to each other via a bus B.
- the server device 100 can realize the functions and methods described in this embodiment through cooperation between the processor 101, memory 102, storage 103, input/output I/F 104, and communication I/F 105.
- the processor 101 executes functions and/or methods realized by codes or instructions included in a program stored in the storage 103.
- the processor 101 may include, for example, a central processing unit (CPU), a microprocessing unit (MPU), a graphics processing unit (GPU), a microprocessor, a processor core, a multiprocessor, an application-specific integrated circuit (ASIC), a field programmable gate array (FPGA), etc., and may realize each process disclosed in each embodiment by a logic circuit (hardware) or a dedicated circuit formed in an integrated circuit (an integrated circuit (IC) chip, a large scale integration (LSI)), etc.). These circuits may be realized by one or more integrated circuits, and multiple processes shown in each embodiment may be realized by one integrated circuit.
- an LSI may be called a VLSI, a super LSI, an ultra LSI, etc., depending on the degree of integration.
- Memory 102 temporarily stores programs loaded from storage 103 and provides a working area for processor 101. Memory 102 also temporarily stores various data that is generated while processor 101 is executing a program. Memory 102 includes, for example, RAM (Random Access Memory), ROM (Read Only Memory), etc.
- Storage 103 stores programs.
- Storage 103 includes, for example, a HDD (Hard Disk Drive), an SSD (Solid State Drive), flash memory, etc.
- the communication I/F 105 is implemented as hardware such as a network adapter, communication software, or a combination of these, and transmits and receives various types of data via a network.
- the communication may be performed either wired or wirelessly, and any communication protocol may be used as long as mutual communication can be performed.
- the communication I/F 105 communicates with other devices via the network.
- the communication I/F 105 transmits various types of data to other devices according to instructions from the processor 101.
- the communication I/F 105 also receives various types of data transmitted from other devices and transmits it to the processor 101.
- the input/output I/F 104 includes an input device for inputting various operations to the server device 100, and an output device for outputting the results of processing by the server device 100.
- the input/output I/F 104 may be an integrated input device and output device, or may be separate input device and output device.
- the input device is realized by any of a variety of devices, or a combination of devices, that can receive input from a user and transmit information related to the input to the processor 101.
- Examples of input devices include hardware keys such as a touch panel, a touch display, and a keyboard, pointing devices such as a mouse, a camera (operation input via images), and a microphone (operation input via voice).
- the display device 200 in this disclosure can be configured with the same hardware configuration as in FIG. 3, unless otherwise noted.
- the information processing system 1000 includes at least a server device 100.
- the one or more computer processors included in the information processing system 1000 include a transmitting unit 110, a receiving unit 120, and a determining unit 130, as shown as an example in FIG. 4.
- the above-mentioned transmitting unit 110, receiving unit 120, and determining unit 130 will be described as being provided in the server device 100, but some or all of them may be provided in the display device 200.
- the transmission unit 110 transmits information for displaying the virtual space to the display device 200.
- the display device 200 can be, for example, a wearable display device.
- Wearable display devices include, for example, HMDs (Head Mounted Displays) and glasses-type displays.
- the display device 200 can include at least a display unit, a processing unit, and a communication unit.
- the display device 200 may also be equipped with an acceleration sensor and a posture sensor for detecting the movement of the user's head.
- the display device 200 may also include sensors for detecting the movement of the user's eyes, the state of the pupils, and the user's facial expressions. In other words, the display device 200 may have an eye tracking function and a facial tracking function.
- the display device 200 can also be connected, via wire or wirelessly, to a specific controller that can be operated by the user.
- the virtual space has a display area in which specific content can be played.
- the virtual space may include at least a first space, and the first space may include a display area in which the specific content can be played.
- FIG. 5 is an image diagram showing an example of a virtual space 10 displayed on the display unit of the display device 200.
- the first space 11 is a space representing the inside of a spaceship
- the display area 12 is represented as a window of the spaceship. The user can then view the virtual space of outer space or specific content from inside the spaceship through the window of the spaceship.
- the specific content can be, for example, game content or video content, but is not limited thereto as long as it includes the first content described below.
- Specific content can be made to start playing in response to a user's operation on an operation panel object 13 located inside the spaceship.
- playback of a particular piece of content may be triggered by the user taking the action of sitting down on a chair object 14 placed inside the spaceship.
- the display area 12 during playback of a particular piece of content can extend over almost the entire field of view of the user.
- playback of a particular piece of content can be stopped or ended in response to a user's operation on an operation UI object 15 that is displayed during playback of the particular piece of content.
- playback of a particular piece of content may be stopped or ended when the user stands up from a chair object placed inside the spaceship.
- the receiving unit 120 receives behavior information of the user while playing the first content, which is transmitted from the display device 200.
- the first content is one of the specific contents.
- the first content can be game content or video content dedicated to obtaining information for determining the user's state. Details will be described later.
- the user's behavior information can be, for example, gaze information about the user's gaze, operation information about the user's operation, and/or speech information about the user's speech. Details will be described later.
- the determination unit 130 determines the user's state based on the user's behavior information received by the receiving unit 120.
- the user's state may be, for example, a state related to the user's eye function, brain function and/or mental function. More details are provided below.
- the above configuration provides a technical improvement that solves or alleviates at least some of the problems with the conventional technology described above.
- the above configuration makes it possible to appropriately determine the state of the user based on the behavior information of the user while the first content is being played back.
- the user's behavior information can be gaze information related to the user's gaze, operation information related to the user's operation, and/or speech information related to the user's speech.
- Gaze information regarding the user's gaze can be acquired by a camera provided in the display device 200. Furthermore, in the case of a display device 200 that has an infrared irradiation function, data such as the gaze direction, eye opening, and pupil size and position can be acquired by using the dual camera and infrared irradiation.
- the user's behavior information may also include facial expression information relating to the user's facial expressions.
- the facial expression information relating to the user's facial expressions may be acquired by a camera provided in the display device 200. Specifically, each part of the user's face may be tracked to acquire data on the user's facial expressions and mouth movements.
- Operation information regarding the user's operation can be acquired by a controller connected to the display device 200.
- the operation information can be acquired as the user's hand movements captured by a camera provided in the display device 200.
- the speech information regarding the user's speech can be acquired by a microphone provided in the display device 200.
- the user's state may relate to the function of the user's eyes, brain and/or mind.
- the condition of eye function can be, for example, the condition of the presence or absence of glaucoma symptoms, the condition of eyesight, etc.
- brain function can be the state of whether or not there are symptoms of dementia.
- Mental function can be, for example, the state of whether or not there are symptoms of a mental illness.
- the one or more computer processors in this disclosure may further include a generation unit 140, as shown as an example in FIG. 7.
- the generation unit 140 generates information to be provided to the user based on the results of the determination unit.
- the provided information can be information directing to a service that provides information about the user's status in a virtual space.
- a service that provides information about the user's condition is, for example, a service that allows the user to receive consultation or medical treatment regarding their condition in a virtual space, such as a hospital in a virtual space.
- the provided information can be, for example, guidance information for the second content.
- the second content can be, for example, game content or video content.
- the second content can be game content or video content dedicated to maintaining or improving the user's condition. Details will be provided later.
- the provided information can be information directing users to a service that provides information about the user's status in virtual space.
- the guidance information can be information such as a display of a route to the location in the virtual space where the above-mentioned service is provided, or navigation.
- FIG. 8 shows an example of a route that a spacecraft can take to reach a service location P1, with the route shown in the display area 12 by an arrow P2, etc.
- the guidance information may also be link information for warping to a location in virtual space where the above service is provided.
- the guidance information may be information (such as a button) for displaying the user (doctor, etc.) involved in the service provision or the user's avatar CO in the display area 12 and enabling a conversation, as shown as an example in FIG. 9.
- ophthalmology clinic in the virtual space.
- the user can access the ophthalmology clinic in the virtual space by following the guidance information.
- the user can consult with a doctor or the like about their symptoms and receive online medical treatment.
- the user and/or the doctor may be displayed as an avatar.
- Eye function can be determined with greater accuracy by using a head-mounted display device as the display device.
- Brain function can be determined more accurately by using a head-mounted display device as the display device.
- a psychiatrist, psychoneurologist, etc. in the virtual space For example, if it is determined that the user's mental function is abnormal, guidance information to a psychiatrist, psychoneurologist, etc. in the virtual space is displayed.
- the user can access a psychiatrist, psychoneurologist, etc. in the virtual space by following the guidance information.
- the user can consult with a doctor, etc., and an online medical examination can be performed.
- the user and/or the doctor may be displayed as an avatar.
- a head-mounted display device as the display device, it is possible to judge the mental function with greater accuracy.
- the provided information can be information to guide the user to second content for maintaining or improving the user's state in the virtual space.
- the first content is game content or video content dedicated to obtaining information for determining the user's state.
- the second content can be dedicated game content or video content to maintain or improve the user's condition.
- Game content can be content that progresses in response to user operations. However, rather than game content for entertainment purposes, game content can be dedicated to obtaining information for determining the user's condition or for maintaining or improving the user's condition.
- Video content can be content that a user watches. However, it can be video content that is not for entertainment purposes, but is dedicated to obtaining information for determining the user's condition or for maintaining or improving the user's condition.
- the first content may be content for evaluating the user's field of view.
- the first content may be game content simulating a shooting game, as shown as an example in FIG. 10.
- Such game content is a game in which the user, for example, inputs the game by pressing a first button on the controller when a moving first object 17 overlaps with a crosshair 16 fixed in the central region R1 of the display area, and inputs the game by pressing a second button on the controller when a second object 18 appears in the peripheral region R2 of the display area, thereby adding up the score and progressing.
- a configuration can be adopted in which the user inputs, such as by pressing a first button on the controller, when a moving second object 17 overlaps with a crosshair (first object) 16 fixed in the central region R1 of the display area, and the user inputs, such as by pressing a second button on the controller, when a third object 18 appears in the peripheral region R2 of the display area, thereby adding up the score.
- a configuration can be adopted in which the user's gaze is tracked, and if the user's gaze is in the central region R1, an additional score is added.
- a configuration can be adopted in which a crosshair that follows the user's line of sight is displayed, and a score is added when the crosshair is in the central region R1 and overlaps with the first object 17.
- a predetermined sound for example, an alarm sound
- the time until the first object 17 (the target) collides with the central region R1 may be displayed on the screen.
- the sound effects can be changed according to the distance between the first object 17, which is the target, and the crosshairs 16 or the central region R1.
- changes in the sound effects include changing the frequency or rhythm, and the sound effects can be configured to gradually increase the sense of tension.
- the first content may be content for evaluating the user's eyesight.
- the first content may be game content simulating a vision test, as shown as an example in FIG. 11.
- Such game content is a game that progresses by, for example, inputting by the user pressing a corresponding button on the controller or by the user speaking the corresponding word in accordance with the shape of the target icon 19 displayed in the display area.
- the second content may be content for maintaining or improving the user's field of vision.
- the second content may be game content simulating a shooting game such as that shown as an example in FIG. 10.
- a configuration can be applied that allows the user to perform exercises that involve large eye movements up, down, left and right.
- a configuration can be adopted in which the first object 17 moves in the surrounding area R2, and a score is added when the aim that follows the line of sight overlaps with the first object 17.
- the appearance of the second object 18 is not essential.
- the one or more computer processors in the present disclosure may further include a gaze determination unit 150, as shown as an example in FIG. 7.
- the gaze determination unit 150 determines whether the user's gaze position is within a first region R1 ( Figure 10) set in the center of the display area 12 based on gaze information regarding the user's gaze, which is user behavior information received by the receiving unit 120.
- One or more computer processors in the present disclosure may further include an association unit 160, as shown as an example in FIG. 7.
- the association unit 160 associates a first reward with the user information of the user when the gaze determination unit 150 determines that the gaze position is within the first region R1.
- Such first reward may be an addition to the score as described above.
- the one or more computer processors in this disclosure may further include a playback unit 170, as shown as an example in FIG. 7.
- the playback unit 170 plays a predetermined sound on a predetermined display device 200 when the gaze determination unit 150 determines that the gaze position is in the second region R2 (FIG. 10).
- the specified sound can be the alarm sound mentioned above.
- the display processing unit 180 displays in the display area 12 a screen including a first object 16 displayed in a first area R1 set in the center of the display area, a second object 17 different from the first object, the second object 17 moving at least within the first area R1, and a third object 18 different from the first object 16 and the second object 17, the third object 18 being displayed for at least a predetermined period of time in a second area R2 outside the first area R1.
- the one or more computer processors in the present disclosure can change the display position of the display area in response to the movement of the user's head while the first content is being played.
- the center of the display area can always be displayed in the center (front) of the user's field of vision, even if the user moves their head.
- This configuration makes it possible to prevent a decrease in the accuracy of determining the user's state (particularly the state of the field of vision) due to the movement of the user's head.
- one or more computer processors in this disclosure can change the display position of the display area in response to the user's eye movement while playing specific content.
- the center of the display area can always be displayed in front of the user's line of sight, even if the user moves their line of sight.
- This configuration makes it possible to prevent a decrease in the accuracy of determining the user's state (particularly the state of the field of vision) due to the movement of the user's head.
- the display position of the display area can be appropriately adjusted, making it possible to appropriately determine the user's state.
- a character object that acts according to the user's behavior information can be displayed.
- the receiving unit 120 further receives biometric information of the user, and the determining unit 130 can determine the user's condition based on the user's behavior information and the user's biometric information received by the receiving unit 120.
- biometric information refers to biometric information other than the biometric information contained in the behavior information described above.
- biometric information can include information related to brain waves, pulse rate, body temperature, etc.
- the notification unit 150 notifies the user terminals of other users associated with the user of the results of the determination made by the determination unit 130.
- the other users may be the user's family, friends, acquaintances, or other users who have a relationship with the user in the real world.
- the other users may also be administrators of a real-world community (such as a company, school, nursing home, hospital, etc.) to which the user belongs.
- the notification unit 150 may notify other users of all the results of the determination made by the determination unit 130, but may decide whether or not to notify them depending on the results of the determination.
- the notification unit 150 can further notify the user of the status of the service provided based on the guidance information.
- the service provision status refers to whether or not the above-mentioned service has been provided to the user, i.e., whether or not the user has received the service based on the guidance information.
- This configuration can encourage users to receive services.
- the one or more computer processors in the present disclosure can allow access to the virtual space from the user terminals of other users associated with the user.
- one or more computer processors in the present disclosure can enable conversation between the user and other users in the virtual space and/or display of the user's avatar and the avatars of other users.
- the other users need only be associated with the user, and are not limited to the above-mentioned family members, administrative users, etc.
- the other users referred to here can also include the user's friends who exist only in the virtual space and have no relationship with the user in the real world.
- multiple users including the user and other users, can access the same virtual space.
- the avatars of the user and the other users may be displayed in the virtual space.
- the avatar may also reflect the facial expressions and/or movements of the user and other users, and may move in response to operations by the user and other users.
- conversations between users and other users are deemed to include conversations via voice chat or text conversations via comment input.
- the above configuration will encourage communication with other users in the virtual space and help stimulate the user's brain.
- the service that provides information about the user's condition in the virtual space can be a consultation service with an expert, a food delivery service, and/or an insurance service.
- the expert consultation service allows users to consult with experts such as doctors, nurses, and counselors about their condition via virtual space.
- the food delivery service allows users to request delivery of products they need to improve their condition via virtual space.
- the insurance service is a service that proposes insurance that corresponds to the user's condition via a virtual space.
- the terms of insurance enrollment (such as the amount, for example) may change depending on the user's condition and the usage of the service.
- the user can receive services in a virtual space that correspond to the user's status.
- the information processing method disclosed herein is characterized in that one or more computer processors execute a sending step S110, a receiving step S120, and a determining step S130, as shown as an example in FIG. 12.
- step S110 information is transmitted to a specific display device for displaying a virtual space having a display area capable of playing specific content.
- the transmission step S110 can be executed by the above-mentioned transmission unit 110. Details of the transmission unit 110 are as described above.
- the receiving step S120 receives behavior information of a user who is playing back a first content in a display area, the behavior information being sent from a specific display device.
- the receiving step S120 can be executed by the receiving unit 120 described above. Details of the receiving unit 120 are as described above.
- the determination step S130 determines the user's state based on the user behavior information received in the reception step S120.
- the determination step S130 can be executed by the determination unit 130 described above. Details of the determination unit 130 are as described above.
- the above configuration provides a technical improvement that solves or alleviates at least some of the problems with the conventional technology described above.
- the computer program in this disclosure is characterized by causing one or more computer processors to realize a transmission function, a reception function, and a determination function.
- the transmission function transmits information to a specified display device to display a virtual space with a display area capable of playing specific content.
- the receiving function receives behavior information of a user who is playing back a first content in a display area, the behavior information being sent from a specific display device.
- the determination function determines the user's status based on the user's behavior information received by the reception function.
- circuits 1110 to 1130 shown in FIG. 13 The above functions can be realized by circuits 1110 to 1130 shown in FIG. 13.
- the transmission circuit 1110, the reception circuit 1120, and the determination circuit 1130 are realized by the transmission unit 110, the reception unit 120, and the determination unit 130 described above, respectively. Details of each unit are as described above.
- the above configuration provides a technical improvement that solves or alleviates at least some of the problems with the conventional technology described above.
- the display device 200 in this disclosure is characterized by having a display processing unit 210, an acquisition unit 220, and a determination unit 230, as shown in FIG. 14.
- the display processing unit 210 displays a virtual space on the display unit that has a display area in which specific content can be played.
- the acquisition unit 220 acquires behavior information of a user while the first content is being played back in the display area.
- the determination unit 230 determines the state of the user based on the user behavior information acquired by the acquisition unit 220.
- the determination unit 230 can have the same configuration as the determination unit 130 described above.
- the above configuration provides a technical improvement that solves or alleviates at least some of the problems with the conventional technology described above.
- the information processing method disclosed herein is characterized in that one or more computer processors execute a display processing step S210, an acquisition step S220, and a determination step S230, as shown in FIG. 15.
- a virtual space having a display area in which specific content can be played is displayed on the display unit.
- the display processing step S210 can be executed by the display processing unit 210 described above. Details of the display processing unit 210 are as described above.
- acquisition step S220 behavior information of the user while playing the first content in the display area is acquired.
- the acquisition step S220 can be executed by the acquisition unit 220 described above. Details of the acquisition unit 220 are as described above.
- the determination step S230 determines the user's state based on the user's behavior information acquired in the acquisition step S220.
- the determination step S230 can be executed by the determination unit 230 described above. Details of the determination unit 230 are as described above.
- the above configuration provides a technical improvement that solves or alleviates at least some of the problems with the conventional technology described above.
- the computer program in this disclosure is characterized by causing one or more computer processors to realize a display processing function, an acquisition function, and a determination function.
- the display processing function causes the display unit to display a virtual space in which a first space having a display area capable of playing specific content is provided.
- the acquisition function acquires behavior information of a user while the first content is being played in the display area.
- the determination function determines the user's status based on the user's behavior information acquired by the acquisition function.
- circuits 1210 to 1230 shown in FIG. 16 The above functions can be realized by circuits 1210 to 1230 shown in FIG. 16.
- the display processing circuit 1210, acquisition circuit 1220, and judgment circuit 1230 are realized by the display processing unit 210, acquisition unit 220, and judgment unit 230 described above, respectively. Details of each unit are as described above.
- the above configuration provides a technical improvement that solves or alleviates at least some of the problems with the conventional technology described above.
- an information processing device such as a computer or a mobile phone can be suitably used to function as the server device or terminal device according to the above-mentioned embodiments.
- Such an information processing device can be realized by storing a program describing the processing contents for realizing each function of the server device or terminal device according to the embodiments in a memory unit of the information processing device, and having the CPU of the information processing device read and execute the program.
- the method described in the embodiment can be stored as a program that can be executed by a calculator (computer) on a recording medium such as a magnetic disk (floppy disk, hard disk, etc.), optical disk (CD-ROM, DVD, MO, etc.), semiconductor memory (ROM, RAM, flash memory, etc.), and can also be distributed by transmitting it via a communication medium.
- the program stored on the medium also includes a setting program that configures the software means (including not only execution programs but also tables and data structures) that the computer executes.
- the computer that realizes this device reads the program recorded on the recording medium, and in some cases, configures the software means using the setting program, and executes the above-mentioned processing by controlling the operation of this software means.
- the recording medium referred to in this specification is not limited to a storage medium for distribution, but also includes storage media such as a magnetic disk or semiconductor memory provided inside the computer or in a device connected via a network.
- the storage unit may function as a main storage device, an auxiliary storage device, or a cache memory, for example.
- Server device 110 Transmission unit 120 Reception unit 130 Determination unit 200 Display device
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Ophthalmology & Optometry (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Un système de traitement d'informations selon la présente invention est équipé d'un ou de plusieurs processeurs informatiques, et est caractérisé en ce que le ou les processeurs informatiques comprennent chacun : une unité d'émission (110) qui émet, vers un dispositif d'affichage prescrit, des informations pour afficher un espace virtuel pourvu d'une région d'affichage dans laquelle un contenu spécifique peut être reproduit ; une unité de réception (120) qui reçoit des informations qui se rapportent au comportement d'un utilisateur pendant la reproduction d'un premier contenu dans la région d'affichage, et qui sont émises à partir du dispositif d'affichage prescrit ; et une unité de détermination (130) qui détermine l'état de l'utilisateur, sur la base des informations qui se rapportent au comportement de l'utilisateur et qui sont reçues par l'unité de réception.
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2023/026906 WO2025022510A1 (fr) | 2023-07-22 | 2023-07-22 | Système de traitement d'informations, procédé de traitement d'informations, dispositif d'affichage et programme informatique |
| PCT/JP2024/025035 WO2025023030A1 (fr) | 2023-07-22 | 2024-07-10 | Système de traitement d'informations, procédé de traitement d'informations, dispositif d'affichage et programme informatique |
| JP2024556702A JP7612130B1 (ja) | 2023-07-22 | 2024-07-10 | 情報処理システム、情報処理方法、表示装置およびコンピュータプログラム |
| JP2024226985A JP2025036489A (ja) | 2023-07-22 | 2024-12-24 | 情報処理システム、情報処理方法、表示装置およびコンピュータプログラム |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2023/026906 WO2025022510A1 (fr) | 2023-07-22 | 2023-07-22 | Système de traitement d'informations, procédé de traitement d'informations, dispositif d'affichage et programme informatique |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025022510A1 true WO2025022510A1 (fr) | 2025-01-30 |
Family
ID=94374165
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2023/026906 Pending WO2025022510A1 (fr) | 2023-07-22 | 2023-07-22 | Système de traitement d'informations, procédé de traitement d'informations, dispositif d'affichage et programme informatique |
| PCT/JP2024/025035 Pending WO2025023030A1 (fr) | 2023-07-22 | 2024-07-10 | Système de traitement d'informations, procédé de traitement d'informations, dispositif d'affichage et programme informatique |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2024/025035 Pending WO2025023030A1 (fr) | 2023-07-22 | 2024-07-10 | Système de traitement d'informations, procédé de traitement d'informations, dispositif d'affichage et programme informatique |
Country Status (1)
| Country | Link |
|---|---|
| WO (2) | WO2025022510A1 (fr) |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20170035317A1 (en) * | 2014-04-17 | 2017-02-09 | The Regents Of The University Of California | Portable brain activity sensing platform for assessment of visual field deficits |
| JP2017058527A (ja) * | 2015-09-16 | 2017-03-23 | 株式会社エクシング | カラオケ装置及びカラオケ用プログラム |
| JP2017204206A (ja) * | 2016-05-13 | 2017-11-16 | 国立大学法人島根大学 | アバターを表示させるスマートフォン装置および健康管理システム |
| JP2017208676A (ja) * | 2016-05-17 | 2017-11-24 | 株式会社コロプラ | 仮想空間を提供する方法、プログラム及び記録媒体 |
| JP7103744B1 (ja) * | 2022-04-01 | 2022-07-20 | 株式会社仙台放送 | 視野評価用情報処理システム、視野評価用情報処理方法、視野評価用情報コンピュータプログラムおよび情報処理装置 |
| JP2022539539A (ja) * | 2019-06-25 | 2022-09-12 | コリア ユニバーシティ リサーチ アンド ビジネス ファウンデーション | 仮想現実ゲーム及び生体信号センサー基盤の前庭眼反射の評価及び再活装置 |
| JP2023032224A (ja) * | 2021-08-26 | 2023-03-09 | 国立大学法人 宮崎大学 | 眼位異常検出システム、眼位異常検出方法、及び、眼位異常検出プログラム |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP1407710B1 (fr) * | 2002-10-08 | 2005-08-10 | Inami & Co., Ltd. | System de perimetrie controlé par un ordinateur |
| JP4425666B2 (ja) * | 2004-02-26 | 2010-03-03 | 株式会社バンダイナムコゲームス | プログラム、情報記憶媒体及びゲーム装置 |
| JP7413147B2 (ja) * | 2020-05-21 | 2024-01-15 | キヤノン株式会社 | 画像処理装置、画像処理方法、及びプログラム |
-
2023
- 2023-07-22 WO PCT/JP2023/026906 patent/WO2025022510A1/fr active Pending
-
2024
- 2024-07-10 WO PCT/JP2024/025035 patent/WO2025023030A1/fr active Pending
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20170035317A1 (en) * | 2014-04-17 | 2017-02-09 | The Regents Of The University Of California | Portable brain activity sensing platform for assessment of visual field deficits |
| JP2017058527A (ja) * | 2015-09-16 | 2017-03-23 | 株式会社エクシング | カラオケ装置及びカラオケ用プログラム |
| JP2017204206A (ja) * | 2016-05-13 | 2017-11-16 | 国立大学法人島根大学 | アバターを表示させるスマートフォン装置および健康管理システム |
| JP2017208676A (ja) * | 2016-05-17 | 2017-11-24 | 株式会社コロプラ | 仮想空間を提供する方法、プログラム及び記録媒体 |
| JP2022539539A (ja) * | 2019-06-25 | 2022-09-12 | コリア ユニバーシティ リサーチ アンド ビジネス ファウンデーション | 仮想現実ゲーム及び生体信号センサー基盤の前庭眼反射の評価及び再活装置 |
| JP2023032224A (ja) * | 2021-08-26 | 2023-03-09 | 国立大学法人 宮崎大学 | 眼位異常検出システム、眼位異常検出方法、及び、眼位異常検出プログラム |
| JP7103744B1 (ja) * | 2022-04-01 | 2022-07-20 | 株式会社仙台放送 | 視野評価用情報処理システム、視野評価用情報処理方法、視野評価用情報コンピュータプログラムおよび情報処理装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2025023030A1 (fr) | 2025-01-30 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Adhanom et al. | Eye tracking in virtual reality: a broad review of applications and challenges | |
| Qian et al. | An eye tracking based virtual reality system for use inside magnetic resonance imaging systems | |
| CN112203732B (zh) | 识别玩家参与度以生成情景游戏玩法辅助 | |
| US20100167801A1 (en) | Kids personal health records fed into video games | |
| US20240164672A1 (en) | Stress detection | |
| US10709328B2 (en) | Main module, system and method for self-examination of a user's eye | |
| JP2017136131A (ja) | 眼科検査システム及び眼科検査管理サーバ | |
| KR20180108954A (ko) | 가상현실을 이용한 신경질환 진단 장치 및 방법 | |
| Heller | Reimagining reality: Human rights and immersive technology | |
| US20250013552A1 (en) | Information processing system for visual field evaluation, information processing method for visual field evaluation, computer program for visual field evaluation, and information processing device | |
| KR102105552B1 (ko) | 사용자의 인지능력 개선을 위한 안마의자 시스템 | |
| US20240139462A1 (en) | Methods for cybersickness mitigation in virtual reality experiences | |
| CN116392123A (zh) | 一种基于游戏交互和眼动追踪的多动症筛查方法及系统 | |
| CA3226053A1 (fr) | Systemes et methodes de prise en charge d'affections psychiatriques ou mentales faisant appel a la realite numerique ou augmentee | |
| JP2014115897A (ja) | 情報処理装置、管理装置、情報処理方法、管理方法、情報処理プログラム、管理プログラム及びコンテンツ提供システム | |
| WO2020178411A1 (fr) | Équipe d'agent virtuel | |
| JP7106195B2 (ja) | コンピュータシステム、薬提案方法及びプログラム | |
| KR102160669B1 (ko) | 의료 커뮤니케이션 가상 훈련 시뮬레이션 시스템 및 방법 | |
| US20240139463A1 (en) | Methods for cybersickness mitigation in virtual reality experiences | |
| JP2014089650A (ja) | 電子問診装置、電子問診システム、電子問診装置の制御方法及び制御プログラム | |
| JP7612130B1 (ja) | 情報処理システム、情報処理方法、表示装置およびコンピュータプログラム | |
| WO2025022510A1 (fr) | Système de traitement d'informations, procédé de traitement d'informations, dispositif d'affichage et programme informatique | |
| Li et al. | Seatmatevr: Proxemic cues for close bystander-awareness in virtual reality | |
| US20240412862A1 (en) | Information processing device, information processing method, and storage medium | |
| Tu et al. | The effect of two different types of human-computer interactions on user's emotion in virtual counseling environment |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23946596 Country of ref document: EP Kind code of ref document: A1 |