[go: up one dir, main page]

WO2020246127A1 - Système d'octroi d'incitation, dispositif d'octroi d'incitation, procédé d'octroi d'incitation et programme de gestion d'incitation dans un espace de réalité virtuelle - Google Patents

Système d'octroi d'incitation, dispositif d'octroi d'incitation, procédé d'octroi d'incitation et programme de gestion d'incitation dans un espace de réalité virtuelle Download PDF

Info

Publication number
WO2020246127A1
WO2020246127A1 PCT/JP2020/015497 JP2020015497W WO2020246127A1 WO 2020246127 A1 WO2020246127 A1 WO 2020246127A1 JP 2020015497 W JP2020015497 W JP 2020015497W WO 2020246127 A1 WO2020246127 A1 WO 2020246127A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
virtual
virtual reality
incentive
space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2020/015497
Other languages
English (en)
Japanese (ja)
Inventor
拓宏 水野
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alpha Code Inc
Original Assignee
Alpha Code Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alpha Code Inc filed Critical Alpha Code Inc
Publication of WO2020246127A1 publication Critical patent/WO2020246127A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/69Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to an incentive giving system, an incentive giving device, an incentive giving method and an incentive management program in a virtual reality space, and is particularly suitable for a system that gives a predetermined incentive to a user who watches virtual reality contents. It is an incentive.
  • VR virtual reality
  • the advertisement displayed in the VR space is called a VR advertisement.
  • this VR advertisement it is desired to enhance the advertising effect as much as possible, as in the case of the Internet advertisement performed using the Internet website or the like. Therefore, a mechanism has been devised to display an advertisement having contents according to the attributes, interests, behaviors, etc. of the user who is viewing the VR image.
  • Patent Document 1 a system has been proposed in which a plurality of communities are formed in the VR space and the number of participants in the events held in each community is increased (see, for example, Patent Document 1).
  • the VR space providing system described in Patent Document 1 the behavior of a user who participates in a community in the VR space via an avatar (participation in an event held in the community, the behavior of the user at the event, or the event). The game wins or loses in the game, etc.) is evaluated, and points that can be used in the VR space are given to the user according to the evaluation.
  • the VR advertisement may be repeatedly displayed in the VR space, or the VR advertisement may be displayed prominently.
  • a display method is sometimes unpleasant for the user and may be counterproductive as an advertisement. That is, even if the targeting advertisement is performed based on the information on the user's attributes and behavior, the displayed advertisement does not always match the user's hobbies and tastes. The more prominent such an ad is, the more annoying and unpleasant it appears to those who are not interested in it.
  • Patent Document 1 if the user's behavior in the community in the VR space is evaluated and points are given, it is possible to give the user motivation to participate in the community. is there. For example, if a community for promoting products and services is created in the VR space and points are given to users who participate in the community and perform a predetermined action, the number of users who participate in the community can be increased. It can be expected that the number will increase and the sales promotion effect will increase to some extent.
  • the present invention has been made to solve such a problem, and can effectively give an incentive to a user who experiences the contents of the VR space so that the user can feel more motivated.
  • the purpose is to do.
  • the present invention it is determined whether or not the virtual user corresponding to the real user exists in the virtual reality space, and it is determined that the virtual user exists in the virtual reality space. In some cases, a predetermined incentive is given to the actual user.
  • an incentive is given only when the virtual user is in the virtual reality space. That is, it is not a condition for giving an incentive to actively act in the virtual reality space or to clear a certain condition by the activity, and the user can obtain an incentive relatively easily. .. Therefore, it is possible to effectively give an incentive to the user who experiences the content in the virtual reality space so that the user can feel more motivated.
  • FIG. 1 shows the whole configuration example of the VR image providing system which applied the incentive giving system in the virtual reality space by 1st and 2nd Embodiment.
  • FIG. 1 is a diagram showing an overall configuration example of a VR image providing system to which the incentive giving system in the virtual reality space according to the first embodiment is applied.
  • the incentive giving system (VR image providing system) according to the first embodiment includes a client device 100 and a server device 200.
  • the client device 100 includes a head-mounted display (HMD) 101 that is worn on the user's head and used, various sensors 102 to 104, a controller 105 that is held and used by the user, and an arithmetic unit 106. .. Although the configuration in which the HMD 101 and the arithmetic unit 106 are separate bodies is shown here, the HMD 101 may have a built-in function of the arithmetic unit 106.
  • the server device 200 corresponds to an incentive giving device, and the client device 100 is connected via a communication network 300 such as the Internet or a mobile phone network. Although only one client device 100 is shown in FIG. 1, there may be a plurality of client devices 100.
  • the HMD 101 included in the client device 100 displays a virtual reality image (VR image) of the virtual reality space (VR space).
  • the HMD 101 may be of any type. That is, the HMD 101 may be a binocular type or a monocular type. Further, the HMD 101 may be a non-transparent type that completely covers the eyes, or a transparent type that does not completely cover the eyes. Further, the HMD 101 may be a goggle type, a glasses type, or a hat type.
  • the various sensors 102 to 104 are mounted on the HMD 101 to detect the movement of the user's head, the motion detection sensor 102, mounted on the HMD 101 to detect the movement of the user's line of sight, and the controller 105. It is a motion detection sensor 104 mounted on the device to detect the movement of the user's hand.
  • the motion detection sensors 102 and 104 are known sensors configured by combining an acceleration sensor, a gyro sensor, and the like, and detect acceleration and angular velocity according to a change in the direction and speed of movement of an object, a change in posture, and the like.
  • the motion detection information of the above is wirelessly transmitted to the arithmetic unit 106 via the HMD 101 or the controller 105.
  • the HMD 101 and the arithmetic unit 106 and the controller 105 and the arithmetic unit 106 are connected by a short-range wireless communication means such as Bluetooth (registered trademark).
  • the line-of-sight detection sensor 103 is a known sensor configured by combining, for example, a small camera and an image processing device.
  • the eye movement is detected by image recognition processing on a captured image, and the movement detection information is transmitted via the HMD 101. Is transmitted wirelessly to the arithmetic unit 106.
  • the line-of-sight detection sensor 103 may be composed of transparent optical sensors arranged side by side on the lenses of the spectacles of the HMD 101.
  • the controller 105 is for giving a desired instruction to the arithmetic unit 106 by the user, and is provided with a predetermined operation button.
  • the controller 105 wirelessly transmits instruction information according to the content of the operation to the arithmetic unit 106.
  • the arithmetic unit 106 is composed of, for example, a smartphone, a tablet terminal, or a personal computer.
  • the arithmetic unit 106 performs a process of generating user state information, which will be described later, and transmitting it to the server device 200 based on the motion detection information supplied from the various sensors 102 to 104.
  • the arithmetic unit 106 also performs a process of transmitting instruction information supplied from the controller 105 to the server device 200. Further, the arithmetic unit 106 receives the VR image that fluctuates according to the user state information and the instruction information from the server device 200 and displays it on the HMD 101.
  • FIG. 2 is a block diagram showing a functional configuration example of the arithmetic unit 106 included in the client device 100. Note that, in FIG. 2, only the main functional configurations relating to the subject in the first embodiment are shown, and those not directly related to the subject are omitted. For example, the functional configuration relating to processing the instruction information supplied from the controller 105 is not shown. The fact that only the main functional configuration relating to the subject is illustrated is the same in the functional configuration of the server device 200 of FIG. 3 which will be described later.
  • the arithmetic unit 106 has a communication unit 10, a head movement detection unit 11, a line-of-sight detection unit 12, a hand movement detection unit 13, and a user state information generation unit as functional configurations. 14. It includes a VR image acquisition unit 15 and a display control unit 16.
  • These functional blocks 11 to 16 can be configured by any of hardware, DSP (Digital Signal Processor), and software.
  • DSP Digital Signal Processor
  • each of the above functional blocks 11 to 16 is actually configured to include a computer CPU, RAM, ROM, etc., and is a program stored in a recording medium such as RAM, ROM, hard disk, or semiconductor memory. Is realized by the operation of.
  • the communication unit 10 bidirectionally communicates with the server device 200 via the communication network 300.
  • the head movement detection unit 11 corresponds to a user wearing the HMD 101 (corresponding to a "real user” in the claims) based on the motion detection information transmitted from the motion detection sensor 102 mounted on the HMD 101. In the following description, the movement of the head facing direction and posture is detected (the term "user” simply means a real user).
  • the line-of-sight detection unit 12 detects the movement in the direction in which the line of sight of the user wearing the HMD 101 is facing, based on the motion detection information transmitted from the line-of-sight detection sensor 103 mounted on the HMD 101.
  • the hand movement detection unit 13 detects the movement of the direction or posture in which the user holding the controller 105 is facing, based on the movement detection information transmitted from the movement detection sensor 104 mounted on the controller 105. To do.
  • the user state information generation unit 14 generates user state information indicating the movement state of each part of the user based on the detection results of the head movement detection unit 11, the line-of-sight detection unit 12, and the hand movement detection unit 13. ..
  • the format of the user status information is arbitrary. For example, it is possible to generate user state information as one piece of information having a plurality of fields indicating the movement of the head, the movement of the line of sight, and the movement of the hand. Alternatively, a plurality of user state information in which the movement of the head, the movement of the line of sight, and the movement of the hand are configured as separate information may be individually generated.
  • the user state information generation unit 14 transmits the generated user state information to the server device 200 via the communication unit 10. The user state information generation unit 14 repeatedly executes such processing at predetermined time intervals.
  • the client device 100 detects the movement of the head of the user wearing the HMD 101, the movement of the line of sight, and the movement of the hand holding the controller 105 by the various sensors 102 to 104, and the user state information based on the detection result is stored in the server. It is configured to transmit to the device 200. Further, although the related functional configuration is not shown in FIG. 2, the client device 100 is configured to transmit instruction information generated by the user operating the controller 105 to the server device 200.
  • the VR image acquisition unit 15 transmits a VR image provision request to the server device 200, and in response, acquires a VR image from the server device 200 via the communication unit 10.
  • the server device 200 provides a Web portal site capable of providing a plurality of VR contents that can be viewed on the client device 100. Then, the client device 100 accesses the portal site, logs in, and performs an operation in which the user selects a desired VR content from a plurality of VR contents.
  • the VR image acquisition unit 15 transmits a request for providing a VR image corresponding to the selected VR content to the server device 200, and acquires the corresponding VR image from the server device 200.
  • the VR content selection operation may be performed using the operation unit (touch panel, keyboard, mouse, etc.) included in the arithmetic unit 106, or may be performed using the controller 105.
  • the operation unit touch panel, keyboard, mouse, etc.
  • the user executes the selection operation while looking at the screen of the portal site displayed on the display provided in the arithmetic unit 106 without the HMD 101 attached. ..
  • the selection operation is performed using the controller 105, the user executes the selection operation while looking at the screen of the portal site displayed on the HMD 101 with the HMD 101 attached.
  • the display control unit 16 controls the HMD 101 to display the VR image acquired by the VR image acquisition unit 15.
  • the VR image is an image that fluctuates according to the user state information and the instruction information.
  • the field of view in a three-dimensional VR space fluctuates according to the movement of the user's head and the movement of the line of sight indicated by the user state information.
  • the VR image fluctuates the view of the VR space or exists in the VR space.
  • the state of the object fluctuates, and the scene in VR space fluctuates.
  • FIG. 3 is a block diagram showing a functional configuration example of the server device 200 according to the first embodiment.
  • the server device 200 has a communication unit 20, an image provision request acquisition unit 21, a user status information acquisition unit 22, and a VR image provision unit 23 (virtual reality image provision unit within the scope of claims) as functional configurations.
  • a user presence / absence determination unit 24 and an incentive granting unit 25 are provided.
  • the server device 200 includes a VR image storage unit 27 and a user information storage unit 28 as storage media.
  • Each of the above functional blocks 21 to 25 can be configured by any of hardware, DSP, and software.
  • each of the above functional blocks 21 to 25 is actually configured to include a computer CPU, RAM, ROM, etc., and is a program stored in a recording medium such as RAM, ROM, hard disk, or semiconductor memory. It is realized by operating (including the incentive management program).
  • the communication unit 20 bidirectionally communicates with the arithmetic unit 106 of the client device 100 via the communication network 300.
  • the image provision request acquisition unit 21 acquires the image provision request transmitted from the client device 100.
  • the image provision request is made from the client device 100 to the server when the user selects the desired VR content from the plurality of VR contents existing in the portal site provided from the server device 200 to the client device 100. This is information transmitted to the device 200.
  • This image acquisition request includes identification information that can uniquely identify the user who made the request (for example, a user ID when accessing the portal site and logging in).
  • the user status information acquisition unit 22 sequentially acquires the user status information that is repeatedly transmitted by the user status information generation unit 14 of the client device 100.
  • the VR image providing unit 23 provides the client device 100 with a VR image that fluctuates according to the behavior of the user who is viewing the VR content. That is, when the image providing request acquisition unit 21 acquires the image providing request transmitted from the client device 100, the VR image providing unit 23 varies according to the user state information and the instruction information transmitted from the client device 100. A VR image is generated and transmitted to the client device 100. The contents of the fluctuation of the VR image based on the user state information and the instruction information are as described above.
  • the source data of the VR image generated by the VR image providing unit 23 is stored in the VR image storage unit 27. That is, the VR image storage unit 27 stores the entire image data of the VR space set as a range that can be changed according to the movement of the user's head and the movement of the line of sight, and various types existing in the VR space. The image data of the object is stored. In addition, the VR image storage unit 27 stores various data necessary for displaying the VR image on the HMD 101.
  • the VR image providing unit 23 is a VR space included in the field of view in the direction the user is looking, based on the information of the movement of the user's head and the movement of the line of sight indicated by the user state information acquired by the user state information acquisition unit 22.
  • the image data of the above and the image data of the object are read from the VR image storage unit 27.
  • a VR image of the field of view is generated, and the generated VR image is transmitted to the client device 100.
  • the VR image providing unit 23 In the initial state when the image providing request acquisition unit 21 acquires the image providing request transmitted from the client device 100, the VR image providing unit 23 generates a VR image having a predetermined field of view and transmits it to the client device 100. To do.
  • the user existence determination unit 24 determines whether or not a virtual user corresponding to the real user exists in the VR space.
  • the virtual user corresponding to the real user is a virtual user who acts in the VR space as an alter ego of the real user.
  • the virtual user may be an avatar image including the whole body, upper body, face, etc. displayed in the VR space (a part or all of one's own figure is visualized as an avatar image), or a virtual user. Only the image of a typical hand or controller may be displayed in the VR space (the image of oneself is not visualized as an avatar image).
  • the image provision request acquisition unit 21 acquires the image provision request transmitted from the client device 100, and the VR image provision unit 23 receives the VR image in the initial state as the client.
  • the device 100 it is determined that the virtual user exists in the VR space. That is, when the user selects desired VR content on the portal site, sends an image provision request to the server device 200, and starts viewing the VR content, the user presence / absence determination unit 24 makes the image provision request. It is determined that a virtual user corresponding to the real user exists in the VR space. After that, when the real user stops viewing the VR content, the user existence determination unit 24 determines that the virtual user does not exist in the VR space.
  • the incentive giving unit 25 gives a predetermined incentive to the real user when the user presence / absence determination unit 24 determines that the virtual user exists in the VR space.
  • the incentive to be given may be an incentive that the real user can use in the real world, or an incentive that can be used by the virtual user in the VR space.
  • Incentives that can be used by real users in the real world are, for example, coupons that can be used when purchasing a specific product or service, points issued by a specific business company, and the like.
  • the incentive giving unit 25 transmits the incentive information of such a coupon or a point to the client device 100 via the communication unit 10.
  • the client device 100 stores the incentive information acquired from the server device 200.
  • the user can use the coupon or points by presenting the incentive information at a predetermined online shop or an actual store.
  • the incentives that virtual users can use in the VR space are, for example, virtual currencies and items that can be used in the VR space.
  • the incentive giving unit 25 stores such incentive information in the user information storage unit 28.
  • the user information storage unit 28 stores incentive information given to the user, for example, in association with the user ID used when the user logs in to the portal site.
  • the incentive information stored in the user information storage unit 28 can be used at the request of the user in the VR content for which the VR image is provided by the VR image providing unit 23. For example, the use of incentive information in VR content is instructed by the user operating the controller 105.
  • the content of the incentives shown here is an example, and is not limited to this. Further, the method of giving an incentive is also an example, and the method is not limited to this.
  • the plurality of VR contents that can be selected on the portal site may include VR contents that are targeted for incentives and VR contents that are not.
  • the processing of the user presence / absence determination unit 24 and the incentive giving unit 25 may be executed only when the VR content to which the incentive is given is selected.
  • VR content when a certain business entity promotes a product or service, it is conceivable to create VR content to be given an incentive and provide it on the portal site.
  • a travel company or a transportation company wants to promote a trip to a specific tourist destination, it creates VR content that introduces the goodness of the tourist destination, and within the VR space provided by the VR content.
  • An incentive is given to a real user (a user who is viewing the VR content) corresponding to the virtual user existing in.
  • the VR content itself is a content that conveys the attractiveness of the tourist destination to the user, and an explicit advertisement image like a conventional Internet advertisement or VR advertisement is not displayed in the VR space, which is annoying. There is no discomfort to the user due to the advertisement image.
  • the contents of VR contents to be given incentives can be considered in various ways other than for sales promotion of such tourist spots.
  • an administrative agency or an incentive requested by an administrative agency wants to revitalize a town in a specific area, it creates VR content that conveys the appeal of the area and provides users who view the VR content. It is also possible to give incentives.
  • a specific company wants to promote sales related to BI (Business Intelligence) tools, it creates VR content that allows users to experience the functions of the BI tool, and gives incentives to users who view the VR content. It is also possible to do something like that.
  • a plurality of virtual users corresponding to the plurality of client devices 100 are used. It may be a VR space such as a community that can exist at the same time and allows a plurality of virtual users existing at the same time to interact with each other.
  • the dialogue between the virtual users may be performed by using, for example, an electronic bulletin board system, or may be a format in which an annotation image of a speech balloon is added in the vicinity of the avatar image.
  • the client device 100 further includes a microphone and a speaker, inputs the voice spoken by the user from the microphone, supplies the voice data to the arithmetic unit 106, and transmits the voice data from the arithmetic unit 106 to the server device 200.
  • the VR image providing unit 23 of the server device 200 generates a VR image in which avatar images of a plurality of virtual users corresponding to a plurality of users viewing VR contents are drawn in a VR space, and the VR images of the plurality of users are generated.
  • the voice data related to the remark is transmitted to the client device 100 together with the VR image.
  • the VR image is displayed on the HMD 101, and the voice related to the speech is output from the speaker.
  • the community-type VR content that enables a plurality of users viewing the same VR content to interact with each other corresponds to the virtual user existing in the VR space provided by the VR content.
  • the incentive giving unit 25 gives an incentive according to the length of time that the virtual user exists in the VR space after the user existence determination unit 24 determines that the virtual user exists in the VR space. You may change the way of giving. For example, the incentive giving unit 25 gives an incentive when the virtual user enters the VR space, and then gives the incentive again after a predetermined time elapses without the virtual user leaving the VR space. You may. Further, the incentive giving unit 25 may give an incentive every time a predetermined time elapses after the virtual user enters the VR space. Further, the content of the incentive may be changed according to the length of time that the virtual user exists in the VR space.
  • the incentive giving unit 25 may change the incentive giving method according to the number of times that the user existence determination unit 24 determines that the virtual user exists in the VR space for one VR content. ..
  • the incentive giving unit 25 may give an incentive until the number of times the virtual user enters the VR space exceeds a predetermined value, and may not give an incentive after the number of times the virtual user enters the VR space exceeds the predetermined value.
  • the incentive may not be given until the number of times the virtual user has entered the VR space exceeds the predetermined value, and the incentive may be given after the number of times the virtual user has entered the VR space exceeds the predetermined value.
  • the content of the incentive may be changed according to the number of times the virtual user enters the VR space.
  • the virtual user is invited.
  • an incentive may be given to the real user corresponding to the certain virtual user.
  • the invitation may be made by using the invitation function prepared as one function of the VR content.
  • the user information may be configured in which a real user invites in the real space and information about who is invited by whom is added to the user information.
  • FIG. 4 is a flowchart showing an operation example of the server device 200 according to the first embodiment configured as described above.
  • the image provision request acquisition unit 21 determines whether or not the image provision request transmitted from the client device 100 has been acquired (step S1). If the image provision request has not been acquired, the image provision request acquisition unit 21 repeats the determination in step S1.
  • the VR image providing unit 23 When the image providing request acquisition unit 21 acquires the image providing request, the VR image providing unit 23 generates a VR image having a predetermined field of view as an initial state (step S2), and transmits the VR image to the client device 100 (step S2). Step S3). With the transmission of this VR image, the user presence / absence determination unit 24 determines that a virtual user corresponding to the user who has transmitted the image provision request (the user corresponding to the user ID included in the image provision request) exists in the VR space. (Step S4).
  • the incentive giving unit 25 gives a predetermined incentive to the user who has sent the image provision request (step S5).
  • the user status information acquisition unit 22 determines whether or not the user status information transmitted from the client device 100 has been acquired (step S6).
  • the VR image providing unit 23 generates a VR image that fluctuates according to the user's behavior indicated by the user state information (step S7), and the VR image. Is transmitted to the client device 100 (step S8). After that, the process proceeds to step S9.
  • the VR image providing unit 23 determines whether or not the provision of the VR content requested by the user has been completed (step S9).
  • the case where the provision of the VR content is terminated is either the case where the user is instructed to end the viewing of the VR content, or the case where the playback of the VR content is completed to the end.
  • the process returns to step S6.
  • the user presence / absence determination unit 24 determines that the virtual user corresponding to the user who has sent the image provision request does not exist in the VR space (step S10), and FIG. The processing of the flowchart shown is completed.
  • the first embodiment it is determined whether or not the virtual user corresponding to the real user exists in the VR space, and it is determined that the virtual user exists in the VR space.
  • a predetermined incentive is given to a real user.
  • an incentive is given only when the virtual user is in the VR space. That is, it is not a condition for giving an incentive to actively act in an activity or community set in the VR space, or to clear certain conditions by the activity, and the user can give an incentive relatively easily. You will be able to get it. Therefore, it is possible to effectively give an incentive to the user who experiences the VR content so that the user can easily feel the motivation.
  • FIG. 5 is a block diagram showing a functional configuration example of the server device 200'according to the second embodiment. Note that, in FIG. 5, those having the same reference numerals as those shown in FIG. 3 have the same functions, and therefore, duplicate description will be omitted here.
  • the server device 200' As shown in FIG. 5, the server device 200'according to the second embodiment further includes an appropriateness determination unit 26 as a functional configuration, and also includes an incentive giving unit 25'instead of the incentive giving unit 25.
  • the appropriateness determination unit 26 evaluates at least one of the behavior of the virtual user and the fluctuation of the VR image, and is based on the evaluation result. Therefore, it is determined whether or not the existence of the virtual user in the VR space is appropriate.
  • the behavior of the virtual user is, for example, as follows. -Movement of the area that the virtual user pays attention to in the VR space-Movement of the virtual user's hand in the VR space, the posture of the hand or the positional relationship of both hands
  • the fluctuation of the VR image is, for example, as follows. .. ⁇ Movement in VR space (movement of visual field)
  • the area of interest of the virtual user in the VR space (hereinafter referred to as the area of interest) varies depending on the direction of the head and the direction of the line of sight of the user wearing the HMD 101.
  • the region of interest does not change at all. If the HMD101 is placed on some device that moves or vibrates automatically, the area of interest will fluctuate, but it will repeat fluctuations according to a certain pattern, or it will be a movement that is impossible for human behavior. There is. Therefore, when the area of interest does not move at all or moves unnaturally in this way, it can be said that the existence of the virtual user may not be appropriate.
  • the hand of the virtual user fluctuates according to the movement of the hand of the user holding the controller 105.
  • the virtual user's hand does not move at all. If the HMD101 is placed on some device that automatically moves or vibrates, the virtual user's hand will fluctuate, but it will repeatedly fluctuate according to a certain pattern, or both hands will move in exactly the same way. Or it may be a movement that is impossible for human behavior. Therefore, if the virtual user's hand does not move at all or moves unnaturally in this way, it can be said that the existence of the virtual user may not be appropriate.
  • the posture of the virtual user's hands may become unnatural, or the positional relationship between the virtual user's hands may become unnatural. That is, when the user does not have the controller 105 and places it somewhere, the posture and the positional relationship may not be possible if the user actually holds the controller 105. Even in such a case, it can be said that the existence of the virtual user may not be appropriate.
  • the field of view of the VR space fluctuates depending on the direction of the head of the user wearing the HMD 101.
  • the field of view of the VR space does not change at all. If the HMD101 is placed on some device that automatically moves or vibrates, the field of view in the VR space will fluctuate, but it will repeat fluctuations according to a certain pattern, or it will be a movement of the human head. It may be a movement that you cannot get. Therefore, it can be said that the existence of the virtual user may not be appropriate even when the field of view of the VR space does not move at all or moves unnaturally in this way.
  • the appropriateness determination unit 26 evaluates at least one of the behavior of the virtual user and the fluctuation of the VR image illustrated above, and when there is a possibility that the existence of the virtual user is not appropriate as described above, the appropriateness determination unit 26 is in the VR space. It is determined that the existence of the virtual user in is not appropriate.
  • the behavior of the virtual user and the fluctuation of the VR image can be recognized by using, for example, the VR image generated by the VR image providing unit 23. Since the behavior of the virtual user and the fluctuation of the VR image are specified based on the user state information acquired by the user state information acquisition unit 22, the appropriateness determination unit 26 uses the user state information to determine the existence of the virtual user. The suitability may be determined.
  • the user state information transmitted from the client device 100 or the VR image providing unit 23 accordingly Based on the generated VR image, at least one of the behavior of the virtual user and the variation of the VR image is evaluated. This evaluation is performed, for example, based on the user state information or the VR image generated in response to the user state information for a predetermined time after the virtual user is determined to exist in the VR space by the user existence determination unit 24. Alternatively, the evaluation is performed based on the user state information or the VR image obtained after the virtual user is determined to exist in the VR space by the user presence / absence determination unit 24 and before the virtual user leaves the VR space. You may do it.
  • the incentive giving unit 25 determines whether or not to give an incentive after the evaluation by the appropriateness determination unit 26 is performed.
  • the content of the evaluation performed by the appropriateness judgment unit 26 can be arbitrarily determined. For example, the existence of a virtual user is present for at least one of the movement of the area of interest of the virtual user in the VR space, the movement of the virtual user's hand in the VR space, the posture of the hand or the positional relationship between both hands, and the movement of the VR space. When it is evaluated that it may not be appropriate, it is conceivable to determine that the existence of the virtual user in the VR space is not appropriate.
  • the existence of the virtual user is appropriate for each of the movement of the area of interest of the virtual user in the VR space, the movement of the virtual user's hand in the VR space, the posture of the hand or the positional relationship between both hands, and the movement of the VR space. It is also possible to score the high possibility that it is not a thing and judge whether or not the existence of the virtual user in the VR space is appropriate based on the total value of each score. The scoring in this case may be performed based on a predetermined function, or may be performed using a learning model constructed by machine learning.
  • the movement of the virtual user's attention area in the VR space, the movement of the virtual user's hand in the VR space, the posture of the hand, or the positional relationship of both hands is shown. It is not limited to.
  • the movement of the body of the virtual user, the posture of the body, or the positional relationship between the body and the hand may be used.
  • the user wears the motion detection sensor on the user's body with a belt or the like.
  • the location of the user's body to which the motion detection sensor is attached via a belt or the like is any or all of the shoulders, elbows, wrists, hips, knees, ankles, and the like.
  • the client device 100 also includes the motion detection information detected by the motion detection sensor attached to the user's body in the user status information and transmits the motion detection information to the server device 200.
  • the body of the virtual user fluctuates according to the movement of the body of the user who wears the movement detection sensor on the body. However, in the case of a user who does not wear the motion detection sensor on the body after entering the VR space, the body of the virtual user does not move at all. If a motion detection sensor is placed on some device that automatically moves or vibrates, the virtual user's body will move, but it may repeat movements in a certain pattern, or it may be a human behavior. There is no movement. Therefore, when the virtual user's body does not move at all or moves unnaturally in this way, it can be said that the existence of the virtual user may not be appropriate.
  • the posture of the virtual user's body may become unnatural, or the positional relationship between the virtual user's body and both hands may become unnatural. .. That is, if the user does not wear the motion detection sensor and puts it somewhere, or does not have the controller 105 and puts it somewhere, the virtual user's body and hands are in a state that cannot be achieved by a living human being. In some cases. Even in such a case, it can be said that the existence of the virtual user may not be appropriate.
  • the client device 100 includes, for example, a position detection sensor in the HMD 101.
  • the position of the virtual user in the VR space changes depending on the movement of the user wearing the HMD 101 equipped with the position detection sensor.
  • the position of the virtual user does not move at all. Therefore, if the position of the virtual user does not move at all in this way, it can be said that the existence of the virtual user may not be appropriate.
  • the remarks made by the virtual user in the community-type VR content may be used.
  • the same remarks makes remarks that are not semantically connected to the remarks of others, or makes unclear remarks, the existence of the virtual user exists. It can be said that it may not be appropriate.
  • the movement of the field of view in the VR space is shown, but the movement is not limited to this.
  • the movement of the object image displayed in the VR space may be used.
  • the object image displayed in the VR space can be changed by the operation of the controller 105 (hand movement or button operation of the user holding the controller 105).
  • the controller 105 is not operated, so that the object does not move according to the operation of the controller 105. Therefore, it can be said that the existence of the virtual user may not be appropriate when the movement that fluctuates according to the operation of the controller 105 does not occur in any object.
  • the second embodiment configured as described above, it is possible to prevent the incentive from being given to the user who is in the VR space only for the purpose of acquiring the incentive, and the incentive is more effectively expressed. be able to.
  • the example in which the virtual user gives an incentive only by being in the VR space has been described, but it is large enough to actively participate in the activity carried out in the VR space. Incentives may be given. Even in this case, since participating in the activity itself is not a condition for giving incentives, it is possible to give incentives to users who do not participate in the activity, and to users who actively participate. Can give greater incentives.
  • the server device 200 provides the client device 100 with a Web portal site capable of providing a plurality of VR contents, and the user selects desired VR contents from the Web portal site.
  • the present invention is not limited to this.
  • the server device 200 may provide the client device 100 with a VR image of a virtual portal space capable of providing a plurality of VR contents, display the VR image on the HMD 101, and allow the user to select a desired VR content from the VR images. ..
  • an example in which an image provision request is transmitted from the client device 100 to the server device 200 and a VR image generated by the server device 200 is transmitted to the client device 100 has been described.
  • the present invention is not limited to this.
  • a portal screen displaying a list of selection interfaces corresponding to a plurality of VR contents installed in the arithmetic unit 106 of the client device 100 is displayed on the display of the HMD 101 or the arithmetic unit 106, and the VR content desired by the user is displayed. May be selected, and the VR image of the selected VR content may be generated by the arithmetic unit 106.
  • the arithmetic unit 106 further includes the functions of the image provision request acquisition unit 21, the user status information acquisition unit 22, the VR image provision unit 23, the user presence / absence determination unit 24, and the appropriateness determination unit 26. Then, the VR image providing unit 23 generates a virtual reality image that fluctuates according to the user state information based on the detection results of the various sensors 102 to 104 and displays it on the HMD 101. Further, the user presence / absence determination unit 24 receives an image provision request internally issued by the client device 100, and the VR image providing unit 23 generates a VR image in the initial state and displays it on the HMD 101. It is determined that it exists in the VR space, and the determination result is notified to the server device 200 (200').
  • the appropriateness determination unit 26 also notifies the server device 200 (200') of the determination result.
  • the incentive giving unit 25 (25') of the server device 200 (200') receives the determination results of the user presence / absence determination unit 24 and the appropriateness determination unit 26 from the client device 100, and determines whether or not to give the incentive.
  • the configuration for selecting one from a plurality of VR contents has been described, but a configuration for simply designating and playing back one VR content may be used.
  • the present invention is any configuration as long as it detects when the VR content is viewed by the user (when the virtual user enters the VR space provided by the VR content) and gives an incentive. Is included in the range of.
  • first and second embodiments are merely examples of embodiment of the present invention, and the technical scope of the present invention should not be interpreted in a limited manner by these. It must not be. That is, the present invention can be implemented in various forms without departing from its gist or its main features.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Finance (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Development Economics (AREA)
  • Accounting & Taxation (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Entrepreneurship & Innovation (AREA)
  • General Engineering & Computer Science (AREA)
  • Game Theory and Decision Science (AREA)
  • Software Systems (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention comprend : une unité de détermination de présence d'utilisateur (24) qui détermine si un utilisateur virtuel correspondant à un utilisateur réel est présent dans un espace de réalité virtuelle (VR pour Virtual Reality); et une unité d'octroi d'incitation (25) qui, lorsqu'il est déterminé que l'utilisateur virtuel est présent dans l'espace de réalité virtuelle, attribue une incitation prescrite à l'utilisateur réel, l'incitation étant amenée à être accordée à l'utilisateur réel si l'utilisateur virtuel n'est présent que dans l'espace de réalité virtuelle. Par conséquent, l'activation active dans l'espace de réalité virtuelle ou l'effacement d'une condition prescrite par activation n'est pas une condition pour un octroi de l'incitation et l'utilisateur peut obtenir relativement simplement l'incitation.
PCT/JP2020/015497 2019-06-07 2020-04-06 Système d'octroi d'incitation, dispositif d'octroi d'incitation, procédé d'octroi d'incitation et programme de gestion d'incitation dans un espace de réalité virtuelle Ceased WO2020246127A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-106783 2019-06-07
JP2019106783A JP7395172B2 (ja) 2019-06-07 2019-06-07 仮想現実空間におけるインセンティブ付与システム、インセンティブ付与装置、インセンティブ付与方法およびインセンティブ管理用プログラム

Publications (1)

Publication Number Publication Date
WO2020246127A1 true WO2020246127A1 (fr) 2020-12-10

Family

ID=73652523

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/015497 Ceased WO2020246127A1 (fr) 2019-06-07 2020-04-06 Système d'octroi d'incitation, dispositif d'octroi d'incitation, procédé d'octroi d'incitation et programme de gestion d'incitation dans un espace de réalité virtuelle

Country Status (2)

Country Link
JP (1) JP7395172B2 (fr)
WO (1) WO2020246127A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114844934A (zh) * 2022-04-28 2022-08-02 北京北建大科技有限公司 基于云渲染的多人大空间vr互动场景搭建方法
CN115082101A (zh) * 2021-03-16 2022-09-20 本田技研工业株式会社 信息处理装置、信息处理方法及存储介质

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7254112B2 (ja) * 2021-03-19 2023-04-07 本田技研工業株式会社 仮想体験提供装置、仮想体験提供方法、及びプログラム
JP7752962B2 (ja) * 2021-05-17 2025-10-14 株式会社バンダイナムコエンターテインメント ゲーム制御システム及びゲーム制御プログラム
JP7387039B1 (ja) 2023-01-25 2023-11-27 Kddi株式会社 情報処理装置及び情報処理方法
KR102741006B1 (ko) * 2023-06-27 2024-12-10 주식회사 맥스트 메타버스 내의 라이브홈쇼핑 제공방법 및 장치
WO2024248227A1 (fr) * 2023-06-02 2024-12-05 주식회사 맥스트 Procédé et appareil pour fournir un contenu lié à la réalité dans un métavers

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080091692A1 (en) * 2006-06-09 2008-04-17 Christopher Keith Information collection in multi-participant online communities
JP2011083508A (ja) * 2009-10-19 2011-04-28 Smile-Lab Co Ltd ビデオゲーム制御サーバ、ビデオゲーム制御方法、およびビデオゲーム制御プログラム
JP2012014676A (ja) * 2010-05-31 2012-01-19 Sony Computer Entertainment Inc 仮想現実空間提供システム、仮想現実空間提供方法およびそのプログラム
US20120270620A1 (en) * 2008-09-26 2012-10-25 International Business Machines Corporation Avatar protection within a virtual universe
US20130110952A1 (en) * 2008-09-23 2013-05-02 International Business Machines Corporation Managing virtual universe avatar behavior ratings
JP2017027477A (ja) * 2015-07-24 2017-02-02 株式会社オプティム 3次元出力サーバ、3次元出力方法及び3次元出力サーバ用プログラム。
JP2018073172A (ja) * 2016-10-31 2018-05-10 株式会社ソニー・インタラクティブエンタテインメント 情報処理装置および画像生成方法
JP2018092416A (ja) * 2016-12-05 2018-06-14 株式会社コロプラ 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるプログラム
WO2018225218A1 (fr) * 2017-06-08 2018-12-13 株式会社ソニー・インタラクティブエンタテインメント Dispositif de traitement d'informations et procédé de génération d'image

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006004324A (ja) * 2004-06-21 2006-01-05 Ritz International:Kk コミュニティ運営システム
KR100729459B1 (ko) * 2004-06-28 2007-06-15 에스케이 텔레콤주식회사 아바타 통신 서비스 시스템
JP5043740B2 (ja) * 2008-04-03 2012-10-10 インターナショナル・ビジネス・マシーンズ・コーポレーション 仮想空間においてガイドするためのガイド提供装置、方法及びプログラム
EP3012743A4 (fr) * 2013-06-18 2017-03-08 Sony Corporation Dispositif de traitement d'informations, méthode de traitement d'informations et programme
JP2015071076A (ja) * 2014-12-10 2015-04-16 株式会社コナミデジタルエンタテインメント ゲーム管理装置、ゲームシステム、ゲーム管理方法及びプログラム
JP2018075260A (ja) * 2016-11-10 2018-05-17 株式会社バンダイナムコエンターテインメント ゲームシステム及びプログラム
JP2018181306A (ja) * 2018-02-07 2018-11-15 株式会社コロプラ 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080091692A1 (en) * 2006-06-09 2008-04-17 Christopher Keith Information collection in multi-participant online communities
US20130110952A1 (en) * 2008-09-23 2013-05-02 International Business Machines Corporation Managing virtual universe avatar behavior ratings
US20120270620A1 (en) * 2008-09-26 2012-10-25 International Business Machines Corporation Avatar protection within a virtual universe
JP2011083508A (ja) * 2009-10-19 2011-04-28 Smile-Lab Co Ltd ビデオゲーム制御サーバ、ビデオゲーム制御方法、およびビデオゲーム制御プログラム
JP2012014676A (ja) * 2010-05-31 2012-01-19 Sony Computer Entertainment Inc 仮想現実空間提供システム、仮想現実空間提供方法およびそのプログラム
JP2017027477A (ja) * 2015-07-24 2017-02-02 株式会社オプティム 3次元出力サーバ、3次元出力方法及び3次元出力サーバ用プログラム。
JP2018073172A (ja) * 2016-10-31 2018-05-10 株式会社ソニー・インタラクティブエンタテインメント 情報処理装置および画像生成方法
JP2018092416A (ja) * 2016-12-05 2018-06-14 株式会社コロプラ 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるプログラム
WO2018225218A1 (fr) * 2017-06-08 2018-12-13 株式会社ソニー・インタラクティブエンタテインメント Dispositif de traitement d'informations et procédé de génération d'image

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115082101A (zh) * 2021-03-16 2022-09-20 本田技研工业株式会社 信息处理装置、信息处理方法及存储介质
CN114844934A (zh) * 2022-04-28 2022-08-02 北京北建大科技有限公司 基于云渲染的多人大空间vr互动场景搭建方法

Also Published As

Publication number Publication date
JP7395172B2 (ja) 2023-12-11
JP2020201619A (ja) 2020-12-17

Similar Documents

Publication Publication Date Title
JP7395172B2 (ja) 仮想現実空間におけるインセンティブ付与システム、インセンティブ付与装置、インセンティブ付与方法およびインセンティブ管理用プログラム
US12366929B2 (en) Systems, methods, and apparatus for enhanced peripherals
US9906474B2 (en) Automated selection of appropriate information based on a computer user's context
US7073129B1 (en) Automated selection of appropriate information based on a computer user's context
US20170103440A1 (en) Wearable augmented reality eyeglass communication device including mobile phone and mobile computing via virtual touch screen gesture control and neuron command
JP7742613B2 (ja) 情報処理システム、情報処理方法、情報処理プログラム
US12357915B2 (en) Advertising display system, method for displaying advertising in virtual space, and calculating fees for the advertising
Brito et al. Augmented reality versus conventional interface: is there any difference in effectiveness?
WO2018122709A1 (fr) Dispositif de communication portable à lunettes à réalité augmentée comprenant un téléphone mobile et un dispositif informatique mobile contrôlé par un geste tactile virtuel et une commande neuronale
US20230020633A1 (en) Information processing device and method for medium drawing in a virtual system
JP2022095357A (ja) サービス提供システムおよびサービス提供方法
JP2022138645A (ja) サービス提供システムおよび広告方法
Tiusanen Virtual reality in destination marketing
US20090258336A1 (en) System and method for development of interpersonal communication
JP2022091892A (ja) 情報処理システム、情報処理方法、情報処理プログラム
JP7592793B1 (ja) プログラム、情報処理装置および情報処理方法
JP7750294B2 (ja) 情報処理装置、情報処理方法、およびプログラム
Eghbali Social acceptability of virtual reality interaction: Experiential factors and design implications
JP7731938B2 (ja) プログラム、情報処理装置および情報処理方法
US20240378802A1 (en) Information processing method, server, and information processing system
Sherwood Designing to support impression management

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20817200

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20817200

Country of ref document: EP

Kind code of ref document: A1