[go: up one dir, main page]

WO2025206103A1 - Information processing system, method, and program - Google Patents

Information processing system, method, and program

Info

Publication number
WO2025206103A1
WO2025206103A1 PCT/JP2025/012332 JP2025012332W WO2025206103A1 WO 2025206103 A1 WO2025206103 A1 WO 2025206103A1 JP 2025012332 W JP2025012332 W JP 2025012332W WO 2025206103 A1 WO2025206103 A1 WO 2025206103A1
Authority
WO
WIPO (PCT)
Prior art keywords
scene
information processing
processing system
virtual space
event
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2025/012332
Other languages
French (fr)
Inventor
Masayuki Inoue
Yukio YAKUSHIJIN
Tatsumi MIYAKE
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Group Corp
Original Assignee
Sony Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Group Corp filed Critical Sony Group Corp
Publication of WO2025206103A1 publication Critical patent/WO2025206103A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings

Definitions

  • the present disclosure relates to an information processing system, a method, and a program.
  • a motion capture technology for extracting a human skeleton position (each of a plurality of points corresponding to each part of a body, such as a head, a torso, limbs, and the like) from a captured image and recording a change in skeleton position as a human operation is known.
  • Such a motion capture technology is recently used in sports games and the like.
  • a virtual object having motion of a player in a real space is arranged in a virtual space, and thus, a game in the real space can be represented in the virtual space.
  • a technology for installing a virtual advertisement in a virtual space is known as disclosed in PTL 1.
  • the present disclosure proposes a new and improved technology capable of improving an appeal effect of a virtual advertisement.
  • an information processing method including transmitting content data of an event to a user terminal, whether or not each scene of a plurality of scenes of an event is an attention scene being determined based on the content data, a specified scene of the plurality of scenes and an object being displayed in a virtual space on the user terminal based on the determination of whether the specified scene is an attention scene, and the object being displayed as a first object having a first appearance based on a determination that the specified scene is an attention scene, and the object being displayed as a second object having a second appearance different from the first appearance based on a determination that the specified scene is not an attention scene.
  • a non-transitory computer-readable medium having embodied thereon a program, which when executed by a computer causes the computer to execute an information processing method, the method including transmitting content data of an event to a user terminal, whether or not each scene of a plurality of scenes of an event is an attention scene being determined based on the content data, a specified scene of the plurality of scenes and an object being displayed in a virtual space on the user terminal based on the determination of whether the specified scene is an attention scene, and the object being displayed as a first object having a first appearance based on a determination that the specified scene is an attention scene, and the object is displayed as a second object having a second appearance different from the first appearance based on a determination that the specified scene is not an attention scene.
  • Fig. 1 is an explanatory diagram illustrating a configuration of an information processing system 1 according to an embodiment of the present disclosure.
  • Fig. 2 is an explanatory diagram illustrating a display screen D1 of a virtual space generated and displayed by a user terminal 20.
  • Fig. 3 is an explanatory diagram illustrating a configuration of a content server 10 according to an embodiment of the present disclosure.
  • Fig. 4 is an explanatory diagram illustrating a specific example of event metadata.
  • Fig. 5 is an explanatory diagram illustrating a configuration of the user terminal 20 according to an embodiment of the present disclosure.
  • Fig. 6 is an explanatory diagram illustrating a display screen D2 of the virtual space generated and displayed by the user terminal 20.
  • FIG. 7 is a flowchart illustrating a flow of a first display control example according to an embodiment of the present disclosure.
  • Fig. 8 illustrates a display screen D3 of the virtual space as a specific example of a display screen of a virtual space generated in a case where a distance between a position of a virtual viewpoint and an arrangement position of an advertisement object is a first distance less than a threshold value.
  • Fig. 9 illustrates a display screen D4 of the virtual space as a specific example of a display screen of a virtual space generated in a case where the distance between the position of the virtual viewpoint and the arrangement position of the advertisement object is a second distance equal to or more than the threshold value.
  • Fig. 8 illustrates a display screen D3 of the virtual space as a specific example of a display screen of a virtual space generated in a case where a distance between a position of a virtual viewpoint and an arrangement position of an advertisement object is a first distance less than a threshold value.
  • Fig. 9 illustrates a display screen
  • the present disclosure relates to an information processing system that generates and displays a display screen of a virtual space.
  • the present disclosure relates to an information processing system that generates and displays a display screen of a virtual space that reproduces an event in a real space.
  • an example of the event described above an example in which an embodiment of the present disclosure is mainly applied to a soccer game will be mainly described, but the event is not limited to the soccer game.
  • embodiments of the present disclosure can be applied to various events such as other sports games, concerts, plays, chats, forums, interviews, promotional events, and various contests.
  • Fig. 1 is an explanatory diagram illustrating a configuration of an information processing system 1 according to an embodiment of the present disclosure.
  • the information processing system 1 includes a content server 10 and a plurality of user terminals 20.
  • the information processing system 1 may include only one user terminal of the plurality of user terminals 20 or only the plurality of user terminals 20, only the content server 10, or both the content server 10 and at least one of the plurality of user terminals 20.
  • the content server 10 and the plurality of user terminals 20 are connected to communicate with each other via a communication network 82.
  • the communication network 82 is a wired or wireless transmission path of information transmitted from a device connected to the communication network 82.
  • the communication network 82 may include a public network such as the Internet, a telephone network, or a satellite communication network, various local area networks (LANs) including Ethernet (registered trademark), a wide area network (WAN), and the like.
  • LANs local area networks
  • WAN wide area network
  • the communication network 82 may include a dedicated line network such as an Internet protocol-virtual private network (IP-VPN).
  • IP-VPN Internet protocol-virtual private network
  • the motion data of the player is time-series data of bone data indicating a position and an orientation of each joint of the player.
  • the motion data also includes time-series data about a position of the player.
  • the motion data of the ball is time-series data about a position of the ball.
  • These pieces of motion data can be acquired by analyzing pieces of data obtained from a plurality of sensors installed in a soccer stadium in the real space. The analysis can include tracking of each player and ball, adjustment of a data format, deletion of unnecessary data, and the like.
  • the content server 10 may perform the analysis, or another server may perform the analysis.
  • the content server 10 stores content data of a past event (game), and may transmit the stored content data to the user terminal 20, may transmit the content data to the user terminal 20 in real time with the event in the real space, or may transmit the content data to the user terminal 20 with a time delay with respect to the event in the real space.
  • servers may be multiplexed in accordance with the number of terminals connected to the content server 10 in order to distribute a processing load of the content server 10.
  • a processing load of the content server 10 For example, in a case where a certain virtual space is used by 1000 user terminals 20, 10 content servers 10 connectable to 100 user terminals 20 are prepared, and each content server 10 provides the same content data. Therefore, the same content data can be provided to a total of 1000 user terminals 20 every 100 user terminals. In each virtual space, rough elements such as a flow of time are synchronized, but an interaction between users can be performed for each virtual space. Note that, management of multiplexing of the content server 10 may be performed by a back-end system (not illustrated).
  • a server environment of the content server 10 is not particularly limited.
  • the content server 10 may be a physical server or a virtual server executed on the physical server.
  • the physical server or the virtual server described here may be a server provided by a hosting service, or may be an own server prepared by a business operator who provides a service for providing a virtual space.
  • a function of one content server 10 may be realized by a plurality of physical servers or a plurality of virtual servers.
  • the content server 10 can manage positional information of the user corresponding each of to the plurality of user terminals 20 connected for communication.
  • the user terminal 20 is a terminal used by the user.
  • the user terminal 20 arranges a virtual object in the virtual space by using the content data received from the content server 10, and generates and displays a display screen representing the virtual space.
  • the user terminal 20 arranges a player object that reproduces motion of the player in the real space in the virtual space of the soccer stadium on the basis of the motion data of the player included in the content data.
  • the user terminal 20 arranges a ball object that reproduces motion of the ball in the real space in the virtual space of the soccer stadium on the basis of the motion data of the ball included in the content data.
  • the user terminal 20 arranges an advertisement object, which is a virtual advertisement, in the virtual space.
  • the advertisement object is an object for advertising a product, a service, a company, an association, or the like.
  • the advertisement object may be two-dimensional data such as still image data or moving image data, or may be three-dimensional data.
  • the user terminal 20 can arrange the advertisement object described above around a region in the virtual space corresponding to the predetermined region.
  • Fig. 2 is an explanatory diagram illustrating a display screen D1 of the virtual space generated and displayed by the user terminal 20.
  • the display screen D1 of the virtual space includes display of a plurality of player objects P1 to P6, display of a ball object B, and display of an advertisement object A1.
  • the advertisement object A1 is arranged around a region in the virtual space corresponding to a soccer court in the real space (that is, pitch side).
  • the user terminal 20 arranges the virtual viewpoint on an extension line connecting a goal and the ball.
  • the user terminal 20 arranges the virtual viewpoint at a position with a wide angle of view above and behind the ball.
  • These camera positions and orientations of cameras may be operable by the user.
  • the virtual viewpoint is arranged at a position where the player is viewed from above, while in the “GK” and the “Shooter”, the virtual viewpoint is arranged at a viewpoint position of the player.
  • a subjective video of the player is generated as the display screen of the virtual space.
  • the display control unit 252 arranges the player object and the ball object in the virtual space of the soccer stadium indicated by the virtual space model stored in the storage unit 220 on the basis of the motion data included in the content data and the human body model stored in the storage unit 220 (S316).
  • the display control unit 252 determines whether or not the scene to be displayed is the attention scene (S320).
  • the display control unit 252 can make the determination on the basis of the event metadata included in the content data.
  • the display control unit 252 arranges different advertisement objects in the virtual space in accordance with a distance between the position of the virtual viewpoint and an arrangement position of the advertisement object. For example, in a case where the above distance is a first distance, the display control unit 252 may arrange an advertisement object having a larger size in the virtual space than in a case where the above distance is a second distance smaller than the first distance. In a case where the above distance is the first distance, the display control unit 252 may arrange, in the virtual space, an advertisement object having the same size as in a case where the above distance is the second distance and having a larger character or image than in a case where the above distance is the second distance.
  • the display control unit 252 may arrange advertisement objects having different contents between a case where the above distance is the first distance and a case where the above distance is the second distance smaller than the first distance in the virtual space.
  • advertisement objects having different contents between a case where the above distance is the first distance and a case where the above distance is the second distance smaller than the first distance in the virtual space.
  • Fig. 8 illustrates a display screen D3 of the virtual space as a specific example of the display screen of the virtual space generated in a case where the distance between the position of the virtual viewpoint and the arrangement position of the advertisement object is the first distance less than a threshold value.
  • an advertisement object A3 is arranged in the virtual space, and therefore, the display of the advertisement object A3 is included in the display screen D3 of the virtual space.
  • the advertisement object A3 is an image of shoes, and the shoe includes a logo mark. The user who has viewed the display screen D3 of the virtual space can visually recognize the logo mark from the advertisement object A3 and can grasp which manufacturer's shoes are the shoes of the advertisement object A3.
  • Fig. 9 illustrates a display screen D4 of the virtual space as a specific example of the display screen of the virtual space generated in a case where the distance between the position of the virtual viewpoint and the arrangement position of the advertisement object is the second distance that is equal to or more than the threshold value.
  • an advertisement object A4 is arranged in the virtual space, and therefore, the display of the advertisement object A4 is included in the display screen D4 of the virtual space.
  • the advertisement object A4 is an image of a logo mark. The user who has viewed the display screen D4 of the virtual space can visually recognize the logo mark from the advertisement object A4 and can grasp which manufacturer's advertisement the advertisement object A4 is.
  • a size of the advertisement object A3 illustrated in Fig. 8 in the virtual space is the same as a size of the advertisement object A4 illustrated in Fig. 9 in the virtual space.
  • a display size of the advertisement object A4 on the display screen D4 of the virtual space illustrated in Fig. 9 is smaller than a display size of the advertisement object A3 on the display screen D3 of the virtual space.
  • Fig. 9 illustrates the display of the advertisement object A3 in a case where the advertisement object A3 is arranged in the virtual space as a comparative example.
  • the display of the advertisement object A3 is small, and it is difficult for the user to visually recognize the logo mark. When the user cannot visually recognize the logo mark, an advertising effect is reduced.
  • the advertisement object A4 illustrated in Fig. 9 is arranged in accordance with the distance between the position of the virtual viewpoint and the arrangement position of the advertisement object, and thus, the user can visually recognize the logo mark. As a result, the advertising effect can be enhanced.
  • Fig. 10 is a flowchart illustrating the flow of the second display control example according to an embodiment of the present disclosure.
  • the user performs the reproduction operation including the designation of the game on the operation display unit 230 of the user terminal 20 (S304).
  • the communication unit 210 transmits the transmission request of the content data to the content server 10 (S308).
  • the transmission request includes a game ID (event ID) designated by the user in S304. Therefore, the communication unit 210 receives the content data of the designated game from the content server 10 (S312).
  • the display control unit 252 arranges the player object and the ball object in the virtual space of the soccer stadium indicated by the virtual space model stored in the storage unit 220 on the basis of the motion data included in the content data and the human body model stored in the storage unit 220 (S316).
  • the display control unit 252 determines whether or not the distance between the arrangement position of each advertisement object and the position of the virtual viewpoint is equal to or more than the threshold value (S420). Note that, the position of the virtual viewpoint changes in accordance with which item is selected on the viewpoint selection display M illustrated in Fig. 2 and the like, the situation of the game (for example, the position of the ball object), and the like.
  • the display control unit 252 arranges a fourth advertisement object at the arrangement position in the virtual space (S428).
  • the fourth advertisement object is, for example, the advertisement object A4 illustrated in Fig. 9.
  • the display control unit 252 generates the display screen of the virtual space (S332), and causes the operation display unit 230 to display the display screen of the virtual space (S336).
  • the processing from S312 to S336 described above is repeated until the end of reproduction (S340).
  • the display control unit 252 arranges different advertisement objects in the virtual space in accordance with the scene of the display screen of the virtual space viewed by the user. According to such a configuration, an appropriate advertisement object corresponding to the scene is displayed, and thus, an appeal effect by the advertisement object can be improved.
  • the display control unit 252 can arrange, in the virtual space, different advertisement objects between the attention scene in the event and the normal scene. Since it is considered that the number of recognized advertisement objects increases on the attention scene, an advertiser can realize a desired appeal effect by preparing the advertisement object for the attention scene and the advertisement object for the normal scene based on the increase in the number of recognized advertisement objects. Note that, an advertisement rate for displaying the advertisement object on the attention scene may be set to be higher than an advertisement rate for displaying the advertisement object on the normal scene.
  • the attention scene determination unit 152 can determine whether or not each scene is the attention scene on the basis of the sound data acquired in the event or the number of times of reproduction of each scene. According to such a configuration, an appropriate determination result as to whether or not each scene is the attention scene can be obtained.
  • the display control unit 252 arranges different advertisement objects in the virtual space in accordance with the position of the virtual viewpoint. That is, the display control unit 252 arranges different advertisement objects in the virtual space in accordance with the scene in the sense of how the virtual space is viewed. According to such a configuration, an appropriate advertisement object corresponding to the method for viewing the virtual space is displayed, and thus, the appeal effect by the advertisement object can be improved.
  • the display control unit 252 arranges different advertisement objects in the virtual space in accordance with the distance between the position of the virtual viewpoint and the arrangement position of the advertisement object. Since a relationship between the size of the advertisement object and the display size of the advertisement object changes depending on the distance between the position of the virtual viewpoint and the arrangement position of the advertisement object, the control of the advertisement object corresponding to the above distance is reasonable.
  • the advertisement object A5 may include a plurality of objects and the second advertisement object A6 may include any combination of one or more new objects and one or more of the plurality of objects.
  • the one or more of the plurality of objects in the advertisement object A6 may be in a same or different arrangement than the arrangement of the one or more of the plurality of objects in the advertisement object A5.
  • the second advertisement object A6 may also be the first advertisement object A5 having a portion thereof changed or covered.
  • the input device 908 includes an input means such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever for a user to input information, an input control circuit which generates an input signal on the basis of the input by the user and outputs the same to the CPU 901 and the like.
  • the input device 908 is operated, and thus, the user can input various kinds of data and can give an instruction about a processing operation.
  • the advertisement object may be used by another method.
  • the display control unit 252 may arrange the advertisement object in a window different from a window including the display screen of the virtual space.
  • the above-described display screen of the virtual space and electronic commerce may be designed in cooperation with each other.
  • the user terminal 20 may access a server that sells a uniform of the player object and may display a purchase screen of the uniform of the player object.
  • the user can purchase the uniform in which a uniform number and a name of the player are imprinted.
  • circuitry is further configured to: initiate display of different objects based on different distances between a position of a virtual viewpoint and an arrangement position of the different objects in the virtual space.
  • circuitry is further configured to: initiate display of a third object having a third appearance in the virtual space based on a distance between a position of a virtual viewpoint and an arrangement position of the third object in the virtual space being equal to or more than a predetermined threshold; and initiate display of a fourth object having a fourth appearance different from the third appearance in the virtual space based on a distance between another position of the virtual viewpoint and the arrangement position of the fourth object in the virtual space being less than the predetermined threshold.
  • the information processing system according to any one of (1) to (16), wherein the content data includes a degree of interest of each scene of the plurality of scenes, and wherein the circuitry is further configured to determine whether or not each scene of the plurality of scenes of the event is the attention scene based on the degree of interest of each scene of the plurality of scenes.
  • the degree of interest includes at least one of sound data associated with each scene of the plurality of scenes, a number of times a reproduction of each scene of the plurality of scenes is stored in a storage device, a number of comments on each scene of the plurality of scenes, or a number of evaluations of each scene of the plurality of scenes.
  • An information processing method including: transmitting content data of an event to a user terminal, wherein whether or not each scene of a plurality of scenes of an event is an attention scene is determined based on the content data, wherein a specified scene of the plurality of scenes and an object are displayed in a virtual space on the user terminal based on the determination of whether the specified scene is an attention scene, and wherein the object is displayed as a first object having a first appearance based on a determination that the specified scene is an attention scene, and the object is displayed as a second object having a second appearance different from the first appearance based on a determination that the specified scene is not an attention scene.
  • An information processing system including: a display control unit that generates a display screen of a virtual space in which a virtual object having motion corresponding to motion data of a target is arranged, in which the display control unit arranges different advertisement objects in the virtual space in accordance with a scene of the display screen viewed by a user.
  • the information processing system according to (1) in which the motion data of the target is motion data of the target in an event in a real space, and the display control unit arranges the different advertisement objects in the virtual space between a specific scene and another scene in the event.
  • the information processing system according to (2) further including: a scene determination unit that determines whether or not each scene in the event corresponds to the specific scene on a basis of reaction to the each scene.
  • the information processing system according to (8) in which the display control unit arranges, in the virtual space, an advertisement object having a larger size in a case where the distance is a first distance than in a case where the distance is a second distance smaller than the first distance.
  • the display control unit arranges advertisement objects having different contents in the virtual space between a case where the distance is a first distance and a case where the distance is a second distance smaller than the first distance.
  • the display control unit arranges advertisement objects having different contents in the virtual space between a case where the distance is a first distance and a case where the distance is a second distance smaller than the first distance.
  • the information processing system according to any one of (2) to (5) in which the event is performed by using a predetermined region in the real space, and the display control unit arranges the advertisement object around a region in the virtual space corresponding to the predetermined region.
  • the information processing system according to any one of (1) to (11), further including: a user terminal; and an information processing apparatus, in which the information processing apparatus includes the display control unit, and a communication control unit that controls transmission of the display screen to the user terminal, and the user terminal receives the display screen from the information processing apparatus, and displays the display screen on a display unit.
  • the information processing system according to any one of (1) to (11), further including: a user terminal; and an information processing apparatus, in which the information processing apparatus includes a communication control unit that controls transmission of the motion data of the target to the user terminal, and the user terminal includes the display control unit operated on a basis of the motion data of the target received from the information processing apparatus.
  • a method including: generating, by a processor, a display screen of a virtual space in which a virtual object having motion corresponding to motion data of a target is arranged, in which the generating of the display screen of the virtual space includes arranging different advertisement objects in the virtual space in accordance with a scene of the display screen viewed by a user.
  • a program causing a computer to function as a display control unit that generates a display screen of a virtual space in which a virtual object having motion corresponding to motion data of a target is arranged, in which the display control unit arranges different advertisement objects in the virtual space in accordance with a scene of the display screen viewed by a user.
  • Information processing system 10 Content server 110 Communication unit 120 Storage unit 150 Control unit 152 Attention scene determination unit 154 Communication control unit 20 User terminal 210 Communication unit 220 Storage unit 230 Operation display unit 240 Sound output unit 250 Control unit 252 Display control unit 254 Output control unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Economics (AREA)
  • Game Theory and Decision Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

There is provided an information processing system including circuitry configured to determine whether or not each scene of a plurality of scenes of an event is an attention scene based on content data of the event, and initiate display of a specified scene of the plurality of scenes and an object in a virtual space based on the determination of whether the specified scene is an attention scene, the object being displayed as a first object having a first appearance based on a determination that the specified scene is an attention scene, and the object being displayed as a second object having a second appearance different from the first appearance based on a determination that the specified scene is not an attention scene.

Description

INFORMATION PROCESSING SYSTEM, METHOD, AND PROGRAM CROSS REFERENCE TO RELATED APPLICATIONS
This application claims the benefit of Japanese Priority Patent Application JP 2024-051624 filed on March 27, 2024, the entire contents of which are incorporated herein by reference.
The present disclosure relates to an information processing system, a method, and a program.
Recently, a motion capture technology for extracting a human skeleton position (each of a plurality of points corresponding to each part of a body, such as a head, a torso, limbs, and the like) from a captured image and recording a change in skeleton position as a human operation is known.
Such a motion capture technology is recently used in sports games and the like. For example, a virtual object having motion of a player in a real space is arranged in a virtual space, and thus, a game in the real space can be represented in the virtual space. Furthermore, for example, a technology for installing a virtual advertisement in a virtual space is known as disclosed in PTL 1.
JP 2021-076923A
Summary
However, the technology disclosed in PTL 1 mainly relates to synchronization between an advertisement of an electronic signboard and a virtual advertisement in the virtual space, and the technology disclosed in PTL 1 does not sufficiently improve an appeal effect of the virtual advertisement.
Therefore, the present disclosure proposes a new and improved technology capable of improving an appeal effect of a virtual advertisement.
According to an aspect of the present disclosure, there is provided an information processing system including circuitry configured to determine whether or not each scene of a plurality of scenes of an event is an attention scene based on content data of the event, and initiate display of a specified scene of the plurality of scenes and an object in a virtual space based on the determination of whether the specified scene is an attention scene, the object being displayed as a first object having a first appearance based on a determination that the specified scene is an attention scene, and the object being displayed as a second object having a second appearance different from the first appearance based on a determination that the specified scene is not an attention scene.
According to another aspect of the present disclosure, there is provided an information processing method including transmitting content data of an event to a user terminal, whether or not each scene of a plurality of scenes of an event is an attention scene being determined based on the content data, a specified scene of the plurality of scenes and an object being displayed in a virtual space on the user terminal based on the determination of whether the specified scene is an attention scene, and the object being displayed as a first object having a first appearance based on a determination that the specified scene is an attention scene, and the object being displayed as a second object having a second appearance different from the first appearance based on a determination that the specified scene is not an attention scene.
According to another aspect of the present disclosure, there is provided A non-transitory computer-readable medium having embodied thereon a program, which when executed by a computer causes the computer to execute an information processing method, the method including transmitting content data of an event to a user terminal,
whether or not each scene of a plurality of scenes of an event is an attention scene being determined based on the content data, a specified scene of the plurality of scenes and an object being displayed in a virtual space on the user terminal based on the determination of whether the specified scene is an attention scene, and
the object being displayed as a first object having a first appearance based on a determination that the specified scene is an attention scene, and the object is displayed as a second object having a second appearance different from the first appearance based on a determination that the specified scene is not an attention scene.
Fig. 1 is an explanatory diagram illustrating a configuration of an information processing system 1 according to an embodiment of the present disclosure. Fig. 2 is an explanatory diagram illustrating a display screen D1 of a virtual space generated and displayed by a user terminal 20. Fig. 3 is an explanatory diagram illustrating a configuration of a content server 10 according to an embodiment of the present disclosure. Fig. 4 is an explanatory diagram illustrating a specific example of event metadata. Fig. 5 is an explanatory diagram illustrating a configuration of the user terminal 20 according to an embodiment of the present disclosure. Fig. 6 is an explanatory diagram illustrating a display screen D2 of the virtual space generated and displayed by the user terminal 20. Fig. 7 is a flowchart illustrating a flow of a first display control example according to an embodiment of the present disclosure. Fig. 8 illustrates a display screen D3 of the virtual space as a specific example of a display screen of a virtual space generated in a case where a distance between a position of a virtual viewpoint and an arrangement position of an advertisement object is a first distance less than a threshold value. Fig. 9 illustrates a display screen D4 of the virtual space as a specific example of a display screen of a virtual space generated in a case where the distance between the position of the virtual viewpoint and the arrangement position of the advertisement object is a second distance equal to or more than the threshold value. Fig. 10 is a flowchart illustrating a flow of a second display control example according to an embodiment of the present disclosure. Fig. 11 illustrates a display screen D5 of the virtual space as a specific example of a display screen of a virtual space generated in a case of a first condition of an event, the first condition being a first one of a league, a team, or a location of an event or type of an event. Fig. 12 illustrates a display screen D6 of the virtual space as a specific example of a display screen of a virtual space generated in a case of a second condition of an event, the second condition being a second one of a league, a team, or a location of an event or a type of an event. Fig. 13 is a block diagram illustrating an example of a hardware configuration 90.
Hereinafter, embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. Note that, in the present specification and drawings, components having substantially the same functional configuration are denoted by the same reference signs, and redundant description is omitted.
Furthermore, the “mode for carrying out the disclosure” is described according to the order of items described below.
1. Configuration of Information Processing System
2. Configuration of Content Server
3. Configuration of User Terminal
4. Display Control Example
4-1. First Display Control Example
4-2. Second Display Control Example
5. Operation and Effect
6. Hardware Configuration
7. Supplement
<1. Configuration of Information Processing System>
The present disclosure relates to an information processing system that generates and displays a display screen of a virtual space. In particular, the present disclosure relates to an information processing system that generates and displays a display screen of a virtual space that reproduces an event in a real space. Hereinafter, as an example of the event described above, an example in which an embodiment of the present disclosure is mainly applied to a soccer game will be mainly described, but the event is not limited to the soccer game. For example, embodiments of the present disclosure can be applied to various events such as other sports games, concerts, plays, chats, forums, interviews, promotional events, and various contests.
First, a configuration of the information processing system according to an embodiment of the present disclosure will be described with reference to Fig. 1.
Fig. 1 is an explanatory diagram illustrating a configuration of an information processing system 1 according to an embodiment of the present disclosure. As illustrated in Fig. 1, the information processing system 1 according to an embodiment of the present disclosure includes a content server 10 and a plurality of user terminals 20. The information processing system 1 may include only one user terminal of the plurality of user terminals 20 or only the plurality of user terminals 20, only the content server 10, or both the content server 10 and at least one of the plurality of user terminals 20.
The content server 10 and the plurality of user terminals 20 are connected to communicate with each other via a communication network 82. The communication network 82 is a wired or wireless transmission path of information transmitted from a device connected to the communication network 82. For example, the communication network 82 may include a public network such as the Internet, a telephone network, or a satellite communication network, various local area networks (LANs) including Ethernet (registered trademark), a wide area network (WAN), and the like. Furthermore, the communication network 82 may include a dedicated line network such as an Internet protocol-virtual private network (IP-VPN).
(Content Server 10)
The content server 10 transmits content data including analysis data of the event in the real space to the user terminal 20. For example, the analysis data includes motion data of a target such as a person and a tool involved in the event. In a case where the event is the soccer game, the analyzed data includes motion data of a player and motion data of a ball.
The motion data of the player is time-series data of bone data indicating a position and an orientation of each joint of the player. The motion data also includes time-series data about a position of the player. The motion data of the ball is time-series data about a position of the ball. These pieces of motion data can be acquired by analyzing pieces of data obtained from a plurality of sensors installed in a soccer stadium in the real space. The analysis can include tracking of each player and ball, adjustment of a data format, deletion of unnecessary data, and the like. The content server 10 may perform the analysis, or another server may perform the analysis. Furthermore, the content server 10 stores content data of a past event (game), and may transmit the stored content data to the user terminal 20, may transmit the content data to the user terminal 20 in real time with the event in the real space, or may transmit the content data to the user terminal 20 with a time delay with respect to the event in the real space.
In the information processing system 1 according to an embodiment of the present disclosure, servers may be multiplexed in accordance with the number of terminals connected to the content server 10 in order to distribute a processing load of the content server 10. For example, in a case where a certain virtual space is used by 1000 user terminals 20, 10 content servers 10 connectable to 100 user terminals 20 are prepared, and each content server 10 provides the same content data. Therefore, the same content data can be provided to a total of 1000 user terminals 20 every 100 user terminals. In each virtual space, rough elements such as a flow of time are synchronized, but an interaction between users can be performed for each virtual space. Note that, management of multiplexing of the content server 10 may be performed by a back-end system (not illustrated).
Furthermore, a server environment of the content server 10 is not particularly limited. For example, the content server 10 may be a physical server or a virtual server executed on the physical server. The physical server or the virtual server described here may be a server provided by a hosting service, or may be an own server prepared by a business operator who provides a service for providing a virtual space. Furthermore, a function of one content server 10 may be realized by a plurality of physical servers or a plurality of virtual servers. The content server 10 can manage positional information of the user corresponding each of to the plurality of user terminals 20 connected for communication.
(User Terminal 20)
The user terminal 20 is a terminal used by the user. The user terminal 20 arranges a virtual object in the virtual space by using the content data received from the content server 10, and generates and displays a display screen representing the virtual space. For example, the user terminal 20 arranges a player object that reproduces motion of the player in the real space in the virtual space of the soccer stadium on the basis of the motion data of the player included in the content data. Furthermore, the user terminal 20 arranges a ball object that reproduces motion of the ball in the real space in the virtual space of the soccer stadium on the basis of the motion data of the ball included in the content data.
Moreover, the user terminal 20 according to an embodiment of the present disclosure arranges an advertisement object, which is a virtual advertisement, in the virtual space. The advertisement object is an object for advertising a product, a service, a company, an association, or the like. The advertisement object may be two-dimensional data such as still image data or moving image data, or may be three-dimensional data. In a case where the event is performed by using a predetermined region in the real space, the user terminal 20 can arrange the advertisement object described above around a region in the virtual space corresponding to the predetermined region.
Hereinafter, a specific example of the display screen of the virtual space generated and displayed by the user terminal 20 will be described with reference to Fig. 2.
Fig. 2 is an explanatory diagram illustrating a display screen D1 of the virtual space generated and displayed by the user terminal 20. As illustrated in Fig. 2, the display screen D1 of the virtual space includes display of a plurality of player objects P1 to P6, display of a ball object B, and display of an advertisement object A1. In the example illustrated in Fig. 2, the advertisement object A1 is arranged around a region in the virtual space corresponding to a soccer court in the real space (that is, pitch side).
Furthermore, the display screen D1 of the virtual space includes a viewpoint selection display M. The viewpoint selection display M includes an item corresponding to each of a plurality of virtual viewpoints. When the user selects any item on the viewpoint selection display M, the user terminal 20 generates a display screen of the virtual space in which the virtual space is viewed from the virtual viewpoint corresponding to the selected item. In Fig. 2, “Standard”, “Bird's eye”, “GK”, “Shooter”, and “Avatar” are illustrated as a plurality of items. In the example illustrated in Fig. 2, “Standard” is selected among the plurality of items.
In the “Standard”, the user terminal 20 arranges the virtual viewpoint on an extension line connecting a goal and the ball. In “Bird's eye”, the user terminal 20 arranges the virtual viewpoint at a position with a wide angle of view above and behind the ball. These camera positions and orientations of cameras may be operable by the user. In these two items, the virtual viewpoint is arranged at a position where the player is viewed from above, while in the “GK” and the “Shooter”, the virtual viewpoint is arranged at a viewpoint position of the player. Thus, a subjective video of the player is generated as the display screen of the virtual space.
Specifically, in the “GK”, the user terminal 20 arranges the virtual viewpoint at a viewpoint position of a goalkeeper. In the “Shooter”, the user terminal 20 arranges the virtual viewpoint at a viewpoint position of a player who shoots at the end of a scene. Note that, the scene is a section of a game obtained by dividing the game into a plurality of sections.
Furthermore, in the “Avatar”, the user terminal 20 generates a display screen of the virtual space in which the user object is viewed from behind by arranging a user object corresponding to an avatar of the user in the virtual space and arranging a virtual viewpoint behind the user object. The user can move the user object in the virtual space by operating the user terminal 20. The user object moves, and thus, the virtual viewpoint also moves. As a result, the display screen of the virtual space also changes.
For example, the user terminal 20 described above is realized by a smartphone, a tablet terminal, a personal computer (PC), a head mounted display (HMD) covering the entire field of view, a glasses-type device, a projector, a console game machine, or the like.
Note that, in the present specification, processing in a case where the user terminal 20 stores the advertisement object as application data will be mainly described, but the content server 10 may include the advertisement object in the content data and may transmit the content data to the user terminal 20.
Furthermore, in the above description, an example in which the user terminal 20 generates the display screen of the virtual space on the basis of the content data received from the content server 10 has been described, but a generation subject of the display screen of the virtual space is not limited to the user terminal 20. For example, in a case where the content server 10 stores a virtual space model, the advertisement object, and the like, the content server 10 may generate the display screen of the virtual space and may transmit the generated display screen to the user terminal 20. In this case, the content server 10 has functions corresponding to a storage unit 220 and a display control unit 252 as described later.
<2. Configuration of Content Server>
The configuration of the information processing system 1 according to an embodiment of the present disclosure has been described above. Next, a configuration of the content server 10 according to an embodiment of the present disclosure will be described with reference to Fig. 3.
Fig. 3 is an explanatory diagram illustrating the configuration of the content server 10 according to an embodiment of the present disclosure. As illustrated in Fig. 3, the content server 10 according to an embodiment of the present disclosure includes a communication unit 110, a storage unit 120, and a control unit 150.
(Communication Unit 110)
The communication unit 110 communicates various kinds of data with other devices. For example, the communication unit 110 receives a transmission request of the content data from the user terminal 20, and transmits the content data to the user terminal 20 under the control of a communication control unit 154. Furthermore, the communication unit 110 can receive the analysis data of the event, sound data obtained by sound collection in the event, and the like from an external server.
(Storage Unit 120)
The storage unit 120 stores various kinds of data used for an operation of the content server 10. For example, the storage unit 120 stores motion data, sound data, event metadata, and the number of times of scene-specific reproduction every event. As described above, the motion data includes the motion data of the player and the motion data of the ball. The motion data of the player includes, for example, a player ID which is identification information of the player. The sound data is data including a shout, a cheer, a ball sound, and the like obtained by sound collection in the event.
The event metadata is time-series data indicating a situation of each scene of the event. In a case where the event is the soccer game, the event metadata includes time information, an occurrence event, detailed information, and type information. Examples of the occurrence event in the game include a score (which team scored and how many points, which players scored goals?), various warnings (In the case of soccer, of yellow card or red card, a violation, or the like) by a referee, a free kick, a player change, and others. The detailed information is detailed information of the occurrence event. The time information, the occurrence event, and the detailed information may be manually input by an operator, or may be mechanically acquired by image analysis or the like. The type information is a type obtained as a determination result by an attention scene determination unit 152 as described later, and indicates “attention” or “normal”.
Fig. 4 is an explanatory diagram illustrating a specific example of the event metadata. In the example illustrated in Fig. 4, a time “20 minutes to 21 minutes”, an occurrence event “Chance”, detailed information “AAA was called into action at his near post by BBB …”, and a type “normal” are associated with each other. A scene corresponding to the time “20 minutes to 21 minutes” means that “Chance” has occurred, and specifically, an event of "AAA was called into action at his near post by BBB …” has occurred and the scene is a normal scene.
The number of times of scene-specific reproduction is the number of times managed every scene, and is a cumulative number of times each scene has been reproduced by the plurality of user terminals 20. The number of times of scene-specific reproduction can be updated as necessary by the control unit 150.
(Control Unit 150)
The control unit 150 controls the overall operation of the content server 10. In particular, the control unit 150 according to an embodiment of the present disclosure has a function as the attention scene determination unit 152 and a function as the communication control unit 154.
- Attention scene determination unit 152
The attention scene determination unit 152 determines whether or not each scene of the event is an attention scene having a high degree of interest from the user, and adds a determination result as the event metadata to the storage unit 120. The attention scene determination unit 152 can make the above determination on the basis of reaction to each scene.
For example, the attention scene determination unit 152 may make the above determination on the basis of the sound data stored in the storage unit 120. Specifically, the attention scene determination unit 152 may determine a scene including a shout with a magnitude exceeding a reference as the attention scene.
Furthermore, the attention scene determination unit 152 may make the above determination on the basis of the number of times of scene-specific reproduction stored in the storage unit 120. Specifically, the attention scene determination unit 152 may determine a scene whose number of times of reproduction exceeds the reference as the attention scene.
Furthermore, in a case where a comment, intension display of high evaluation, or the like is made by the user for each scene and the collected comment, high evaluation, or the like is managed in the storage unit 120, the attention scene determination unit 152 may determine a scene in which the number of collected comments or the number of high evaluations exceeds a reference as the attention scene.
- Communication control unit 154
For example, the communication control unit 154 controls communication between the communication unit 110 and the user terminal 20. For example, in a case where the transmission request of the content data is received from the user terminal 20 by the communication unit 110, the communication control unit 154 reads the motion data, the sound data, and the event metadata as the content data from the storage unit 120, and causes the communication unit 110 to transmit the content data to the user terminal 20.
The transmission request of the content data may include an event ID for specifying the event and designation information of a time or a scene. In this case, the communication control unit 154 may cause the communication unit 110 to transmit the motion data and the sound data of the time or scene corresponding to the event ID and designated by the designation information to the user terminal 20 together with the event metadata.
<3. Configuration of User Terminal>
The configuration of the content server 10 according to an embodiment of the present disclosure has been described above. Next, a configuration of the user terminal 20 according to an embodiment of the present disclosure will be described with reference to Fig. 5.
Fig. 5 is an explanatory diagram illustrating the configuration of the user terminal 20 according to an embodiment of the present disclosure. As illustrated in Fig. 5, the user terminal 20 according to an embodiment of the present disclosure includes a communication unit 210, a storage unit 220, an operation display unit 230, a sound output unit 240, and a control unit 250.
(Communication Unit 210)
The communication unit 210 performs various kinds of communication with the content server 10. For example, the communication unit 210 transmits the transmission request of the content data to the content server 10, and receives the content data from the content server 10.
(Storage Unit 220)
The storage unit 220 stores various kinds of data used for an operation of the user terminal 20. For example, the storage unit 220 stores a human body model, a virtual space model, and an advertisement object.
The human body model includes information such as a shape, a size, a face, and a hairstyle of each skeleton of a human body. In a case where the event is the soccer game, the storage unit 220 stores the human body model of each player as the human body model.
The virtual space model is information representing a structure of the virtual space. In a case where the event is the soccer game, the storage unit 220 stores a three-dimensional model of a soccer stadium as the virtual space model.
As described above, the advertisement object is an object for advertising a product, a service, a company, an association, or the like. The advertisement object may be two-dimensional data such as still image data or moving image data, or may be three-dimensional data.
An event ID indicating which event's display screen of the virtual space the advertisement object is to be arranged on may be set to the advertisement object. Furthermore, the storage unit 220 may store different advertisement objects for the same event. This point will be described in detail in “4. Display Control Example”.
(Operation Display Unit 230)
The operation display unit 230 has a function as an operation unit operated by the user and a function as a display unit that displays the display screen. These functions may be integrally provided or may be separately provided.
For example, the user performs a reproduction operation on the operation display unit 230. The user can also perform an operation of designating the time or the scene to be reproduced on the operation display unit 230. Furthermore, the operation display unit 230 displays the display screen of the virtual space generated by the display control unit 252 as described later.
(Sound Output Unit 240)
When the display screen of the virtual space is displayed on the operation display unit 230, the sound output unit 240 converts the sound data included in the content data received by the communication unit 210 into air vibration and outputs the air vibration on the basis of the control from the sound output control unit 254.
(Control Unit 250)
The control unit 250 controls the overall operation of the user terminal 20. For example, the control unit 250 controls acquisition of an application including the human body model, the virtual space model, and the advertisement object from a network, and controls storage of the acquired application in the storage unit 220. Furthermore, the control unit 250 also exerts functions as the display control unit 252 and the sound output control unit 254 by executing the application.
The display control unit 252 generates the display screen of the virtual space and causes the operation display unit 230 to display the display screen of the virtual space. The display control unit 252 generates the display screen of the virtual space on the basis of the data stored in the storage unit 220 and the content data received from the content server 10 by the communication unit 210.
For example, the display control unit 252 arranges the player object in the virtual space of the soccer stadium indicated by the virtual space model stored in the storage unit 220 on the basis of the motion data included in the content data. At this time, the display control unit 252 applies motion data of a certain player to a human body model of the player stored in the storage unit 220 to constitute a player object of the player, and arranges the player object at a position indicated by position data included in the motion data. Furthermore, the display control unit 252 arranges the ball object at a position indicated by the motion data of the ball.
Moreover, the display control unit 252 arranges the advertisement object stored in the storage unit 220 in the virtual space of the soccer stadium. The display control unit 252 may arrange the advertisement object in a region set in advance in the virtual space model. Moreover, the display control unit 252 arranges different advertisement objects in the virtual space in accordance with the scene of the display screen of the virtual space viewed by the user. Such display control by the display control unit 252 will be described in detail later.
The sound output control unit 254 causes the sound output unit 240 to output sound data.
<4. Display Control Example>
The configuration of the user terminal 20 according to an embodiment of the present disclosure has been described above. Subsequently, some examples of display control performed by the display control unit 252 according to an embodiment of the present disclosure will be described.
(4-1. First Display Control Example)
On the attention scene which is an example of a specific scene in the event, the display control unit 252 arranges an advertisement object larger than a normal scene which is another scene in the virtual space.
For example, the display screen D1 of the virtual space illustrated in Fig. 2 corresponds to a normal scene, and a display screen D2 of the virtual space illustrated in Fig. 6 corresponds to an attention scene. As illustrated in Figs. 2 and 6, an advertisement object A2 arranged on the attention scene is larger than the advertisement object A1 arranged on the normal scene. As a result, even though the positions of the virtual viewpoints are the same, the display of the advertisement object A2 is larger than the display of the advertisement object A1.
For the above display control, the storage unit 220 may store the advertisement object for the attention scene and the advertisement object for the normal scene. Alternatively, the display control unit 252 may use the advertisement object stored in the storage unit 220 as it is on the normal scene, and may enlarge and use the advertisement object on the attention scene.
Furthermore, in the examples illustrated in Figs. 2 and 6, contents of the advertisement object A1 and the advertisement object A2 are the same, and sizes of the advertisement object A1 and the advertisement object A2 are different. However, a relationship between the advertisement object A1 and the advertisement object A2 is not limited to such an example. For example, the contents of the advertisement object A1 and the advertisement object A2 may be different, and the sizes of the advertisement object A1 and the advertisement object A2 may be the same, or both the contents and the sizes of the advertisement object A1 and the advertisement object A2 may be different. Furthermore, the advertisement object A1 and the advertisement object A2 may have different colors such as a background color, a product color, and a character color. Furthermore, the advertisement object A1 and the advertisement object A2 may have different data formats. For example, the advertisement object A1 may be a still image, and the advertisement object A2 may be a moving image. Furthermore, the advertisement object A1 may be a two-dimensional object, and the advertisement object A2 may be a three-dimensional object.
Hereinafter, a flow of the above-described first display control example will be organized with reference to Fig. 7.
Fig. 7 is a flowchart illustrating the flow of the first display control example according to an embodiment of the present disclosure. As illustrated in Fig. 7, first, the user performs a reproduction operation including designation of the game on the operation display unit 230 of the user terminal 20 (S304). Subsequently, the communication unit 210 transmits the transmission request of the content data to the content server 10 (S308). The transmission request includes a game ID (event ID) designated by the user in S304. Therefore, the communication unit 210 receives the content data of the designated game from the content server 10 (S312).
Thereafter, the display control unit 252 arranges the player object and the ball object in the virtual space of the soccer stadium indicated by the virtual space model stored in the storage unit 220 on the basis of the motion data included in the content data and the human body model stored in the storage unit 220 (S316).
Subsequently, the display control unit 252 determines whether or not the scene to be displayed is the attention scene (S320). The display control unit 252 can make the determination on the basis of the event metadata included in the content data.
In a case where it is determined that the scene to be displayed is the normal scene (S320/NO), the display control unit 252 arranges a first advertisement object in the virtual space (S324). On the other hand, in a case where it is determined that the scene to be displayed is the attention scene (S320/YES), the display control unit 252 arranges a second advertisement object in the virtual space (S328). The first advertisement object may be, for example, an object smaller than the advertisement object A2 illustrated in Fig. 6 as the second advertisement object, such as the advertisement object A1 illustrated in Fig. 2.
Thereafter, the display control unit 252 generates the display screen of the virtual space (S332), and causes the operation display unit 230 to display the display screen of the virtual space (S336). The processing from S312 to S336 described above is repeated until the end of reproduction (S340).
Note that, although not illustrated in Fig. 7, while the display screen of the virtual space is displayed, the user can perform an operation of giving an instruction about a scene or time to be displayed, an operation of selecting an item on the viewpoint selection display M, and the like, and the user terminal 20 performs control corresponding to these operations.
(4-2. Second Display Control Example)
The display control unit 252 arranges different advertisement objects in the virtual space in accordance with the scene of the display screen of the virtual space viewed by the user. In a second display control example, the display control unit 252 arranges different advertisement objects in the virtual space in accordance with a method for viewing the display screen of the virtual space as the scene of the display screen of the virtual space. Since the method for viewing the display screen of the virtual space changes in accordance with the position of the virtual viewpoint, it can be said that the display control unit 252 arranges different advertisement objects in the virtual space in accordance with the position of the virtual viewpoint.
More specifically, the display control unit 252 arranges different advertisement objects in the virtual space in accordance with a distance between the position of the virtual viewpoint and an arrangement position of the advertisement object. For example, in a case where the above distance is a first distance, the display control unit 252 may arrange an advertisement object having a larger size in the virtual space than in a case where the above distance is a second distance smaller than the first distance. In a case where the above distance is the first distance, the display control unit 252 may arrange, in the virtual space, an advertisement object having the same size as in a case where the above distance is the second distance and having a larger character or image than in a case where the above distance is the second distance.
Alternatively, the display control unit 252 may arrange advertisement objects having different contents between a case where the above distance is the first distance and a case where the above distance is the second distance smaller than the first distance in the virtual space. Hereinafter, the present example will be described with reference to Figs. 8 and 9.
Fig. 8 illustrates a display screen D3 of the virtual space as a specific example of the display screen of the virtual space generated in a case where the distance between the position of the virtual viewpoint and the arrangement position of the advertisement object is the first distance less than a threshold value. In the example illustrated in Fig. 8, an advertisement object A3 is arranged in the virtual space, and therefore, the display of the advertisement object A3 is included in the display screen D3 of the virtual space. The advertisement object A3 is an image of shoes, and the shoe includes a logo mark. The user who has viewed the display screen D3 of the virtual space can visually recognize the logo mark from the advertisement object A3 and can grasp which manufacturer's shoes are the shoes of the advertisement object A3.
Fig. 9 illustrates a display screen D4 of the virtual space as a specific example of the display screen of the virtual space generated in a case where the distance between the position of the virtual viewpoint and the arrangement position of the advertisement object is the second distance that is equal to or more than the threshold value. In the example illustrated in Fig. 9, an advertisement object A4 is arranged in the virtual space, and therefore, the display of the advertisement object A4 is included in the display screen D4 of the virtual space. The advertisement object A4 is an image of a logo mark. The user who has viewed the display screen D4 of the virtual space can visually recognize the logo mark from the advertisement object A4 and can grasp which manufacturer's advertisement the advertisement object A4 is.
Note that, a size of the advertisement object A3 illustrated in Fig. 8 in the virtual space is the same as a size of the advertisement object A4 illustrated in Fig. 9 in the virtual space. Thus, a display size of the advertisement object A4 on the display screen D4 of the virtual space illustrated in Fig. 9 is smaller than a display size of the advertisement object A3 on the display screen D3 of the virtual space. Fig. 9 illustrates the display of the advertisement object A3 in a case where the advertisement object A3 is arranged in the virtual space as a comparative example. In the comparative example, the display of the advertisement object A3 is small, and it is difficult for the user to visually recognize the logo mark. When the user cannot visually recognize the logo mark, an advertising effect is reduced. However, as in the second display control example, the advertisement object A4 illustrated in Fig. 9 is arranged in accordance with the distance between the position of the virtual viewpoint and the arrangement position of the advertisement object, and thus, the user can visually recognize the logo mark. As a result, the advertising effect can be enhanced.
Hereinafter, a flow of the above-described second display control example will be organized with reference to Fig. 10.
Fig. 10 is a flowchart illustrating the flow of the second display control example according to an embodiment of the present disclosure. As illustrated in Fig. 10, first, the user performs the reproduction operation including the designation of the game on the operation display unit 230 of the user terminal 20 (S304). Subsequently, the communication unit 210 transmits the transmission request of the content data to the content server 10 (S308). The transmission request includes a game ID (event ID) designated by the user in S304. Therefore, the communication unit 210 receives the content data of the designated game from the content server 10 (S312).
Thereafter, the display control unit 252 arranges the player object and the ball object in the virtual space of the soccer stadium indicated by the virtual space model stored in the storage unit 220 on the basis of the motion data included in the content data and the human body model stored in the storage unit 220 (S316).
Subsequently, the display control unit 252 determines whether or not the distance between the arrangement position of each advertisement object and the position of the virtual viewpoint is equal to or more than the threshold value (S420). Note that, the position of the virtual viewpoint changes in accordance with which item is selected on the viewpoint selection display M illustrated in Fig. 2 and the like, the situation of the game (for example, the position of the ball object), and the like.
In a case where it is determined that the distance between the arrangement position of a certain advertisement object and the position of the virtual viewpoint is less than the threshold value (S420/NO), the display control unit 252 arranges a third advertisement object at the arrangement position in the virtual space (S424). The third advertisement object is, for example, the advertisement object A3 illustrated in Fig. 8.
On the other hand, in a case where it is determined that the distance between the arrangement position of a certain advertisement object and the position of the virtual viewpoint is equal to or more than the threshold value (S420/YES), the display control unit 252 arranges a fourth advertisement object at the arrangement position in the virtual space (S428). The fourth advertisement object is, for example, the advertisement object A4 illustrated in Fig. 9.
Thereafter, the display control unit 252 generates the display screen of the virtual space (S332), and causes the operation display unit 230 to display the display screen of the virtual space (S336). The processing from S312 to S336 described above is repeated until the end of reproduction (S340).
<5. Operation and Effect>
According to an embodiment of the present disclosure described above, various operations and effects can be obtained. For example, according to an embodiment of the present disclosure, the display control unit 252 arranges different advertisement objects in the virtual space in accordance with the scene of the display screen of the virtual space viewed by the user. According to such a configuration, an appropriate advertisement object corresponding to the scene is displayed, and thus, an appeal effect by the advertisement object can be improved.
As a specific example, the display control unit 252 can arrange, in the virtual space, different advertisement objects between the attention scene in the event and the normal scene. Since it is considered that the number of recognized advertisement objects increases on the attention scene, an advertiser can realize a desired appeal effect by preparing the advertisement object for the attention scene and the advertisement object for the normal scene based on the increase in the number of recognized advertisement objects. Note that, an advertisement rate for displaying the advertisement object on the attention scene may be set to be higher than an advertisement rate for displaying the advertisement object on the normal scene.
Furthermore, according to an embodiment of the present disclosure, the attention scene determination unit 152 that determines whether or not each scene in the event is the attention scene on the basis of the reaction to each scene is provided. According to such a configuration, human labor for determining the attention scene can be suppressed.
More specifically, the attention scene determination unit 152 can determine whether or not each scene is the attention scene on the basis of the sound data acquired in the event or the number of times of reproduction of each scene. According to such a configuration, an appropriate determination result as to whether or not each scene is the attention scene can be obtained.
Furthermore, the display control unit 252 can arrange the advertisement object larger than the normal scene in the virtual space on the attention scene. According to such a configuration, since the display of the advertisement object becomes large on the attention scene, the appeal effect of the advertisement object can be further improved.
As another example, the display control unit 252 arranges different advertisement objects in the virtual space in accordance with the position of the virtual viewpoint. That is, the display control unit 252 arranges different advertisement objects in the virtual space in accordance with the scene in the sense of how the virtual space is viewed. According to such a configuration, an appropriate advertisement object corresponding to the method for viewing the virtual space is displayed, and thus, the appeal effect by the advertisement object can be improved.
For example, the display control unit 252 arranges different advertisement objects in the virtual space in accordance with the distance between the position of the virtual viewpoint and the arrangement position of the advertisement object. Since a relationship between the size of the advertisement object and the display size of the advertisement object changes depending on the distance between the position of the virtual viewpoint and the arrangement position of the advertisement object, the control of the advertisement object corresponding to the above distance is reasonable.
Specifically, in a case where the above distance is the first distance, the display control unit 252 may arrange an advertisement object having a larger size in the virtual space than in a case where the above distance is the second distance smaller than the first distance. Alternatively, the display control unit 252 may arrange advertisement objects having different contents between a case where the above distance is the first distance and a case where the above distance is the second distance smaller than the first distance in the virtual space. In either case, the appeal effect is expected to be improved by the advertisement object.
Furthermore, the event is performed by using a predetermined region in the real space, and the display control unit 252 arranges the advertisement object around a region in the virtual space corresponding to the predetermined region. For example, in a case where the predetermined region is a soccer court, the display control unit 252 arranges the advertisement object on the pitch side around the soccer court. The advertisement object arranged in such a region is less likely to hinder the viewing of the event and is easily visible to the user.
Fig. 11 illustrates a display screen D5 of the virtual space as a specific example of the display screen of the virtual space generated in a case of a first condition of an event, the first condition being a first one of a league, a team, or a location of an event or a type of an event. A type of an event may include, for example, football, baseball, Esports, concerts, rugby, or any other even that may be held in a similar venue. In the example illustrated in Fig. 11, an advertisement object A5 having a first appearance is arranged in the virtual space, and therefore, the display of the advertisement object A5 is included in the display screen D5 of the virtual space. The advertisement object A5 is displayed having the first appearance based on a determination of the event being the first condition.
Fig. 12 illustrates a display screen D6 of the virtual space as a specific example of a display screen of a virtual space generated in a case of a second condition of an event, the second condition being a second one of a league, a team, or a location of an event or a type of an event. In the example illustrated in Fig. 12, an advertisement object A6 having a second appearance different than the first appearance is arranged in the virtual space, and therefore, the display of the advertisement object A6 is included in the display screen D6 of the virtual space. The advertisement object A6 is displayed having the second appearance based on a determination of the event being the second condition.
Furthermore, in the examples illustrated in Figs. 11 and 12, contents of the advertisement object A5 and the advertisement object A6 are different, and sizes of the advertisement object A5 and the advertisement object A6 are the same. However, a relationship between the advertisement object A5 and the advertisement object A6 is not limited to such an example. For example, the sizes of the advertisement object A5 and the advertisement object A6 may be different, and the contents of the advertisement object A5 and the advertisement object A6 may be the same, or both the contents and the sizes of the advertisement object A5 and the advertisement object A6 may be different. Furthermore, the advertisement object A5 and the advertisement object A6 may have different colors such as a background color, a product color, and a character color. Furthermore, the advertisement object A5 and the advertisement object A6 may have different data formats. For example, the advertisement object A5 may be a still image, and the advertisement object A6 may be a moving image. Furthermore, the advertisement object A5 may be a two-dimensional object, and the advertisement object A6 may be a three-dimensional object.
Furthermore, the advertisement object A5 may include a plurality of objects and the second advertisement object A6 may include any combination of one or more new objects and one or more of the plurality of objects. The one or more of the plurality of objects in the advertisement object A6 may be in a same or different arrangement than the arrangement of the one or more of the plurality of objects in the advertisement object A5. The second advertisement object A6 may also be the first advertisement object A5 having a portion thereof changed or covered.
Furthermore, in the example illustrated in Fig. 12, the advertisement object A5 is removed and replaced with the advertisement object A6. However, the replacement of the advertisement object A5 with the advertisement object A6 after the removal of the advertisement object A5 is not limited to such an example. For example, in the case of the second condition of the event, the advertisement object A5 may be removed such the advertisement object A5 is not replaced with any advertisement objects such that no advertisement objects are shown in place of the advertisement object A5 in the display screen D6.
<6. Hardware Configuration>
Embodiments of the present disclosure have been described above. Information processing such as the determination and display control of the attention scene described above is realized by cooperation of software and hardware. Hereinafter, a hardware configuration example applicable to the content server 10 and the user terminal 20 will be described.
Fig. 13 is a block diagram illustrating an example of a hardware configuration 90. The hardware configuration 90 includes a central processing unit (CPU) 901, a read only memory (ROM) 902, a random access memory (RAM) 903, and a host bus 904. Furthermore, the hardware configuration 90 includes a bridge 905, an external bus 906, an interface 907, an input device 908, an output device 910, a storage device (HDD) 911, a drive 912, and a communication device 915.
The CPU 901 functions as an arithmetic processing device and a control device, and controls the overall operation according to various programs. Furthermore, the CPU 901 may also be a microprocessor. The ROM 902 stores a program, operation parameters, and the like used by the CPU 901. The RAM 903 temporarily stores the program used in execution of the CPU 901, parameters which change as appropriate in the execution thereof and the like. They are connected to each other by the host bus 904 including a CPU bus and the like. Functions such as the control unit 150 of the content server 10 and the control unit 250 of the user terminal 20 can be realized by cooperation of the CPU 901, the ROM 902, and the RAM 903 with software.
The host bus 904 is connected to the external bus 906 such as a peripheral component interconnect/interface (PCI) bus via the bridge 905. Note that, the host bus 904, the bridge 905, and the external bus 906 are not necessarily provided separately, and functions thereof may be mounted on one bus.
The input device 908 includes an input means such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever for a user to input information, an input control circuit which generates an input signal on the basis of the input by the user and outputs the same to the CPU 901 and the like. The input device 908 is operated, and thus, the user can input various kinds of data and can give an instruction about a processing operation.
The output device 910 includes, for example, a display device such as a liquid crystal display (LCD) device, an organic light emitting diode (OLED) device, and a lamp. Moreover, the output device 910 includes audio output devices such as a speaker and headphones. The output device 910 outputs, for example, a reproduced content. Specifically, the display device displays various kinds of information such as reproduced video data as text or images. On the other hand, the audio output devices convert reproduced audio data and the like into audio and outputs the audio.
The storage device 911 is a data storage device provided as an example of a storage unit according to an embodiment of the present disclosure. The storage device 911 may include, for example, a storage medium, a recording device that records data in the storage medium, a reading device that reads data from the storage medium, and a deletion device that deletes data recorded on the storage medium. For example, the storage device 911 includes a hard disk drive (HDD). The storage device 911 drives a hard disk and stores programs to be executed by the CPU 901 and various data.
The drive 912 is a reader and writer for the storage medium, and is built in or externally attached to the hardware configuration 90. The drive 912 reads information recorded in a removable storage medium 84 mounted thereon, such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 903. Furthermore, the drive 912 can also write information to the removable storage medium 84.
The communication device 915 is, for example, a communication interface provided as a communication device and the like for connecting to the communication network 82. Furthermore, the communication device 915 may be a wireless local area network (LAN) compatible communication device, a long term evolution (LTE) compatible communication device, or a wire communication device that performs wired communication.
<7. Supplement>
The embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the present disclosure is not limited to such an example. It is obvious that a person having ordinary knowledge in the technical field to which the present disclosure belongs can conceive various changes or modifications within the scope of the technical idea described in the claims, and it is naturally understood that these also belong to the technical scope of the present disclosure.
For example, in the above description, the example in which the display control unit 252 arranges different advertisement objects on the attention scene and the normal scene has been described, and the example in which different advertisement objects are arranged in accordance with the distance between the position of the virtual viewpoint and the arrangement position of the advertisement object has been described. However, the display control unit 252 may arrange different advertisement objects in accordance with other elements.
For example, in a case where a sunlight condition, brightness, and the like of the virtual space are controlled in accordance with a time zone in which the content is reproduced, the display control unit 252 may arrange different advertisement objects in accordance with the time zone in which the content is reproduced. Furthermore, in a case where the sunlight condition, the brightness, and the like of the virtual space are controlled in accordance with a time zone in which the event is performed in the real space, the display control unit 252 may arrange different advertisement objects in accordance with the time zone in which the event is performed in the real space. According to such a configuration, an advertisement object suitable for the sunlight condition and the brightness of the virtual space (for example, an advertisement object of a color having high visibility under the sunlight condition of the virtual space) is arranged, and thus, the appeal effect of the advertisement object can be improved.
Furthermore, the display control unit 252 may arrange different advertisement objects in accordance with a weather condition in the virtual space. For example, the display control unit 252 may arrange an advertisement object of a product for sunburn in a case where the weather condition in the virtual space is fine, and may arrange an advertisement object of a product for rain such as an umbrella in a case where the weather condition in the virtual space is rainy. This configuration can also improve the appeal effect of the advertisement object.
Furthermore, the display control unit 252 may arrange different advertisement objects in accordance with which player object is to be viewed from above on the display screen. As described above, in “Standard”, “Bird's eye”, and the like, the virtual viewpoint is arranged at a position where the player object is viewed from above, and the virtual viewpoint also moves with the movement of the ball object. For example, on the display screen on which a certain player object is viewed from above, the display control unit 252 can arrange an advertisement object corresponding to popularity of a player corresponding to the player object or a sponsor with which the player has a contract. Note that, an advertisement rate for displaying an advertisement object on a display screen on which a player object of a popular player is viewed from above may be set higher than an advertisement rate for displaying an advertisement object on a display screen on which a player object of a relatively less popular player is viewed from above.
Furthermore, the display control unit 252 may arrange different advertisement objects in accordance with an attribute of the user of the user terminal 20. Examples of the above attribute include a country of residence, gender, and age. With such a configuration, the presence or absence of an advertisement object for an alcoholic beverage can be controlled in accordance with the age of the user, and an advertisement object of a brand can be displayed in accordance with the country of residence, and the like.
Furthermore, the user terminal 20 can also take a picture representing the inside of the virtual space at a certain point in time. For example, in the “Avatar”, an autoscopic picture of a user's avatar can be taken. When the picture is taken, the display control unit 252 may change the advertisement object in the virtual space. Therefore, an advertisement object different from the advertisement object in the moving image is displayed on the picture. As described above, it is also useful to arrange different advertisement objects in accordance with whether or not the scene viewed by the user is a moving image scene or a picture scene.
Furthermore, in the above description, the example in which the advertisement object is arranged on the pitch side in the soccer game has been described, but the arrangement position of the advertisement object is not limited to the pitch side. For example, the display control unit 252 may arrange an advertisement object on an event tool such as a uniform of a player, a goal net, or a ball. Moreover, the display control unit 252 may arrange the advertisement object in a three-dimensional object such as a flying object flying in the virtual space.
Furthermore, in the above description, the example in which the advertisement object is arranged in the virtual space and the display of the advertisement object is included in the display screen of the virtual space has been described, but the advertisement object may be used by another method. For example, the display control unit 252 may arrange the advertisement object in a window different from a window including the display screen of the virtual space.
Note that, the above-described display screen of the virtual space and electronic commerce may be designed in cooperation with each other. For example, in a case where the user taps the player object on the display screen of the virtual space, the user terminal 20 may access a server that sells a uniform of the player object and may display a purchase screen of the uniform of the player object. On the purchase screen, the user can purchase the uniform in which a uniform number and a name of the player are imprinted.
Furthermore, each step in the processing of the user terminal 20 is not necessarily processed in time series in the order described as in the flowchart. For example, each step in the processing of the user terminal 20 may be processed in an order different from the order described as in the flowchart, or may be processed in parallel.
Furthermore, a computer program for causing the hardware such as the CPU, ROM, and RAM built in the content server 10 and the user terminal 20 can also exhibit a function equivalent to each configuration of the content server 10 and the user terminal 20 described above. Furthermore, a non-transitory storage medium storing the computer program is also provided.
Furthermore, the content data is received from the content server 10, and thus, the user terminal 20 can exhibit each function of the user terminal 20 described above. That is, the user terminal 20 according to an embodiment of the present disclosure and the information processing system 1 including the user terminal 20 can also be regarded as being produced on the basis of provision of the content data from the content server 10 from a certain viewpoint.
Furthermore, the effects described in the present specification are merely exemplary or illustrative, and not restrictive. That is, the technology according to the present disclosure can exhibit other effects apparent to those skilled in the art from the description of the present specification, in addition to the effects described above or instead of the effects described above.
Note that, the following configurations also fall within the technological scope of the present disclosure.
(1) An information processing system including:
circuitry configured to:
determine whether or not each scene of a plurality of scenes of an event is an attention scene based on content data of the event; and
initiate display of a specified scene of the plurality of scenes and an object in a virtual space based on the determination of whether the specified scene is an attention scene,
wherein the object is displayed as a first object having a first appearance based on a determination that the specified scene is an attention scene, and the object is displayed as a second object having a second appearance different from the first appearance based on a determination that the specified scene is not an attention scene.
(2) The information processing system according to (1), wherein the object is an advertisement object.
(3) The information processing system according to any one of (1) or (2), wherein the object having the first appearance is larger than the object having the second appearance.
(4) The information processing system according to any one of (1) to (3), further including:
a storage device configured to store the first object and the second object.
(5) The information processing system according to claim any one of (1) to (4), wherein the circuitry is further configured to initiate display of the second object by enlarging the first object.
(6) The information processing system according to any one of (1) to (5), wherein contents of the first object and contents of the second object are different from each other.
(7) The information processing system according to any one of (1) to (6), wherein a size of the first object and a size of the second object are different from each other.
(8) The information processing system according to any one of (1) to (7), wherein a size of the first object and a size of the second object are the same as each other.
9. The information processing system according to claim 1, wherein a color of the first object and a color of the second object are different from each other.
(10) The information processing system according to any one of (1) to (9), wherein the first object and the second object are advertisement objects including at least one of a background, a product, or a character.
(11) The information processing system according to any one of (1) to (10), wherein at least one of a color of the background, a color of the product, or a color of the character are different from each other.
(12) The information processing system according to any one of (1) to (11), wherein a data format of the first object and a data format of the second object are different from each other.
(13) The information processing system according to any one of (1) to (12), wherein the data format of the first object is that of a moving image and the data format of the second object is that of a still image.
(14) The information processing system according to any one of (1) to (13), wherein the data format of the first object is that of a three-dimensional image and the data format of the second object is that of a two-dimensional image.
(15) The information processing system according to any one of (1) to (14), wherein the circuitry is further configured to:
initiate display of different objects based on different distances between a position of a virtual viewpoint and an arrangement position of the different objects in the virtual space.
(16) The information processing system according to any one of (1) to (15), wherein the circuitry is further configured to:
initiate display of a third object having a third appearance in the virtual space based on a distance between a position of a virtual viewpoint and an arrangement position of the third object in the virtual space being equal to or more than a predetermined threshold; and
initiate display of a fourth object having a fourth appearance different from the third appearance in the virtual space based on a distance between another position of the virtual viewpoint and the arrangement position of the fourth object in the virtual space being less than the predetermined threshold.
(17) The information processing system according to any one of (1) to (16), wherein the content data includes a degree of interest of each scene of the plurality of scenes, and
wherein the circuitry is further configured to determine whether or not each scene of the plurality of scenes of the event is the attention scene based on the degree of interest of each scene of the plurality of scenes.
(18) The information processing system according to any one of (1) to (17), wherein the degree of interest includes at least one of sound data associated with each scene of the plurality of scenes, a number of times a reproduction of each scene of the plurality of scenes is stored in a storage device, a number of comments on each scene of the plurality of scenes, or a number of evaluations of each scene of the plurality of scenes.
(19) The information processing system according to any one of (1) to (18), wherein the circuitry is further configured to:
initiate display of different objects based on different conditions of the event.
(20) The information processing system according to any one of (1) to (19), wherein the different conditions include at least one of a league, a team, or a location of the event or a type of an event.
(21) An information processing method including:
transmitting content data of an event to a user terminal,
wherein whether or not each scene of a plurality of scenes of an event is an attention scene is determined based on the content data,
wherein a specified scene of the plurality of scenes and an object are displayed in a virtual space on the user terminal based on the determination of whether the specified scene is an attention scene, and
wherein the object is displayed as a first object having a first appearance based on a determination that the specified scene is an attention scene, and the object is displayed as a second object having a second appearance different from the first appearance based on a determination that the specified scene is not an attention scene.
(22) A non-transitory computer-readable medium having embodied thereon a program, which when executed by a computer causes the computer to execute an information processing method, the method including:
transmitting content data of an event to a user terminal,
wherein whether or not each scene of a plurality of scenes of an event is an attention scene is determined based on the content data,
wherein a specified scene of the plurality of scenes and an object are displayed in a virtual space on the user terminal based on the determination of whether the specified scene is an attention scene, and
wherein the object is displayed as a first object having a first appearance based on a determination that the specified scene is an attention scene, and the object is displayed as a second object having a second appearance different from the first appearance based on a determination that the specified scene is not an attention scene.
(23)
An information processing system including:
a display control unit that generates a display screen of a virtual space in which a virtual object having motion corresponding to motion data of a target is arranged,
in which the display control unit arranges different advertisement objects in the virtual space in accordance with a scene of the display screen viewed by a user.
(24)
The information processing system according to (1), in which the motion data of the target is motion data of the target in an event in a real space, and
the display control unit arranges the different advertisement objects in the virtual space between a specific scene and another scene in the event.
(25)
The information processing system according to (2), further including:
a scene determination unit that determines whether or not each scene in the event corresponds to the specific scene on a basis of reaction to the each scene.
(26)
The information processing system according to (3), in which the scene determination unit determines whether or not the each scene corresponds to the specific screen on a basis of sound data acquired in the event in the real space.
(27)
The information processing system according to (3), in which the scene determination unit determines whether or not the each scene corresponds to the specific scene on a basis of the number of times of reproduction of the each scene.
(28)
The information processing system according to any one of (2) to (5), in which the display control unit arranges, in the virtual space, a larger advertisement object on the specific scene than on the other scene.
(29)
The information processing system according to (1), in which the display control unit generates, as the display screen, a screen representing the virtual space viewed from a virtual viewpoint which is a virtual point of view in the virtual space, and
the display control unit arranges the different advertisement objects in the virtual space in accordance with a position of the virtual viewpoint by arranging the different advertisement objects in the virtual space in accordance with the scene.
(30)
The information processing system according to (7), in which the display control unit arranges the different advertisement objects in the virtual space in accordance with a distance between the position of the virtual viewpoint and an arrangement position of the advertisement object.
(31)
The information processing system according to (8), in which the display control unit arranges, in the virtual space, an advertisement object having a larger size in a case where the distance is a first distance than in a case where the distance is a second distance smaller than the first distance.
(32)
The information processing system according to (8), in which the display control unit arranges advertisement objects having different contents in the virtual space between a case where the distance is a first distance and a case where the distance is a second distance smaller than the first distance.
(33)
The information processing system according to any one of (2) to (5), in which the event is performed by using a predetermined region in the real space, and
the display control unit arranges the advertisement object around a region in the virtual space corresponding to the predetermined region.
(34)
The information processing system according to any one of (1) to (11), further including:
a user terminal; and
an information processing apparatus,
in which the information processing apparatus includes the display control unit, and a communication control unit that controls transmission of the display screen to the user terminal, and
the user terminal receives the display screen from the information processing apparatus, and displays the display screen on a display unit.
(35)
The information processing system according to any one of (1) to (11), further including:
a user terminal; and
an information processing apparatus,
in which the information processing apparatus includes a communication control unit that controls transmission of the motion data of the target to the user terminal, and
the user terminal includes the display control unit operated on a basis of the motion data of the target received from the information processing apparatus.
(36)
A method including:
generating, by a processor, a display screen of a virtual space in which a virtual object having motion corresponding to motion data of a target is arranged,
in which the generating of the display screen of the virtual space includes arranging different advertisement objects in the virtual space in accordance with a scene of the display screen viewed by a user.
(37)
A program causing a computer to function as a display control unit that generates a display screen of a virtual space in which a virtual object having motion corresponding to motion data of a target is arranged,
in which the display control unit arranges different advertisement objects in the virtual space in accordance with a scene of the display screen viewed by a user.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
1 Information processing system
10 Content server
110 Communication unit
120 Storage unit
150 Control unit
152 Attention scene determination unit
154 Communication control unit
20 User terminal
210 Communication unit
220 Storage unit
230 Operation display unit
240 Sound output unit
250 Control unit
252 Display control unit
254 Output control unit

Claims (22)

  1. An information processing system comprising:
    circuitry configured to:
    determine whether or not each scene of a plurality of scenes of an event is an attention scene based on content data of the event; and
    initiate display of a specified scene of the plurality of scenes and an object in a virtual space based on the determination of whether the specified scene is an attention scene,
    wherein the object is displayed as a first object having a first appearance based on a determination that the specified scene is an attention scene, and the object is displayed as a second object having a second appearance different from the first appearance based on a determination that the specified scene is not an attention scene.
  2. The information processing system according to claim 1, wherein the object is an advertisement object.
  3. The information processing system according to claim 1, wherein the object having the first appearance is larger than the object having the second appearance.
  4. The information processing system according to claim 1, further comprising:
    a storage device configured to store the first object and the second object.
  5. The information processing system according to claim 1, wherein the circuitry is further configured to initiate display of the second object by enlarging the first object.
  6. The information processing system according to claim 1, wherein contents of the first object and contents of the second object are different from each other.
  7. The information processing system according to claim 1, wherein a size of the first object and a size of the second object are different from each other.
  8. The information processing system according to claim 1, wherein a size of the first object and a size of the second object are the same as each other.
  9. The information processing system according to claim 1, wherein a color of the first object and a color of the second object are different from each other.
  10. The information processing system according to claim 1, wherein the first object and the second object are advertisement objects including at least one of a background, a product, or a character.
  11. The information processing system according to claim 10, wherein at least one of a color of the background, a color of the product, or a color of the character are different from each other.
  12. The information processing system according to claim 1, wherein a data format of the first object and a data format of the second object are different from each other.
  13. The information processing system according to claim 12, wherein the data format of the first object is that of a moving image and the data format of the second object is that of a still image.
  14. The information processing system according to claim 12, wherein the data format of the first object is that of a three-dimensional image and the data format of the second object is that of a two-dimensional image.
  15. The information processing system according to claim 1, wherein the circuitry is further configured to:
    initiate display of different objects based on different distances between a position of a virtual viewpoint and an arrangement position of the different objects in the virtual space.
  16. The information processing system according to claim 15, wherein the circuitry is further configured to:
    initiate display of a third object having a third appearance in the virtual space based on a distance between a position of a virtual viewpoint and an arrangement position of the third object in the virtual space being equal to or more than a predetermined threshold; and
    initiate display of a fourth object having a fourth appearance different from the third appearance in the virtual space based on a distance between another position of the virtual viewpoint and the arrangement position of the fourth object in the virtual space being less than the predetermined threshold.
  17. The information processing system according to claim 1, wherein the content data includes a degree of interest of each scene of the plurality of scenes, and
    wherein the circuitry is further configured to determine whether or not each scene of the plurality of scenes of the event is the attention scene based on the degree of interest of each scene of the plurality of scenes.
  18. The information processing system according to claim 17, wherein the degree of interest includes at least one of sound data associated with each scene of the plurality of scenes, a number of times a reproduction of each scene of the plurality of scenes is stored in a storage device, a number of comments on each scene of the plurality of scenes, or a number of evaluations of each scene of the plurality of scenes.
  19. The information processing system according to claim 1, wherein the circuitry is further configured to:
    initiate display of different objects based on different conditions of the event.
  20. The information processing system according to claim 19, wherein the different conditions include at least one of a league, a team, or a location of the event or a type of an event.
  21. An information processing method comprising:
    transmitting content data of an event to a user terminal,
    wherein whether or not each scene of a plurality of scenes of an event is an attention scene is determined based on the content data,
    wherein a specified scene of the plurality of scenes and an object are displayed in a virtual space on the user terminal based on the determination of whether the specified scene is an attention scene, and
    wherein the object is displayed as a first object having a first appearance based on a determination that the specified scene is an attention scene, and the object is displayed as a second object having a second appearance different from the first appearance based on a determination that the specified scene is not an attention scene.
  22. A non-transitory computer-readable medium having embodied thereon a program, which when executed by a computer causes the computer to execute an information processing method, the method comprising:
    transmitting content data of an event to a user terminal,
    wherein whether or not each scene of a plurality of scenes of an event is an attention scene is determined based on the content data,
    wherein a specified scene of the plurality of scenes and an object are displayed in a virtual space on the user terminal based on the determination of whether the specified scene is an attention scene, and
    wherein the object is displayed as a first object having a first appearance based on a determination that the specified scene is an attention scene, and the object is displayed as a second object having a second appearance different from the first appearance based on a determination that the specified scene is not an attention scene.
PCT/JP2025/012332 2024-03-27 2025-03-27 Information processing system, method, and program Pending WO2025206103A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2024051624A JP2025150632A (en) 2024-03-27 2024-03-27 Information processing system, method and program
JP2024-051624 2024-03-27

Publications (1)

Publication Number Publication Date
WO2025206103A1 true WO2025206103A1 (en) 2025-10-02

Family

ID=97215968

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2025/012332 Pending WO2025206103A1 (en) 2024-03-27 2025-03-27 Information processing system, method, and program

Country Status (2)

Country Link
JP (1) JP2025150632A (en)
WO (1) WO2025206103A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140278847A1 (en) * 2013-03-14 2014-09-18 Fabio Gallo Systems and methods for virtualized advertising
WO2022239403A1 (en) * 2021-05-12 2022-11-17 株式会社コロプラ Program, information processing method, information processing device, and system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140278847A1 (en) * 2013-03-14 2014-09-18 Fabio Gallo Systems and methods for virtualized advertising
WO2022239403A1 (en) * 2021-05-12 2022-11-17 株式会社コロプラ Program, information processing method, information processing device, and system

Also Published As

Publication number Publication date
JP2025150632A (en) 2025-10-09

Similar Documents

Publication Publication Date Title
KR102740573B1 (en) Expanded VR participation and viewing of esports events
US10609308B2 (en) Overly non-video content on a mobile device
US10099147B2 (en) Using a portable device to interface with a video game rendered on a main display
US11006160B2 (en) Event prediction enhancements
US9143699B2 (en) Overlay non-video content on a mobile device
US10896322B2 (en) Information processing device, information processing system, facial image output method, and program
US20170064240A1 (en) Player position and auxiliary information visualization
WO2019234879A1 (en) Information processing system, information processing method and computer program
CN111148554A (en) Virtual reality presentation of real world space
EP3425483B1 (en) Intelligent object recognizer
KR101400923B1 (en) Method, system and computer-readable recording medium for broadcasting sports game using simulation
CN116963809A (en) In-game dynamic camera angle adjustment
US20210385554A1 (en) Information processing device, information processing method, and information processing program
JP7612384B2 (en) Information processing device, information processing method, and program
JP7343588B2 (en) System and method for customizing and compositing video feeds on client devices
JP6609078B1 (en) Content distribution system, content distribution method, and content distribution program
CN109104619A (en) Image processing method and device for live streaming
WO2025206103A1 (en) Information processing system, method, and program
US11606608B1 (en) Gamification of video content presented to a user
WO2022180973A1 (en) Comment art management system, comment art management method, comment art management program, and computer-readable recording medium
JP6942898B1 (en) Programs, methods, information processing equipment, systems
US20250292446A1 (en) Video graphic overlay device and method
CN120731602A (en) Information processing device, information processing method, and program
JP2020167661A (en) Content distribution system, content distribution method, and content distribution program
JP2020150520A (en) Attention level utilization device, attention level utilization method, and attention level utilization program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25777804

Country of ref document: EP

Kind code of ref document: A1